Comprehensive Notes: Parallel Computing (CS 526)
Based on Lectures from Muhammad Nadeem Nadir
Department of Computer Science, The University of Lahore
Slide 1: Introduction to Parallel Computing
Parallel computing means using multiple processors or computers to solve a problem faster. Instead
of one processor doing all the work, the workload is divided.
Example: Instead of one person making an entire pizza, one person prepares the dough, another
adds toppings, and a third bakes it.
Slide 2: Concurrency vs. Parallelism
Concurrency: Multiple tasks execute in overlapping time periods but not necessarily at the same
time.
Example: A person eating and talking-they pause between bites to talk.
Parallelism: Multiple tasks execute at the same time.
Example: A chef cooking while talking on the phone, both actions happen simultaneously.
Slide 5: Growth in CPU Transistor Count
Moore's Law states that the number of transistors doubles every 18 months, improving performance.
However, increasing clock speed causes excessive heat and power consumption.
Example: A car engine cannot run indefinitely faster because of overheating and fuel limitations.
Slide 6: Single-core CPU Limitations
Single-core CPUs have three major problems:
- Power Consumption: Higher clock speeds require more power.
- Heat Dissipation: More power generates more heat.
- Limited Memory: A single-core CPU has limited memory access.
Example: A single cashier handling customers vs. multiple cashiers for efficiency.
Slide 8: Multi-core vs. Single-core Performance
Single-core processors rely on higher clock speeds, which increase heat and power consumption.
Multi-core processors distribute the workload.
Example: A group of people solving a puzzle together is faster than one person solving it alone.
Slide 10: The Cache Coherence Problem
Cache coherence ensures all processor cores see the same data.
Example: If a teacher updates a class schedule but only some students get the new version, others
still follow the old one.
Slide 11: Cache Coherence Protocols (MESI & MOESI)
MESI Protocol states that cache blocks can be in one of four states:
- M (Modified): Updated locally but not in main memory.
- E (Exclusive): Present only in one cache, matches main memory.
- S (Shared): Available in multiple caches.
- I (Invalid): Not valid.
MOESI adds an 'Owned' state to allow direct sharing between caches.
Example: A library with multiple copies of a book-some are borrowed (Modified), some are reserved
(Exclusive), and some are on shelves (Shared).