0% found this document useful (0 votes)
5 views2 pages

Parallel Computing Concepts Notes

Parallel computing involves using multiple resources simultaneously to solve problems more quickly by dividing tasks. Key concepts include processes, processors, and different architectures such as shared and distributed memory. Common algorithms and performance metrics are also discussed, highlighting the importance of communication and efficiency in parallel processing.

Uploaded by

Pramod Rumale
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
5 views2 pages

Parallel Computing Concepts Notes

Parallel computing involves using multiple resources simultaneously to solve problems more quickly by dividing tasks. Key concepts include processes, processors, and different architectures such as shared and distributed memory. Common algorithms and performance metrics are also discussed, highlighting the importance of communication and efficiency in parallel processing.

Uploaded by

Pramod Rumale
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

Parallel Computing Concepts - Complete Notes

1. Introduction

Parallel computing is the simultaneous use of multiple compute resources to solve a computational problem

faster by dividing the work.

2. Key Terminologies

- Process: Independent program executing.

- Processor: Hardware that executes instructions.

- Speedup = Sequential Time / Parallel Time

- Efficiency = Speedup / Number of processors

3. Parallel Architectures

- Shared Memory: Multiple processors access the same memory.

- Distributed Memory: Each processor has its own local memory; communication via message passing.

4. Parallel Programming Models

- Message Passing (MPI): Explicit communication between processes.

- Shared Memory (OpenMP): Threads share memory space.

5. Common Parallel Algorithms

- Matrix Multiplication: Divide matrices into blocks; each processor computes a block.

- Prefix Sum: Compute cumulative sums in parallel using tree-based approach.

- Broadcast and Reduction: Communication operations to distribute or combine data.

6. Performance Metrics

- Speedup, Efficiency, Scalability, Load Balancing, Communication Overhead

7. Example Diagram: Recursive Doubling Broadcast on 8-node Ring

Step 1: Node 0 sends to Node 1


Parallel Computing Concepts - Complete Notes

0 --> 1

Step 2: Nodes 0 and 1 send to Nodes 2 and 3


0,1 --> 2,3

Step 3: Continue until all nodes receive the data

You might also like