CS424.
Parallel Computing Lab#4
1 MPI Communications
In MPI (Message Passing Interface), communication plays a crucial role in coordinating work among independent
processes. MPI processes are independent, and explicit communication is necessary for coordination. There are two
types of communication within MPI:
1. Point-to-Point Communication: involves sending and receiving messages between two specific processes
within the same communicator. It allows processes to exchange data directly with each other.
• Types of Point-to-Point Operations:
- Blocking Send/Receive: Synchronous communication where the sender blocks until the
receiver acknowledges receipt.
- Non-blocking Send/Receive: Asynchronous communication where the sender initiates the
transfer and continues execution without waiting for the receiver.
2. Collective Communication: involves multiple processes working together as a group. It enables coordinated
operations across all processes in a communicator. It uses collective MPI functions, such as:
- MPI_Scatter: Distributes data from one process to many processes.
- MPI_Reduce: Performs data reduction (e.g., sum, max) across all processes.
- MPI_Bcast: Broadcasts data from one process to all others.
2 Examples
1. The following program implements a simple send/receive communication. Compile and run the program and
study the output of several runs.
Code 1
2. The following program demonstrates sending and receiving messages between two processes in a ping-pong
fashion. The communication does not work if the size of the communicator is not exactly 2.
Note that:
• Process 0: Sends a message to process 1 and then receives a message back.
• Process 1: Receives a message from process 0, sends a response, and then receives another
message.
Compile and run the program and study the output of several runs.
1
Code 2
3 Practice
1. Revisit Code 1 and change the behavior of the program so process 0 can receive the results in the order in
which the processes finish.
2. Explain the execution of Code 2 if the size of the communicator is not 2.
3. Compile and run the code in mpiSumArray.c( Code folder), study the behavior of the program across several
runs, and write the proper code lines to address the comments marked as “// ** display…”. Your output
should be similar to the example below.
4. Write an MPI program in C to calculate the sum of two predefined arrays of at least length 12. The program
should ensure that it only runs when the number of processes is four or more. Use parallel execution to
compute partial sums and then combine them to get the final result.