250 likes | 729 Vues
MPI One Sided and Two-Sided Communication: Table of Contents. MPI CommunicationIntroduction to MPI CommunicationExample of MPI CommunicationMPI Two-Sided Communication Definition of MPI Two-Sided CommunicationExample of MPI Two-Sided CommunicationDiagram of MPI Two-Sided CommunicationMPI One-
E N D
1. MPI One-Sided and Two-Sided Communication Computer Science Colloquium
Akinwunmi (Akin) Odeniyi
Advisor: Dr. Kai Wang, PhD.
2. MPI One Sided and Two-Sided Communication: Table of Contents MPI Communication
Introduction to MPI Communication
Example of MPI Communication
MPI Two-Sided Communication
Definition of MPI Two-Sided Communication
Example of MPI Two-Sided Communication
Diagram of MPI Two-Sided Communication
MPI One-Sided Communication.
Definition of MPI One-Sided Communication
Example of MPI One-Sided Communication
Diagrams of MPI One-Sided Communication 5/18/2012 2
3. MPI One Sided and Two-Sided Communication: Table of Contents Transposing an n by n matrix using two processors
Program code using Two-Sided Communication
Program code using One-Side Communication
Performance study of MPI communication models
Tabulation of data
Graphs of findings
Analysis of results
Conclusion.
Review.
Sources.
Thanks 5/18/2012 3
4. MPI Communication: Introduction MPI (Message Passing Interface) is a
Library specification for message passing.
Standard for writing message-passing programs.
It allows many computers to communicate .
There are two types of MPI Communication
MPI Two-Sided Communication Model
MPI Standard 1 (MPI-1).
MPI One-Sided Communication Model
MPI Standard 2 (MPI-2). 5/18/2012 4
5. MPI Communication: Why is Communication Needed? Assume we want to add numbers 1 to 100.
Let a processor perform an addition in 1 second.
This processor will sum all numbers in 99 seconds.
Two processors can reduce computation time.
Each processor will sum numbers in 49 seconds.
It requires 1 second to add the two final results.
Processors need to communicate in order to:
Distribute the numbers 1 to 100 evenly.
Determine the sum of their computation results.
Communication is needed in MPI for effective programming.
5/18/2012 5
6. Two-Sided Communication in MPI-1: Definition Two-Sided Communication
Evident in the use of most MPI function calls.
Parameters is specified by sender and receiver.
Data moves between processes address spaces.
Senders Side: MPI_Send() function is called.
Receivers Side: MPI_Recv() function is used.
Implies some degree of synchronization
Receive cant complete until send has started.
5/18/2012 6
7. MPI Two-Sided Communication: Example The sum of 1 to 100 described above can be done with Two-Sided communication model.
Processor 1
Sends the numbers {1, 3, , 99} to processor 2.
Receives the sum of {1, 3, , 99} from processor 2.
Processor 2
Receives sequence {1, 3, , 99} from processor 1.
Sends the series: 1 + 3 + + 99 to processor 1. 5/18/2012 7
8. Two-Sided Communication: Diagram 5/18/2012 8
9. One-Sided Communication in MPI-2: Definition Remote Memory Access (RMA)
One process specifies communication parameters.
Separates communication and synchronization.
User imposes right ordering of memory accesses.
Origin: the process that performs the call.
Target: the process in which memory is accessed.
Communication calls
MPI_Get: Remote read.
MPI_Put: Remote write. 5/18/2012 9
10. MPI One-Sided Communication: Example. Given a sequence {1, , n} and 2 processors.
Processor 1 Loops through {1, , n}. It
Informs processor 2 each time it finds a prime.
Does not know when next prime will be found.
Uses MPI_Put to notify processor 2 of primes.
Processor 2 need not loop through {1, , n}. It
Does not know how many primes will be found.
Remains idle till processor 1 finds or sends prime. 5/18/2012 10
11. One-Sided Communication: MPI_Get 5/18/2012 11
12. One-Sided Communication: MPI_Put 5/18/2012 12
13. One -Sided and Two-Sided Communication: Transposing a Matrix Transposing an n x n matrix named M.
For all integer i in [1, n]
For all integer j in [1, n]
Mji = Mij
Programs were written in c language.
Programs ran on 2 processors in the rio machine.
matTran2.c used MPI_Send and MPI_Recv.
matTran1.c used the MPI_Get function.
Duration of transposition is given in seconds. 5/18/2012 13
14. Transposing a Matrix with Two-Sided Communication: matTran2.c t1 = MPI_Wtime();
for(i=0; i<numRows; i++){
for(j=0; j<numCols; j++){
if(myid==0){ matrixVal = matrixArray[k]; k++;
MPI_Send(&matrixVal, 1, MPI_INT, 1, tag, MPI_COMM_WORLD); }
if(myid==1) { MPI_Recv(&matrixVal, 1, MPI_INT, 0, tag, MPI_COMM_WORLD, &status); matrixArray[(numRows*j)+i] = matrixVal; }
} }
t2 = MPI_Wtime(); 5/18/2012 14
15. Transposing a Matrix with One-Sided Communication: matTran1.c if(myid==0){ t1 = MPI_Wtime(); t2 = MPI_Wtime();
t3 = t2-t1; }
t1 = MPI_Wtime();
if(myid==0) MPI_Win_create(&matrixVal, sizeof(int), sizeof(int),MPI_INFO_NULL, MPI_COMM_WORLD, &win);
else MPI_Win_create(NULL, 0, 1, MPI_INFO_NULL, MPI_COMM_WORLD, &win);
for(i=0; i<numRows; i++) {
for(j=0; j<numCols; j++) { 5/18/2012 15
16. Transposing a Matrix with One-Sided Communication: matTran1.c if(myid==0){ matrixVal = matrixArray[k]; k++;
MPI_Win_fence(0, win); MPI_Win_fence(0, win);}
if(myid==1) { MPI_Win_fence(0, win); MPI_Get(&matrixVal, 1, MPI_INT, 0, 0, 1, MPI_INT, win); MPI_Win_fence(0, win); matrixArray[(numRows*j)+i] = matrixVal;}
} } //ends for loops
MPI_Win_free(&win);
t2 = MPI_Wtime();
duration = t2-t1-t3; 5/18/2012 16
17. Performance of One-Sided and Two-Sided Communication: Table 5/18/2012 17
18. Performance of One-Sided and Two-Sided Communication: Graph 1 5/18/2012 18
19. Performance of One-Sided and Two-Sided Communication: Graph 2 5/18/2012 19
20. Performance of One-Sided and Two-Sided Communication: Analysis MPI 2-Sided functions transposed the n by n matrix in lesser time for all n observed.
Time measured for MPI_Get 1-Sided function increased at a faster rate as n increased.
matTran2 halted for n >= 512 with an out of resources error, but matTran1 ran for n > 512.
Elapsed time for n = 1 was greater than that for n = 2 in both communication models. 5/18/2012 20
21. MPI One-Sided and Two-Sided Communication: Summary For efficiency purposes, MPI Two-Sided communication model is better and faster.
Every Two-Sided communication problem can be solved with One-Sided communication.
Moreover, some problems cannot be solved by MPI Two-Sided communication model.
Only MPI One-Sided model will work for the prime number example mentioned earlier. 5/18/2012 21
22. MPI One-Sided and Two-Sided Communication: Future MPI_Probe() in MPI Standard 1:
Solves One-Sided Communication problems.
Allows incoming message to be checked for.
Does not receive incoming messages.
Future research:
Comparing performance of the two models.
Solving a one sided communication problem.
Using MPI_Send, MPI_Recv, and MPI_Probe.
Using MPI_Get and or MPI_Put. 5/18/2012 22
23. MPI One-Sided and Two-Sided Communication: Sources MPICH Home Page: http://www.mcs.anl.gov/research/projects/mpi/mpich1/
MPICH2: High-performance and widely portable MPI http://www.mcs.anl.gov/research/projects/mpich2/
High Performance Computing Class Notes: http://www.usd.edu/~Kai.Wang/Teach.html 5/18/2012 23
24. MPI One-Sided and Two-Sided Communication: Appreciation Thanks to computer science faculty and staff.
Thank you for your time and attention.
Gratitude goes to friends and family.
Glory be to God!
Questions? 5/18/2012 24