1 / 7

Parallel Computing Using MPI

Parallel Computing Using MPI. Parallel Computing MPI CS Lab setup Simple MPI program MPI data type and user defined MPI data types. Parallel Computing. Traditional computing is sequential. Only one instruction can be executed at any given moment in time

katima
Télécharger la présentation

Parallel Computing Using MPI

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Parallel Computing Using MPI • Parallel Computing • MPI • CS Lab setup • Simple MPI program • MPI data type and user defined MPI data types

  2. Parallel Computing • Traditional computing is sequential. Only one instruction can be executed at any given moment in time • Parallel computing uses multiple computers (or processors on the same machine) to execute multiple instructions simultaneously.

  3. Parallel Computing • Motivations • Solve a problem more quickly • Solve very large problems (scalability issue) • Save money by using a cluster of cheap computers instead of a supercomputer

  4. Parallel Computing • Cluster Computing: Multiple independent computers combined into a parallel computing unit • Hardware readily available • Message exchanges between nodes of the cluster are through message passing • 2 free message passing software • MPI: Message Passing Interface • PVM: Parallel Virtual Machine

  5. MPI • Provides library functions for passing messages in a parallel environment • Fortran, C and C++ programs are written as normal and then linked against the MPI library

  6. MPI • 2 software packages that support MPI for parallel computing • MPICH - free portable version of MPI • MPICH2 - all new implementation of MPI • lam-mpi – another implementation of MPI

  7. CS Lab Setup • Lazar lab is setup with MPICH so each LINUX machine can be used as a node in a parallel computing cluster

More Related