1 / 6

CSE 664 Parallel Comptuer Architecture Definitions..

CSE 664 Parallel Comptuer Architecture Definitions. GRAIN SIZE Fine, Medium, Coarse. The basic program segment chosen for parallel processing. Latency: Communication Time Memory Latency: Synchronization Latency Balance Granularity and Latency to obtain better performance.

Télécharger la présentation

CSE 664 Parallel Comptuer Architecture Definitions..

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CSE 664 Parallel Comptuer ArchitectureDefinitions.. GRAIN SIZE Fine, Medium, Coarse. The basic program segment chosen for parallel processing.

  2. Latency: Communication Time Memory Latency: Synchronization Latency Balance Granularity and Latency to obtain better performance.

  3. Parallelism Levels • Instruction Level=== less than 20 instructions; FINE GRAIN • Loop Level === less than 500 instructions • Procedure Level=== Subroutine- 2000 instructions; MEDIUM GRAIN • Subprogram level=== Message Passing multiprogramming • Programming Level=== COARSE GRAIN

  4. Fine Grain provides a higher degree of parallelism, heavy communication overhead, and scheduling overhead. Assisted by parallelism compiler. • Coarse Grain relies heavily on an effective OS and on the efficiency of the parallel algorithm. • Medium Grain parallelism uses the programmer and the compiler

  5. Shared variable communication is used to support fine and medium grain computations. • N tasks communicating with each other requires: N(N-1)/2 communication links.

  6. Depending on the application, you need to choose to utilize Fine, Medium or Coarse grain parallelism.

More Related