1 / 125

Kengo Nakajima, RIST. 3rd ACES WG Meeting, June 5 th & 6 th , 2003. Brisbane, QLD, Australia,

HPC Middleware (HPC-MW) Infrastructure for Scientific Applications on HPC Environments Overview and Recent Progress. Kengo Nakajima, RIST. 3rd ACES WG Meeting, June 5 th & 6 th , 2003. Brisbane, QLD, Australia,. My Talk. HPC-MW (35 min.) Hashimoto/Matsu’ura Code on ES (5 min.).

klaus
Télécharger la présentation

Kengo Nakajima, RIST. 3rd ACES WG Meeting, June 5 th & 6 th , 2003. Brisbane, QLD, Australia,

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. HPC Middleware (HPC-MW)Infrastructure for Scientific Applications on HPC EnvironmentsOverview and Recent Progress Kengo Nakajima, RIST. 3rd ACES WG Meeting, June 5th & 6th, 2003. Brisbane, QLD, Australia,

  2. 3rd ACES WG mtg. June 2003. My Talk • HPC-MW (35 min.) • Hashimoto/Matsu’ura Code on ES (5 min.)

  3. 3rd ACES WG mtg. June 2003. Table of Contents • Background • Basic Strategy • Development • On-Going Works/Collaborations • XML-based System for User-Defined Data Structure • Future Works • Examples

  4. 3rd ACES WG mtg. June 2003. Frontier Simulation Software for Industrial Science (FSIS)http://www.fsis.iis.u-tokyo.ac.jp • Part of "IT Project" in MEXT (Minitstry of Education, Culture, Sports, Science & Technology) • HQ at Institute of Industrial Science, Univ. Tokyo • 5 years, 12M USD/yr. (decreasing ...) • 7 internal projects, > 100 people involved. • Focused on • Industry/Public Use • Commercialization

  5. 3rd ACES WG mtg. June 2003. Frontier Simulation Software for Industrial Science (FSIS) (cont.)http://www.fsis.iis.u-tokyo.ac.jp • Quantum Chemistry • Quantum Molecular Interaction System • Nano-Scale Device Simulation • Fluid Dynamics Simulation • Structural Analysis • Problem Solving Environment (PSE) • High-Performance Computing Middleware (HPC-MW)

  6. 3rd ACES WG mtg. June 2003. Background • Various Types of HPC Platforms • Parallel Computers • PC Clusters • MPP with Distributed Memory • SMP Cluster (8-way, 16-way, 256-way): ASCI, Earth Simulator • Power, HP-RISC, Alpha/Itanium, Pentium, Vector PE • GRID Environment -> Various Resources • Parallel/Single PE Optimization is important !! • Portability under GRID environment • Machine-dependent optimization/tuning. • Everyone knows that ... but it's a big task especially for application experts, scientists.

  7. 3rd ACES WG mtg. June 2003. Background • Various Types of HPC Platforms • Parallel Computers • PC Clusters • MPP with Distributed Memory • SMP Cluster (8-way, 16-way, 256-way): ASCI, Earth Simulator • Power, HP-RISC, Alpha/Itanium, Pentium, Vector PE • GRID Environment -> Various Resources • Parallel/Single PE Optimization is important !! • Portability under GRID environment • Machine-dependent optimization/tuning. • Everyone knows that ... but it's a big task especially for application experts, scientists.

  8. 3rd ACES WG mtg. June 2003. Reordering for SMP Cluster with Vector PEs: ILU Factorization

  9. 3rd ACES WG mtg. June 2003. 3D Elastic SimulationProblem Size~GFLOPSEarth Simulator/SMP node (8 PEs) ●:PDJDS/CM-RCM,■:PCRS/CM-RCM,▲:Natural Ordering

  10. 3rd ACES WG mtg. June 2003. 3D Elastic SimulationProblem Size~GFLOPSIntel Xeon 2.8 GHz, 8 PEs ●:PDJDS/CM-RCM,■:PCRS/CM-RCM,▲:Natural Ordering

  11. 3rd ACES WG mtg. June 2003. Parallel Volume RenderingMHD Simulation of Outer Core

  12. 3rd ACES WG mtg. June 2003. Volume Rendering Moduleusing Voxels On PC Cluster On Earth Simulator -Hierarchical Background Voxels -Linked-List -Globally Fine Voxels -Static-Array

  13. 3rd ACES WG mtg. June 2003. Background (cont.) • Simulation methods such as FEM, FDM etc. have several typical processes for computation.

  14. 3rd ACES WG mtg. June 2003. "Parallel" FEM Procedure Pre-Processing Main Post-Processing Initial Grid Data Data Input/Output Post Proc. Partitioning Matrix Assemble Visualization Linear Solvers Domain Specific Algorithms/Models

  15. 3rd ACES WG mtg. June 2003. Applications Applications Middleware Big Gap Compilers, MPI etc. Compilers, MPI etc. Background (cont.) • Simulation methods such as FEM, FDM etc. have several typical processes for computation. • How about "hiding" these process from users by Middleware between applications and compilers ? • Development: efficient, reliable, portable, easy-to-maintain • accelerates advancement of the applications (= physics) • HPC-MW = Middleware close to "Application Layer"

  16. 3rd ACES WG mtg. June 2003. I/O Matrix Assemble Linear Solver Visualization Example of HPC MiddlewareSimulation Methods include Some Typical Processes FEM

  17. 3rd ACES WG mtg. June 2003. I/O Matrix Assemble MPP-C MPP-A Linear Solver Visualization MPP-B Example of HPC MiddlewareIndividual Process can be optimized for Various Types of MPP Architectures FEM

  18. 3rd ACES WG mtg. June 2003. Example of HPC MiddlewareLibrary-Type HPC-MW for Existing HW FEM code developed on PC

  19. 3rd ACES WG mtg. June 2003. I/O I/O I/O Matrix Assemble Matrix Assemble Matrix Assemble Linear Solver Linear Solver Linear Solver Vis. Vis. Vis. HPC-MW for Xeon Cluster HPC-MW for Earth Simulator HPC-MW for Hitachi SR8000 Example of HPC MiddlewareLibrary-Type HPC-MW for Existing HW FEM code developed on PC

  20. 3rd ACES WG mtg. June 2003. FEM code developed on PC I/F for I/O I/F for Mat.Ass. I/F for Solvers I/F for Vis. I/O I/O I/O Matrix Assemble Matrix Assemble Matrix Assemble Linear Solver Linear Solver Linear Solver Vis. Vis. Vis. HPC-MW for Earth Simulator HPC-MW for Hitachi SR8000 HPC-MW for Xeon Cluster Example of HPC MiddlewareLibrary-Type HPC-MW for Existing HW

  21. 3rd ACES WG mtg. June 2003. FEM code developed on PC I/F for I/O I/F for Mat.Ass. I/F for Solvers I/F for Vis. I/O I/O I/O Matrix Assemble Matrix Assemble Matrix Assemble Linear Solver Linear Solver Linear Solver Vis. Vis. Vis. HPC-MW for Earth Simulator HPC-MW for Hitachi SR8000 HPC-MW for Xeon Cluster Example of HPC MiddlewareParallel FEM Code Optimized for ES

  22. 3rd ACES WG mtg. June 2003. FEM code developed on PC I/F for I/O I/F for Mat.Ass. I/F for Solvers I/F for Vis. I/O I/O I/O Matrix Assemble Matrix Assemble Matrix Assemble Linear Solver Linear Solver Linear Solver Vis. Vis. Vis. HPC-MW for Earth Simulator HPC-MW for Hitachi SR8000 HPC-MW for Xeon Cluster Example of HPC MiddlewareParallel FEM Code Optimized for Intel Xeon

  23. 3rd ACES WG mtg. June 2003. FEM code developed on PC I/F for I/O I/F for Mat.Ass. I/F for Solvers I/F for Vis. I/O I/O I/O Matrix Assemble Matrix Assemble Matrix Assemble Linear Solver Linear Solver Linear Solver Vis. Vis. Vis. HPC-MW for Earth Simulator HPC-MW for Hitachi SR8000 HPC-MW for Xeon Cluster Example of HPC MiddlewareParallel FEM Code Optimized for SR8000

  24. 3rd ACES WG mtg. June 2003. • Background • Basic Strategy • Development • On-Going Works/Collaborations • XML-based System for User-Defined Data Structure • Future Works • Examples

  25. 3rd ACES WG mtg. June 2003. HPC Middleware(HPC-MW)? • Based on idea of "Plug-in" in GeoFEM.

  26. 3rd ACES WG mtg. June 2003. System Config. of GeoFEM http://geofem.tokyo.rist.or.jp/

  27. 3rd ACES WG mtg. June 2003. Local Data StructureNode-based Partitioninginternal nodes-elements-external nodes

  28. 3rd ACES WG mtg. June 2003. What can we do by HPC-MW ? • We can develop optimized/parallel code easily on HPC-MW from user's code developed on PC. • Library-Type • most fundamental approach • optimized library for individual architecture • Compiler-Type • Next Generation Architecture • Irregular Data • Network-Type • GRID Environment (heterogeneous) • Large-Scale Computing (Virtual Petaflops), Coupling.

  29. LINK 3rd ACES WG mtg. June 2003. Initial Entire Mesh Data CNTL Data mesh generation partitioning Utility Software for Parallel Mesh Generation & Partitioning FEM Source Code by Users Patch File for Visualization FEM Codes Image File for Visualization HW Data Library-type HPC-MW Exec. File Parallel FEM CNTL Data FEM Distributed Local Mesh Data Distributed Local Results Distributed Local Mesh Data LINK Compiler-Type HPC-MW Generator 線形ソルバ ソースコード 線形ソルバ ソースコード 線形ソルバ ソースコード 線形ソルバ ソースコード GEN. 線形ソルバ ソースコード 線形ソルバ ソースコード 線形ソルバ ソースコード 線形ソルバ ソースコード Network-Type HPC-MW Source Code Compiler-Type HPC-MW Source Code Network-Type HPC-MW Generator HPC-MW Prototype Source FIle GEN. MPICH, MPICH-G Network HW Data HPC-MWProcedure

  30. 3rd ACES WG mtg. June 2003. FEM code developed on PC I/F for I/O I/F for Mat.Ass. I/F for Solvers I/F for Vis. I/O I/O I/O Matrix Assemble Matrix Assemble Matrix Assemble Linear Solver Linear Solver Linear Solver Vis. Vis. Vis. HPC-MW for Earth Simulator HPC-MW for Hitachi SR8000 HPC-MW for Xeon Cluster Library-Type HPC-MWParallel FEM Code Optimized for ES

  31. 3rd ACES WG mtg. June 2003. What can we do by HPC-MW ? • We can develop optimized/parallel code easily on HPC-MW from user's code developed on PC. • Library-Type • most fundamental approach • optimized library for individual architecture • Compiler-Type • Next Generation Architecture • Irregular Data • Network-Type • GRID Environment, • Large-Scale Computing (Virtual Petaflops), Coupling.

  32. 3rd ACES WG mtg. June 2003. I/O I/O Data for Analysis Model Matrix Assemble Matrix Assemble MPP-C Parameters of H/W MPP-A Linear Solver Linear Solver Visualization Visualization MPP-B Compiler-Type HPC-MW Optimized code is generated by special language/ compiler based on analysis data (cache blocking etc.) and H/W information. F E M Special Compiler

  33. 3rd ACES WG mtg. June 2003. What can we do by HPC-MW ? • We can develop optimized/parallel code easily on HPC-MW from user's code developed on PC. • Library-Type • most fundamental approach • optimized library for individual architecture • Compiler-Type • Next Generation Architecture • Irregular Data • Network-Type • GRID Environment (heterogeneous). • Large-Scale Computing (Virtual Petaflops), Coupling.

  34. 3rd ACES WG mtg. June 2003. analysis model space Network-Type HPC-MWHeterogeneous Environment"Virtual" Supercomputer F E M analysis model space

  35. 3rd ACES WG mtg. June 2003. What is new, What is nice ? • Application Oriented (limited to FEM at this stage) • Various types of capabilities for parallel FEM are supported. • NOT just a library • Optimized for Individual Hardware • Single Performance • Parallel Performance • Similar Projects • Cactus • Grid Lab (on Cactus)

  36. 3rd ACES WG mtg. June 2003. • Background • Basic Strategy • Development • On-Going Works/Collaborations • XML-based System for User-Defined Data Structure • Future Works • Examples

  37. 3rd ACES WG mtg. June 2003. Schedule FY.2002 FY.2003 FY.2004 FY.2006 FY.2005 Basic Design Prototype Library-type HPC-MW Scalar Vector Compiler-type Network-type HPC-MW Compiler Network FEM Codes on HPC-MW Public Release

  38. 3rd ACES WG mtg. June 2003. System for Development HW Vendors HW Info. HPC-MW Public Release Public Users Library-Type Compiler-Type Network-Type Feed-back, Comments I/O Vis. Solvers Coupler AMR DLB Mat.Ass. Feed-back Appl. Info. Infrastructure FEM Code on HPC-MW Solid Fluid Thermal

  39. 3rd ACES WG mtg. June 2003. FY. 2003 • Library-Type HPC-MW • FORTRAN90,C, PC-Cluster Version • Public Release • Sept. 2003: Prototype Released • March 2004: Full version for PC Cluster • Demonstration on Network-Type HPC-MW in SC2003, Phoenix, AZ, Nov. 2003. • Evaluation by FEM Code

  40. 3rd ACES WG mtg. June 2003. Library-Type HPC-MWMesh Generation is not considered • Parallel I/O : I/F for commercial code (NASTRAN etc.) • Adaptive Mesh Refinement (AMR) • Dynamic Load-Balancing using pMETIS (DLB) • Parallel Visualization • Linear Solvers(GeoFEM + AMG, SAI) • FEM Operations (Connectivity, Matrix Assembling) • Coupling I/F • Utility for Mesh Partitioning • On-line Tutorial

  41. 3rd ACES WG mtg. June 2003. AMR+DLB

  42. 3rd ACES WG mtg. June 2003. Parallel Visualization Tensor Field Vector Field Scalar Field Streamlines Hyperstreamlines Surface rendering Particle tracking Interval volume-fitting Volume rendering Topological map LIC Volume rendering Extension of functions Extension of dimensions Extension of Data Types

  43. 3rd ACES WG mtg. June 2003. PMR(Parallel Mesh Relocator) • Data Size is Potentially Very Large in Parallel Computing. • Handling entire mesh is impossible. • Parallel Mesh Generation and Visualization are difficult due to Requirement for Global Info. • Adaptive Mesh Refinement (AMR) and Grid Hierarchy.

  44. 3rd ACES WG mtg. June 2003. Parallel Mesh Generationusing AMR • Prepare Initial Mesh • with size as large as single PE can handle. Initial Mesh

  45. 3rd ACES WG mtg. June 2003. Parallel Mesh Generationusing AMR • Partition the Initial Mesh into Local Data. • potentially very coarse Partition Local Data Local Data Local Data Local Data Initial Mesh

  46. 3rd ACES WG mtg. June 2003. Parallel Mesh Generationusing AMR • Parallel Mesh Relocation (PMR) by Local Refinement on Each PEs. Partition PMR Local Data Local Data Local Data Local Data Initial Mesh

  47. 3rd ACES WG mtg. June 2003. Parallel Mesh Generationusing AMR • Parallel Mesh Relocation (PMR) by Local Refinement on Each PEs. Refine Partition PMR Local Data Local Data Refine Local Data Local Data Refine Initial Mesh Refine

  48. 3rd ACES WG mtg. June 2003. Parallel Mesh Generationusing AMR • Parallel Mesh Relocation (PMR) by Local Refinement on Each PEs. Refine Refine Partition PMR Local Data Local Data Refine Refine Local Data Local Data Refine Refine Initial Mesh Refine Refine

  49. 3rd ACES WG mtg. June 2003. Parallel Mesh Generationusing AMR • Hierarchical Refinement History • can be utilized for visualization & multigrid Refine Refine Partition PMR Local Data Local Data Refine Refine Local Data Local Data Refine Refine Initial Mesh Refine Refine

  50. 3rd ACES WG mtg. June 2003. Parallel Mesh Generationusing AMR • Hierarchical Refinement History • can be utilized for visualization & multigrid mapping results reversely from local fine mesh to initial coarse mesh Initial Mesh

More Related