1 / 0

Blue Waters An Extraordinary Resource for Extraordinary Science

Blue Waters An Extraordinary Resource for Extraordinary Science. Thom Dunning, William Kramer, Marc Snir , William Gropp , Wen-mei Hwu Cristina Beldica , Brett Bode, Robert Fiedler, Merle Giles, Scott Lathrop, Mike Showerman

ping
Télécharger la présentation

Blue Waters An Extraordinary Resource for Extraordinary Science

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Blue WatersAn Extraordinary Resource for Extraordinary Science

    Thom Dunning, William Kramer, Marc Snir, William Gropp, Wen-meiHwu Cristina Beldica, Brett Bode, Robert Fiedler, Merle Giles, Scott Lathrop, Mike Showerman National Center for Supercomputing Applications, Department of Chemistry, Department of Computer Science, and Department of Electrical & Computer Engineering
  2. Sustained Petascalecomputing will enable advances in a broad range of science and engineering disciplines: Molecular Science Weather & Climate Forecasting Astrophysics Health Life Science Materials Astronomy Earth Science
  3. BackgroundNSF’s Strategy for High-end Computing Science and Engineering Capability (logarithmic scale) Track 1 System UIUC/NCSA (~1 PF sustained) Track 2 Systems Track 2d (?) UT/ORNL (~1PF peak) TACC (500+TF peak) Track 3 Systems Leading University HPC Centers (10-100 TF) FY’07 FY’08 FY’09 FY’10 FY’11
  4. D. Baker, University of Washington Protein structure refinement and determination M. Campanelli, RIT Computational relativity and gravitation D. Ceperley, UIUC Quantum Monte Carlo molecular dynamics J. P. Draayer, LSU Ab initio nuclear structure calculations P. Fussell, Boeing Aircraft design optimization C. C. Goodrich Space weather modeling M. Gordon, T. Windus, Iowa State University Electronic structure of molecules S. Gottlieb, Indiana University Lattice quantum chromodynamics V. Govindaraju Image processing and feature extraction M. L. Klein, University of Pennsylvania Biophysical and materials simulations J. B. Klempet al., NCAR Weather forecasting/hurricane modeling R. Luettich, University of North Carolina Coastal circulation and storm surge modeling W. K. Liu, Northwestern University Multiscale materials simulations M. Maxey, Brown University Multiphase turbulent flow in channels S. McKee, University of Michigan Analysis of ATLAS data M. L. Norman, UCSD Simulations in astrophysics and cosmology J. P. Ostriker, Princeton University Virtual universe J. P. Schaefer, LSST Corporation Analysis of LSST datasets P. Spentzouris, Fermilab Design of new accelerators W. M. Tang, Princeton University Simulation of fine-scale plasma turbulence A. W. Thomas, D. Richards, Jefferson Lab Lattice QCD for hadronic and nuclear physics J. Tromp, Caltech/Princeton Global and regional seismic wave propagation P. R. Woodward, University of Minnesota Astrophysical fluid dynamics Survey of Scientific Community
  5. What These Scientists Told Us They Needed Maximum Core Performance … to minimize number of cores needed for a given level of performance as well as lessen the impact of sections of code with limited scalability Low Latency, High Bandwidth Communications Fabric … to maximize scalability of science and engineering applications Large, Low Latency, High Bandwidth Memory Subsystem … to enable the solution of memory-intensive problems Large Capacity, High Bandwidth I/O Subsystem … to enable the solution of data-intensive problems Reliable Operation … to enable long-running simulations
  6. Diverse Large Scale Computational Science
  7. Goals of Blue Waters Project Science and Engineering Provide knowledge/expertise/services to help researchers develop applications that take full advantage of Blue Waters Computing System Hardware and Software Sustain ≥1 petaflops on range of science and engineering applications Enhance petascale applications development environment and systems software Education Prepare next generation of scientists and engineers for research at the frontiers of petascale computing and computation Industrial Engagement Enable industry to utilize petascale computing to address their most challenging problems and enhance their competitive position
  8. Name • Date • Location Focus on Sustained Performance Blue Water’s and NSF are focusing on sustained performance in a way few have been before. Sustained is the computer’s performance on a broad range of applications that scientists and engineers use every day. Time to solution is the metric – not Ops/s Tests include time to read data and write the results NSF’s call emphasized sustained performance, demonstrated on a collection of application benchmarks (application + problem set) Not just simplistic metrics (e.g. HP Linpack) Applications include both Petascale applications (effectively use the full machine, solving scalability problems for both compute and I/O) and applications that use a fraction of the system Metric is the time to solution Blue Waters project focus is on delivering sustained PetaFLOPS performance to all applications Develop tools, techniques, samples, that exploit all parts of the system Explore new tools, programming models, and libraries to help applications get the most from the system
  9. Blue Waters Project Components Petascale Education, Industry and Outreach Petascale Applications (Computing Resource Allocations) Great Lakes Consortium for Petascale Computing Petascale Application Collaboration Team Support Outstanding User and Production Support WAN connections, Consulting, System Management, Security, Operations, … Value added Software – Collaborations Value added hardware and software Blue Waters Base System – Processors, Memory, Interconnect, On-line Storage, System Software, Programming Environment Petascale Computing Facility
  10. Blue Waters Petascale Computing System Blue Waters Computing System Typical Cluster (NCSA Abe) Dell Intel Xeon 5300 0.090 ~0.005 9,600 0.0144 0.1 5 40 Track 2 (TACC) Sun AMD 0.58 ~0.06 62,976 0.12 1.73 2.5 10 Blue Waters* IBM Power 7 ~1.0 >200,000 >0.8 >10 >500 100-400 System Attribute Vendor Processor Peak Perf. (PF) Sustained Perf. (PF) Number of cores Amount of Memory (PB) Amount of Disk Storage (PB) Amount of Archival Storage (PB) External Bandwidth (Gbps) * Reference petascale computing system (no accelerators).
  11. From Chip to Entire Integrated System PCF Blue Waters System Rack/Building Block multiple MCMs Near-line Storage On-line Storage Quad Chip MCM Chip Color indicates relative amount of public information
  12. Power7 Chip: Computational Heart of Blue Waters Base Technology 45 nm, 576 mm2 1.2 B transistors Chip 8 cores 12 execution units/core 1, 2, 4 way SMT/core Caches 32 KB I, D-cache, 256 KB L2/core 32 MB L3 (private/shared) Dual DDR3 memory controllers 100 GB/s sustained memory bandwidth Power7 Chip Quad-chip MCM
  13. Memory Solutions
  14. RAM Technologies DIMMs, Dense and Fast SRAM are industry standard; eDRAM is IBM technology Used in Power 4,5,6 for off-chip L3 cache Used in Power 7 for on-chip cache (to avoid pin limitations and support BW of 8 cores)
  15. Cache Structure Innovation Combines dense, low power attributes of eDRAM with the speed and bandwidth advantages of SRAM – all on the same chip Provides low latency L1 and L2 dedicated per core ~3x lower latency than L3 Local region Keeps a 256KB working set Reduced L3 power requirements and improves throughput Provides large, shared L3 ~3x lower latency than memory Automatically migrates per core private working set footprints (up to 4MB) to fast local region per code at ~5x lower latency than the full L3 cache Automatically clones shared data to multiple per core private regions Enables a subset of cores to utilize the entire, large shared L3 Cache when remaining cores are not using it.
  16. Illinois Petascale Computing Facility at a Glance 88,000 GSF over two stories—45’ tall 30,000+ GSF of raised floor 20,000+ unobstructed net for computers 6’ clearance of raised floor 24 MW initial power feeds + backup Three 8 MW feeds + One 8 MW for backup 13,800 volt power to the each 5,400 Tons of cooling Full water side economization for 50%+ of the year Automatic Mixing of mechanical and ambient chilled water for optimal efficiency Adjacent to (new) 6.5M gallon thermal storage tank 480 Volt distribution to computers Energy Efficiency PUE - ~1.02 to <1.2 (projected) USGBCLEED Silver (Gold?) classification target (Native+5) Partners EYPMCF/ Gensler IBM Yahoo! www.ncsa.uiuc.edu/BlueWaters
  17. Illinois Petascale Computing Facility Cooling Towers PCF is Near University Power and Cooling Infrastructure
  18. IO Model: Global, Parallel shared file system (>10 PB) and archival storage (GPFS/HPSS) MPI I/O Environment: Traditional (command line), Eclipse IDE (application development, debugging, performance tuning, job and workflow management) Languages: C/C++, Fortran (77-2008 including CAF), UPC Performance tuning: HPC and HPCS toolkits, open source tools Resource manager: Batch and interactive access Parallel debugging at full scale Full – featured OS(AIX or Linux), Sockets, threads, shared memory, checkpoint/restart Libraries: MASS, ESSL, PESSL, PETSc, visualization… Programming Models: MPI/MP2, OpenMP, PGAS, Charm++, Cactus Low-level communications API supporting active messages (LAPI) Hardware Multicore POWER7 processor with Simultaneous MultiThreading (SMT) and Vector MultiMedia Extensions (VSX) Private L1, L2 cache per core, shared L3 cache per chip 128 GB RAM High-Performance, low-latency interconnect supporting RDMA
  19. Illinois-IBM Collaborative Projects. I Computing Systems Software Goal: enhance IBM’s HPC software stack Examples Integrated System Management Console Petascale Application Development Environment Computational Libraries Programming models Science and Engineering Applications Goal: prepare applications to fully utilize Blue Waters’ capabilities Process Before hardware: extensive use of processor and interconnect simulators and Track 2 systems to optimize processor and communications performance Modeling: Two modeling teams, LANL’s PAL (Hoisie) and SDSCPMaC (Snavely), are funded to fully engage with application teams After hardware: further optimization for Power7 processor, node, …, full scale system
  20. Illinois-IBM Collaborative Projects. II Computing Systems Software and Hardware Goal: enhance the performance of the base Blue Waters system Example Innovative Data Management using New File-system Features Evaluation of accelerators Petascale Computing Facility Goal: advance “green” computing by optimizing PUE Elements Focus on direct-liquid cooling On-site cooling towers with ambient water Automated control to use optimal cooling mix Efficient electrical distribution system PUE < 1.2 (Power Usage Effectiveness)
  21. Illinois-IBM Collaborative Projects. III Ease of Use Goal: improve the productivity of application teams using BW Example Common Communication Infrastructure Debugging and code management – Eclipse framework integration Tuning – RENCI performance tools, others Workflow management – resource scheduling Visualization Improving the System Effectiveness Example Interconnect Routing Cyber-security Others possible (resiliency)
  22. Name • Date • Location Blue Waters ProjectPetascale Computing Resources Allocations Biological Sciences Computational Microscope Klaus Schulten, Laxmikant Kale, University of Illinois at Urbana-Champaign Petascale Simulations of Complex Biological Behavior in Fluctuating Environments Ilias Tagkopoulos, University of California, Davis Engineering Petascale Computations for Complex Turbulent Flows Pui-Kuen Yeung, James Riley, Robert Moser, Amitava Majumdar, Georgia Institute of Technology
  23. Name • Date • Location Blue Waters ProjectPetascale Computing Resources Allocations Geosciences Petascale Research in Earthquake System Science on Blue Waters Thomas Jordan, JacoboBielak, University of Southern California Enabling Large-Scale, High-Resolution, and Real-Time Earthquake Simulations on Petascale Parallel Computers L. Wang & P. Chen, University of Wyoming Testing Hypotheses about Climate Prediction at Unprecedented Resolutions on the Blue Waters System David Randall, Ross Heikes, Colorado State University; William Large, Richard Loft, John Dennis, Mariana Vertenstein, National Center for Atmospheric Research; Cristiana Stan, James Kinter, Institute for Global Environment and Society; Benjamin Kirtman, University of Miami Understanding Tornados and Their Parent Supercells Through Ultra-High Resolution Simulation/Analysis Robert Wilhelmson, Brian Jewett, Matthew Gilmore, University of Illinois at Urbana-Champaign
  24. Name • Date • Location Blue Waters ProjectPetascale Computing Resources Allocations Mathematics & Physical Sciences Astronomical Sciences Computational Relativity and Gravitation at Petascale: Simulating and Visualizing Astrophysically Realistic Compact Binaries Manuela Campanelli, Carlos Lousto, Hans-Peter Bischof, Joshua Faber, YosefZiochower, Rochester Institute of Technology Enabling Science at the Petascale: From Binary Systems and Stellar Core Collapse to Gamma-Ray Bursts Eric Schnetter, Gabrielle Allen, MayankTyagi, Peter Diener, Christian Ott, Louisiana State University Formation of the First Galaxies: Predictions for the Next Generation of Observatories Brian O’Shea, Michigan State University; Michael Norman, University of California at San Diego
  25. Name • Date • Location Blue Waters ProjectPetascale Computing Resources Allocations Mathematics & Physical Sciences Astronomical Sciences Peta-Cosmology: Galaxy Formation and Virtual Astronomy KentaroNagamine, University of Nevada at Las Vegas; Jeremiah Ostriker, Princeton University; RenyueCen, Greg Bryan Petascale Simulation of Turbulent Stellar Hydrodynamics Paul Woodward, Pen-Chung Yew, University of Minnesota, Twin Cities Chemistry Computational Chemistry at the Petascale Monica Lamm, Mark Gordon, Theresa Windus, MashaSosonkina, Brett Bode, Iowa State University Super Instruction Architecture for Petascale Computing Rodney Bartlett, Erik Duemens, Beverly Sanders, University of Florida; PonnuswamySadayappan, Ohio State University
  26. Name • Date • Location Blue Waters ProjectPetascale Computing Resources Allocations Mathematics & Physical Science Materials Research Breakthrough Petascale Quantum Monte Carlo Calculations Shiwei Zhang, College of William and Mary Electronic Properties of Strongly Correlated Systems Using Petascale Computing Sergey Savrasov, University of California, Davis; KristjanHaule, Gabriel Kotliar, Rutgers University Physics Lattice QCD on Blue Waters Robert Sugar, University of California at Santa Barbara
  27. Name • Date • Location Blue Waters ProjectPetascale Computing Resources Allocations Social, Behavioral and Economic Sciences Simulation of Contagion on Very Large Social Networks with Blue Waters Keith Bisset, XizhouFeng, Virginia Polytechnic Institute and State University
  28. Allocations for Blue Waters Petascale Computing Resource Allocations (PRAC) – Anyone can apply! Vast majority (80%) of the resource allocated in this manner Selected by NSF based on Need for sustained petascale platform to carry out ground-breaking research Likely to be ready to use Blue Waters effectively in 2011 PRAC awardees receive travel funds and “provisional time” Will accept applications on a continuing basis in future Blue Waters application and consulting staff will support awardees in preparing codes Industry Allocation process under development Cost reimbursable Up to 7% of the resources Contact the BW PSP program for details Education Allocation process under development by the GLCPC Allocation Committee Great Lake Consortium on Petascale Computing (GLCPC) members 50,000,000 hours from Director’s reserve Allocation process under development Director’s Reserve For High Risk Start Up Projects Allocation process under development
  29. Great Lakes Consortiumfor Petascale Computation Goal: Facilitate the widespread and effective use of petascale computing to address frontier research questions in science, technology and engineering at research, educational and industrial organizations across the region and nation. Charter Members Argonne National Laboratory Fermi National Accelerator Laboratory Illinois Math and Science Academy Illinois Wesleyan University Indiana University* Iowa State University Illinois Mathematics and Science Academy Krell Institute, Inc. Los Alamos National Laboratory Louisiana State University Michigan State University* Northwestern University* Parkland Community College Pennsylvania State University* Purdue University* The Ohio State University* Shiloh Community Unit School District #1 Shodor Education Foundation, Inc. SURA – 60 plus universities University of Chicago* University of Illinois at Chicago* University of Illinois at Urbana-Champaign* University of Iowa* University of Michigan* University of Minnesota* University of North Carolina–Chapel Hill University of Wisconsin–Madison* Wayne City High School * CIC universities
  30. Petascale Education, Industry and Outreach Education Program Undergraduate Petascale Education Program Professional development workshops for faculty offered throughout the year Support for undergraduate faculty to develop course materials Support for undergraduate student year-long internships with petascale research projects Apply at www.computationalscience.org/upep Virtual School for Science and Engineering Summer Schools proposed for summer 2010 Many Core Processors (covering GPGPU and CUDA) Scaling to Petascale Large Data Handling Content from 2008 and 2009 Summer Schools freely available on-line Looking for sites that want to host participants linked together by HD video-conferencing http://www.ncsa.illinois.edu/BlueWaters/eot.html Industrial Partner Program Facility Industrial Use of Petascale Resources Cost recovery program Outreach Programs To other communities Examples GLCPC Proposal in process - U of Chicago for PRAC proposal consultation
  31. Undergraduate Petascale Education Program (UPEP) Led by Bob Panoff, Shodor Emphasis on engaging faculty from under-represented community and institutions (MSI, EPSCoR, 2- and 4-year) Three areas of emphasis Professional development via campus visits and ~10 workshops per year for undergraduate faculty to incorporate computational thinking and petascale resources in undergraduate classroom Support undergraduate faculty development over 3 years of 30 modules in contemporary science, from desktop to grid to petascale incorporating: Quantitative reasoning, Computational thinking, and Multi-scale modeling Support immersion of 15 undergraduate students per year in year-long petascale research projects On-going invitation for faculty and students to apply - www.computationalscience.org/upep 32
  32. Graduate Education The Virtual School of Computational Science and Engineering is the graduate education component of Blue Waters The Virtual School brings together faculty and staff from research universities around the nation to fill the knowledge gap in CSE A primary activity of the Virtual School is to organize annual Summer Schools for graduate students
  33. Summer School 2009 Two week-long, multi-site workshops were offered in the summer of 2009: Scaling to PetascaleAugust 3–7, 2009http://www.vscse.org/summerschool/2009/scaling/ Many-Core ProcessorsAugust 10–14, 2009http://www.vscse.org/summerschool/2009/manycore/ A total of 232 participants attended these workshops
  34. Summer School 2010 Proposed workshops for 2010: Scaling to Petascale Many-Core Processing Big Data for Science (new this year) http://www.vscse.org/summerschool/2010/workshops.html Call for Participation in 2010http://www.vscse.org/ We look forward to hearing from you!
  35. How can you get involved? Everyone Education program – summer schools, courses, faculty workshops… Software developers Contact us if you are developing tools for HPC Application developers Modest time allocations through GLCPC Large awards via NSF PRAC program Contact the BWs team if you are planning a submission! We can offer tips for preparing your proposal.
  36. Acknowledgements This research is part of the Blue Waters sustained-petascale computing project, which is supported by the National Science Foundation (award number OCI 07-25070) and the state of Illinois. Blue Waters is a joint effort of the University of Illinois at Urbana-Champaign, its National Center for Supercomputing Applications, IBM, and the Great Lakes Consortium for Petascale Computation. The work described is only achievable through the efforts of the Blue Waters Project.
  37. Questions? Dr. Brett Bode NCSA/University of Illinois Blue Waters Software Development Manager bbode@ncsa.uiuc.edu/ - http://www.ncsa.uiuc.edu/BlueWaters (217) 244-5187
More Related