1 / 17

Blue Waters An Extraordinary Computing Resource for Advancing Science and Engineering

Blue Waters An Extraordinary Computing Resource for Advancing Science and Engineering. Thom Dunning, Rob Pennington,* Bill Kramer, Marc Snir, Bill Gropp, Wen-mei Hwu and Ed Seidel* * Currently at NSF. 10 16. 10 15. B. B. B. B. B. B. B. B. 10 14. B. B. B. B. B. B. 10 13. B.

freja
Télécharger la présentation

Blue Waters An Extraordinary Computing Resource for Advancing Science and Engineering

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Blue WatersAn Extraordinary Computing Resource for Advancing Science and Engineering Thom Dunning, Rob Pennington,* Bill Kramer, Marc Snir, Bill Gropp, Wen-mei Hwu and Ed Seidel* * Currently at NSF

  2. 1016 1015 B B B B B B B B 1014 B B B B B B 1013 B B Rmax B B B B B B B 1012 B B B B B B B 1011 B B 1010 109 ’92 ’94 ’96 ’98 ’00 ’02 ’04 ’06 ’08 ’10 ’12 BackgroundMarch to Petascale Computing Top 500: #1 1 GF: late 1980s 1 TF: 1997 1 PF: 2008 petaflops teraflops CI Days • 22 February 2010 • University of Kentucky

  3. BackgroundNSF’s Strategy for High-end Computing • Three Resource Levels • Track 3: University owned and operated • Track 2: Several NSF-funded supercomputer & specialized computing centers (TeraGrid) • Track 1: NSF-funded leading-edge computer center • Computing Resources • Track 3: 10s–100s TF • Track 2: 500–1,000 TF, ~100+ TB of memory • Track 1: see following slides CI Days • 22 February 2010 • University of Kentucky

  4. Blue Waters ProjectFielding a Sustained Petascale Computing System

  5. Blue Waters ProjectTechnology Selection: Who We Consulted • D. Baker, University of Washington • Protein structure refinement and determination • M. Campanelli, RIT • Computational relativity and gravitation • D. Ceperley, UIUC • Quantum Monte Carlo molecular dynamics • J. P. Draayer, LSU • Ab initio nuclear structure calculations • P. Fussell, Boeing • Aircraft design optimization • C. C. Goodrich • Space weather modeling • M. Gordon, T. Windus, Iowa State University • Electronic structure of molecules • S. Gottlieb, Indiana University • Lattice quantum chromodynamics • V. Govindaraju • Image processing and feature extraction • M. L. Klein, University of Pennsylvania • Biophysical and materials simulations • J. B. Klemp et al., NCAR • Weather forecasting/hurricane modeling • R. Luettich, University of North Carolina • Coastal circulation and storm surge modeling • W. K. Liu, Northwestern University • Multiscale materials simulations • M. Maxey, Brown University • Multiphase turbulent flow in channels • S. McKee, University of Michigan • Analysis of ATLAS data • M. L. Norman, UCSD • Simulations in astrophysics and cosmology • J. P. Ostriker, Princeton University • Virtual universe • J. P. Schaefer, LSST Corporation • Analysis of LSST datasets • P. Spentzouris, Fermilab • Design of new accelerators • W. M. Tang, Princeton University • Simulation of fine-scale plasma turbulence • A. W. Thomas, D. Richards, Jefferson Lab • Lattice QCD for hadronic and nuclear physics • J. Tromp, Caltech/Princeton • Global and regional seismic wave propagation • P. R. Woodward, University of Minnesota • Astrophysical fluid dynamics CI Days • 22 February 2010 • University of Kentucky

  6. Blue Waters ProjectAttributes of Sustained Petascale System • Maximum Core Performance … to minimize number of cores needed for a given performance level, lessen impact of sections of code with limited scalability • Low Latency, High Bandwidth Interconnect … to enable science and engineering applications to scale to tens to hundreds of thousands of cores • Large, Fast Memories … to solve the most memory-intensive problems • Large, Fast I/O System and Data Archive … to solve the most data-intensive problems • Reliable Operation … to enable the solution of Grand Challenge problems CI Days • 22 February 2010 • University of Kentucky

  7. Blue Waters ProjectTrack 1 System: Blue Waters TACC NCSA System Attribute Track 2 Track 1 Vendor Sun IBM Processor AMD Barcelona IBM Power7 Peak Performance (PF) 0.579 Sustained Performance (PF) ~0.05 ~1 Number of Cores/Chip 4 8 Number of Processor Cores 62,976 >300,000 Amount of Memory (TB) 123 >1,000 Amount of Disk Storage (PB) 1.73 (s) >10 Amount of Archival Storage (PB) 2.5 (20) >500 External Bandwidth (Gbps) 10 100-400 >20 2 >3 >6 >5 >200 >10 CI Days • 22 February 2010 • University of Kentucky

  8. Blue Waters ProjectBuilding Blue Waters Blue Waters will be the most powerful computer in the world for scientific research when it comes on line in Summer of 2011. Blue Waters ~1 PF sustained >300,000 cores >1 PB of memory >10 PB of disk storage ~500 PB of archival storage >100 Gbps connectivity Blue Waters Building Block 32 IH server nodes 32 TB memory 256 TF (peak) 4 Storage systems 10 Tape drive connections IH Server Node 8 MCM’s (256 cores) 1 TB memory 8 TF (peak) Fully water cooled Multi-chip Module 4 Power7 chips 128 GB memory 512 GB/s memory bandwidth 1 TF (peak) Router 1,128 GB/s bandwidth Blue Waters is built from components that can also be used to build systems with a wide range of capabilities—from deskside to beyond Blue Waters. Power7 Chip 8 cores, 32 threads L1, L2, L3 cache (32 MB) Up to 256 GF (peak) 45 nm technology CI Days • 22 February 2010 • University of Kentucky

  9. Blue Waters ProjectSelected Unique Features of Blue Waters • Shared/Distributed Memory Computing System • Powerful Shared Memory Multichip Module (QCM) • New high performance fabric interconnects all nodes • Hardware support for global shared memory • I/O and Data archive Systems • High performance I/O subsystems • On-line disks fully integrated with archival storage system • Natural Growth Path • Full range of systems from servers to supercomputers • Facilitates software development • Address science and engineering problems at all levels CI Days • 22 February 2010 • University of Kentucky

  10. Blue Waters ProjectNational Petascale Computing Facility Partners EYP MCF/ Gensler IBM Yahoo! • Energy Efficiency • LEED certified Gold (goal: Platinum) • PUE < 1.2 (< 1.1 for much of year) • Modern Data Center • 90,000+ ft2 total • 30,000 ft2 raised floor • 20,000 ft2 machine room www.ncsa.uiuc.edu/BlueWaters CI Days • 22 February 2010 • University of Kentucky

  11. Blue Waters ProjectGreat Lakes Consortium for Petascale Computation Goal: Facilitate the widespread and effective use of petascale computing to address frontier research questions in science, technology and engineering at research, educational and industrial organizations across the region and nation. Charter Members Argonne National Laboratory Fermi National Accelerator Laboratory Illinois Math and Science Academy Illinois Wesleyan University Indiana University* Iowa State University Illinois Mathematics and Science Academy Krell Institute, Inc. Louisiana State University Michigan State University* Northwestern University* Parkland Community College Pennsylvania State University* Purdue University* The Ohio State University* Shiloh Community Unit School District #1 Shodor Education Foundation, Inc. SURA – 60 plus universities University of Chicago* University of Illinois at Chicago* University of Illinois at Urbana-Champaign* University of Iowa* University of Michigan* University of Minnesota* University of North Carolina–Chapel Hill University of Wisconsin–Madison* Wayne City High School * CIC universities* CI Days • 22 February 2010 • University of Kentucky

  12. Science & Engineering Researchon Blue Waters

  13. Blue Waters ProjectComputational Science and Engineering Petascale computing will enable advances in a broad range of science and engineering disciplines: Molecular Science Weather & Climate Forecasting Health Astronomy Earth Science CI Days • 22 February 2010 • University of Kentucky

  14. Petascale Computing Resource Allocations • Solicitation: NSF 08-529 (PRAC) • Selection Criteria • Compelling science or engineering research question • Question that can only be answered using a system of the scale of Blue Waters (cycles, memory, I/O bandwidth, etc.) • Evidence, or a convincing argument, that the application code can make effective use of Blue Waters • Source (or sources) of funds to support the research work and any needed code development effort • Funding • Allocation or provisional allocation of time on Blue Waters • Travel funds to enable teams to work closely with Blue Waters Project team • Next Due Date • March 17, 2010 (annually thereafter) CI Days • 22 February 2010 • University of Kentucky

  15. Blue Waters Engagement with Research Teams • Provide Details on Blue Waters System • Provide Assistance with Blue Waters Software • Numerical libraries • MPI, OpenMP, ARMCI/Global Arrays, LAPI, OpenSHMEM, etc. • Compilers (Fortran, C, UPC, Co-array Fortran) • Provide Assistance with Blue Waters Hardware • Chip and network simulators • Staged access to Power7 hardware • Provide Training • On-line documentation • Webinars/tutorials/on-line courses (~8 per year) • Workshops (~2 per year) CI Days • 22 February 2010 • University of Kentucky

  16. For More Information: • Blue Waters Website • http://www.ncsa.uiuc.edu/BlueWaters • IBM Power Systems Website • http://www-03.ibm.com/systems/power/ • PRAC Solicitation • http://www.nsf.gov/pubs/2008/nsf08529/nsf08529.htm CI Days • 22 February 2010 • University of Kentucky

  17. Questions?

More Related