1 / 20

ADVANCED SCIENTIFIC COMPUTING RESEARCH An Overview

ADVANCED SCIENTIFIC COMPUTING RESEARCH An Overview. Michael Strayer Associate Director, Office of Science Michael.Strayer@science.doe.gov. Department of Energy Organizational Structure. Advanced Scientific Computing Research.

chip
Télécharger la présentation

ADVANCED SCIENTIFIC COMPUTING RESEARCH An Overview

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. ADVANCED SCIENTIFIC COMPUTING RESEARCHAn Overview Michael Strayer Associate Director, Office of Science Michael.Strayer@science.doe.gov CASC, May 3, 2007

  2. Department of EnergyOrganizational Structure CASC, May 3, 2007

  3. Advanced Scientific Computing Research ASCR Mission: Steward of DOE’s Computational Science, Applied Mathematics, Computer Science, High-Performance Computing and Networking Research for open science. Deploy and operate high performance computing user facilities at LBNL, ANL, and ORNL ASCR Vision: Best in class advancing science and technological innovation through modeling and simulation NERSC Oak Ridge LCF Argonne LCF ESnet HPC & Network Facilities & Testbeds http://www.science.doe.gov/ascr CASC, May 3, 2007

  4. High Performance Production Computing Facility (NERSC) Delivers high-end capacity computing to entire DOE SC research community Large number of projects (200 – 300) Medium- to very-large-scale projects that occasionally need a very high capability Annual allocations Leadership Computing Facilities Delivers highest computational capability to national and international researchers through peer-reviewed Innovative and Novel Computational Impact on Theory and Computation program Small number of projects (10 – 20) Multiple year allocations ASCR High Performance Computing Resources CASC, May 3, 2007

  5. NERSC 10 Teraflop IBM SP 375 RS/6000 (Seaborg) with 6080 processors, 7.2 terabytes aggregate memory 6.7 Teraflop IBM Power 5 (Bassi) with 888 processors, 3.5 terabytes aggregate memory 3.1 Teraflop LinuxNetworx Opteron cluster (Jacquard) with 712 processors, 2.1 terabytes aggregate memory LCF at Oak Ridge 119 teraflop Cray XT3/XT4 (Jaguar) with 11,708 dual core AMD Opteron processor nodes, 46 terabytes aggregate memory 18.5 Teraflop Cray X1E (Phoenix) with 1,024 multi-streaming vector processors, Argonne LCF 5.7 Teraflop IBM Blue Gene/L (BGL) with 2,048PPC processors Current Facilities CASC, May 3, 2007

  6. Future Facility Upgrades • ALCF • 100 teraflop IBM Blue Gene/P delivered by end of 2007 • 250-500 teraflop upgrade to IBM Blue Gene/P in late 2008 • LCF – Oak Ridge • Cray XT4 upgraded to 250 TF by end of 2007 • 1 Petaflop Cray Baker system to be delivered by end of 2008 • NERSC • 100+ teraflop Cray XT4 in operation by October 2007 CASC, May 3, 2007

  7. Scientific Discovery through Advanced Computing (SciDAC) Innovative and Novel Computational Impact on Theory and Experiment (INCITE) Ensuring Hardware Productivity CASC, May 3, 2007

  8. Create comprehensive, scientific computing software infrastructure to enable scientific discovery in the physical, biological, and environmental sciences at the petascale Develop new generation of data management and knowledge discovery tools for large data sets (obtained from scientific user and simulations) Scientific Discovery through Advanced Computing (SciDAC) http://www.scidac.gov CASC, May 3, 2007

  9. 2001-2006 SciDAC Accomplishments • SciDAC teams • created first laboratory-scale flame simulation in three dimensions to better understand combustion which provides 80% of the energy used in the U.S. • simulated techniques for re-fueling fusion reactors • developed new methods for simulating improvements in future particle accelerators • SciDAC partnerships improved effectiveness of scientific applications codes between 275% to over 10,000% • Example: decreased time to solution for Agile-Boltzmann baseline run from 4 weeks to 4 days (8 angles, 12 energy groups with a spatial resolution of 100) • The SciDAC data mining tool, Sapphire, awarded a prestigious 2006 R&D100 award Hydroxyl radical in a turbulent jet flame SciDAC Review and Scientific Discovery document numerous SciDAC accomplishments CASC, May 3, 2007

  10. SciDAC-2 Computational Collaborations to Drive Scientific Discovery • Statistics • 31- SciDAC projects • 9- Centers • 4- Institutes • 18- Efforts in 11 application areas • Astrophysics, Climate, Biology, Fusion, Petabyte data, Materials & Chemistry, Nuclear physics, High Energy physics, QCD, Turbulence, Groundwater • New performers ? About 60% of the funds ! Total- $61.5 M CASC, May 3, 2007

  11. Institutes- University-led centers of excellence Focus on major software issues Employ range of collaborative research interactions. Reach out to engage a broader community of scientists in scientific discovery through advanced computation and collaboration. Conduct training/outreach in high performance computing topics. Centers for Enabling Technology- work directly with applications: Develop to enable scientific simulation codes to take full advantage of tera- to peta-scale. Ensure critical computer science and applied mathematics issues are addressed in a timely and comprehensive fashion. Address issues associated with research software lifecycle. Institutes and Centers― Attributes ― CASC, May 3, 2007

  12. Innovative web and software services Tools which make SciDAC researchers more effective at delivering their technologies (web hosting and authenticated wiki-like portals) Services to promote an easy interface between SciDAC and ‘the outside computational world’ (web, email, and phone central point of contact for SciDAC inquiries) Workshops, training sessions Getting the right people together to forge collaborations http://outreach.scidac.gov help@outreach.SciDAC.gov Outreach SciDAC Outreach Center “Build Collaborations to Drive Scientific Discovery” CASC, May 3, 2007

  13. Conference CASC, May 3, 2007

  14. Innovative and Novel Computational Impact on Theory and Experiment- INCITE • Initiated in 2004 • Provides Office of Science computing resources to a small number of computationally intensive research projects of large scale, that can make high-impact scientific advances through the use of a large allocation of computer time and data storage • Open to national and international researchers, including industry • No requirement of DOE Office of Science funding • Peer-reviewed • 2004 Awards: 4.9 Million processor hours at NERSC awarded to three projects • 2005 Awards: 6.5 Million processor hours at NERSC awarded to three projects CASC, May 3, 2007

  15. 2007 INCITE Allocations by Disciplines 95 Million processor hours allocate to 45 projects CASC, May 3, 2007

  16. New 2008 Call for Proposals for over 0.25 Billion processor hours of INCITE allocations should be announced in mid-May at http://hpc.science.doe.gov 2008 INCITE CASC, May 3, 2007

  17. Simulation and Modeling at the Exascale for Energy Ecological Sustainability and Global Security (E3SGS) Initiative The planned petascale computer systems and the progress toward exascale systems provide an unprecedented opportunity for science; one that will make it possible to use computation not only as an critical tool along with theory and experiment in understanding the behavior of the fundamental components of nature but also for fundamental discovery and exploration of the behavior of complex systems with billions of components including those involving humans. Road to Exascale CASC, May 3, 2007

  18. Energy- Ensuring global sustainability requires reliable and affordable pathways to low-carbon energy production, e.g. bio-fuels, fusion and fission, and distribution on a massive scale. Ecological Sustainability-The effort toward sustainability involves characterizing the conditions for balance in the climate system. The ability to fit energy production and industrial emissions within balanced global climate and chemical cycles is the major scientific and technical challenge for this century. Security- The internet, as well as the instrumentation and control systems for the energy infrastructure, are central to the well-being of our society. Critical Challenges CASC, May 3, 2007

  19. Engagetop scientists and engineers, computer scientists and applied mathematicians in the country to develop the science of complexity as well as new science driven computer architectures and algorithms tied to the needs of scientific computing at all scales. Correspondingly, recruit and develop the next generation of computational and mathematical scientists. Invest in pioneering large-scale science, modeling and simulation that contribute to advancing energy, ecology and global security. Develop scalable analysis algorithms, data systems and storage architectures needed to accelerate discovery from large-scale experiments and enable verification and validation of the results of the pioneering applications. Additionally, develop visualization and data management systems to manage the output of large-scale computational science runs and in new ways to integrate data analysis with modeling and simulation. Accelerate the build-out and future development of the DOE open computing facilities to realize the large-scale systems-level science required to advance the energy, ecology and global security program. Programmatic Themes CASC, May 3, 2007

  20. Three “town hall meetings” on the proposed E3SGS initiative: Lawrence Berkeley National Laboratory hosted the first meeting on April 17-18, Oak Ridge National Laboratory – May 17-18, and Argonne National Laboratory – May 31-June 1. E3SGS Town Hall Meetings CASC, May 3, 2007

More Related