1 / 35

NSF’s CyberInfrastructure Vision for 21st Century Discovery, Innovation, and Learning

This workshop discusses the history of CyberInfrastructure (CI) at NSF, strategic planning, and current and future investments. It explores the vision of a national-level integrated system of resources and services to enable new paradigms of research and education.

Télécharger la présentation

NSF’s CyberInfrastructure Vision for 21st Century Discovery, Innovation, and Learning

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. NSF’s CyberInfrastructure Vision for 21st Century Discovery, Innovation, and Learning GridChem Workshop March 9, 2006 Austin, TX Miriam Heller, Ph.D. Office of Cyberinfrastructure Program Director mheller@nsf.gov

  2. Outline • CyberInfrastructure (CI) at NSF : Then and Now • Strategic Planning – Setting Directions • OCI Investments : Now and Later • Concluding Remarks

  3. Historical NSF Contributions ‘97 ‘00 ITR Projects Supercomputer Centers Discipline- specific CI Projects Terascale Computing Systems ETF Management & Operations • PSC • NCSA • SDSC • JvNC • CTC Partnerships for Advanced Computational Infrastructure Core Support • Alliance (NCSA-led) • NPACI (SDSC-led) • NCSA • SDSC NSFNet NMI Hayes Report Branscomb Report Atkins Report PITAC Report ‘85 ‘93 ’03-SCI ‘01 ‘99 ‘08 ‘95 ’05- OCI ANIR

  4. Cyberinfrastructure Vision • “Atkins report” - Blue-ribbon panel, chaired by Daniel E. Atkins • Called for a national-level, integrated system of hardware, software, & data resources and services • New infrastructure to enable new paradigms of science & engineering research and education with increased efficiency www. nsf.gov/od/oci/reports/toc.jsp

  5. January 2006 5 July 2005 Fall 2003

  6. Dan Atkins Office Director (June) José Muñoz Dep. Office Dir. Office of CyberInfrastructure Debra Crawford Office Director (Acting) José Muñoz Dep. Office Dir. Judy Hayden Priscilla Bezdek Mary Daley Irene Lombardo Allison Smith ANL RP IU RP PU RP ORNL RP TACC RP MRI REU Sites ETF GIG EIN IRNC Condor NMI Integ. Optiputer CI-TEAM EPSCOR GriPhyN Disun CCG NMI HPC Acq. NCSA Core NCSA RP PSC RP STI NMI Dev. CyberSecurity SDSC Core SDSC RP Kevin Thompson Program Director Guy Almes Program Director Miriam Heller Program Director Fillia Makedon Program Director Steve Meacham Program Director Doug Gatchell Program Director SBE CyberTools SBE POC Frank Scioli Program Director (Vacancy) Program Director (Software) Vittal Rao Program Director

  7. CyberInfrastrcture (CI) Governance • CyberInfrastructure Council (CIC) • NSF ADs and ODs, chaired by Dr. Bement (NSF Director) • CIC responsible for shared stewardship and ownership of NSF’s CyberInfrastructure Portfolio • SCI  OCI Realignment • SCI / CISE  Office of the Director / Office of CyberInfrastructure (OCI) • Budget transferred • Ongoing projects and personnel transferred • OCI focuses on provisioning “production-quality” CI to enable 21st century research and education breakthroughs • CISE remains focused on basic CS research and education mission • Advisory Committee for NSF’s CI activities and portfolio • Cyberinfrastructure User Advisor Committee (CUAC)

  8. LHC Burgeoning Number of CI Systems

  9. CyberInfrastructure Budgets OCI Budget: $127M (FY06) NSF 2006 CI Budget 25% Research directorates FY07: $182.42 (Request) OCI 75% ETF + CORE 56% HPC hardware acquisitions, O&M, and user support as a fraction of NSF’s overall CI budget

  10. NSF’s Cyberinfrastructure Vision (FY 2006 – 2010) • Ch. 1 : Call to Action • Visions for: • Ch. 2 : High Performance Computing • Ch. 3 : Data, Data Analysis & Visualizaton • Ch. 4 : Collaboratories, Observatories and Virtual Organizations • Ch. 5 : Learning and Workforce Development http://www.nsf.gov/od/oci/ci_v5.pdf Completed in Summer 2006

  11. NSF states intent to“play a leadership role” • “NSF will play a leadership role in the development and support of a comprehensive cyberinfrastructure essential to 21st century advances in science and engineering research and education. • NSF is the only agency within the U.S. government that funds research and education across all disciplines of science and engineering. ... Thus, it is strategically placed to leverage, coordinate and transition cyberinfrastructure advances in one field to all fields of research.” From NSF Cyberinfrastructure Vision for the 21st Century Discovery

  12. Collaboratories, Observatories & Virtual Organizations Learning & Workforce Development CI Vision :4 Interrelated Perspectives High Performance Computing Data, Data Analysis & Visualization

  13. Enabling and Motivating Trends Push Pull • digital convergence • structured • processable Atkins- Symposium on KES: Past, Present and Future

  14. Some Computation: TeraGrid • Provides: • Unified user environment to support high-capability, production-quality cyberinfrastructure services for science & engineering research. • New S&E opportunities using new ways to distribute resources and services. • Integrate grid services, incl. • HPC • Data collections • Visualization servers • Portals • Distributed, open architecture • GIG responsible for : • SW integration (incl. CTSS) • Base infrastructure (security, networking, and operations) • User support • Community engagement (e.g. Science Gateways) • 8 RP’s • PSC, TACC, NCSA, SDSC, ORNL, Indiana, Purdue, Chicago/ANL • Other institutions participate as sub-awardees of the GIG

  15. Content • Digital everything; exponential growth; conversion and born-digital. • S&E literature is digital. Microfilm-> digital for preservation. Digital libraries are real and getting better. • Distributed (global scale), multi-media, multi-disciplinary observational. Huge volume. • Need for large-scale, enduring, professionally managed/curated data repositories. Increasing demand for easier finding, reuse: data mining, interdisciplinary data federation. • New modes of scholarly communication: what’s publishing? what’s a publication? • IP, openness, ownership, privacy, security issues Atkins- Symposium on KES: Past, Present and Future

  16. Interactivity • Networking - machine to machine • IRNC program • Internet2 • Interfaces - human to machine • Smart sensors, instruments, arrays - machine to physical world • CEO:P program • Organizational - Interactive distributed systems systems; knowledge (work) environments; virtual communities • NSF Workshop on Cyberinfrastructure for the Social Sciences, 2005 • Next Generation CyberTools Atkins- Symposium on KES: Past, Present and Future

  17. Comprehensive & Synergistic View of IT & the Future of the Research University Atkins- Symposium on KES: Past, Present and Future

  18. “Borromean Ring*” teams needed for Cyberinfrastructure Success Computer & Information, Science& Engineering Social & Behavioral Sciences Disciplinary, multi-disciplinary research communities People & Society Iterative, participatory design; collateral learning. *Three symmetric, interlocking rings, no two of which are interlinked. Removing one destroys the synergy. Atkins- Symposium on KES: Past, Present and Future

  19. OCI INVESTMENT HIGHLIGHTS • Midrange HPC Acquisition ($30) • Leadership Class High-Performance Computing System Acquisition ($50M) • Data- and Collaboration-Intensive Software Services ($25.7M) • Conduct applied research and development • Perform scalability/reliability tests to explore tool viability • Develop, harden and maintain software tools and services • Provide software interoperability • CI Training, Education, Advancement and Training ($10M)

  20. Acquisition Strategy Science and engineering capability (logrithmic scale) Track 1 system(s) Track 2 systems Typical university HPC systems FY06 FY07 FY08 FY09 FY10

  21. HPC Acquisition Activities • HPC acquisition will be driven by the needs of the S&E community • RFI held for interested Resource Providers and HPC vendors on 9 Sep 2005 • First in a series of HPC S&E requirements workshops held 20-21 Sep 2005 • Generated Application Benchmark Questionnaire • Attended by 77 scientists and engineers

  22. P P P M M M Science Driven Cyberinfrastructure Trade-off • Interconnect fabric • Processing power • Memory • I/O Interconnect

  23. Data Storage/Preservation Extreme I/O Can’t be done on Grid (I/O exceeds WAN) SDSC Data Science Env SCEC Visualization Climate SCEC Simulation • 3D + time simulation • Out-of-Core EOL ENZO simulation NVO ENZO Visualization CIPRes Data capability (Increasing I/O and storage) CFD Distributed I/O Capable Protein Folding Campus, Departmental and Desktop Computing CPMD QCD Traditional HEC Env Compute capability (increasing FLOPS) Computing: One Size Doesn’t Fit All courtesy SDSC

  24. Benchmarking • Broad inter-agency interest • Use of benchmarking for performance prediction • valuable when target systems are not readily available either because • Inaccessible (e.g. secure) • Does not exist at sufficient scale • In various stages of design • Useful for “what-if” analysis • Suppose I double the memory on my Redstorm? • Nirvana (e.g., Snavely/SDSC): • Abstract away the application: application signatures • Platform independent • Abstract away the hardware: platform signature • Convolve the signatures to provide an assessment

  25. HPC Benchmarking • HPC Challenge Benchmarks (http://icl.cs.utk.edu/hpcc/) • HPL - the Linpack TPP benchmark which measures the floating point rate of execution for solving a linear system of equations. • DGEMM - measures the floating point rate of execution of double precision real matrix-matrix multiplication. • STREAM - a simple synthetic benchmark program that measures sustainable memory bandwidth (in GB/s) and the corresponding computation rate for simple vector kernel. • PTRANS (parallel matrix transpose) - exercises the communications where pairs of processors communicate with each other simultaneously. It is a useful test of the total communications capacity of the network. • RandomAccess - measures the rate of integer random updates of memory (GUPS). • FFTE - measures the floating point rate of execution of double precision complex one-dimensional Discrete Fourier Transform (DFT). • Communication bandwidth and latency - a set of tests to measure latency and bandwidth of a number of simultaneous communication patterns; based on b_eff (effective bandwidth benchmark).

  26. HPC Acquisition - Track 1 • Increased funding will support first phase of a petascale system acquisition • Over four years NSF anticipates investing $200M • Acquisition is critical to NSF’s multi-year plan to deploy and support world-class HPC environment • Collaborating with sister agencies with a stake in HPC • DARPA, HPCMOD, DOE/OS, DOE/NNSA, NIH

  27. NSF Middleware Initiative • Program to design, develop, test, deploy, and sustain a set of reusable and expandable middleware functions that benefit many science and engineering applications in a networked environment. • Define open-source, open-architecture standards for on-line (international) collaboration resource sharing that is sustainable, scalable, and securable • Examples include: • Community-wide access to experimental data on the Grid • Authorized resource access across multiple campuses using common tools • Web-based portals that provide a common interface to wide-ranging Grid-enabled computation resources • Grid access to instrumentation such as accelerators, telescopes

  28. NMI-funded Activities in S&E Research From 2001-2004 funded > 40 development awards + integration awards • Integration award highlights include NMI Grids Center (e.g. Build and Test), Campus Middleware Services (e.g. Shibolleth) and Nanohub • Condor – Mature distributed computing system installed on 1000’s of CPU “pools” and 10’s of 1000’s of CPUs. • GridChem –Open source Java application launches/monitors computational chemistry calculations (Gaussian03, GAMESS, NWChem+Molpro, Qchem, Aces) on CCG supercomputers remotely. • NanoHub – Extends NSF Network for Computational Nanotechnology applications, e.g., NEMO3D, nanoMOS, to distributed environment over Teragrid, U Wisconsin, other grid assets using InVIGO, Condor-G, etc.

  29. Other Middleware Funding • OCI made a major award in middleware in November 2005 to Foster/Kesselman: • "Community Driven Improvement of Globus Software", $13.3M award over 5 years • Ongoing funding to Virtual Data Toolkit (VDT) middleware via OCI and MPS OSG activities, including: • DiSUN is a 5 year $12 M award for computational, storage, middleware resources at four Tier-2 site • GriPhyN and iVDGL target VDT, VDS but ending soon • Ongoing funding to VDT middleware via TeraGrid as part of the CTSS

  30. Learning and Our 21st Century CI WorkforceCI-TEAM: Demonstration Projects • Input: 70 projects / 101 proposals / 17 (24%) collaborative projects • Outcomes: • Invested $2.67 M in awards for projects up to $250K total over 1-2 years • 15.7% success rate: in 11 Demonstration Projects (14 proposals) across BIO, CISE, EHR, ENG, GEO, MPS disciplines • Broadening Participation for CI Workforce Development • Alvarez (FIU) – CyberBridges • Crasta (VaTech) – Project-Centric Bioinformatics • Fortson (Adler) – CI-Enabled 21st c. Astronomy Training for HS Science Teachers • Fox (IU) - Bringing Minority Serving Institution Faculty into CI & e-Science Communities • Gordon (OSU) – Leveraging CI to Scale-Up a Computational Science U/G Curriculum • Panoff (Shodor) – Pathways to CyberInfrastructure : CI through Computational Science • Takai (SUNY Stonybrook) – High School Distributed Search for Cosmic Rays (MARIACHI) • Developing and Implementing CI Resources for CI Workforce Development • DiGiano (SRI) – Cybercollaboration Between Scientists and Software Developers • Figueiredo (UFl) – In-VIGO/Condor-G MW for Coastal & Estuarine Science CI Training • Regli (Drexel) – CI for Creation and Use of Multi-Disciplinary Engineering Models • Simpson (PSU) – CI-Based Engineering Repositories for Undergraduates (CIBER-U)

  31. Cyberinfrastructure Training, Education, Advancement, and Mentoring for Our 21st Century Workforce (CI-TEAM) • Aims to prepare science and engineering workforce with knowledge and skills needed to create, advance and use cyberinfrastructure for discovery, learning and innovation across and within all areas of science and engineering. • Exploits the power of Cyberinfrastructure to cross digital, disciplinary, institutional, and geographic divides and fosters inclusion of diverse groups of people and organizations, with particular emphasis on traditionally underrepresented groups. • Focus on workforce development activities; <50% tool development. • FY06 program funds ~ $10 M for two types of awards: • Demonstration Projects (~ FY05 projects, exploratory in nature, may be limited in scope and scale, have potential to expand into future scale implementation activities; ≤ $250,000) • Implementation Projects (larger in scope or scale, draw on prior experience with proposed activities or teams, expected to deliver sustainable learning and workforce development activities that complement ongoing NSF investment in cyberinfrastructure; ≤ $1,000,000). New CI-TEAM Solicitation Due June 5, 2006

  32. Concluding Thoughts • NSF has taken a leadership role in CI and working to define the vision and future directions • Successful past investments position CI for the Revolution • Achieving the goal of provisioning CI for 21st breakthrough science and engineering research and education depends on the successful investment in the development and deployment of useful, appropriate, usable, used, sustainable CI resources, tools, and services complemented by investment in a cyber-savvy workforce to design, deploy, use and support • Need PIs to • Advise NSF on CI needs • Track growing CI use • Demonstrate breakthrough research and education

  33. Thank You! ? Miriam Heller, Ph.D. Program Director Office of Cyberinfrastructure National Science Foundation Tel: +1.703.292.7025 Email: mheller@nsf.gov

  34. 2005 IRNC Awards • Awards • TransPAC2 (U.S. – Japan and beyond) • GLORIAD (U.S. – China – Russia – Korea) • Translight/PacificWave (U.S. – Australia) • TransLight/StarLight (U.S. – Europe) • WHREN (U.S. – Latin America) International Research Network Connections

More Related