1 / 32

The Boulder AreaTeragrid (BAT) “Step up to the Plate”

The Boulder AreaTeragrid (BAT) “Step up to the Plate”. Marla Meehl Peter O’Neil Jim Van Dyke. Outline. Motivations for connecting to the Teragrid The pyramid and hierarchies of networking CENIC (California) and I-Wire (Illinois) NSF ANIR’s hierarchy for funding

varuna
Télécharger la présentation

The Boulder AreaTeragrid (BAT) “Step up to the Plate”

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The Boulder AreaTeragrid (BAT)“Step up to the Plate” Marla Meehl Peter O’Neil Jim Van Dyke

  2. Outline • Motivations for connecting to the Teragrid • The pyramid and hierarchies of networking • CENIC (California) and I-Wire (Illinois) • NSF ANIR’s hierarchy for funding • Distributed Terascale Facility vision • Pacific Light Rail/Teragrid access to DTF • Applications and benefits • Costs and potential funding partners

  3. Motivations • Make UCAR/NCAR, NOAA/NIST, and CU-Boulder (BAT) facilities (supercomputers, mass storage, visualization, models, datasets, and networking technologies,) available to the Teragrid community • Enable advanced scientific application services • Investment in infrastructure to position the BAT for emerging federal program funding

  4. Motivations • Initiate the BAT collaboration in national and international “experimental” grid networking • Accelerate exchange of technical expertise • Natural extension of BRAN • Prudent and scalable path for long term • Political clout of being “on” the Teragrid

  5. PITAC Report Considerations • President's Information Technology Advisory Committee (PITAC) • NSF Centers 1 (no more) generation behind ASCI • Support for massive petabyte databases • Near future • Bandwidth across country as fast if not faster than bandwidth within centers • 10GHz processor’s by end of decade for petaflop computing • LSN projects and funding driven by PITAC report(s)

  6. NSF Review of NCAR/SCD • “…advanced networking capabilities will be required to make this data available to the community that NCAR serves.” • “NCAR must lead the way to this future.” • be a “player” on the national stage - not follow or lag behind • “SCD will need to increase its investment in research – research carefully directed at the most critical problems faced by the computational atmospheric sciences community.” • “The panel recognizes that the provision of computing, datasets, data analysis, and networking services to the university community is central to the mission of SCD.” • “NCAR must continue to develop its networking capabilities, otherwise, it will not be able to serve its proper role as a data repository for the atmospheric sciences community.”

  7. Our Challenge • “The world has seemingly grown smaller through astonishing advances in telecommunications, but we have, at the same time, a vastly greater appreciation of the complexity and interrelatedness of the physical and human spheres that form the coupled Earth system. NSF now uses new terms such as ‘planetary metabolism’ and planetary ecology’ to capture the need to think in a more integrated sense about humanity’s relationship with the natural world. These scientific challenges are indeed grand.” • NCAR Strategic Plan - 2001

  8. SCD’s Challenge • “New to SCD is a focus on using this [intellectual] leadership as a transformational engine for NCAR and its community through convergence of elements of the information technology revolution, such as collaborative environments and connect to NSF’s Teragrid of distributed computing and data services.” • NCAR Strategic Plan - 2001

  9. CENIC Pyramid

  10. Networking – ANIR/NSF • Future Networks • Operational High Performance (Production) Networks • Experimental Infrastructure Networks • Research Networks

  11. Networking – ANIR/NSF • High Performance (Production) Networks • Abilene, vBNS+, FedNets (ESnet, DREN, NREN) • Essential Tool for Research and Education • Always available and dependable 24/7 • High Performance • International Connectivity • Exciting Future • NSF support for focused Activities e.g. middleware, measurement initiatives, network simulation

  12. Networking – ANIR/NSF • Experimental Networks – PLR/NLR, I-Wire, Teragrid • High performance trials of cutting-edge networks • Based on advanced application needs unsupported by existing production network services • Ultra High Speed – one or more 10-40 Gig waves • Robust enough to support application-dictated development of application software toolkits, middleware, computing, and networking • Provide delivered experimental services on a persistent basis, yet encourage experimentation with innovative and novel concepts • International Connectivity

  13. Networking – ANIR/NSF • Research Net – Point to Point Waves, DTF • Experiment with Disruptive Technologies • Design of Experiments • Implementation of Experiments • Evaluation of Results • Smaller-scale network prototypes which enable basic scientific and engineering network research and testing of component technologies, protocols, and network architectures

  14. “The Network is the Supercomputer”

  15. Distributed/Extended Machine Rooms (MR2MR) DTF DTF

  16. Increasingly with the broadband & even private waves – fiber needed for e2e experimental/developmental networks DTF Critical Mass Sites Top 10 Res. Univ.: Next 15 Res. Univ: Key Centers, Labs: Intl. 10gig &  12/5/01

  17. Leverage Regional Connections • Incent fatter/dedicated pipes • Enable significant e2e • Connect Scientists/Labs/Devices • Establish Tera/MetaPop Centers Critical Mass Sites Top 10 Res. Univ.: Next 15 Res. Univ: Centers, Labs: Intl. 10gig &  Key Hubs draft 12/4/01

  18. Enabling New Class of Applications • Data intensive computing • Collaboration technology • Distance visualization • Workflow management and collaborative problem solving environments • Management of large-scale, distributed, multi-institutional systems, e.g., Grid • Sensornet • Hierarchical data delivery

  19. Applications • Turbulence • Big Data • 1000 year climate data • Earth Systems Grid • Data Portals • Atmospheric reanalysis • Windows to the Universe • NSF Cyber-infrastructure – eScience • NCAR is being asked to be a leader here • Doppler radar networks

  20. What If DTF Fails? • The data, network, and application focused research and development could still make the Teragrid a success • The BAT could be a big part of this success • Custom networks (tailor-made) • Efficient networks that are geographically placed • Data repositories • Agile optical transport networks • Enhancing and scaling networks

  21. Benefit to Non-Teragrid BAT Users • Benefits all users if we can access big datasets on the NCAR Mass Storage System and other storage systems • Pre-positioning to be highly competitive in e-Science funding environment • Access to broader data repositories • Testbeds for refining standards and technology in a limited environment for application dictated services on a broader basis (in cooperation with private sector) • Positions Colorado for state-level advanced networking efforts

  22. Additional Benefits to UCAR Universities of Experimental Network • “The pursuit of knowledge drives a researcher’s experimental design, which in turn, determines the scientific resources required, which then drives the information resources and services required. Or, that is how it should be, from an application’s point of view. A major complaint from application scientists is that historic funding mechanisms for FedNets create the opposite order, whereby networks define the limits of the applications. Furthermore, end-to-end requirements have not been addressed; the problem of routinely getting from science machines in the sites/campuses to the high-performance wide-area network is unsolved.” • NSF CISE Grand Challenges in e-Science Workshop Report 2002

  23. Experimental Networks to Incubate Paradigm Shift • “Networks should be described as collections of application services rather than by their circuits, their theoretical bandwidth or their architectures, and experimental networks are the only likely means for incubating this paradigm shift… • e-Science developers care only about services delivered at the application level – such as observed data transfer rates, video frame rates, reliable multicasting, and inter-organizational security and authentication capabilities. Delivery of application services requires a vertical integration effort – from the network infrastructure level all the way to the application layer, requiring a ‘paradigm shift’ in the way the Nation thinks about high-performance networks.” • NSF CISE Grand Challenges in e-Science Workshop Report 2002

  24. Cost Sharing and Funding Opportunities • UCAR/NCAR, NOAA/NIST, and CU Boulder cost share for experimental/ research efforts • NSF ANIR and ATM support for NCAR inter-connectivity to Teragrid • Year 3 Teragrid funds ($37.5M) • January solicitation for TeraPoPs as core nodes of “experimental” network infrastructure for optical application transfer • March solicitation for experimental network research • Potential DOE funding

  25. Funding Criteria • Setting priorities between areas of research ultimately requires tradeoffs between different goals, quality of life, and expanding the frontiers of human knowledge and understanding • The allocation of funds to research is primarily a political process • Promote tools, technologies, or facilities that can accelerate the pace of discovery in geosciences • Enable cyber-infrastructure for integrated and interdisciplinary studies • CRA Testimony to Senate Advisory Committee

  26. Budget Assumptions • Two cost options • Includes cost of spare equipment • Boulder fiber interconnect costs undetermined • No direct attachment to the DTF

  27. Teragrid/Light Rail Cost Summary – Worst Case

  28. Teragrid/Light Rail Cost Summary –Current Realistic Estimate

  29. Other Cost Considerations • Savings • UCAR/NCAR/NOAA FRGP OC12 = $100,000/year • CU-Boulder FRGP/4-campus OC12 = $100,000/year

  30. Summary • Facilitates/defines the BAT national presence • Participation strongly encouraged by SDSC, NCSA, PSC, NSF, and Light Rail • Critical to ensure long term funding opportunities • Valuable long term investment • incremental cost of network upgrades minimized “We can’t afford not to participate”

More Related