1 / 23

The OptIPuter – From SuperComputers to SuperNetworks

The OptIPuter – From SuperComputers to SuperNetworks. GEON Meeting San Diego Supercomputer Center, UCSD La Jolla, CA November 19, 2002. Dr. Larry Smarr Director, California Institute for Telecommunications and Information Technologies Professor, Dept. of Computer Science and Engineering

chavers
Télécharger la présentation

The OptIPuter – From SuperComputers to SuperNetworks

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The OptIPuter –From SuperComputers to SuperNetworks GEON Meeting San Diego Supercomputer Center, UCSD La Jolla, CA November 19, 2002 Dr. Larry Smarr Director, California Institute for Telecommunications and Information Technologies Professor, Dept. of Computer Science and Engineering Jacobs School of Engineering, UCSD

  2. California Has Initiated Four New Institutes for Science and Innovation UCSB UCLA UCI UCSD California Institute for Bioengineering, Biotechnology, and Quantitative Biomedical Research Center for Information Technology Research in the Interest of Society UCD UCM UCB UCSF California NanoSystems Institute UCSC California Institute for Telecommunications and Information Technology www.ucop.edu/california-institutes

  3. Cal-(IT)2An Integrated Approach to the Future Internet 220 UC San Diego & UC Irvine Faculty Working in Multidisciplinary Teams With Students, Industry, and the Community The State’s $100 M Creates Unique Buildings, Equipment, and Laboratories www.calit2.net

  4. SDSC and Cal-(IT)2 Have The Same Knowledge and Data Team by Construction! Applications: Medical informatics, Biosciences, Ecoinformatics,… Visualization Data Mining, Simulation Modeling, Analysis, Data Fusion Knowledge-Based Integration Advanced Query Processing Grid Storage Filesystems, Database Systems High speed networking Networked Storage (SAN) instruments sensornets Storage hardware SDSC Data and Knowledge Systems Program Cal-(IT)2 Knowledge and Data Engineering Laboratory

  5. The Move to Data-Intensive Science & Engineering-e-Science Community Resources Sloan Digital Sky Survey ALMA LHC ATLAS

  6. Why Optical Networks Are Emerging as the 21st Century Driver for the Grid Scientific American, January 2001

  7. The Rapid Increase in Bandwidth is Driven by Parallel Lambdas On Single Optical Fibers (WDM) Parallel Lambdas Will Drive This Decade The Way Parallel Processors Drove the 1990s

  8. A LambdaGrid Will Be the Backbone for an e-Science Network Apps Middleware Clusters C O N T R O L P L A N E Dynamically Allocated Lightpaths Switch Fabrics Physical Monitoring Source: Joe Mambretti, NU

  9. The Next S-Curves of NetworkingExponential Technology Growth Lambda Grids Experimental Networks Production/ Mass Market DWDM 100% Technology Penetration Internet2 Abilene Experimental/ Early Adopters Connections Program 0% Research Gigabit Testbeds Time Technology S-Curve ~1990s 2000 2010 Networking Technology S-Curves

  10. Data Intensive Scientific Applications Require Experimental Optical Networks • Large Data Challenges in Neuro and Earth Sciences • Each Data Object is 3D and Gigabytes • Data are Generated and Stored in Distributed Archives • Research is Carried Out on Federated Repository • Requirements • Computing Requirements  PC Clusters • Communications  Dedicated Lambdas Over Fiber • Data  Large Peer-to-Peer Lambda Attached Storage • Visualization  Collaborative Volume Algorithms • Response • OptIPuter Research Project

  11. Illinois’ I-WIREThe First State Dark Fiber Experimental Network 18 pair 4 pair Starlight (NU-Chicago) Argonne 4 10 pair 4 pair 4 pair Qwest455 N. Cityfront 12 pair 12 pair UC Gleacher 450 N. Cityfront UIC UIUC/NCSA McLeodUSA 151/155 N. Michigan Doral Plaza 2 pair 2 pair Level(3) 111 N. Canal Illinois Century Network James R. Thompson Ctr City Hall State of IL Bldg 2 pair UChicago IIT Source: Charlie Catlett 12/2001

  12. CENIC Plans to Create a Dark FiberExperimental and Research Network The SoCal Component

  13. From Telephone Conference Calls to Access Grid International Video Meetings Can We Modify This Technology To Create a GEON Research Virtual Laboratory? Access Grid Lead-Argonne NSF STARTAP Lead-UIC’s Elec. Vis. Lab

  14. iGrid 2002September 24-26, 2002, Amsterdam, The Netherlands • Fifteen Countries/Locations Proposing 28 Demonstrations: Canada, CERN, France, Germany, Greece, Italy, Japan, The Netherlands, Singapore, Spain, Sweden, Taiwan, United Kingdom, United States • ApplicationsDemonstrated: Art, Bioinformatics, Chemistry, Cosmology, Cultural Heritage, Education, High-Definition Media Streaming, Manufacturing, Medicine, Neuroscience, Physics, Tele-science • Grid Technologies: Grid Middleware, Data Management/ Replication Grids, Visualization Grids, Computational Grids, Access Grids, Grid Portal Sponsors: HP, IBM, Cisco, Philips, Level (3), Glimmerglass, etc. UIC www.startap.net/igrid2002

  15. iGrid 2002 Was Sustaining 1-3 Gigabits/s Total Available Bandwidth Between Chicago and Amsterdam Was 30 Gigabit/s

  16. The NSF TeraGridA LambdaGrid of Linux SuperClusters Caltech 0.5 TF 0.4 TB Memory 86 TB disk Argonne 1 TF 0.25 TB Memory 25 TB disk TeraGrid Backbone (40 Gbps) NCSA 8 TF 4 TB Memory 240 TB disk SDSC 4.1 TF 2 TB Memory 250 TB disk This will Become the National Backbone to Support Multiple Large Scale Science and Engineering Projects Applications Visualization Intel, IBM, Qwest Myricom, Sun, Oracle Compute Data $53Million from NSF

  17. From SuperComputers to SuperNetworks--Changing the Grid Design Point • The TeraGrid is Optimized for Computing • 1024 IA-64 Nodes Linux Cluster • Assume 1 GigE per Node = 1 Terabit/s I/O • Grid Optical Connection 4x10Gig Lambdas = 40 Gigabit/s • Optical Connections are Only 4% Bisection Bandwidth • The OptIPuter is Optimized for Bandwidth • 32 IA-64 Node Linux Cluster • Assume 1 GigE per Processor = 32 gigabit/s I/O • Grid Optical Connection 4x10GigE = 40 Gigabit/s • Optical Connections are Over 100% Bisection Bandwidth

  18. The OptIPuter is an Experimental Network Research Project • Driven by Large Neuroscience and Earth Science Data • EarthScope and BIRN • Multiple Lambdas Linking Clusters and Storage • LambdaGrid Software Stack • Integration with PC Clusters • Interactive Collaborative Volume Visualization • Lambda Peer to Peer Storage With Optimized Storewidth • Enhance Security Mechanisms • Rethink TCP/IP Protocols • NSF Large Information Technology Research Proposal • UCSD and UIC Lead Campuses—Larry Smarr PI • USC, UCI, SDSU, NW Partnering Campuses • Industrial Partners: IBM, Telcordia/SAIC, Chiaro Networks, CENIC • $13.5 Million Over Five Years

  19. Metro Optically Linked Visualization Wallswith Industrial Partners Set Stage for Federal Grant • Driven by SensorNets Data • Real Time Seismic • Environmental Monitoring • Distributed Collaboration • Emergency Response • Linked UCSD and SDSU • Dedication March 4, 2002 Linking Control Rooms UCSD SDSU Cox, Panoram, SAIC, SGI, IBM, TeraBurst Networks SD Telecom Council 44 Miles of Cox Fiber

  20. OptIPuter NSF Proposal Partnered with National Experts and Infrastructure SURFnet CERN Asia Pacific CA*net4 Vancouver CA*net4 Seattle Pacific Light Rail Portland Chicago NYC UIC NU PSC San Francisco TeraGridDTFnet Asia Pacific NCSA CENIC USC UCI Los Angeles UCSD, SDSU Atlanta San Diego (SDSC) AMPATH Source: Tom DeFanti and Maxine Brown, UIC

  21. OptIPuter LambdaGridEnabled by Chiaro Networking Router switch switch switch switch Medical Imaging and Microscopy Chemistry, Engineering, Arts • Cluster – Disk • Disk – Disk • Viz – Disk • DB – Cluster • Cluster – Cluster Chiaro Enstara San Diego Supercomputer Center Scripps Institution of Oceanography www.calit2.net/news/2002/11-18-chiaro.html

  22. The UCSD OptIPuter Deployment The UCSD OptIPuter LambdaGrid Testbed ½ Mile To Other OptIPuter Sites To Other OptIPuter Sites Phase I, Fall 02 Phase I, Fall 02 Phase II, 2003 Phase II, 2003 Collocation point Collocation point SDSC SDSC SDSCAnnex SDSCAnnex High School Preuss JSOE Engineering CRCA Arts Medicine SOM UndergradCollege 6thCollege Chemistry Phys. Sci -Keck Node M Collocation Earth Sciences SIO

  23. OptIPuter Transforms Individual Laboratory Visualization, Computation, & Analysis Facilities Anatomy Neuroscience Visible Human ProjectNLM, Brooks AFB, SDSC Volume Explorer Dave Nadeau, SDSC, BIRNSDSC Volume Explorer Fast polygon and volume rendering with stereographics + GeoWall = 3D APPLICATIONS: Underground Earth Science Earth Science GeoFusion GeoMatrix Toolkit Rob Mellors and Eric Frost, SDSUSDSC Volume Explorer The Preuss School UCSD OptIPuter Facility

More Related