1 / 33

Talk summary

From the WEB to the GRID Industrial potential of the technology Fabrizio GAGLIARDI CERN Geneva-Switzerland EU-DataGrid Project Leader October 2001 F.Gagliardi@cern.ch. Talk summary. Introduction From the WEB to the Grid EU DataGrid background Future Plans Potential for industry and commerce

daisy
Télécharger la présentation

Talk summary

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. From the WEB to the GRIDIndustrial potential of the technologyFabrizio GAGLIARDICERNGeneva-SwitzerlandEU-DataGrid Project LeaderOctober 2001F.Gagliardi@cern.ch

  2. Talk summary • Introduction • From the WEB to the Grid • EU DataGrid background • Future Plans • Potential for industry and commerce • Conclusions

  3. From the WEB to the GRID • The history of computing is solutions in search of problems to solve • In the mid 80’s the problem of physicists at CERN was the exchange of multimedia information within international world-wide scientific collaborations

  4. The WEB example • All the elements of the solution were there: • Internet • Reasonably powerful PCs • Friendly user interfaces • Hypertext invented long before • Tim Berners-Lee in 1989 had the vision • A good invention which required to migrate to US to become a phenomenal success

  5. Technological evolution • Networks: Qos, availability, cost • The Metcalf’s law: usefulness of networks grow with the cube of the number of their nodes • Internet exponential grow (traffic doubles every 12 months) • PC • The Moore’s law: CPU power double every 18 months • User interfaces: Mosaic, Netscape, Portals

  6. NSF PACI Network Connections

  7. UK SuperJANET4 NL SURFnet GEANT IT GARR-B DataTAG project NewYork Abilene STAR-LIGHT ESNET CERN MREN STAR-TAP

  8. Asian Pacific Grid • Common Framework for Asia-Pacific Grid researchers • Represent AP interests to GGF • Collaborate with APAN/TransPAC • Voluntary framework: Not a project funded from single source North America (STARTAP)

  9. New step in technology • Wide area networking becoming as powerful, as reliable and affordable as local area networks • A PC today has the power of a computer center of “only” 10 years ago • Powerful graphics and friendly interfaces make access to computer resources very easy • In short: time ripe for a new vision

  10. The CERN problem

  11. CERN The European Organisation for Nuclear Research 20 European countries 2,500 staff 6,000 users

  12. 27km of tunnel stuffed with magnets and klystrons

  13. One of the four LHC detectors 40 MHz (40 TB/sec) online system multi-level trigger filter out background reduce data volume level 1 - special hardware 75 KHz (75 GB/sec) level 2 - embedded processors 5 KHz (5 GB/sec) level 3 - PCs 100 Hz (100 MB/sec) data recording & offline analysis

  14. The LHC Detectors CMS ATLAS ~6-8 PetaBytes / year ~108 events/year LHCb

  15. R&D testbed Physics WAN Systems administration Mass Storage disks processors Funding • Requirements growing faster than Moore’s law • CERN’s overall budget is fixed Estimated cost of facility at CERN ~ 30% of offline requirements* Budget level in 2000 for all physics data handling *assumes physics in July 2005, rapid ramp-up of luminosity

  16. World Wide Collaboration  distributed computing & storage capacity CMS: 1800 physicists 150 institutes 32 countries

  17. Lab m Uni x Uni a UK USA FermiLab Lab a France Tier 1 Uni n CERN Tier2 Physics Department Italy Desktop Lab b NL Lab c  Uni y Uni b   LHC Computing Model USA Brookhaven ………. Germany les.robertson@cern.ch

  18. The solution: the GRID

  19. The GRID metaphor • Analogous with the electrical power grid • Unlimited ubiquitous distributed computing • Transparent access to multi peta byte distributed data bases • Easy to plug in • Hidden complexity of the infrastructure Ian Foster andCarl Kesselman, editors, “The Grid: Blueprint for a New Computing Infrastructure,” Morgan Kaufmann, 1999, http://www.mkp.com/grids

  20. EU DataGrid background • Motivated by the challenge of the LHC computing • Large amount of data (~10 Pbytes/year starting in 2006) • Distributed computing resources and skills • Geographical worldwide distributed community (VO) • Excellent Grid computing model match to HEP requirements (Foster’s quote: HEP is Grid computing “par excellence” ) • Transition from supercomputers to commodity computing done • Distributed job level parallelism (no strong need for MPI) • High throughput computing rather than supercomputing • VO tradition already long established • Prototype Grid activity in some CERN member states

  21. Main project goals and characteristics • To build a significant prototype of the LHC computing model • To collaborate with and complement other European and US projects • To develop a sustainable computing model applicable to other sciences and industry: biology, earth observation etc. • Specific project objectives: • Middleware for fabric & Grid management (mostly funded by the EU): evaluation, test, and integration of existing M/W S/W and research and development of new S/W as appropriate • Large scale testbed (mostly funded by the partners) • Production quality demonstrations (partially funded by the EU) • Open source and communication: • Global GRID Forum • Industry and Research Forum

  22. Main Partners • CERN – International (Switzerland/France) • CNRS - France • ESA/ESRIN – International (Italy) • INFN - Italy • NIKHEF – The Netherlands • PPARC - UK

  23. Associated Partners • Research and Academic Institutes • CESNET (Czech Republic) • Commissariat à l'énergie atomique (CEA) – France • Computer and Automation Research Institute,  Hungarian Academy of Sciences (MTA SZTAKI) • Consiglio Nazionale delle Ricerche (Italy) • Helsinki Institute of Physics – Finland • Institut de Fisica d'Altes Energies (IFAE) - Spain • Istituto Trentino di Cultura (IRST) – Italy • Konrad-Zuse-Zentrum für Informationstechnik Berlin - Germany • Royal Netherlands Meteorological Institute (KNMI) • Ruprecht-Karls-Universität Heidelberg - Germany • Stichting Academisch Rekencentrum Amsterdam (SARA) – Netherlands • Swedish Natural Science Research Council (NFR) - Sweden • Industrial Partners • Datamat (Italy) • IBM (UK) • CS-SI (France)

  24. Project scope • 9.8 M Euros EU funding over 3 years • 90% for middleware and applications (HEP, EO and biology) • Three year phased developments & demos (2001-2003) • Possible extensions (time and funds) on the basis of first successful results: • DataTAG (2002-2003) • CrossGrid (2002-2004) • GridStart (2002-2004) • … More info on www.eu-datagrid.org

  25. Potential for industry and commerce • New business model (open source + added value services) • Endorsed by three DataGrid partners • IBM recent announcements and plans • Integration and service providers • Opportunity for ASPs • Electronic commerce enabler

  26. Few industrial examples • NASA: for on-line diagnostic • Boeing: HPC simulation for engineering design • ESA: several EO compute and data intensive applications • VC exploring other business opportunities (see Index Venture presentations at GGF3 in Frascati)

  27. Few scientific examples

  28. ONLINE ANALYSIS OF INSTRUMENT DATA TELE-IMMERSION/DISTANCE COLLABORATION TRANS-ATLANTIC REMOTE VISUALIZATION AND STEERING RECORD-SETTING DISTRIBUTED SUPERCOMPUTING COLLABORATIVE DATA MINING What we will be able to doIf Grids and Networks continue to grow

  29. Advanced Photon Source wide-area dissemination desktop & VR clients with shared controls real-time collection archival storage Example Application: Online Instrumentation tomographic reconstruction DOE X-ray source grand challenge: ANL, USC/ISI, NIST, U.Chicago

  30. NEXRAD Doppler Radar Automated Surface Networks Upper-Air Balloons Satellites Commercial Aircraft Improving Severe Storm Forecasting:Using the Grid to Gather the Initial Data

  31. Conclusions • EU DataGrid is well on its way to demonstrate that Grid is the right solutions for CERN and LHC computing • The intense flourishing of Grid projects in other disciplines demonstrates that Grid is good for science • I believe that industry and commerce will be next, provided we manage to build secure Grids with internationally accepted standards • The Global Grid Forum recently launched should contribute to this process (www.gridforum.org)

More Related