1 / 33

planning for sustainability

Grids for Science in Europe by Jürgen Knobloch CERN IT-Department Presented at Gridforum.nl Annual Business Day Eindhoven April 2, 2008. planning for sustainability. CERN Annual budget: ~1000 MSFr (~600 M€) Staff members: 2650 Member states: 20. + 270 Fellows, + 440 Associates.

nan
Télécharger la présentation

planning for sustainability

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Grids for Science in Europe by Jürgen KnoblochCERN IT-DepartmentPresented atGridforum.nl Annual Business DayEindhovenApril 2, 2008 planning for sustainability Jürgen Knobloch/CERN Slide 1

  2. CERN Annual budget: ~1000 MSFr (~600 M€) Staff members: 2650 Member states: 20 + 270 Fellows, + 440 Associates + 8000 CERN users CERN Basic research Fundamental questions High E accelerator: Giant microscope (p=h/λ) Generate new particles (E=mc2) Create Big Bang conditions

  3. Large Hadron Collider - LHC • 27 km circumference • Cost ~ 3000 M€ (+ detectors) • Proton-proton (or lead ion) collisions at 7+7 TeV • Bunches 1011 protons cross every 25 nsec • 600 million collisions/sec • Physics questions • Origin of mass (Higgs?) • Dark matter? • Symmetry matter-antimatter • Forces – supersymmetry • Early universe – quark-gluon plasma • … 1.9 K 10-9 to 10-12 16 micron squeeze Jürgen Knobloch/CERN Slide 3

  4. LHC gets ready … Jürgen Knobloch/CERN Slide 4

  5. LHC Computing Challenge • Signal/Noise 10-9 • Data volume • High rate * large number of channels * 4 experiments  15 PetaBytes of new data each year • Compute power • Event complexity * Nb. events * thousands users  100 k CPUs (cores) • Worldwide analysis & funding • Computing funding locally in major regions & countries • Efficient analysis everywhere  GRID technology Jürgen Knobloch/CERN Slide 5

  6. Requirements Match? Tape & disk requirements:>10 times CERNpossibility Substantial computing requiredoutside CERN Jürgen Knobloch/CERN Slide 6

  7. Timeline - Grids OSG GriPhyN, iVDGL, PPDG GRID 3 Inter-operation WLCG EU DataGrid EGEE 1 EGEE 2 EGEE 3 LCG 1 LCG 2 First physics Cosmics Service Challenges CCRC08 Common Computing Readiness Challenge Data Challenges Jürgen Knobloch/CERN Slide 7

  8. WLCG Collaboration • The Collaboration • 4 LHC experiments • ~250 computing centres • 12 large centres (Tier-0, Tier-1) • 38 federations of smaller “Tier-2” centres • Growing to ~40 countries • Grids: EGEE, OSG, Nordugrid • Technical Design Reports • WLCG, 4 Experiments: June 2005 • Memorandum of Understanding • Agreed in October 2005 • Resources • 5-year forward look Jürgen Knobloch/CERN Slide 8

  9. Events at LHC Luminosity :1034cm-2 s-1 40 MHz – every 25 ns 20 events overlaying Jürgen Knobloch/CERN Slide 9

  10. Trigger & Data Acquisition Jürgen Knobloch/CERN Slide 10

  11. Data Recording 1.25 GB/sec (Ions) Jürgen Knobloch/CERN Slide 11

  12. Tier-0, Tier-1, Tier-2 • Tier-0 (CERN): • Data recording • First-pass reconstruction • Data distribution • Tier-1 (11 centres): • Permanent storage • Re-processing • Analysis • Tier-2 (>250 centres): • Simulation • End-user analysis Data Transfer out of Tier-0 2 GB/sec Note: For a site to be reliable, many things have to work simultaneously! Target Jürgen Knobloch/CERN Slide 12

  13. Middleware • Security • Virtual Organization Management (VOMS) • MyProxy • Data management • File catalogue (LFC) • File transfer service (FTS) • Storage Element (SE) • Storage Resource Management (SRM) • Job management • Work Load Management System(WMS) • Logging and Bookeeping (LB) • Computing Element (CE) • Worker Nodes (WN) • Information System • Monitoring: BDII (Berkeley Database Information Index), RGMA (Relational Grid Monitoring Architecture)  aggregate service information from multiple Grid sites, now moved to SAM (Site Availability Monitoring) • Monitoring & visualization (Gridview, Dashboard, Gridmap etc.) Jürgen Knobloch/CERN Slide 13

  14. Increasing workloads >⅓non-LHC Factor 3 capacity ramp-up is in progress for LHC operation

  15. Many Grid Applications At present there are about 20 applications from more than 10 domains on the EGEE Grid infrastructure • Astronomy & Astrophysics - MAGIC, Planck • Computational Chemistry • Earth Sciences - Earth Observation, Solid Earth Physics, Hydrology, Climate • Fusion • High Energy Physics - 4 LHC experiments (ALICE, ATLAS, CMS, LHCb) BaBar, CDF, DØ, ZEUS • Life Sciences - Bioinformatics (Drug Discovery, GPS@, Xmipp_MLrefine, etc.) • Condensed Matter Physics • Computational Fluid Dynamics • Computer Science/Tools • Civil Protection • Finance (through the Industry Task Force) Jürgen Knobloch/CERN Slide 15

  16. Grid Applications Seismology Medical Chemistry Astronomy Particle Physics Fusion Jürgen Knobloch/CERN Slide 16

  17. Computingat theTerra-Scale Tier-1 Centers: TRIUMF (Canada); GridKA(Germany); IN2P3 (France); CNAF (Italy); SARA/NIKHEF (NL); Nordic Data Grid Facility (NDGF); ASCC (Taipei); RAL (UK); BNL (US); FNAL (US); PIC (Spain) Jürgen Knobloch/CERN Slide 17

  18. Evolution National European e-Infrastructure Global Jürgen Knobloch/CERN Slide 18

  19. Sustainability Jürgen Knobloch/CERN Slide 19

  20. European Grid Initiative Goal: • Long-term sustainability of grid infrastructures in Europe Approach: • Establishment of a new federated model bringing together NGIs to build the EGI Organisation EGI Organisation: • Coordination and operation of a common multi-national, multi-disciplinary Grid infrastructure • To enable and support international Grid-based collaboration • To provide support and added value to NGIs • To liaise with corresponding infrastructures outside Europe www.eu-egi.org

  21. EGI_DS Consortium members • Johannes KeplerUniversität Linz (GUP) • Greek Research and Technology Network S.A. (GRNET) • IstitutoNazionalediFisicaNucleare (INFN) • CSC - Scientific Computing Ltd. (CSC) • CESNET, z.s.p.o. (CESNET) • European Organization for Nuclear Research (CERN) • VereinzurFörderungeinesDeutschenForschungsnetzes - DFN-Verein (DFN) • Science & Technology Facilities Council (STFC) • Centre National de la RechercheScientifique (CNRS) Jürgen Knobloch/CERN Slide 21

  22. EGI – European Grid Initiative • EGI Design Study started Sept 07 to establish a sustainable pan-European grid infrastructure after the end of EGEE-3 in 2010 • The main foundations of EGI are 37 National Grid Initiatives (NGIs) • Project is funded by the European Commission's 7th Framework Program www.eu-egi.org Jürgen Knobloch/CERN Slide 22

  23. EGI_DS Schedule Duration27 months: Develop EGI Proposal NGIs signing Proposal Start of EGEE-III Final Draft of EGI Blueprint Proposal Submission of EGEE-III EGI Blueprint Proposal Start of EGI Design Study EGEE-III transition to EGI-like structure EU Call Deadline for EGI Proposal EGI Entity in place EGEE-III (2YEARS)‏ EGI operational EGEE-II (2YEARS)‏ 2008 2010 2009 Jürgen Knobloch/CERN Slide 23

  24. EGI Functionality Overview • Management, Outreach & Dissemination - Representation of EU Grid Efforts • Operations & Resource Provisioning & Security • Application Support & Training • Middleware (Build&Test, Component Selection/Validation/Deployment) • Standardisation & Policies • Industry take-up Jürgen Knobloch/CERN Slide 24

  25. EGI EGI Management Structure • Political Bodies • such as: • eIRG, • EU, • ESFRI, • Ntl. Bodies, • ... StrategyCommittee EGI Council Advisory Committees Director + staff CTO (Development) CAO (Admin.+ PR) COO (Operations) Dev. Group Adm.+PR Group Oper. Group EGI Organisation www.eu-egi.org Jürgen Knobloch/CERN Slide 25

  26. Characteristics of NGIs Work in progress! Things may change! Each NGI … should be a recognized national body with a single point-of-contact … should mobilise national funding and resources … should ensure an operational national grid infrastructure … should support user communities (application independent, and open to new user communities and resource providers) … should contribute and adhere to international standards and policies Responsibilities between NGIs and EGI are split to be federated and complementary Jürgen Knobloch/CERN Slide 26

  27. EGI Operations Work in progress! Things may change! • NGIs perform operations of their national grid infrastructure, including monitoring and accounting, following a general SLA • Each NGI has full responsibility towards other NGIs but acts independently • EGI provides coordination level to guarantee effective, integrated operations through the adoption of standard services Jürgen Knobloch/CERN Slide 27

  28. EGI Resource Strategy Work in progress! Things may change! • EGI owns very few hardware resources • Resource provisioning is responsibility of NGIs  Freedom to choose whatever resources to offer • Development of a strategy for simple entry of new Virtual Organizations Jürgen Knobloch/CERN Slide 28

  29. EGI Middleware Work in progress! Things may change! • Convergence of middleware stacks towards full interoperability (through general standards interfaces) • Distinction between core and high-level functions EGI coordinates core services • Standards-compliance test for core services provided by EGI • Common build&test system needed in Europe Jürgen Knobloch/CERN Slide 29

  30. EGI Application Support Work in progress! Things may change! • Application support implemented and provided by NGIs • EGI performs coordination role • Validation of user support by NGIs • Facilitating experts between NGIs • Organization of user forums • EGI to coordinate training events Jürgen Knobloch/CERN Slide 30

  31. EGI Funding Work in progress! Things may change! • The EGI organisation has three major and separate cost centres: • general central services funded through contributions of the NGIs • service provisioning funded through service charges • Developmentsfunded through project grants • There are in general no cross-subsidies possible between these cost centres. Jürgen Knobloch/CERN Slide 31

  32. Issues being discussed • Small vs. large central EGI • Roles “core EGI” vs. NGI • Transition from a (EU) project-funded grid to self-sustained operations • Maintaining the available expertise • Choice of legal structure and location Jürgen Knobloch/CERN Slide 32

  33. EGI – European Grid Initiative • Future EGI Organisation = “Glue” between various grid communities in Europe and beyond • EGI_DS defines required mechanisms and functionalities of the EGI Organisation  Towards a sustainable environment for the application communities utilizing grid infrastructures for their everyday work Jürgen Knobloch/CERN Slide 33

More Related