1 / 17

Beyond GridPP2

Beyond GridPP2. Tony Doyle. Outline. Beyond GridPP2: what are the estimated resource requirements in the LHC exploitation era? Background: PMB preliminary discussions in September PPAP presentation in October: Resources needed in medium-long term? (09/07-08/10) Exploitation medium

Télécharger la présentation

Beyond GridPP2

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Beyond GridPP2 Tony Doyle Collaboration Board

  2. Outline • Beyond GridPP2: what are the estimated resource requirements in the LHC exploitation era? • Background: • PMB preliminary discussions in September • PPAP presentation in October: • Resources needed in medium-long term? • (09/07-08/10) Exploitation medium • (09/10-08/14) Exploitation long-term • Focus on resources needed in 2008 • GridPP Oversight Committee outline on Monday • initial ideas for discussion here.. Collaboration Board

  3. Aim: build upon Phase 1 Ensure development programmes are linked Project management: GridPP LCG Shared expertise: Tier 0 and LCG: Foundation Programme F. LHC Computing Grid Project (LCG Phase 2) [review] • LCG establishes the global computing infrastructure • Allows all participating physicists to exploit LHC data • Earmarked UK funding being reviewed Collaboration Board Required Foundation: LCG Deployment

  4. Jos Engelen proposal to RRB members (Richard Wade [UK]) on how a 20MCHF shortfall for LCG phase II can be funded Spain to fund ~2 staff. Others at this level? Funding from UK (£1m), France, Germany and Italy for 5 staff. Others? Now vitally important that the LCG effort established predominantly via UK funding (40%) is sustained at this level (~10%) Proposal to SC in preparation Tier 0 and LCG: RRB meeting (October) Collaboration Board Value to the UK? Required Foundation: LCG Deployment

  5. Grid and e-Science Support in 2008 I. Experiment Layer II. Application Middleware III. Grid Middleware IV. Facilities and Fabrics What areas require support? IV Running the Tier-1 Data Centre IV Hardware annual upgrade IV Contribution to Tier-2 Sysman effort  (non-PPARC) hardware IV Frontend Tier-2 hardware IV Contribution to Tier-0 support III One M/S/N expert in each of 6 areas III Production manager and four Tier-2 coordinators II Application/Grid experts (UK support) I ATLAS Computing MoU commitments and support I CMS Computing MoU commitments and support I LHCb Core Tasks and Computing Support I ALICE Computing support I Future experiments adopt e-Infrastructure methods • No GridPP management: (assume production mode established + devolved management to Institutes) Collaboration Board

  6. PPARC Financial Input: GridPP1 Outturn Grid Application Development LHC and US Experiments + Lattice QCD UK Tier-1/A Regional Centre Hardware and Manpower Management Travel etc LHC Computing Grid Project (LCG) Applications, Fabrics, Technology and Deployment European DataGrid (EDG) Middleware Development Collaboration Board

  7. PPARC Financial Input: GridPP2 Components A. Management, Travel, Operations F. LHC Computing Grid Project (LCG Phase 2) [review] B. Middleware Security Network Development E. Tier-1/A Deployment: Hardware, System Management, Experiment Support C. Grid Application Development LHC and US Experiments + Lattice QCD + Phenomenology D. Tier-2 Deployment: 4 Regional Centres - M/S/N support and System Management Collaboration Board

  8. IV. Hardware Support • Between (October) and December UK Tier-1 LHC estimates reduced (see Dave’s talk): now more realistic • Global shortfall of Tier-1 CPU was (-13%) and Disk (-55%) in October • UK Tier-1 estimated input in December now corresponds to ~20% (~7%) of global disk (CPU) • LCG MoU commitments required by April 2005 • UK Tier-2 CPU and disk resources significant • Rapid physics analysis turnaround is a necessity • Priority is to ensure that ALL required software (experiment, middleware, OS) is routinely deployed on this hardware well before 2008 Collaboration Board

  9. III. Middleware, Security and Network Security Middleware Networking Network Monitoring Configuration Management Grid Data Management Storage Interfaces Information Services Security Require some support expertise in each of these areas in order to maintain the Grid M/S/N builds upon UK strengths as part of International development Collaboration Board

  10. II. Application Middleware AliEn BaBar GANGA Pheomenology Lattice QCD SAMGrid CMS Require some support expertise in each of these areas in order to maintain the Grid applications. Need to develop e-Infrastructure portals for new experiments starting up in exploitation era. Collaboration Board

  11. ATLAS UK e-science forward look (Roger Jones) • Current core and infrastructure activities: • Run Time Testing and Validation Framework, tracking and trigger instantiations • Provision of ATLAS Distributed Analysis & production tools • Production management • GANGA development • Metadata development • ATLFast simulation • ATLANTIS Event Display • Physics Software Tools • ~11 FTEs mainly ATLAS e-science with some GridPP & HEFCE • Current Tracking • and Trigger e-science: • Alignment effort ~6FTEs • Core software ~2.5FTEs • Tracking tools ~6FTEs • Trigger ~2FTEs The current eScience funding will only take us (at best) to first data Expertise required for the real-world problems and maintenance Note for the HLT, the installation and commissioning will continue into the running period because of staging Both will move from development to optimisation & maintenance Need ~15 FTE (beyond existing rolling grant) in 2007/9 - continued e-science/GridPP support

  12. CMS UK e-science forward look (Dave Newbold) • Computing system / support • Development / tuning of computing • model + system; management • User support for T1 / T2 centres • (globally); liaison with LCG ops • Monitoring / DQM • Online data gathering/‘expert systems’ • for CMS tracker, trigger • Tracker /ECAL software • Installation / calibration support; low-level reconstruction codes • Data management • Phedex system for bulk offline data movement and tracking • System-level metadata; movement of HLT farm data online (new area) • Analysis system • CMS-specific parts of distributed analysis system on LCG NB: ‘First look’ estimates; well inevitably change as we approach running Need ~9 FTE (beyond existing rolling grant) in 2007/9 - continued e-science/GridPP support

  13. LHCb UK e-science forward look (Nick Brook) • Current RICH & VELO e-science: • RICH: UK provide bulk of the RICH s/w team including s/w coordinator ~7 FTEs about 50:50 e-science funding+rolling grant/HEFCE • VELO: UK provide bulk of the VELO s/w team including s/w coordinator ~4 FTEs about 50:50 e-science funding+rolling grant/HEFCE ALL essential alignment activities for both detectors through e-science funding Will move from development to maintenance and operational alignment ~3FTEs for alignment in 2007-9 • Current core activities: • GANGA development • Provision of DIRAC & production tools • Development of conditions DB • The production bookkeeping DB • Data management & metadata • Tracking • Data Challenge Production Manager • ~10 FTEs mainly GridPP, e-science, studentships with some HEFCE support Will move from development to maintenance phase - UK pro rata share of LHCb core computing activities ~5 FTEs Need ~9 FTE (core+alignment+UK support) in 2007/9 - continued e-science support

  14. Grid ande-Sciencefunding requirements • Simple model Collaboration Board

  15. Grid ande-Sciencefunding requirements • Priorities in context of a financial snapshot in 2008 • Grid (£5.6m p.a.) and e-Science (£2.7m p.a.) • Assumes no GridPP project management • Savings? • EGEE Phase 2 (2006-08) may contribute • UK e-Science context is • NGS (National Grid Service) • OMII (Open Middleware Infrastructure Institute) • DCC (Digital Curation Centre) • Timeline? To be compared with Road Map: Not a Bid- Preliminary Input Collaboration Board

  16. Grid and e-ScienceExploitation Timeline? • PPAP initial input Oct 2004 • Science Committee initial input • PPARC call assessment (2007-2010) 2005 • Science Committee outcome Oct 2005 • PPARC call Jan 2006 • PPARC close of call May 2006 • Assessment Jun-Dec 2006 • PPARC outcome Dec 2006 • Institute Recruitment/Retention Jan-Aug 2007 • Grid and e-Science Exploitation Sep 2007 - …. • Note if the assessment from PPARC internal planning differs significantly from this preliminary advice from PPAP and SC, then earlier planning is required. Collaboration Board

  17. Summary Resources needed for Grid and e-Science in medium-long term? • Current Road Map ~£6m p.a. • Resources needed in 2008 estimated at £8.3m p.a. • Timeline for decision-making outlined.. • PP community-supported strategy required Collaboration Board

More Related