1 / 23

GridPP Status GridPP20 Collaboration Meeting, Dublin

GridPP Status GridPP20 Collaboration Meeting, Dublin David Britton, University of Glasgow, 11/March/2008. Status. March 2008. March 2007. Status in 2007 : 177 sites, 32,412 CPUs, 13,282 TB storage Monitoring via Grid Operations Centre. 2007 Outturn. UKI.

khoi
Télécharger la présentation

GridPP Status GridPP20 Collaboration Meeting, Dublin

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. GridPP Status GridPP20 Collaboration Meeting, Dublin David Britton, University of Glasgow, 11/March/2008

  2. Status March 2008 March 2007 Status in 2007: 177 sites, 32,412 CPUs, 13,282 TB storage Monitoring via Grid Operations Centre

  3. 2007 Outturn UKI In 2007 UKI provided 17.9% of the EGEE CPU (down from 27% in 2006)

  4. CPU by Subregion London NorthGrid Tier1 ScotGrid SouthGrid NorthGrid London GridIreland Tier1 ScotGrid South Grid

  5. GridView

  6. CPU Efficiency

  7. Storage

  8. Data Transfers to RAL

  9. SAM Tests

  10. ATLAS User Tests Steve’s ATLAS tests: Percentage of successful ATLAS test jobs each day. Average over all GridPP Sites All GridPP Sites 2008 2007

  11. ATLAS User Tests Power failure CCRC?

  12. CCRC Combined Computing Readiness Challenge: Feb 4th to 29th and May 5th to 30th “It went better than we expected but not as well as we hoped.” (Jamie Shiers to the LCG GDB, March 5th) Concurrent data exports from all experiments occurred for several days with a new peak record of 2.3 GB/s (roughly 1GB/s more than average sustained during SC4)

  13. CCRC - CMS (Will hear more from Stuart in the next session)

  14. Critical Services In preparation for the CCRC, a set of Experiment Critical Services defined for each experiment.

  15. SRM v2.2 Related to, and concurrent with, the CCRC, SRM v2.2 has been rolled out. Plot shows the number of SRM v2.2 endpoints as advertised in the WLCG information system over the CCRC period. (Will hear more from Jens in the next session)

  16. Meanwhile... • Since GridPP19 in Ambleside: • The GridPP3 grants were issued. • The Tier-2 hardware allocations were made. • The Tier-2 hardware grants were submitted. • The Tier-1 review was performed. • The Oversight Committee met in October • The Tier-1 hardware for 2008 has been procured • A funding crisis has hit UK particle physics.

  17. Tier-1 Review Final GridPP site review, the Tier-1, took place on Nov 21st review panel: GridPP Project Leader (Chairman) Prof. Tony Doyle GridPP Project Manager Prof. David Britton, Chair of GridPP User Board Dr Glenn Patrick RAL Associate Chair of GridPP User Board Dr Dave Newbold ATLAS-UK Computing Chair Prof. Roger Jones Director of Particle Physics, STFC Prof. Norman McCubbin Head of Particle Physics Computing, STFC (Secretary) Dr David Kelsey STFC e-Science Programme Manager Trish Mullins Tier-0 Representative Dr Bernd Panzer-Steindel Tier-1 Representative Dr Michael Ernst

  18. Tier-1 Feedback • Tier-1 was complemented in 10 different areas: • Availability and reliability of the UK’s Tier-1 • Improvement in CASTOR following dedicated efforts • UK’s influence and leadership in global deployment • The improved procurement process • The progress towards a new machine room • Collaboration in the wider e-science context • Enabling VOs and VO specific services • The dedicated and skilled Tier-1 team • The overall expertise of the Tier-1 management • The presentations to the review panel.

  19. Tier-1 Feedback • Concerns were expressed in 11 areas: • The short term effort need on CASTOR • Out of hours response planning • User support • Formal management procedures • Proactive as oppose to reactive approach • Integration with the wider deployment team • Lack of a service delivery plan • Lack of comprehensive disaster planning • Lack of metrics to measure delivery • Site networking problems • The lack of a Tier-1 dashboard. • An additional 20 specific recommendations were made. Many of • these points have been, or are being, addressed

  20. Oversight Committee • The Oversight Committee met in October and expressed concerns about: • CASTOR • Disaster Planning • User Experience • 24 x 7 coverage at the Tier-1 • Future Funding (application interface posts) • One committee member also noted after the meeting that: • "The role of the Oversight Committee is to monitor, review and advise. In our discussions we focused on areas where we had concerns or where we thought improvements were needed. I would like to balance this by saying that I think the GridPP project is being managed superbly well, and a very tricky task is being discharged professionally and successfully."

  21. Tier-1 Hardware In FY07 GridPP has spent £2.1m on Tier-1 hardware. 1.5 Pb of disk has been delivered, installed and tested by the vendor. It is currently undergoing the RAL 28-day load test. This more than doubles the current available disk. 3000 KSI2K of CPU has been purchased is being delivered. This triples the currently available CPU. 1.0 Pb of Tape media and 12 drives have been purchased. Other spend on non-capacity hardware items.

  22. Funding As mentioned by Steve, GridPP has been ranked med-high priority in the programme review exercise. Projects in the med-high category will be cut in the range 5% to 15% but, in advance of formal notification, we anticipate GridPP will be at the lower end of this. This will be effectively the fourth time we have been cut. 100% in the proposal  70% scenario £1.2m of GridPP2 savings removed £1.2m of Working Allowance spent Given that the grants have already been issued this will be extremely challenging to manage but we do not anticipate job losses for people already in post.

  23. Summary 2007 was outturn was OK – we are not “ahead of the pack” but certainly “in the pack”. Current status is OK – CCRC was a success but much more still needs to be done. The outlook is mixed – large quantities of new hardware coming on line; tempered by funding problems. The LHC is getting close. There are exciting times ahead June 2008

More Related