1 / 13

LHCb Computing Activities in UK Current activities UK GRID activities RICH s/w activities

LHCb Computing Activities in UK Current activities UK GRID activities RICH s/w activities. Architecture of LHCb Computing Model - based on a distributed multi-Tier regional centre. Processing of real data at CERN ( production centre ) Regional centre simulation production.

karen-ware
Télécharger la présentation

LHCb Computing Activities in UK Current activities UK GRID activities RICH s/w activities

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. LHCb Computing Activities in UK • Current activities • UK GRID activities • RICH s/w activities

  2. Architecture of LHCb Computing Model - based on a distributed multi-Tier regional centre Processing of real data at CERN (production centre) Regional centre simulation production

  3. Present Facilities: • Liverpool MAP - 300 node facility • RAL NT farm - closed in February after LHCb MC production • RAL CSF facility - 120 node Linux facility • RAL datastore - IBM 3494 tape robot • RAL NT delivered approx 100k RAWH events/week • MAP can deliver ~16k DST2 events/week if dedicated to LHCb

  4. Exists Planned Proposed LHCb-UK “Testbed” CERN pcrd25.cern.ch lxplus009.cern.ch RAL CSF 120 Linux cpu IBM 3494 tape robot Institutes LIVERPOOL MAP 300 Linux cpu RAL (PPD) Bristol RAL DataGrid Testbed Imperial College GLASGOW/ EDINBURGH “Proto-Tier 2” Oxford Cambridge

  5.  250 PC99 JREI funding ~£0.4M

  6. Prototype “testbed” Architecture • Based around existing production facilities • Intel PCs running Linux Redhat 6.1 • Mixture of batch systems (LSF at CERN, PBS at RAL, FCS at MAP). • Globus 1.1.3 everywhere. • Standard file transfer tools (eg. globus-rcp, GSIFTP). • GASS servers for secondary storage? • Java tools for controlling production, bookkeeping, etc. • MDS/LDAP for bookkeeping database(s).

  7. GRID Activities • Globus been used to remotely submit scripts and run SICBMC (CERN/RAL/Liverpool) • LHCb batch (PBS) jobs run on RAL CSF via Globus • ongoing investigations of Globus toolkt: globus-rcp, GSIFTP, RAL datastore as GASS server • work on-going to “open up” MAP for general use i.e analysis type activitites • co-ordinating LHCb external computing - work on-going on baseline DataModel • work on-going to harness University NT resources • major i/p to Grid PP proposal (most detailed forward look of all UK LHC expts) - disappointment on the emphasis of final document

  8. Summary of current UK LHCb resources • 2,775 SI95 shared for LHCb between RAL & Liverpool (the lion’s share being MAP) • will increase due to RAL upgrade to 3,400 SI95 in March • end of 2001 “Scotch” facility add additional 2,250 SI95 •  total end of 2001 to 5,650 SI95 • by end of 2001 3.5Tb disk space distributed across RAL/Edinburgh/Liverpool • at RAL an additional 13Tb robotic tape by year end

  9. Estimation of needs by 2003 Assume 10-15% of 2006-2007 resources needed in 2003 & assuming UK contribution 20-25%  10,800-20,250 SI95 needed 7-14 Tb of disk 23-48Tb of “robotic” storage Figures consistent with “bottom-up” estimation performed at CERN Factors 2-4 greater than resources available by end of 2001

  10. RICH software • UK is co-ordinating the s/w effort • current FORTRAN simulation written by UK • FORTRAN simulation used in current RICH studies still (e.g. alignment studies & optimisation wrt T11) • since TDR move towards OO - UK prominent (e.g. GEANT4 studies, interfacing current FORTRAN databanks to OO-framework…) • investigation of fast RICH reconstruction

  11. Rich simulation with OO software (work just getting underway) Simulation of testbeam setup – testing of Čerenkov radiation within GEANT4 Interfacing current info. from FORTRAN simulation into OO environment Interest in photodetector simulation OO environment

  12. Fast Online Rich Particle Identification • developing complimentary PID algorithm to offline method for use in online applications • online PID searches for potential Čerenkov rings for a single track. Much less computation involved and therefore faster. • initial results with algorithm are encouraging and studies into potential online applications are progressing. • possible gains in global PID performance with online results merged with global algorithm, for example background suppression.

  13. Summary • UK in LHCb in prominent position leading Grid effort • vital momentum remain and built upon through “GridPP” effort • UK leading position in RICH s/w - UK effort is now moving into OO s/w and use of RICH “online”

More Related