1 / 19

HEP Applications status in EELA

HEP Applications status in EELA. Lukas Nellen I. de Ciencias Nucleares, UNAM 2 nd EELA Workshop Itacuruça, Brasil. Overview. Selected HEP applications ALICE LHCb Status of installations Future applications Summary. Selected HEP applications.

artie
Télécharger la présentation

HEP Applications status in EELA

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. HEP Applications status in EELA Lukas Nellen I. de Ciencias Nucleares, UNAM 2nd EELA Workshop Itacuruça, Brasil

  2. Overview • Selected HEP applications • ALICE • LHCb • Status of installations • Future applications • Summary Itacuruça, 2nd EELA WS, 26. 06. 2006

  3. Selected HEP applications • Reminder: Applications available or considered • Initial applications • ALICE: Heavy ion physics at LHC • CIEMAT and UNAM • LHCb: B physics at LHC • UFRJ • Other LHC applications • ATLAS (General purpose) • UNLP • CMS (General purpose) • not in EELA, has LA participants • New projects • HEP theory • UTFSM • Pierre Auger Observatory • UNAM and UNLP, others in EGEE or partner projects Itacuruça, 2nd EELA WS, 26. 06. 2006

  4. VO boxes: extending the middleware • Experiments developed GRID computing model in parallel to the EGEE middleware development • Experiments need services not provided by LCG or gLite • Solution: VO box • Installed in RC • ALICE: one per RC • LHCb: at least one per Tier 1 centre • Basic installation by RC • Detailed configuration by experiment • Runs special services for experiment • Can hold software installation Itacuruça, 2nd EELA WS, 26. 06. 2006

  5. The ALICE experiment ALICE Collaboration • ~ 1/2 ATLAS, CMS, ~ 2x LHCb • ~1000 people, 30 countries, ~ 80 Institutes Total weight 10,000t Overall diameter 16.00m Overall length 25m Magnetic Field 0.4Tesla 8 kHz (160 GB/sec) level 0 - special hardware 200 Hz (4 GB/sec) level 1 - embedded processors 30 Hz (2.5 GB/sec) level 2 - PCs 30 Hz (1.25 GB/sec) data recording & offline analysis Itacuruça, 2nd EELA WS, 26. 06. 2006

  6. ALICE computational framework FLUKA G3 G4 EELA ISAJET Virtual MC AliEn + LCG AliRoot AliReconstruction HIJING AliSimulation EVGEN MEVSIM STEER PYTHIA6 PDF PMD EMCAL TRD ITS PHOS TOF ZDC RICH HBTP STRUCT CRT START FMD MUON TPC RALICE ESD HBTAN JETAN AliAnalysis ROOT Itacuruça, 2nd EELA WS, 26. 06. 2006

  7. Submits job User ALICE central services Site Registers output ALICE GRID VO-Box LCG ALICE Job Catalogue ALICE File Catalogue User Job ALICE catalogues Optimizer packman Execs agent xrootd WN File access Workload request GUID CE SRM SA LFC SURL Computing Agent MSS RB Itacuruça, 2nd EELA WS, 26. 06. 2006

  8. ALICE data challenge • Data challenge ‘06 currently running • Last (!) exercise before data taking • Test of the system started with simulation • Up to 3600 jobs running in parallel • Next will be reconstruction and analysis • Jobs of 8h on 2.5GHz Xeon produce ~800MB. With currently installed RCs: • UNAM: 8MBit/sec • CIEMAT: 240MBit/sec Itacuruça, 2nd EELA WS, 26. 06. 2006

  9. Output file 1 ALICE distributed analysis File Catalogue query User job (many events) Data set (ESD’s, AOD’s) Job output Job Optimizer Grouped by SE files location Sub-job 1 Sub-job 2 Sub-job n Job Broker Submit to CE with closest SE CE and SE CE and SE CE and SE processing processing processing Output file 2 Output file n File merging job Itacuruça, 2nd EELA WS, 26. 06. 2006

  10. ALICE in EELA • Two groups • CIEMAT (starting) • UNAM • At least 3 resource centres set up for ALICE jobs • Have to provide a VO box • Starting collaboration on physics for EELA • Define a physics problem of interest • Run on EELA infrastructure Itacuruça, 2nd EELA WS, 26. 06. 2006

  11. The LHCb experiment Itacuruça, 2nd EELA WS, 26. 06. 2006

  12. LHCb computing model Itacuruça, 2nd EELA WS, 26. 06. 2006

  13. LHCb in EELA • One group • UFRJ • Can run on various resource centres • VO box can be shared • No plans for EELA initiated runs of the LHCb application • In Data Challenge: Package of 500 events every 20h, 0.5Gb in size, negligible impact on network. Itacuruça, 2nd EELA WS, 26. 06. 2006

  14. HEP theory in EELA • UTFSM proposed to run HEP theory code in EELA • In TA • Not selected for the initial phase • Suggestions and successful demo needed • Potential to • Port a new application to the GRID • Create a group of experts • We will follow this up Itacuruça, 2nd EELA WS, 26. 06. 2006

  15. Atlas in EELA • ATLAS is not mentioned in the TA • UNLP is in the process of joining the ATLAS collaboration • UNLP is requesting funding for a resource centre for ATLAS production • To be shared with EELA Itacuruça, 2nd EELA WS, 26. 06. 2006

  16. Pierre Auger Observatory • Two EELA partners • UNAM • UNLP • Located in Malargüe, Mendoza, Argentina • Application not yet GRID enabled • AugerAccess project funded by EU • Data transfer uses LA network infrastructure • Office building: lat=-35.46325636, lon=-69.5847700276 • Look it up on Google Earth Itacuruça, 2nd EELA WS, 26. 06. 2006

  17. Auger location Itacuruça, 2nd EELA WS, 26. 06. 2006

  18. The lost brother… • CMS is the only LHC experiment without participation in EELA • CMS groups in LA • Brasil • Mexico • CMS uses the same network as EELA • We have to start to talk with CMS • LA groups have connection to Fermilab • Use OpenScienceGrid, not EGEE middleware • We leave the interoperability as a challenge to WP2 ;-) Itacuruça, 2nd EELA WS, 26. 06. 2006

  19. Conclusions • Two applications got selected (D3.1.1) • ALICE • LHCb • Infrastructure set up to support them • Many thanks to WP2 • We have to review network requirements for LHC data taking • Information so far relates to data challenges • Ready to run (D3.1.2, due M06) • Run in configuration of interest to EELA users due to their participation in the LHC • Need to set up EELA initiated jobs • We have more candidate applications in EELA • HEP Theory • ATLAS • Auger (?) • CMS is the only LHC experiment not in EELA Acknowledgement: Thanks to the experts from ALICE and LHCb for providing the technical information presented Itacuruça, 2nd EELA WS, 26. 06. 2006

More Related