1 / 34

Computing for ALICE at the LHC

Computing for ALICE at the LHC. Outline. Physics at the Large Hadron Collider Higgs – ATLAS & CMS Quark-Gluon Plasma – ALICE (+ATLAS,CMS) Computing for ALICE Present: Processing LHC Run-1 (2010-13) Near Future: Run-2 (2015-17) Long-Term Development: Run-3 (after 2018).

taffy
Télécharger la présentation

Computing for ALICE at the LHC

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Computing for ALICE at the LHC

  2. Outline Physics at the Large Hadron Collider • Higgs – ATLAS & CMS • Quark-Gluon Plasma – ALICE (+ATLAS,CMS) Computing for ALICE • Present: Processing LHC Run-1 (2010-13) • Near Future: Run-2 (2015-17) • Long-Term Development: Run-3 (after 2018)

  3. Large Hadron Collider collides protons and lead ions at >99.99999% of the speed of light to research the most fundamental particles and their interactions LHCb ATLAS ALICE CMS

  4. Search for the Higgs Boson • quantum field fills universe • field gives mass to elementary particles: W/Z, quarks, leptons • new particle → Higgs boson Predicted in 1964 • Peter Higgs • R. Brout, F. Englert • G. S. Guralnik, C. R. Hagen, and T. W. B. Kibble Tom Dietel

  5. ATLAS Higgs Candidate Tom Dietel

  6. Discovery of the Higgs Boson at LHC Spring 2010 • start of data taking 4 July 2012 • discovery of a new particle March 2013 • it’s a Higgs! October 2013 • Nobel prize Extremely Rare few 100 Higgs in a quadrillion (1015) collisions Tom Dietel

  7. Mass of the Proton - the other 99% Proton contains 3 quarks • 2 up-quarks: mu ≈ 2.5 MeV • 1 down-quark: md ≈ 5 MeV Proton heavier than 3 quarks • 2u+1d: mass ≈ 10 MeV • mp = 931 MeV • 100 time heavier Where does the mass come from? • Quantum-Chromodynamics • Confinement: no free quarks Tom Dietel

  8. Quark-Gluon Plasma • hadrons overlap • quarks roam freely (deconfinement) • Quark-Gluon Plasma Tom Dietel Compression reduce distance between nucleons Heating thermally create pions fill space between nucleons

  9. Heavy-Ion Physics • Can the quarks inside the protons and neutrons be freed? • What happens to matter when it is heated to 100000 times the temperature at the centre of the Sun? • Why do protons and neutrons weigh 100 times more than the quarks they are made of? → collisions of heavy nuclei (Pb) at high energies

  10. ALICE Event Display Tom Dietel

  11. CERN and South Africa • SA-CERN • home to all CERN research in South Africa • 5 universities + 1 national lab • more than 60 scientists • ALICE • heavy-ion physics • quark-gluon plasma • UCT, iThemba • ATLAS • particle physics • Higgs physics • SUSY, BSM • UCT, UKZN, UJ, Wits • ISOLDE • rare isotope facility • nuclearand atomic physics • UKZN, UWC, Wits, iThemba • Theory • particle, heavy-ion and nuclear physics • UCT, UJ, Wits Tom Dietel

  12. ALICE Data Flow • Simulation • event generators • model of known physics • compare experiment / theory • particle transport • model of detectors • correct for detector effects • User Analysis • extraction of physics results • based on reconstructed data • 100’s of different analysis at • Reconstruction • merge signals from same particle • determine particle properties (momentum, energy, species) • Event • 1 readout of detectors • approx. 1 collision (but: pile-up, empty events) • data block: 1 (pp) to 100 MB (Pb-Pb) • independent → embarrassingly parallel processing • Storage • disk buffer: short term, random access working copy • long term (“tape”): backup Tom Dietel

  13. Reconstruction – Bubble Chambers Tom Dietel

  14. Raw Data production: 7 PB pPb@5.02 TeV pp@8 TeV PbPb@2.76 TeV pp@2.76-7 TeV PbPb@2.76 TeV Big Data! pp@0.9-7 TeV Tom Dietel

  15. ALICE Grid Computing Tier-0 • CERN ( + Budapest) • reco, sim, analysis • 1 copy of raw data Tier-1 • reco, sim, analysis • 1 shared copy of raw data Tier-2 • sim, analysis • no access to raw data Tom Dietel

  16. ALICE Computing Resources Disk Storage (PB) total: 28.7 PB CPU Cores total: 44 000 cores • tape storage (Tier-0: 22.8 PB, Tier-1: 13.1 PB) • network • human resources

  17. ALICE GRID Sites Tom Dietel

  18. South African Tier-2 at CHPC iQudu Cluster • IBM e1350 cluster • 160 nodes • 2 dual-core AMD Opteron @ 2.6 GHz • 16 GB RAM • ethernet + infiniband • 100 TB storage (xroot) • launched in 2007 • high power consumption • aging hardware • used by ALICE since October 2012 Tom Dietel

  19. ALICE Computing at CHPC Avg: 348 running jobs 1% of all ALICE jobs Tom Dietel

  20. Completed Jobs at CHPC 20000 jobs / month Network Switch Failure Start of grid @ CHPC Tom Dietel

  21. Resources Sharing South Africa 0.3% projection for 2013: ~ 1% Tom Dietel CPU delivered 2012

  22. CPU Requirements – RUN2 +60% Tom Dietel

  23. Disk Requirements – RUN2 ×2.3 Tom Dietel

  24. CHPC Upgrade WLCG • sign MoU in (April) 2014 • representation in WLCG replace grid cluster (iQudu) • first quarter of 2014 • 2000 cores @ 3.2 GHz • 900 TB storage • ALICE + ATLAS additional human resources goal: Tier-1 Parallel session “CHPC Roadmap” Fri morning

  25. ALICE LS2 Upgrade 2018/19 (LHC 2nd Long Shutdown) • 50 kHz Pb-Pb collisions ALICE Hardware Upgrade • Inner Tracking System (ITS) • Time Project Chamber Change of Strategy • all data into online computing farm • continuous readout of detectors • massive online processing Tom Dietel

  26. ALICE Challenges for Run-3 • data rates • reduce 1 TB/s to 30 GB/s • data compression • use partial reconstruction • overlapping events • process time-slices • major change in data model

  27. Computing Working Groups CWG12ComputingHardware CWG1Architecture CWG1Architecture CWG1Architecture CWG1Architecture CWG1Architecture CWG1Architecture CWG1Architecture CWG1Architecture CWG1Architecture CWG1Architecture CWG1Architecture CWG1Architecture CWG11Software Lifecycle CWG11Software Lifecycle CWG10Control,Configuration CWG2Tools CWG2Tools CWG2Tools CWG2Tools CWG2Tools CWG2Tools CWG2Tools CWG2Tools CWG2Tools CWG2Tools CWG2Tools CWG10Control,Configuration CWG10Control,Configuration CWG9QA, DQM CWG9QA, DQM CWG9QA, DQM CWG9QA, DQM CWG3Dataflow CWG3Dataflow CWG3Dataflow CWG3Dataflow CWG3Dataflow CWG3Dataflow CWG3Dataflow CWG3Dataflow CWG3Dataflow CWG3Dataflow CWG8PhysicsSimulation CWG4Data Model CWG8PhysicsSimulation CWG8PhysicsSimulation CWG8PhysicsSimulation CWG8PhysicsSimulation CWG4Data Model CWG4Data Model CWG4Data Model CWG4Data Model CWG4Data Model CWG4Data Model CWG4Data Model CWG4Data Model CWG7Reconstruction CWG5ComputingPlatforms CWG7Reconstruction CWG7Reconstruction CWG7Reconstruction CWG7Reconstruction CWG7Reconstruction CWG5ComputingPlatforms CWG5ComputingPlatforms CWG5ComputingPlatforms CWG5ComputingPlatforms CWG5ComputingPlatforms CWG5ComputingPlatforms CWG5ComputingPlatforms CWG6Calibration CWG6Calibration CWG6Calibration CWG6Calibration CWG6Calibration CWG6Calibration CWG6Calibration Tom Dietel

  28. Summary Present ALICE computing • part of WLCG • more than 40000 CPU cores • almost 30 TB of data • Big Data! • South Africa – CHPC • 1% of ALICE resources Near Future • growth within current computing model • upgrade of CHPC – towards Tier-1 Long-term Future • major ALICE upgrade → extreme data rates • new computing concepts → huge R&D effort

  29. Backup Tom Dietel

  30. AliRoot Tom Dietel

  31. O2 Project • ComputingBoard • OnlineInstitution Board • InstitutionBoards • CWG13Sw Framework • CWGnn • ----- • CWGnn • ----- • CWG7 Reconstruc. • CWGnn • ----- • CWGnn • ----- • CWGnn • ----- • CWG8Simulation • CWG9 • QA, DQM, Vi • CWG10 • Control CWG11 Sw Lifecycle • CWG12 • Hardware - 50 people activein 1-3 CWGs - Service tasks

  32. O2 Hardware System ~ 2500 linksin total Farm Network Storage Network ITS FLP 10 Gb/s 2 x 10 or 40Gb/s 10Gb/s TPC FLP EPN FLP TRD DataStorage EPN FLP EMC PHO FLP DataStorage EPN TOF FLP Muon FLP EPN L0L1 FTP FLP ~ 250 FLPs First Level Processors ~ 1250 EPNs Event ProcessingNodes Trigger Detectors

  33. Dataflow Model Tom Dietel

More Related