1 / 55

LHC Experiment’s Software

LHC Experiment’s Software. Lucia Silvestris INFN-Bari LISHEP 2006 INTERNATIONAL SCHOOL ON HIGH ENERGY PHYSICS Rio de Janeiro - Brazil. Large Hadron Collider & Experiments The startup. 27 km around. Trigger challenge task for LHC !!. L arge H adron C ollider.

craig
Télécharger la présentation

LHC Experiment’s Software

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. LHC Experiment’s Software Lucia Silvestris INFN-Bari LISHEP 2006INTERNATIONAL SCHOOL ON HIGH ENERGY PHYSICSRio de Janeiro - Brazil

  2. Large Hadron Collider & Experiments The startup

  3. 27 km around Trigger challenge task for LHC !! Large Hadron Collider

  4. LHC Detector Requirements • Very good electromagnetic calorimetry for electron and • photon identification (H->gamma gamma) • Good hadronic calorimeter jet reconstruction and missing • transverse energy measurement; • Efficient and high-resolution tracking for particle • momentum measurements, b-quark tagging, t tagging, • vertexing (primary and secondary vertex) • Excellent muon identification with precise momentum • reconstruction

  5. n µ g e n p A Generic Multipurpose LHC Detector About 10 l are needed to shield the muon system from hadrons produced in p-p collision

  6. Experiments at LHC CMSCompact Muon Solenoid ATLAS AToroidal LHC ApparatuS LHCb Study of CP violation in B-meson decays at the LHC ALICEA Large Ion Collider Experiment

  7. LHC startup plan Stage 1 Initial commissioning 43x43 to 156x156, N=3x1010 Zero to partial squeeze L=3x1028 - 2x1031 Stage 2 75 ns operation 936x936, N=3-4x1010 partial squeeze L=1032 - 4x1032 Stage 3 25 ns operation 2808x2808, N=3-5x1010 partial to near full squeeze L=7x1032 - 2x1033

  8. Pilot Run • Pilot Run : Luminosity • 30 days; maybe less (?); 43*43 bunches, then 156*156 bunches Int. Lumi (pb-1) Pile-up Lumi (cm-2s-1) 1031 10 1030 1 0.1 1029 1028 LHC = 20% (optimistic!)

  9. Turn-on is fast Pile-up increasing rapidly Timing (43x43 to 75ns to 25 ns) evolution LOTS of physics For all detectors: Commission detector and readout Commission trigger systems Calibrate/align detector(s) Commission computing and software systems Rediscover the Standard Model Simulation Reconstruction Trigger Monitoring Calibration/Alignment calculation application User-level data objects selection Analysis Visualization SW Development Tools Startup plan and Software Documentation

  10. Higgs (?) Susy - Susy Z’  muons Top re-discovery LHC startup: CMS/ATLAS • Integrated luminosity with the current LHC plans Lumi (cm-2s-1) 1.9 fb-1 1033 1 fb-1 (optimistic?) 1032 1031 LHC = 30% (optimistic!)

  11. Physics Startup plans • ALICE: minimum-bias proton-proton interactions • Standard candle for the heavy-ion runs • LHCb: BS mixing, sin2b repeat • If the Tevatron has not done it already • ATLAS-CMS: measure jet and Z and W production; In 15 pb-1 will have 30K W’s and 4K Zs into leptons. • Measure cross sections and W and Z charge asymmetry (pdfs; top!) • Luminosity?

  12. Startup physics (ALICE) Can publish two papers 1-2 weeks after LHC startup • Multiplicity paper: • Introduction • Detector system • - Pixel (& TPC) • Analysis method • Presentation of data • - dN/dη and mult. distribution (s dependence) • Theoretical interpretation • - ln2(s) scaling?, saturation, multi-parton inter… • Summary • pT paper outline: • Introduction • Detector system • - TPC, ITS • Analysis method • Presentation of data • - pT spectra and pT-multiplicity correlation • Theoretical interpretation • - soft vs hard, mini-jet production… • Summary

  13. Where we are ? Common Software

  14. LCG Application Area • Deliver the common physics applications software for the • LHC experiments (http://lcgapp.cern.ch/) • Organized to ensure focus on real experiment needs • Experiment-driven requirements and monitoring • Architects in management and execution • Open information flow and decision making • Participation of experiment developers • Frequent releases enabling iterative feedback • Success is defined by adoption and validation of the • products by the experiments • Integration, evaluation, successful deployment

  15. Simulation Program Reconstruction Program Analysis Program Exper Frameworks Event Detector Calibration Algorithms Simulation Data Management Grid Services Engines Persistency DataBase Batch Generators Framework FileCatalog Interactive Conditions Core Geometry Histograms Fitters NTuple Physics MathLibs I/O GUI 2D Graphics PluginMgr Dictionary Interpreter Collections 3D Graphics Foundation Utilities OS binding Software Domain Decomposition

  16. Simplified Software Decomposition Experiment SW Applications are built on top of frameworks and implementing the required algorithms Applications Every experiment has a framework for basic services and various specialized frameworks: event model, detector description, visualization, persistency, interactivity, simulation, etc. Exp. Framework Simulation DataMgt. Distrib. Analysis Specialized domains that are common among the experiments Core Libraries Core libraries and services that are widely used and provide basic functionality non-HEP specific software packages Many non-HEP libraries widely used Common SW

  17. Application Area Projects • ROOT– Core Libraries and Services • Foundation class libraries, math libraries, framework services, dictionaries, scripting, GUI, graphics, SEAL libraries, etc. • POOL – Persistency Framework • Storage manager, file catalogs, event collections, relational access layer, conditions database, etc. • SIMU - Simulation project • Simulation framework, physics validation studies, MC event generators, Garfield, participation in Geant4 and Fluka. • SPI– Software Process Infrastructure • Software and development services: external libraries, savannah, software distribution, support for build, test, QA, etc.

  18. ROOT: Core Library and services • ROOT activity at CERN fully integrated in the LCG • organization (planning, milestones, reviews, resources, etc.) • The main change during last year has been the merge of the SEAL and ROOT projects • Single development team • Adiabatic migration of the software products into a single set of core software libraries • 50% of the SEAL functionality has been migrated into ROOT (Mathlib, reflection, python scripting, etc.) •  ROOT is now at the “root” of the software for all the LHC experiments • Web Page: http://root.cern.ch/

  19. ROOT: Core Library and services • Current work packages (SW Components) • BASE: Foundation and system classes, documentation and releases • DICT: Reflexion system, meta classes, CINT and Python interpreters • I/O: Basic I/O, Trees, queries • PROOF: parallel ROOT facility, xrootd • MATH: Mathematical libraries, histogramming, fitting • GUI: Graphical User interfaces and Object editors • GRAPHICS: 2-D and 3-D graphics • GEOM: Geometry system • SEAL: Maintenance of the existing SEAL packages • Web Page: http://root.cern.ch/

  20. Recent developments of ROOT I/O and Trees General I/O STL Collections Data compression using reduced precision Alternatives to default constructors Other I/O improvements Increase precision Save space ROOT: I/O

  21. TTree Extensions New Features Fast Merging Indexing ofTChains TTreeInterface enhancements TRefandpool::Reference Browsing Browse Extensions: Split objects Unsplit objects Collections And can now see Simple member functions Transient members Persistent members Main focus:Consolidation (Thread Safety)Generic Object Reference support Important features requested by the experiments are implemented ROOT:I/O

  22. New Developments of ROOT Mathematical Libraries new Vector package (3D and 4D) SMatrix (for small matrices with fixed size) ROOT: Math • Fitting and Minimization • Minuit2 (C++) • Linear Fitter • Robust Fitter • SPlot (unfolding)

  23. ROOT: Graphics - Detector Geometries Alice LHCb Atlas CMS Examples

  24. ROOT: Graphics - Events Examples

  25. Data Management • FILES - based on ROOT I/O • Targeted for complex data structure: event data, analysis data • Based on Reflex object dictionaries • Management of object relationships: file catalogues • Interface to Grid file catalogs and Grid file access • Relational Databases – Oracle, MySQL, SQLite • Suitable for conditions, calibration, alignment, detector description data - possibly produced by online systems • Complex use cases and requirements, multiple ‘environments’ – difficult to be satisfied by a single solution • Isolating applications from the database implementations with a standardized relational database interface • facilitate the life of the application developers • no change in the application to run in different environments • encode “good practices” once for all • Focus moving into deployment and experiment support

  26. ROOT I/O STORAGE MGR COLLECTIONS POOL API Oracle FILE CATALOG USER CODE RDBMS MySQL COOL API COOL SQLite CORAL Persistency framework • The AA/POOL project is delivering a number of “products” • POOL – Object and references persistency framework • CORAL – Generic database access interface • ORA – Mapping C++ objects into relational database http://pool.cern.ch/ • COOL – Detector conditions database • Object storage and • references successfully used • in large scale production in • ATLAS, CMS, LHCb • Need to focus on database • access and deployment in • Grid • basically starting now

  27. Monitoring Service implementation RDBMS Implementation (mysql) Client Software RDBMS Implementation (frontier) RDBMS Implementation (sqlite) RDBMS Implementation (oracle) CORAL Interfaces (C++ abstract classes user-level API) Common Implementation developer-level interfaces Authentication Service (environment) Authentication Service (xml) Lookup Service (lfc) Lookup Service (xml) CORAL C++ types (Row buffers, Blob, Date, TimeStamp,...) Connection Service implementation Relational Service implementation CORAL :Generic database access interface Plug-in libraries, loaded at run-time, interacting only through the interfaces http://pool.cern.ch/coral/

  28. Example 1: Table creation coral::ISchema& schema = session.nominalSchema(); coral::TableDescription tableDescription; tableDescription.setName( “T_t” ); tableDescription.insertColumn( “I”, “long long” ); tableDescription.insertColumn( “X”, “double” ); schema.createTable( tableDescription); Oracle MySQL CREATE TABLE “T_t” ( I NUMBER(20), X BINARY_DOUBLE) CREATE TABLE T_t ( I BIGINT, X DOUBLE PRECISION) CORAL :Generic database access interface • A software system for vendor-neutral access to relational • databases C++, SQL-free API • CORAL integratedin the software of LHC experiments (CMS, ATLAS • and LHCb) directly (i.e. on-line applications) and indirectly (COOL, POOL)

  29. Conditions DataBase • DataBases to store time varying data • COOL : • holds condition data for reconstruction and analysis • access data from PVSS, local file catalog (LFC) and bookeeping • implementation in ORACLE, MySQL and SQLite • Now in deployment phase (ATLAS and LHCb) • fully integrated in experiment frameworks • Benefits from other LCG projects • CORAL, SEAL/ROOT and 3D project http://pool.cern.ch/CondDB/

  30. Simulation • MC generators • MC generators specialized on different physics domains, developed by different authors • Needed to guarantee support for the LHC experiments and collaboration with the authors. • Simulation engines • Geant4 and Fluka are well established products • Common additional utilities required by the experiments • Interoperability between MC generators and simulation engines • Interactivity, visualization and analysis facilities • Geometry and Event data persistency • Comparison and validation (between engines and real data) • http://lcgapp.cern.ch/project/simu

  31. Simulation framework utilities • HepMC: C++ Event Record for Monte Carlo Generators • GDML: Geometry description markup language • Geometry interchange format or geometry source • GDML writer and readers exists for Geant4 and ROOT • Geant4 Geometry persistency • Saving/retrieving Geant4 geometries with ROOT I/O • FLUGG: using Geant4 geometry from FLUKA • Framework for comparing simulations • Example applications have been developed • Python interface to Geant4 • Provide Python bindings to G4 classes • Steering Geant4 applications from Python scripts • Utilities for MC truth handling

  32. Simulation components Steering Python scripts text editor W TGeom GDML R geom.root R W Geant4 MC generators HepMC Pythia Pythia HepMC Flugg MCtruthroot MCDB Fluka

  33. Distributed data analysis • Full spectrum of different analysis applications will be co- • existing • Data analysis applications using the full functionality provided by the experiment’s framework (analysis tools, databases, etc.) • Requiring big fraction of the available software packages and very demanding on computing and I/O • Typically batch processing • Final analysis of ntuple-like data (ROOT trees) • Fast turn-around (interactive) • Easy migration from local or distributed (PROOF) • Tools to help the Physicists are being made available • Large scale grid job submission (GANGA) • Parallelization of the analysis jobs (PROOF)

  34. Application Area Highlights - SPI • SPI is concentrating on the following areas: • Savannah service (bug tracking, task management, etc.) • >160 hosted projects, >1350 registered users (doubled in one year) • Web Page: http://savannah.cern.ch/ • Software services (installation and distribution of software) • >90 external packages installed in the external service • Software development service • Tools for development, testing, profiling, QA • Web, Hypernews, Documentation • SPI Web Pagehttp://lcgapp.cern.ch/project/spi/

  35. SPI - Software Configuration • An LCG configuration is a • combination of packages and • versions which are coherent and • compatible • Configurations are given names like • “LCG_40” • Experiments build their application • software based on a given LCG • configuration • Interfaces to the experiments configuration systems are provided (SCRAM, CMT) • Concurrent configurations are everyday situation • Configurations are decided in the AF

  36. Applications Exp. Framework Simulation DataMgt. Distrib. Analysis Core Libraries non-HEP specific software packages SPI - Software Releases • The AA/Experiments software stack is quite • large and complex • Many steps and many teams are involved • Only 2-3 production quality releases per year is • affordable • Complete documentation, complete platform set, complete regression tests, test coverage, etc. • Feedback is required before the production • release is made • No clear solution on how to achieve this • Currently under discussion • As often as needed bug fix releases • Quick reaction time and minimal time to release release order

  37. Where we are? Individual experiments

  38. Simulation Program Reconstruction Program Analysis Program Exper Frameworks Event Detector Calibration Algorithms Simulation Data Management Grid Services Engines Persistency DataBase Batch Generators Framework FileCatalog Interactive Conditions Core Geometry Histograms Fitters NTuple Physics MathLibs I/O GUI 2D Graphics PluginMgr Dictionary Interpreter Collections 3D Graphics Foundation Utilities OS binding Software Domain Decomposition

  39. Experiments Software Architecture & Frameworks

  40. Converter Converter Application Manager Converter Transient Event Store Data Files Message Service Persistency Service Event Data Service JobOptions Service Algorithm Algorithm Algorithm Data Files Transient Detector Store Particle Prop. Service Persistency Service Detec. Data Service Other Services Data Files Transient Histogram Store Persistency Service Histogram Service Frameworks: ATLAS+LHCb (I) • ATLAS+LHCb: Athena/Gaudi

  41. Frameworks: Alice (II) ALIROOT

  42. Framework CMS: Component Architecture (III) • CMS: New framework in 2005 • Five types of dynamically loadable processing components • Source • Provides the Event to be processed • OutputModule • Stores the data from the Event • Producer • Creates new data to be placed in the Event • Filter • Decides if processing should continue for an Event • Analyzer • Studies properties of the Event • Components only communicate via the Event • Components are configured at the start of a job using a ParameterSet

  43. Framework CMS: Processing Model (IV) • Source creates the Event • The Event is passed to execution paths • Path is an ordered list of Producer/Filter/Analyzer modules • Producers add data to the Event • OutputModule given Event if certain Paths run to completion POOL File

  44. Framework CMS: Accessing Event Data (VI) • Event class allows multiple ways to access data //Ask by module label and default product label Handle<TrackVector> trackPtr; event.getByLabel(“tracker”, trackPtr ); //Ask by module and product label Handle<SimHitVector> simPtr; event.getByLabel(“detsim”,“pixel” ,simPtr ); //Ask by type vector<Handle<SimHitVector> > allPtr; event.getByType( allPtr ); //Ask by Selector ParameterSelector<int> coneSel(“coneSize”,5); Handle<JetVector> jetPtr; event.get( coneSel, jetPtr );

  45. Framework CMS: Job Configuration (IX) • Job configuration is done in the configuration file • After configuration is complete, all components will have been loaded into the application process RECO = { source = PoolSource { string filename = “test.root” } module tracker = TrackFinderProducer {} module out = PoolOutputModule { string filename = “test2.root”} path p = {tracker,out} }

  46. Simulation and Detector Descriptionin the experiments

  47. ALICE : ~3 million volumes Simulation (I) • Geant4: success story; Deployed by all experiments. • Functionality essentially complete. Detailed physics studies performed by all experiments. • Very reliable in production (better than 1:105) • Good collaboration between experiments and Geant4 team • Lots of feedback on physics (e.g. from testbeams) • LoH (Level of Happiness): very high LHCb : ~ 18 million volumes

  48. Simulation: ATLAS (II) Atlas Detector Description

  49. Simulation: ATLAS (III) New

  50. FLUKA VMC implementation completed Testing well advanced TGeo/FLUKA validation completed Good agreement with G3 and Testbeam FLUKA VMC will be used in the next ALICE Physics data challenge Plan to use Geant4 as alternative simulation engine Simulation: Alice (IV) under developement

More Related