1 / 45

Components‐based software in the HARP PS214 experiment at CERN

Maria Gabriella Catanesi ( INFN Bari Italy) 11th ICATPP Conference Villa Olmo, Como (Italy) 5-9 October 2009. Components‐based software in the HARP PS214 experiment at CERN. HARP physics motivations. Input for prediction of neutrino fluxes for the MiniBooNE and K2K experiments

jamal
Télécharger la présentation

Components‐based software in the HARP PS214 experiment at CERN

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Maria Gabriella Catanesi ( INFN Bari Italy) 11th ICATPP Conference Villa Olmo, Como (Italy) 5-9 October 2009 Components‐based software in the HARP PS214 experiment at CERN

  2. HARP physics motivations • Input for prediction of neutrino fluxes for the MiniBooNE and K2K experiments • Pion/ Kaon yield for the design of the proton driver and target system of Neutrino Factories and SPL- based Super-Beams • Input for precise calculation of the atmospheric neutrino flux (from yields of secondary π ,K) • Input for Monte Carlo generators (GEANT4, e.g. for LHC or space applications)

  3. Existing measurements @ 1999 (harp proposal) 10 Daughter energy 1 TeV 100 10 1 GeV 1 GeV 10 100 1 TeV 10 Parent energy Boxes show importance of phase space region for contained atmospheric neutrino events. Barton et. al. • Overall quoted errors • Absolute rates: ~15% • Ratios: ~5% • These figures are typical of this kind of detector setup Abbott et. al. Measurements. 1-2 pT points 3-5 pT points >5 pT points Eichten et. al.

  4. Running neutrino experiments

  5. Primary energy, target material and geometry, collection scheme • maximizing the π+, π-production rate /proton /GeV • knowing with high precision (<5%) the PTdistribution • CERN scenario: 2.2-5 GeV/c proton linac. • Phase rotation • longitudinally freeze the beam: slow down earlier particles, accelerate later ones • need good knowledge also of PL distribution Design of future projects

  6. The HARPCollaboration 24 institutes~120 collaborators Università degli Studi e Sezione INFN, Bari, Italy Rutherford Appleton Laboratory, Chilton, Didcot, UK Institut für Physik, Universität Dortmund, Germany Joint Institute for Nuclear Research, JINR Dubna, Russia Università degli Studi e Sezione INFN, Ferrara, Italy CERN, Geneva, Switzerland TU Karlsruhe, Germany Section de Physique, Université de Genève, Switzerland Laboratori Nazionali di Legnaro dell' INFN, Legnaro, Italy Institut de Physique Nucléaire, UCL, Louvain-la-Neuve, Belgium Università degli Studi e Sezione INFN, Milano, Italy P.N. Lebedev Institute of Physics (FIAN), Russian Academy of Sciences, Moscow, Russia Institute for Nuclear Research, Moscow, Russia Università "Federico II" e Sezione INFN, Napoli, Italy Nuclear and Astrophysics Laboratory, University of Oxford, UK Università degli Studi e Sezione INFN, Padova, Italy LPNHE, Université de Paris VI et VII, Paris, France Institute for High Energy Physics, Protvino, Russia Università "La Sapienza" e Sezione INFN Roma I, Roma, Italy Università degli Studi e Sezione INFN Roma III, Roma, Italy Dept. of Physics, University of Sheffield, UK Faculty of Physics, St Kliment Ohridski University, Sofia, Bulgaria Institute for Nuclear Research and Nuclear Energy, Academy of Sciences, Sofia, Bulgaria Università di Trieste e Sezione INFN, Trieste, Italy Univ. de Valencia, Spain

  7. Forward Spectrometer: • 30 mrad <  < 210 mrad. • 750 MeV/c < p < 6.5 GeV/c • K2K,MiniBoone, Cosmic rays Detector layout Forward spectrometer Large Angle spectrometer • Large Angle Spectrometer: • 0.35 rad <  < 2.15 rad • 100 MeV/c < p < 700 MeV/c • Super Beams - Nufactories More details in the NIM paper “The Harp Detector @ the CERN PS”

  8. Main features of the Project • Fast readout • ˜103 eventi/PS spill, one spill=400ms. • Event rate ˜ 2.5KHz • Around 106 events/day • (very ambitious for a TPC!) • Short Time Scale ! ( february 2000 – summer 2001) • Make use (where possible) of existing material and/or detectors • Cost optimisation • Minimize efforts and time-scale

  9. Software Beam Instrumentation and Trigger Large Angle Detectors Forward Spectrometer Online Offline Forward RPC Drift Chambers Cherenkov TOF Wall EM Wall Beam Cherenkovs Beam TOFs Trackers MWPC Inner Trigger Forward Trigger Muon Catcher TPC Barrel RPC The HARP experiment Big parallel effort: Design and construction at the same time

  10. HARP Construction TPC July 2000

  11. Forward Spectrometer July 2000

  12. HARP technical run October 2000

  13. August 2001:

  14. 1 Try to produce the results using all the programs that we already have Don't make any general integration having in mind that some work will be lost Postpone the creation of the “true” Software environment 2 Create a Software Architecture ASAP Push on the integration of the different software modules Stop the development with the already used programs A coordinate effort was needed to have a running software in a short time !2 possible approaches :

  15. Choice 1 pro & con • Pro • No time spent working in the Organization Matter • No formalisms and defined rules and discipline • Concentration of the efforts only on the primary scope • Con • A (large) fraction of the work will be not usable in the future • Possible errors due to lack in communications between different programs • Possible interferences and conflicts at the integration time

  16. Choice 2 pro & con • Pro • A (large) fraction of the work will be usable in the future • Optimize the resources and minimize the problems at the integration time • Allows a coherent effort and minimize conflicts • Con • Time spent working in the Organization Matter • formalisms and defined rules and discipline are a must and a consensus on this point is mandatory • Need (sometime) specialized manpower

  17. Software: The HARP Strategy Harp was approved in the February 2000 and the data taking was started in summer 2001 To optimize quality , developing time and human resources we decided to use software engineering tools and in particular to realize an Architectural Design This choice was successfully and we build a first running version of the full chain of our software in few months from June to October 2001

  18. But what this practically means ? Use a well tested method to follow the software development from the very early stage (Project and Managements Plans, User and software Requirements) until the final product tests (Test plan and release procedure) The starting point should be always the Software Project Management Plan that describes 1. The objectives to be reached by the program and the organization of the work 2. The task and responsibilities 3. The schedule 4. The strategy to minimize risks

  19. the objectives to be reached by the program and the organization of the work User Requirements : define the functionality that the software must have for reach the results Example: UR-4: The user shall be able to describe the detector geometry consistently, though allowing different representations, for different applications. Type: capability ( or constrain) Priority: essential Status: implemented SRDreferences: SR-19, SR-20, SR-21

  20. The Software Requirements derive from the User Requirements and from strategic decisions taken in the collaboration. • SR-1: The software shall run on PCs with Linux operating system. • Type: constraint • Priority: essential • Status: implemented • SR-2: The software shall have been developed in C/C++ programming language (at least for the collaboration-wide software). • Type: constraint • Priority: essential • Status: implemented

  21. What is a Architectural Design ? The Architecture of a software is realized determining the components (domain decomposition) and the set of dependency relations between the identified elements Each component is an independent unit of source code and libraries The connection between the different domains defines the dependences and produces the time sequence for the test and release of the official code Software managment is part of the process A correct decomposition allows a parallel development of the different peaces of code and the resource optimization Big effort during the summer 2000

  22. As example… Simulation Display Reconstruction FRAME Detector Response Detector Description Event Model ROOT CLHEP GEANT4

  23. Task and responsibilities: • In general is convenient to match the responsibility assignment to the results of the domain decomposition • In this way will be no ambiguity left between the subcomponents • The share of the work is easier and maximize the parallel development reducing conflicts and/or interferences • Unit test , System test and release procedure were defined and implemented • Software verification and validation are included • User and Software documention are essential parts

  24. The schedule : • The schedule was defined on the base of the Domain dependency structure and from the definition of the testing and release procedure • For each step of the development you can define a schedule and the result of this effort should be a Software Release. • It will evolve in a new one following the same test procedures

  25. Configuration • Software: HARP_DEV , HARP_FILES • code packages + ext.libs + data files • Analysis: HARP_ANALYSIS • analysis package (+ lib, exe) • Production: HARP_PROD • production scripts and executables + run files

  26. The Software is not a static object: • The development is organized as cycles • For each cycle all the steps should be revised • (scopes, schedules etc. etc.) • As example in HARP we have realized until now several cycles to reach different goals • Technical Run 2000 • Start of Run 2001 • SPSC presentation October 2001 • Large Angle Analysis may 2002 • Mock Data Challenge • Migration from Objectivity to Oracle (2003) • …. • Full Data and MC Production v7r8

  27. DAQ :data acquisition software based on the DATE package (ALICE). • HarpEvent: HARP transient event model, including a structured description of settings, reconstruction objects,based on Gaudi (LHCb) • HarpDD HARP detector geometry and materials data • (including alignment and calibration ) • DetRep :geometrical representations of the detector (physics applications) based on the GEANT4 solid modelling • EventSelector event selection and data navigation functionality. • Simulation is based on GEANT4. • Reconstruction computation of reconstructed objects at various levels (including KALMAN Filters ) • HarpUI event display used also online based on ROOT • DetResponse is the component implementing the digitization of the main detectors. The HARP software components described have been developed and used for detector calibration and performance studies, trigger and background studies, beam particle identification, on-line applications, data quality, and productions for data analysis. also used for the T9 beam simulation, and for understanding and resolving trigger rate problems.

  28. Equalisation Clustering Pattern recognition Track fit (helix) Momentum fit TPC Track Reconstruction

  29. CAL TOF CERENKOV CERENKOV TOF FW: PID principle

  30. Persistency • ObjyHarp HARP persistent event model. It is based on ObjectivityDB database, and mirrors the transient event model. • ObjectCnv unpacking of the raw data and the construction of the transient C++ objects used by the physics applications. It can use transparently both online data and stored offline data, as well as MonteCarlo output. • ObjyPersistency is the component implementing the adapter to use the Objectivity or Oracle databases, while allowing the physics applications not to depend at compile time on the I/O solution. In 2003 (following a CERN decision) HARP migrated its data to Oracle, thus an equivalent component implementing HARP persistent event model in Oracle exists. The transition was transparent for the other users/developers

  31. iDST • iDSTmySQL is the component implementing the DST concept for distribution in the collaboration. It contains the persistent-capable physics objects (including reconstruction, simulation, geometry, and event model objects). It supports both a neutral file format and Linux mySQL. • The entire software chain can be rerun on the data produced/retrieved in iDSTmySQL without needing to access the Cern mass storage system (CASTOR) and central data-bases (Objectivity, Oracle). • This allows full distribution of the analysis work in the HARP institutes.

  32. Data taking summary HARP took data at the CERN PS T9 beamline in 2001-2002 Total: 420 M events, ~300 settings SOLID: top: simulated track and noise hits in the TPG; middle: highlighted hits are those assigned by the pattern recognition to belong to the same track; bottom: track fitted on the selected hits. n EXP: CRYOGENIC:

  33. Performances HARP data are reconstructed in production at a rate varying from 0.3 sec/evt/GHz (large angle) to 2.1 sec/evt/GHz (all forward detectors) HARP data are simulated in production at a rate of 1.7 sec/evt/GHz for both large angle and forward detectors simulation. These rates allow in all cases more than one million events per day per 10 GHz (i.e. 3 standard processors clocked at 3 GHz). Productions All HARP data (with few exceptions) were successfully reconstructed and analyzed (~ 50 TByte including calibration data) The same software release (v7r8) was uses without bug fixing to process all data A factor (5-10) larger of corresponding MC events have been also produced

  34. Far/Near Ratio in K2K Predicted Far/Near Ratio Predicted Flux Shape Near Detector HARP gives ~ factor 2 error reduction across all energies Near/Far Ratio Far Detector Nucl.Phys.B732:1-45,2006 hep-ex/0510039

  35. The Harp data cover this region Miniboone:8.9 GeV p beam hitting a berillium target published on EPJ C hep-ex/0702024v2

  36. π+ HARP Be 8.9 GeV 5% Target Results Harp Forward Spectrometer Acceptance

  37. + Neutrinofactorystudy - + - yield/Ekin ds/dq cross-sections can be fed into neutrino factory studiesto find optimum design Warning the above has fixed integration range, but optimization may be momentum dependent

  38. the next step : MICE The International Muon Ionization Cooling Experiment

  39. From MICE Proposal : The full software simulation and reconstruct ion chain of the TPG was obtained using the HARP pattern recognition and track fitting programs with minor modifications a) b) c) • Hits (including noise) • Track finding • Fitted tracks

  40. guard ring GEM-1 TPG: from simulation to prototyping • (R&D)TPG-head module • 3-GEM amplification stage • 30 cm X • read-out: ~7x105 hexagonal pads grouped in • 3 sets of strips dephased y 120o (Hexaboard) • FADC electronics (100 ns sampling time, from HARP TPC Prototype of the ALTRO cip • Test bed: cylindrical field cage (80 cm X,150 cm length) • Solenoidal B field ~ 0.7 T (max) Full volume: concept HARP solenoid: [0,0.7] T field cage drift length: 150 cm source diameter: 80 cm X E B beam gas:Ar(90%)+CO2(10%) cathode plane: 25 kV TPG head

  41. 90Sr source B~0.07 T Vdrift ~ 1cm/ms R~27 mm pT~0.57 MeV/c sres= 40 mm using the HARP Software

  42. Large Gems Prototypes: For ν close detectors T2K ( but also ... LHC,LC ) E.Radicioni Elba-2006 side view One GEM module on pad plane Completed stack under HV test • 4mm induction gaps Ar/CO2 90/10 • Edrift = 160V/cm, σT=200μm (minimum) • HVGEM = 320V HVIND=780V (~2 KV/cm, σT maximum) front view E.Radicioni Elba-2006

  43. Conclusions • The HARP experiment was build in a very short time • To cope with this aggresive schedule we decide to choose for the Software to usesoftware engineering tools and in particular to realize an Architectural Design • This choice was successfully and we build a first running version of the full chain of our software in few months from June to October 2001 • A (large) fraction of the work was easily reused in other applications

More Related