1 / 16

Zprávy z ATLAS SW Week March 2004 Semin ář ATLAS SW CZ Duben 2004

Zprávy z ATLAS SW Week March 2004 Semin ář ATLAS SW CZ Duben 2004. Jiří Chudoba FzÚ AV CR. 2003. POOL/SEAL release (done) ATLAS release 7 (with POOL persistency) (done) LCG-1 deployment (in progress) ATLAS complete Geant4 validation (done) ATLAS release 8

saskia
Télécharger la présentation

Zprávy z ATLAS SW Week March 2004 Semin ář ATLAS SW CZ Duben 2004

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Zprávy z ATLAS SW Week March 2004Seminář ATLAS SW CZ Duben 2004 Jiří Chudoba FzÚ AV CR

  2. 2003 • POOL/SEAL release (done) • ATLAS release 7 (with POOL persistency) (done) • LCG-1 deployment (in progress) • ATLAS complete Geant4 validation (done) • ATLAS release 8 • DC2 Phase 1: simulation production • DC2 Phase 2: intensive reconstruction (the real challenge!) • Combined test beams (barrel wedge) • Computing Model paper • Computing Memorandum of Understanding (moved to end 2004) • ATLAS Computing TDR and LCG TDR • DC3: produce data for PRR and test LCG-n • Physics Readiness Report • Start commissioning run • GO! 2004 NOW 2005 2006 2007 ATLAS Computing Timeline

  3. Near-term Software Release Plan • 7.5.0: 14th Jan 2004 • 7.6.0: 4th Feb • 7.7.0: 25th Feb <- SPMB Decision 3rd Feb • 8.0.0: 17th Mar <- DC2 & CTB Simulation Release • 8.1.0: 7th Apr • 8.2.0: 28th Apr • 8.3.0: 19th May • 9.0.0: 9th Jun <- DC2 & CTB Reconstruction Release • 9.1.0: 30th Jun • 9.2.0: 21st Jul • 9.3.0: 11th Aug ⇐ 15th Feb: LAr Technical run starts ⇐ 1st May: Baseline DC-2 Simulation starts ⇐ 10th May: Testbeam starts ⇐ 1st Jul: Baseline DC-2 Reconstruction starts ⇐ 14th Jul: Complete Testbeam ⇐ 15th Jul: Baseline DC-2 Physics Analysis starts

  4. Release 8 • Expected Major Milestones • GEANT4 Simulation • DC2 (in validation) • Test Beam (underway) • Pile-Up, Digitization in Athena (debugging) • GeoModel • Inner Detector, Muon Spectrometer • Conversion to CLHEP Units (mm, MeV, [-pi,pi]) • POOL/SEAL Persistency • Bytestream Converters • Preliminary Conditions Capabilities • Other Stepping Stones • Move to InstallArea • jobOption.txt --> jobOption.py • Distribution Kits

  5. ATLAS DC2 ATLAS Software Workshop 2 March 2004 Gilbert Poulard CERN PH-ATC

  6. DC2: goals • At this stage the goal includes: • Full use of Geant4; POOL; LCG applications • Pile-up and digitization in Athena • Deployment of the complete Event Data Model and the Detector Description • Simulation of full ATLAS and 2004 combined Testbeam • Test the calibration and alignment procedures • Use widely the GRID middleware and tools • Large scale physics analysis • Computing model studies (document end 2004) • Run as much as possible of the production on LCG-2

  7. DC2 operation • Consider DC2 as a three-part operation: • part I: production of simulated data (May-June 2004) • needs Geant4, digitization and pile-up in Athena, POOL persistency • “minimal” reconstruction just to validate simulation suite • will run on any computing facilities we can get access to around the world • part II: test of Tier-0 operation (July 2004) • needs full reconstruction software following RTF report design, definition of AODs and TAGs • (calibration/alignment and) reconstruction will run on Tier-0 prototype as if data were coming from the online system (at 10% of the rate) • output (ESD+AOD) will be distributed to Tier-1s in real time for analysis • part III: test of distributed analysis on the Grid (August-Oct. 2004) • access to event and non-event data from anywhere in the world both in organized and chaotic ways • in parallel: run distributed reconstruction on simulated data

  8. DC2: Scenario & Time scale September 03: Release7 March 17th: Release 8 (production) May 3rd 04: July 1st 04: “DC2” August 1st: Put in place, understand & validate: Geant4; POOL; LCG applications Event Data Model Digitization; pile-up; byte-stream Conversion of DC1 data to POOL; large scale persistency tests and reconstruction Testing and validation Run test-production Start final validation Start simulation; Pile-up & digitization Event mixing Transfer data to CERN Intensive Reconstruction on “Tier0” Distribution of ESD & AOD Calibration; alignment Start Physics analysis Reprocessing

  9. DC2 resources

  10. Atlas Production System schema Task = [job]* Dataset = [partition]* AMI JOB DESCRIPTION Location Hint (Task) Task (Dataset) Task Transf. Definition Data Management System + physics signature Human intervention Job Run Info Location Hint (Job) Job (Partition) Partition Transf. Definition Executable name Release version signature Supervisor 1 Supervisor 2 Supervisor 3 Supervisor 4 US Grid Executer NG Executer LSF Executer LCG Executer Chimera RB RB US Grid LCG NG Local Batch

  11. Tiers in DC2 • Tier-0 • 20% of simulation will be done at CERN • All data in ByteStream format (~16 TB) will be copied to CERN • Reconstruction will be done at CERN (in ~10 days). • Reconstruction output (ESD) will be exported in 2 copies from Tier-0 ( 2 X ~5 TB).

  12. Tiers in DC2 • Tier-1s will have to • Host simulated data produced by them or coming from Tier-2; plus ESD (& AOD) coming from Tier-0 • Run reconstruction in parallel to Tier-0 exercise (~2 months) • This will include links to MCTruth • Produce and host ESD and AOD • Provide access to the ATLAS V.O. members • Tier-2s • Run simulation (and other components if they wish to) • Copy (replicate) their data to Tier-1 • ATLAS is committed to LCG • All information should be entered into the relevant database and catalog

  13. Core sites and commitments Initial LCG-2 core sites Other firm commitments Will bring in the other 20 LCG-1 sites as quickly as possible

  14. Comments on schedule • The change of the schedule has been “driven” by • ATLAS side: • the readiness of the software • Combined test beam has a highest priority • The availability of the production tools • The integration with grid is not always easy • Grid side • The readiness of LCG • We would prefer run Grid only! • Priorities are not defined by ATLAS only • For the Tier-0 exercise • It will be difficult to define the starting date before we have a better idea how work the “pile-up” and the “event-mixing” processes

  15. Status of the Software for Data Challenge 2 Armin Nairz Conclusions • Generation and GEANT4 Simulation (as in Rel. 7.5.0+) are already in a production-like state • stable and robust • CPU times per event, event sizes ‘within specifications’ • Digitisation (as in Rel. 7.6.0+) could not be tested for all sub-detectors (missing or not working) • for the tested ones, digitisation is working and stable • reason for confidence in working digitisation procedure for the whole detector in/after Rel. 7.7.0 • Pile-up not yet fully functional • Documentation on pre-production activities available from DC webpage • http://atlas.web.cern.ch/Atlas/GROUPS/SOFTWARE/DC/DC2/preprod • contains also how-to’s(running event generation, simulation, digitisation) Armin NAIRZ ATLAS Software Workshop, CERN, March 1-5, 2004 12

  16. Další schůze • GRID • Analysis Tools • Distributed Analysis • ... • Atlantis Tutorial • Athena Tutorial chudoba@fzu.cz

More Related