80 likes | 169 Vues
Fermi Large Area Telescope (LAT) Integration and Test (I&T) Data Experience and Lessons Learned LSST Camera Workshop Brookhaven, March 2012. Tony Johnson (for Richard Dubois) Fermi Data Handling Manager (but not involved in I&T)
E N D
Fermi Large Area Telescope (LAT)Integration and Test (I&T) Data Experience and Lessons LearnedLSST Camera WorkshopBrookhaven, March 2012 Tony Johnson (for Richard Dubois) Fermi Data Handling Manager (but not involved in I&T) Most slides from Anders Borgland (Science Verification, Analysis and Calibration manager)
Fermi LAT <> LSST • We only dealt with I&T for the LAT • not the spacecraft itself (thanks NASA) • NASA did have specific requirements for documenting I&T of LAT
Fermi LAT I&T overview • Integration and Test • Before we launched Fermi we went through a long Integration & Test phase of the LAT. • We had an EngineeringModel of one of the towers. • When we got the complete towers: • Integrated them one by one into the 4x4 grid. • We took cosmic data and Van Der Graf test data along the way • In addition we took beam test data on prototype modules • At CERN, SLAC and GSI • In parallel: • Intense development of Reconstruction/Analysis and Monte Carlo simulations • Detector modeling and response • Extensive “Data Challenge” program • Wherever possible the same software was used for all of these activities: • Data formats for raw, reconstructed and “housekeeping” data • Reconstruction and Analysis code • Data Catalog, Automated data processing (pipeline) • Electronic logbook
Data Products Housekeeping data. Voltages, temperatures, beam conditions, etc Test • Complete set of data needs to be recorded and cataloged • Needs to remain accessible/usable for 10+ years Detector configuration, geometry, etc DAQ Rawdata Code versions, configuration, log files, (the code itself?) Reconstruction-Analysis Electronic log book, formal test report Reconstructed data
Software/Tools Evolution • Reconstruction/analysis code • Had to deal with both real data taking and MC simulations. • Had to flexible enough to deal with different detector configurations • Preferably by configuration files without need to rebuild • Fermi allows detector geometry to be read from XML file • Fermi LAT code: • Made up of individual software packages: • Cal reconstruction, Energy estimate, track finding etc. • Version control using cvs • Software release: • A collection of a consistent set of software packages: • Currently: >90 packages. • Called 'GlastRelease'. • Original idea: • GlastRelease: • MC oriented. • Bleeding edge – all development takes part here. • EngineeringModel: • A frozen version of GlastRelease for real data taking: • Stable: Resynch to GlastRelease once in a while. • Bug 'free': Bugs fixed in GlastRelease before making it into EngineeringModel
Real World • Data is different from MC: • Requirements from processing data are different from doing MC. • May have to add new features to process data. • Data taking schedule is different from MC: • Data taking schedule drives when and how fast to add new features. • In short: • EngineeringModelwas quite often more advanced than GlastRelease • They diverged to a large degree. • Often out of synch. • It was non-trivial to deal with all this • The I&T team needs to test new software features/bug fixes quickly: • You usually need bugs fixed yesterday! • You need to give quick feedback to the developers: • The more details you can give, the better .... • Excellent communication between I&T team and software developers is essential
Lessons Learned • Planned well in advance to use common software for I&T and real data taking • Different requirements, timescales make this challenging • Planned to use the same tools for I&T and real data • Evolution of software makes this hard • Version of reconstruction software used for I&T is now very old • Current code can still read old data – but for how long? • Need to record code version used, may need to maintain old versions of software used for I&T • Used electronic logbook, database for housekeeping, data catalog, processing pipeline to record processing history • In all cases newer versions of these tools have evolved • I&T data is still accessible but in old/obsolete version of tools • Need to maintain old tools, or plan to actively migrate data over time