1 / 45

Report on the LCG Applications Area

This report provides an overview of the LCG Applications Area, including organization, architecture implementation, project status, and external collaborations.

jessm
Télécharger la présentation

Report on the LCG Applications Area

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Report on the LCG Applications Area Torre Wenaus, BNL/CERN LCG Applications Area Manager http://cern.ch/lcg/peb/applications LHCC Meeting May 22, 2003

  2. Outline • Organization and overview • Implementing the Architecture Blueprint • Personnel • Schedule and planning • Brief project status • POOL, SEAL, PI, Simulation, SPI • External participation, collaboration • Concluding remarks

  3. Applications Area Organisation decisions strategy architects forum applications manager applications area meeting POOL project SEAL project PI project Simulation project SPI project consultation

  4. Management and Communication • Architects Forum • Attendees: architects, project leaders, EP/SFT leader • Good atmosphere, effective, agreement generally comes easily. No problems so far. • Minutes public after internal circulation • Meetings ~1-2/month • Applications area meeting • 25-50 attendees local and remote • Project status, release news, activities of interest (internal and external), software usage and feedback • Meetings 2-3/month • Many meetings at project and work package level • We promote having them in the afternoon with a phone connection

  5. Focus on Experiment Need • Project structured and managed to ensure a focus on real experiment needs • SC2/RTAG process to identify, define (need-driven requirements), initiate and monitor common project activities in a way guided by the experiments themselves • Architects Forum to involve experiment architects in day to day project management and execution • Open information flow and decision making • Direct participation of experiment developers in the projects • Tight iterative feedback loop to gather user feedback from frequent releases • Early deployment and evaluation of LCG software in experiment contexts • Success defined by experiment adoption and production deployment Substantive evaluation and feedback still to come, as is (of course) adoption and production deployment

  6. Applications Area Projects • Software Process and Infrastructure (SPI)(operating – A.Aimar) • Librarian, QA, testing, developer tools, documentation, training, … • Persistency Framework (POOL) (operating – D.Duellmann) • POOL hybrid ROOT/relational data store • Core Tools and Services (SEAL)(operating – P.Mato) • Foundation and utility libraries, basic framework services, object dictionary and whiteboard, math libraries, (grid enabled services) • Physicist Interface (PI)(operating – V.Innocente) • Interfaces and tools by which physicists directly use the software. Interactive analysis, visualization, (distributed analysis & grid portals) • Simulation (launched – T.Wenaus et al) • Generic framework, Geant4, FLUKA integration, generator services … The set of projects is complete unless/until a distributed analysis project is opened

  7. Other LCG Projects in other Areas LCG Applications Area Persistency (POOL) Physicist Interface (PI) Simulation LHC Experiments Software Process & Infrastructure (SPI) … Core Libraries & Services (SEAL) Project Relationships

  8. Implementing the Architecture Blueprint • Use what exists: almost all work leverages existing stuff • ROOT, Gaudi/Athena components, Iguana components, CLHEP, Aida, HepUtilities, SCRAM, Oval, NICOS, Savannah, Boost, MySQL, GSL, Minuit, gcc-xml, RLS, … • Component-ware: followed, and working; e.g. rapidity of integration of SEAL components into POOL • Object dictionary: In place and meeting POOL needs; application now expanding to interactivity • Component bus: both Python environment and its integration with ROOT/CINT progressing well • Object whiteboard: Still to come • Distributed operation: essentially no activity, still not in scope • ROOT: The ‘user/provider’ relation is working: good ROOT/POOL cooperation – POOL gets needed mods, ROOT gets debugging/development input

  9. Domain Decomposition Products mentioned are examples; not a comprehensive list Project activity in all expected areas except grid services

  10. Applications Area Personnel Status • Personnel spreadsheets of apps area, EP/SFT and LCG merged into a single LCG-managed spreadsheet • Consistent and current information on contributions, activities • LCG apps area hires essentially complete • 22 working; target in Sep 2001 LCG proposal was 23 • Contributions from UK, Spain, Switzerland, Germany, Sweden, Israel, Portugal, US, India, and Russia • Similar contribution level from CERN (IT and EP/SFT people without experiment affiliation) • Gathering of people in EP/SFT under John Harvey is working very well • Similar again from experiments (including EP/SFT with expt affiliation) http://lcgapp.cern.ch/project/mgmt/AppManpower.xls [superseded this week by merged LCG spreadsheet]

  11. Personnel

  12. Personnel Sources (FTEs)

  13. Current Personnel Distribution

  14. FTEs by Source and Activity LCG apps area project activities

  15. Comparing reality with the Blueprint resource plan

  16. Estimate of Required Effort Depend on expt ramp to complete SPI 60 Now Math libraries 50 40 Physicist interface FTEs 30 Generator services 20 Simulation 10 SEAL 0 POOL Jun-03 Jun-04 Mar-04 Mar-05 Mar-03 Sep-03 Dec-03 Sep-04 Dec-04 Dec-02 Sep-02 Quarter ending Personnel Resources – Required and Available Blue = Available and pledged effort Future estimate based on 20 LCG, 16 CERN, 23 experiments i.e. Present LCG+CERN, and reaching 10 ATLAS, 10 CMS, 3 LHCb In addition, ALICE contributes 4.5 FTEs via ROOT

  17. Messages in this data • The apps area is getting the mandated job done with the manpower levels estimated as required by the blueprint RTAG • Manpower is being used efficiently, in the mandated places • In every project, the manpower external to the experiments (LCG, IT, some EP/SFT) exceeds or equals that contributed by the experiments • Common resources used in common are making these projects possible • Experiment participation in the projects exceeds 13 FTEs • The experiments are directly invested and participating: our projects are their projects, and often use their software • Another ~13 FTEs needed to complete the anticipated scope • With much of it (~8 FTEs) grid-related

  18. Contributions to experiment effort • In the 9/2001 LCG proposal, 6 FTEs to the experiments for integration of experiment applications with the grid • Present status: ~5 FTEs identified (~3 working, ~2 to be hired) • Two full time LCG people in ATLAS and ALICE (2 FTE) • One GDB/ALICE (mostly ALICE) LCG person (~.8 FTE) • EP/SFT person half-time in CMS (0.5 FTE) • Made possible by LCG people filling the hole • Two LCG hires in priority list (INFN,Germany) for LHCb, CMS (2FTE) • Will assign people from the projects to specific experiments to help with take-up of LCG software • A fraction, not all of their time • The person may be an experiment person in some cases! • This is in the works now in POOL

  19. +/- 90 Day Milestones

  20. L1 Milestones (1) On target, except with serious evaluation of releases by the experiments only recently started, it will take work to make it ‘production-capable’ Voided by continuing absence from scope

  21. L1 Milestones (2) Voided by continuing absence from scope

  22. Distributed Analysis Status • 14 months after launch, there is (almost) no ‘G’ in the LCG applications area • This is disruptive to the project and contrary to expectations at launch (estimate then: 2002Q4/2003Q1) • 50% of our L1 LHCC milestones are out the window • Personnel idling, thanks to the SC2 • Have to deflect people interested in collaborating • POOL needs an understanding of its metadata responsibilities; will come when this gets attention • The LCG has no involvement in grid-enabled/grid-enabling applications. • The present “starting to develop contingency plans” situation with middleware highlights the fact the LCG should be involved in applications – the level at which real-world problems are exposed and workaround needs are identified and can then be acted on • The LCG cannot otherwise meet its responsibility to ensure LHC computing capability is in place

  23. POOL Schedule Tracking

  24. MS Project Integration – POOL Milestones

  25. Apps area planning materials • Planning page linked from applications area page • Project plans for the various projects • WBS, schedule (milestones & deliverables) • Personnel spreadsheet • Merged into overall LCG project spreadsheet • Applications area plan document: overall project plan • Incomplete draft • Applications area plan spreadsheet: overall project plan • High level schedule, personnel resource requirements • Risk analysis (new) http://lcgapp.cern.ch/project/mgmt/

  26. Applications Area Risk Analysis

  27. The only current Level 4 risk: licensing • We must have a GPL-type license for applications area software • A position I have stated, decisively agreed with in strong statements from all four experiments • We rely on GPL software such as GSL and MySQL • Commercial options on some, such as MySQL, is not a defensible option in our community • Trends of recent years clearly indicate we should not build barriers to using GPL software • It is an issue because an agreement with PPARC exists that says the LCG will use essentially the EDG license, which is non-GPL • Should be close to being settled (positively)

  28. Persistency Framework (POOL) Project Dirk Duellmann • To deliver the physics data store for ATLAS, CMS, LHCb • POOL V1.0 released last week • Preparatory to the L1 LHCC milestone next month • Almost ‘feature complete’ with respect to June needs; focus now on debugging, performance, documentation • On target to meet the June milestone • All functionality asked for by the experiments for the June production release should be there • Should provide stably supported (1 year) format for data files • Serious experiment trials and feedback for releases has only just started • We will surely uncover bugs and surprises; we hope not major • Expect to be ready to deploy on LCG-1 in July • Initial users: CMS, ATLAS, later LHCb

  29. POOL (2) • The next several months of serious trials will tell: how close or far are we from a truly usable product • Manpower situation not bad • Temporary shortfalls in manpower from experiments made up with increased contributions from IT/DB • Effort available as data migration of COMPASS et al is completed • Dirk not asking for new manpower • Sent SC2 a proposal to bring common conditions database work into project scope (would be a work package distinct from POOL)

  30. Core Libraries and Services (SEAL) Project Pere Mato • Provide foundation and utility libraries and tools, basic framework services, object dictionary, component infrastructure • Facilitate coherence of LCG software and integration with non-LCG software • Development uses/builds on existing software from experiments (e.g. Gaudi, Iguana elements) and C++, HEP communities (e.g. Boost) • Basically on schedule, and manpower is OK • Successfully delivering POOL’s needs, the top priority • CLHEP accepted our proposal to ‘host’ the project • Also reflects appeal of SPI-supported services • Math library project incorporated into SEAL as work package • Attention to grid services on hold

  31. SEAL Schedule Released 14&26/02/03 Released 04/04/03

  32. Physicist Interface (PI) Project Vincenzo Innocente • Analysis Services – active • AIDA and its interfaces to ROOT, POOL, SEAL • Improvements to AIDA interface to histograms, tuples implemented and offered to users for evaluation • Evaluation in progress • Changes make AIDA more amenable to a ROOT implementation, which is proceeding • Will be the supported LCG implementation of AIDA • Analysis Environment – partly on hold by SC2 • Interactivity, visualization, bridge to/from ROOT • Interactivity and ROOT bridge joint work with SEAL • Pool & Grid PI – on hold by SC2 • Event & Detector Visualization – on hold by SC2

  33. Simulation Project Torre Wenaus et al • Activity is ramping up following a work plan approved by the SC2 in March • Principal development activity will be a generic simulation framework • Expect to build on existing ALICE work • Incorporates CERN/LHC Geant4 work • FLUKA team participating for framework integration, physics validation • Simulation physics validation subproject very active already • Generator services subproject starting up under MC4LHC oversight • Shower parameterisation subproject not yet fleshed out

  34. Project Organization Geant4 Project FLUKA Project Experiment Validation MC4LHC Simulation Project Leader Subprojects Framework Geant4 FLUKA integration Physics Validation Shower Param Generator Services WP WP WP WP WP Work packages WP WP WP WP WP WP WP WP

  35. Simulation Subprojects • Generic simulation framework • Subproject leader: Andrea Dell’Acqua • Geant4 • Subproject leader: John Apostolakis • FLUKA integration • Subproject leader: Alfredo Ferrari • Physics validation • Subproject leader: Fabiola Gianotti • Generator services • Subproject leader: Paolo Bartalini

  36. Simulation Project High Level Milestones • 2003/6: Decide generic framework high level design, implementation approach, software to be reused • 2003/6: Generator librarian and alpha version of support infrastructure in place • 2003/7: Simulation physics requirements revisited • 2003/8: Detector description proposal to SC2 • 2003/9: 1st cycle of EM physics validation complete • 2003/12: Generic simulation framework prototype available with G4 and FLUKA engines • 2004/1: 1st cycle of hadronic physics validation complete • 2004/3: Simulation test and benchmark suite available • 2004/9: First generic simulation framework production release • 2004/12: Final physics validation document complete

  37. Software Process and Infrastructure (SPI) Project Alberto Aimar • All tools and services in place, most in use, some (nightly build system) still being deployed • Important contributions from experiments (SCRAM, Oval, NICOS, …) • Policies on code standards and organization, testing, documentation almost complete • Good QA activity, applied mainly to POOL so far • Currently navigating through personnel transitions • Planned transitions of personnel to other projects • Generally with a continuing ‘maintenance’ level of SPI participation • Two new LCG full timers added in Dec/Jan, another expected in June, also two new participants from EP/SFT • Manpower level OK, we just have to sustain an adequate level • Plan and personnel time profile at the end of this month • Savannah portal a great success with 54 projects, 275 users at present • Used by LCG-App, -GD, -Fabric, 3 experiments, CLHEP

  38. Savannah

  39. Documentation and training • All projects instrumented with code referencing tools (LXR, dOxygen, viewCVS), but work needed on written documentation • June releases targeted for complete ‘user’ documentation • Documentation requirements being formalized with SPI providing guidelines and templates • Documentation requirements as prerequisites to a release • Doing the same with testing • Active training program • Very successful ROOT course; second has now been scheduled • SCRAM course ready, public course soon • POOL, SEAL courses will follow, after the June major releases • Starting to record courses for web presentation (Syncomat) • Apps area document registry established

  40. External Participation • Examples: • POOL collections (US) • POOL RDBMS data storage back end (India) • POOL tests (UK) • POOL-driven ROOT I/O development & debugging (US) • SEAL basic framework services (France) • SPI tools – Oval, NICOS (France, US) • Math libraries (India) • Opportunities: • Throughout the simulation project • Several PI work packages • Tried to engage an external group in one; didn’t work • Unit and integration tests • E.g. POOL storage manager, persistency manager, data service • End-user examples

  41. External Participation • Engaging external participation well is hard, but we are working at it • Problems on both sides • Being remote is difficult • More than it needs to be… e.g. VRVS physical facilities issue – improving this month, but more improvements needed and much too difficult to arrange • Remote resources can be harder to control and fully leverage, and may be less reliably available • We work around it and live with it, because we must support and encourage remote participation

  42. Collaborations • Examples… • Apart from the obvious (the experiments, ROOT)… • GD: Requirements from apps for LCG-1; Savannah • Fabrics: POOL and SPI hardware, Savannah-Castor • GTA: Grid file access, … • Grid projects: EDG-RLS, EDG testbed contribution, software packaging/distribution • Geant4 • FLUKA • CLHEP hosting

  43. Take-up in the experiments • The real measure of success or failure… • Experiments now actively engaged in evaluation, integration, and planning deployment • ATLAS integrating POOL and SEAL into experiment framework and infrastructure • No significant problems so far, but not yet exercising POOL/SEAL functionality strenuously • CMS evaluated POOL and decided this month to target it for early (July) deployment in simulation production (‘PCP’ leading up to DC04) • LHCb will begin this summer • Beginning to define project milestones measuring take-up, tied to these experiment programs • As mentioned, will assign project people to assist

  44. Coherence and Focus • We are developing the coherent, component based architecture we are charged to build • Coherence evident in cross-project (POOL, SEAL, PI) collaboration on defining, developing, deploying components • Apart from “ROOT already does it all” arguments, apps area development efforts are not redundant and duplicative • Approach/design is chosen, in some cases after short explorations of the options, and set as the basis for work • Efforts that are divergent, redundant, or under consideration are isolated (in contrib) and not accounted as project effort (and are few in number)

  45. Concluding Remarks • POOL, SEAL and some PI software is now out there • Take-up is starting • On target for major June POOL/SEAL releases • L1 milestone set a year ago should be met • Manpower is appropriate • is at the level the experiments themselves estimated is required • is being used effectively and efficiently for the common good • is delivering what we are mandated to deliver • Apps area is impatient to have grid-related work in scope

More Related