1 / 62

U.S. ATLAS Computing

U.S. ATLAS Computing. Overview Status of ATLAS computing U.S. ATLAS Project Management Organization Status of efforts Core software Subsystems Facilities Schedule FY 00 funding request Summary. Scale of Computing Effort.

Télécharger la présentation

U.S. ATLAS Computing

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. U.S. ATLAS Computing • Overview • Status of ATLAS computing • U.S. ATLAS • Project Management Organization • Status of efforts • Core software • Subsystems • Facilities • Schedule • FY 00 funding request • Summary

  2. Scale of Computing Effort • Rough scaling of factors of 5 to 1E+3 in relevant parameters from Tevatron Experiments • Manpower x5 • CPU/event x1E+3 (event complexity) • Data volume x10 to x1E+2 (channel count) • Distribution of data x10 • U.S. effort comparable to scale of Tevatron experiment. • Effort $15M/year

  3. Scales from experience

  4. Goals for the next year • Project organization • Management • Identify areas of responsibility • Integration of efforts into ATLAS • Inception/development of software • U.S. support facilities • Planning/development of infrastructure • Prepare for reviews

  5. International ATLAS • New Computing Coordinator • Norman McCubbin (RAL) • Available full time November • Approval vote - ATLAS CB June 10th • Responsibility: Core software • New Physics Coordinator • Fabiola Gianotti (CERN) • Approval vote - ATLAS CB June 10th • Detector specific sim/reconstruction • Organized within subsystems

  6. Architecture Taskforce • Software partitioned into work packages • Katsuya Amako, KEK • Laurent Chevalier, CEA • Andrea Dell’Acqua, CERN • Fabiola Gianotti, CERN • Steve Haywood, RAL (Chair) • Jurgen Knobloch, CERN • Norman McCubbin, RAL • David Quarrie, LBL • R.D. Schaffer, LAL • Marjorie Shapiro, LBNL • Valerio Vercesi, Pavia

  7. Architecture T.F. Status • Three meetings so far • Directions: • Language: C++ (allow for migration to other e.g. JAVA) • Examine GAUDI (LHCb) architecture • Adoption of “use cases” • Goals for October • Outline of architecture design • Appointment of Chief Architect • Commission work on prototyping of parts of design • Create use-cases, requirement document • Define packages and relations (package diagram)

  8. Stages in Software Management

  9. Quality Control • Recommend software performance specifications, review process • Makoto Asai, Hiroshima • Dario Barberis, Genoa • Martine Bosman, Barcelona • Bob Jones, CERN • Jean-Francois LaPorte, CEA • Helge Meinhard, CERN • Maya Stavrianakou, CERN

  10. Action on other Groups • National Board • Supported platforms • Regional centers • Training • Network of national contacts for training • C++, OO programming • GEANT 4 • ATLAS Specific

  11. ATLAS/CERN Schedule ‘00 • Sept. ‘99 • Start of Cashmore/Hoffman review • Oct. ‘99 • Report of architecture T.F. • Commissioning of prototyping code • Jan ‘00 • Start preparations for bilateral agreements • Fall ‘00 • Report of Cashmore/Hoffman review • MOU preparations

  12. U.S. Participation • Frank Paige - Co- convenor of SUSY working group • David Malon - Co-leader of database group • Craig Tull - Architecture Group • Ian Hinchliffe - Leader of Event Generator group • David Quarrie, Marjorie Shapiro - Architecture Task Force • John Parsons - Co-convenor of Top working group • Misha Leltchouk - L Ar simulation coordinator • Michael Shupe - Convenor of Background working group • Fred Luehring - TRT software coordinator • Steve Goldfarb - Muon Database Coordinator • Tom LeCompte - Tilecal Database Coordinator • Krzys Sliwa - Chair of ATLAS World-wide computing group • Frank Merritt - Training contact, Tilecal Reconstruction coord. • Bruce Gibbard - Regional center contact • John Huth- National Board contact

  13. U.S. ATLAS Computing • NSF, DOE: LHC computing activities are now “projectized” • Implications for U.S. ATLAS: • Direct reporting lines through Project Manager (Bill Willis) and BNL Directorate (Tom Kirk) • Appointment of Associate Project Manager for Computing and Physics (John Huth) • Implications for Agencies: • Must clarify reporting lines, operations

  14. Proposed Management

  15. Management Structure • Reflects flow of deliverables to, from ATLAS • Appointments (2 year renewable terms) • Physics: Ian Hinchliffe (LBNL) • Facilities: Bruce Gibbard (BNL) + deputy • Issues • Software manager • Availability within U.S. ATLAS - hire? • Flatter structure for the time being? • Bring on soon? Most desirable!

  16. Management Plan • Associate Project Manager • Member of E.C. • Develop and execute project plan • Establish and maintain project organization+Tracking • Develop annual budget requests • Liason to ATLAS Computing Management • Appoint L2 managers • Review and approve MOU’s to CERN and Institutes • Exercise change control authority • Establish advisory committees where appropriate • Provide reports and organize reviews

  17. Implications for APM • APM is a time consuming job. • Actions for John Huth: • Relief from MDT electronics coordinator (Jay Chapman, U. Michigan) • Relief from U.S. ATLAS Inst. Bd. Chair (Jim Siegrist, LBNL) • Teaching relief - spring terms ‘00 and ‘01 • Granted by Harvard University

  18. Level 2 Managers • Appointed by APM, concurrance of Exec. Comm. • Members of E.C. (+ APM, + deputy) • Two year renewable terms • Generic responsibilities • Develop definition of milestones and deliverables • Define, with APM, organizational substructure of level 2 • Develop, with APM, annual budget proposals • Identify resource imbalances within subprojects and recommend adjustments • Deliver scope of subproject on time within budget • Maintain cost and schedule • Provide reports to APM, PM • Liason with counterparts at CERN

  19. Specific Responsibilities • Physics Manager • Generators, physics objects, benchmark studies, mock data challenge • Software • Core • Detector specific sim/recon • Training • Facilities • Tier 1,2, networking, support

  20. Project Engineer • Same roles as project engineer’s for construction project • Tracking • Reviews, oversight • Reporting • Technical input

  21. Proposed Names • Physics: Ian Hinchliffe (LBNL) • Contacted, accepted • Policy question: physicists on project • Software: search underway • Facilities: Bruce Gibbard (+deputy) • Deputy: Jim Shank

  22. Facilities Manager • Bruce Gibbard (BNL) • Proposal to add U.S. ATLAS Deputy • U.S. ATLAS and RHIC Deputies • General agreement with Bruce, Tom Kirk • Begin to fill in other areas • Networking • Tier 1 • Remote sites • Support

  23. Detector Contacts • L Ar - Srini Rajagopalan (BNL) • Tilecal - Frank Merritt (U.Chicago) • ID- Laurent Vacavant (LBNL) • Muon - Bing Zhou (U. Michigan) • TRT - Keith Baker (Hampton) • Trigger/DAQ -Andy Lankford (UCI)

  24. Planning Activities • Writing/preparation assignments for review - next week • L2 managers where appropriate (ie. Facilities, physics) • Core • Sim/recon • Training • Management (PMP for computing) • MRE/IT “team” • Review in October

  25. WBS Structure • Should be flexible while project definition is underway (level 3+beyond) • Level 2’s should be fixed now • Commensurate with management structure • Adopt lead number “2”

  26. High Levels of WBS • Draft WBS • 2.1 Physics • Generators, benchmarks, mock data challenges, physics objects • 2.2 Software • 2.2.1 Core • Control/Framework,database, event model, analysis tools • 2.2.2 Detector specific simulation and recon. • 2.2.3 Collaborative tools • 2.2.3 Training • 2.3 Facilities • Regional center, remote sites, networking, support

  27. Near Term Activities/Issues • U.S. ATLAS Web-site • Weekly video conferences • Support role of BNL • Gathering FY 00 requests • Advisory group appointment • Writing assignments for proposal • NSF MRE/IT proposal - Tier 2 centers • Discussions of deliverables with ATLAS • Interactions with agencies • JOG, Computing review

  28. Software • Core Software • Control/Framework (Tull) • Database, Tilecal Pilot Project (Malon) • Event Model (Rajagopalan) • Detector-specific sim/reconstruction • Representatives from subsystems chosen • Training (Merritt) • Establishment of OO courses (BNL, U. Chicago)

  29. General Requirements • Software must last over lifetime of experiment, yet track language changes • Well defined interface layers • Maintainability, engineering critical • Number of users, use of software professionals • Adaptability to distributed environments • Learn from experiments working on OO (BaBar, D0, CDF, STAR)

  30. Database • David Malon (ANL) • Tilecal pilot project • Tilecal testbeam data in object database • Testbed for ATLAS technologies and strategies • Early feedback to developers • Generalized to other subsystems • Database core software • Transient and persistent object mapping • Definition of database/control interface • Specifications • Examine alternatives to Objectivity

  31. Tilecal Model

  32. Database Schedule

  33. Database Milestones • Jan ‘00 • Survey of mapping strategies • Feb. ‘00 • Infrastructure for developers deployed • April ‘00 • Validation of strategies in testbed • July ‘00 • Database management infrastructure defined • Oct ‘00 • Infrastructure for distributed access deployed • Jan ‘01 • Scalability test • Oct ‘01 • Beta release

  34. Control/Framework • Craig Tull (LBNL) • Working on user requirements document (w/ Hinchliffe, Shapiro, Vacavent) • Market survey of framework systems • Object component model • AC++ • Compatibility with ATLAS architecture • Resource loaded work plan exists • Work with A.T.F. for design requirements • Already have tested prototype designs

  35. Framework Milestones

  36. Framework Schedule

  37. GUI Interface (eg. Tk, …) Scripting Interface (eg. Tcl, …) Command Marshalling (eg. SWIG, ...) Software Bus (eg. CORBA) Component Class Adatpers Component C++ Classes/Objects One Framework Model

  38. Event Model • Type of objects that are stored • Client’s view of the event • Physical organization of the event • Client’s interface to the event • Mechanism to navigate between objects • Transient to Persistent Object mapping

  39. Event Model Milestones • Nov. 99 • Review of models in other exp’s • Requirements documents • Dec. 99 • Resource loaded schedule • Mar. 00 • Baseline design • June 00 • Alpha release • Aug. 00 • Review experience • FY 01 • Beta release

  40. Framework Options

  41. Some Detector Activities • TRT/ID • Put full TRT simulation into GEANT4 • L-Ar • Coil, cryos in GEANT4 (Nevis) • Accordian structure in GEANT4 (BNL) • Tilecal • Pilot project

  42. Some Detector Activities • Muon • Study of noise in Higgs-> 4 muon • Combined performance of ID+muon system (A reconstruction) • CSC into simulation • Trigger/DAQ • Comparison of switching architectures • Background studies • Optimization of shielding (100 MeV muon background)

  43. Training • New paradigm of OO programming • Training courses (F. Merritt) • Course offered at BNL (near future) • Course offered at Chicago • Successful programs seen at other experiments (CDF, D0, BaBar) • Ongoing need for training throughout course of experiment • Documentation • ATLAS-specific

  44. Software Development • Asymptotic level - est. 10 software professionals • Peak load (circa 2003) est. 16 S.P.’s • Extrapolations based on existing experiments and proposed areas of responsibility, fractional of U.S. participation • Choice of technology can influence actual needs strongly (e.g. BaBar, STAR in database)

  45. Planned Training • All by Object Mentor (BaBar, others) • Organized by Frank Merritt • Courses approximately 1 week long • Aug. 9 - BNL - OO Design - 13 people • Sept. 20 - U.C. - OO Design - 15 people • Oct. 18 - ANL or BNL - Advanced OO - 10 people • Nov. 8 - FNAL - GEANT 4 - 14 people

  46. Facilities • BNL ramping up support facility • Taps into RHIC Computing Facility • Major issue of Tier 1/2 facilities • Scale of “Tier 2’s” • Size for support staff, infrastructure • Computing model for U.S. (e.g. grids) • Being addressed in NSF MRE/IT proposal • In the process of developing policy on usage, support of platforms at institutions

  47. Facilities • Tier 1 (Regional Center) • BNL • Leverages RCF • ATLAS specific needs, however. • Primary support function for U.S. • Code release, support • Major processing, event store • Personnel scale estimate: • Roughly linear ramp from 2 FTE’s (now) to 22 or more (depending on computing model)

  48. MONARC • Models of Networked Architecture at Regional Centers (ATLAS+CMS) • Alexander Nazarenko, Tufts hire • Tasks: • Validate simulation models • Perform first simulations of LHC architectures • After Dec. ‘99, focus on planning for regional centers • Model validation - end of September • Understanding of U.S. computing facilities

  49. NSF MRE/IT Proposal • Tier 2 centers • Approx. 5 total • 256 node systems • 100 TB tape system • Low maintenance • Linked by computing grid • Computing professionals • Dual role - user/developers

  50. Review Process • Send proposals to Advisory Group (Aug. 4th) • Charge: • Identify areas of overlap/commonality • Suggest simplifications/savings • Coherency with ATLAS effort • Prioritize • Meet with Agencies to establish scale for FY 00, initial request (Now) • Confer with Advisory group on feedback • Prepare, review documentation for Dec. review (mid-late Sept.)

More Related