1 / 18

Paul Avery University of Florida phys.ufl/~avery/ avery@phys.ufl

Opening and Overview. Paul Avery University of Florida http://www.phys.ufl.edu/~avery/ avery@phys.ufl.edu. GriPhyN External Advisory Meeting Marina del Rey, April 12, 2001. U Florida U Chicago Boston U Caltech U Wisconsin, Madison USC/ISI Harvard Indiana Johns Hopkins Northwestern

ban
Télécharger la présentation

Paul Avery University of Florida phys.ufl/~avery/ avery@phys.ufl

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Opening and Overview Paul Avery University of Florida http://www.phys.ufl.edu/~avery/ avery@phys.ufl.edu GriPhyN External Advisory Meeting Marina del Rey, April 12, 2001 Paul Avery

  2. U Florida U Chicago Boston U Caltech U Wisconsin, Madison USC/ISI Harvard Indiana Johns Hopkins Northwestern Stanford U Illinois at Chicago U Penn U Texas, Brownsville U Wisconsin, Milwaukee UC Berkeley UC San Diego San Diego Supercomputer Center Lawrence Berkeley Lab Argonne Fermilab Brookhaven Who We Are Paul Avery

  3. GriPhyN = App. Science + CS + Grids • GriPhyN = Grid Physics Network • US-CMS High Energy Physics • US-ATLAS High Energy Physics • LIGO/LSC Gravity wave research • SDSS Sloan Digital Sky Survey • Strong partnership with computer scientists • Design and implement production-scale grids • Investigation of “Virtual Data” concept (fig) • Integration into 4 major science experiments • Develop common infrastructure, tools and services • Builds on existing foundations: Globus tools • Multi-year project • Grid R&D • Development, deployment of “Tier 2” hardware, personnel (fig) • Education & outreach Paul Avery

  4. GriPhyN Data Grid Challenge “Global scientific communities, served by networks with bandwidths varying by orders of magnitude, need to perform computationally demanding analyses of geographically distributed datasets that will grow by at least 3 orders of magnitude over the next decade, from the 100 Terabyte to the 100 Petabyte scale.” Paul Avery

  5. Tier 0 (CERN) 3 3 3 3 T2 T2 3 T2 Tier 1 3 3 T2 T2 3 3 3 3 3 3 4 4 4 4 Data Grid Hierarchy Tier0 CERNTier1 National LabTier2 Regional Center at UniversityTier3 University workgroupTier4 Workstation GriPhyN: • R&D • Tier2 centers • Unify all IT resources Paul Avery

  6. Tier2 Center Tier2 Center Tier2 Center Tier2 Center Tier2 Center HPSS HPSS HPSS HPSS LHC Global Grid Hierarchy Experiment ~PBytes/sec Online System ~100 MBytes/sec Bunch crossing per 25 nsecs.100 triggers per secondEvent is ~1 MByte in size CERN Computer Center > 20 TIPS Tier 0 +1 HPSS 2.5 Gbits/sec France Center Italy Center UK Center USA Center Tier 1 2.5 Gbits/sec Tier 2 ~622 Mbits/sec Tier 3 Institute ~0.25TIPS Institute Institute Institute 100 - 1000 Mbits/sec Physics data cache Physicists work on analysis “channels”. Each institute has ~10 physicists working on one or more channels Tier 4 Workstations,other portals Paul Avery

  7. GriPhyN I Funded (R&D) • NSF results announced Sep. 13, 2000 • $11.9M from NSF Information Technology Research Program • $ 1.4M in matching from universities • Largest of all ITR awards • Scope of ITR funding • Major costs for people, esp. students, postdocs • 2/3 CS + 1/3 application science • Industry partnerships needed to realize scope • Still being pursued • Education and outreach • Reach non-traditional students and other constituencies • University partnerships • Grids “natural” for integrating intellectual resources from all locations • E/O led by UT Brownsville (Romano, Campanelli) Paul Avery

  8. GriPhyN Management Needs • GriPhyN is a complex project • 17 universities, SDSC, 3 labs, >40 active participants • 4 physics experiments providing frontier challenges • GriPhyN I funded primarily as an IT research project • 2/3 CS + 1/3 physics • Need to balance and coordinate • Research creativity with project goals and deliverables • GriPhyN schedule with 4 experiment schedules • GriPhyN design and architecture with that of other projects whose work will be used by LHC or other experiments • PPDG, EU DataGrid • GriPhyN deliverables with those of other datagrid projects Paul Avery

  9. GriPhyN Management Organization • Project Leadership • Project Directors: Paul Avery, Ian Foster • Project Coordinator (active search) • Advisory Committees • Project Coordination Group (weekly meetings) • Collaboration Board (not met yet) • External Advisory Board (1-2 times per year) • Coordinators • Industrial Programs • Outreach/Education • System Integration • NSF Review Committee Paul Avery

  10. Internet 2 NSF PACIs DOE Science Project Coordinator NSF Review Committee System Integration Project Directors Paul Avery, Ian Foster External Advisory Board Industrial Programs Alex Szalay Outreach/Education Joseph Romano Collaboration Board Chair: Paul Avery Project Coordination Group Other Grid Projects Major Physics Experiments Applications Coord.: H. Newman ATLAS (Rob Gardner) CMS (Harvey Newman) LSC(LIGO) (Bruce Allen) SDSS (Alexander Szalay) CS Research Coord.: I. Foster Execution Management (Miron Livny) Performance Analysis (Valerie Taylor) Request Planning & Scheduling (Carl Kesselman) Virtual Data (Reagan Moore) Technical Coordination Committee Chair: J. Bunn H. Newman + T. DeFanti (Networks) A. Szalay + M. Franklin(Databases) T. DeFanti (Visualization) R. Moore(Digital Libraries) C. Kesselman(Grids) P. Galvez + R. Stevens (Collaborative Systems) VD Toolkit Development Coord.: M. Livny Requirements Definition & Scheduling (Miron Livny) Integration & Testing (Carl Kesselman?) Documentation & Support (TBD) Paul Avery

  11. GriPhyN Management Organization • Technical Organization • Computer Science Research • Virtual Data Toolkit Development • Application (Physics experiment) Projects • Liaison with Experiments • Reps on Project Coordination Group • Subgroups in Application Projects organization • Directors have direct contact with experiment computing leaders • Liaison with Other Datagrid Projects • Common participants with PPDG • Cross committee memberships with EU Datagrid • Datagrid Coordination meetings • First was March 4 in Amsterdam • Next June 23 in Rome Paul Avery

  12. A Common Infrastructure Opportunity • Particle Physics Data Grid (US, DOE) • Data Grid applications for HENP • Funded 2000, 2001 • http://www.ppdg.net/ • GriPhyN (US, NSF) • Petascale Virtual-Data Grids • Funded 9/2000 – 9/2005 • http://www.griphyn.org/ • European Data Grid (EU) • Data Grid technologies, EU deployment • Funded 1/2001 – 1/2004 • http://www.eu-datagrid.org/ • HEP in common • Focus: infrastructure development & deployment • International scope Paul Avery

  13. Data Grid Project Collaboration • GriPhyN + PPDG + EU-DataGrid + national efforts • France, Italy, UK, Japan • Have agreed to collaborate, develop joint infrastructure • Initial meeting March 4 in Amsterdam to discuss issues • Future meetings in June, July • Preparing management document • Joint management, technical boards + steering committee • Coordination of people, resources • An expectation that this will lead to real work • Collaborative projects • Grid middleware • Integration into applications • Grid testbed: iVDGL • Network testbed: T3 = Transatlantic Terabit Testbed Paul Avery

  14. iVDGL • International Virtual-Data Grid Laboratory • A place to conduct Data Grid tests at scale • A concrete manifestation of world-wide grid activity • A continuing activity that will drive Grid awareness • A basis for further funding • Scale of effort • For national, international scale Data Grid tests, operations • Computationally and data intensive computing • Fast networks • Who • Initially US-UK-EU • Other world regions later • Discussions w/ Russia, Japan, China, Pakistan, India, South America Paul Avery

  15. Status of Data Grid Projects • GriPhyN • $12M funded by NSF/ITR 2000 program (5 year R&D) • 2001 supplemental funds requested for initial deployments • Submitting 5-year proposal ($15M) to NSF to deploy iVDGL • Particle Physics Data Grid • Funded in 1999, 2000 by DOE ($1.2 M per year) • Submitting 3-year Proposal ($12M) to DOE Office of Science • EU DataGrid • €10M funded by EU (3 years, 2001 – 2004) • Submitting proposal in April for additional funds • GridPP in UK • Submitted proposal April 3 ($30M) • Japan, others? Paul Avery

  16. GriPhyN Activities Since Sept. 2000 • All-hands meeting Oct. 2-3, 2000 • Architecture meeting Dec. 20 • Smaller meetings between CS-experiments • Preparation of requirements documents by experiments • Architecture document(s) • Included in architecture definition for EU DataGrid • Mar. 4 meeting to discuss collaboration of Grid projects • All-hands meeting April 9, 2001 • Hiring still proceeding (2/3 finished) • Submitting new proposal Apr. 25, 2001 Paul Avery

  17. Discussion Points • Maintaining the right balance between research and development • Maintaining focus vs. accepting broader scope • E.g., international collaboration • E.g., GriPhyN in the large (GriPhyN II) • E.g., Terascale • Creating a national cyberinfrastructure • What is our appropriate role Paul Avery

  18. Discussion Points • Outreach to other disciplines • Biology, NEES, … • Outreach to other constituencies • Small universities, K-12, public, international, … • Virtual data toolkit • Inclusive or focused? • Resource issue, again • Achieving critical mass of resources to deliver on the complete promise Paul Avery

More Related