1 / 9

Grid Computing

Explore global resource management in grid computing. Learn about clusters, LHC computing grid, collaborations among universities, and AustrianGrid project development.

mariec
Télécharger la présentation

Grid Computing

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Grid Computing Reinhard Bischof ECFA-Meeting March 26th 2004 Innsbruck

  2. Certificate /C=AT/O=UIBK/OU=HEPG/CN=host/grid.uibk.ac.at Where are the files I need ? Which resources are available ? Many, many other clusters Cluster 2 Cluster 1 Master Master Local resource management condor Local resource management PBS PC PC Grid: From local to global resource management Replica Location Service Certificate /C=AT/O=UIBK/OU=HEPG/CN=Nemo Information System (Top) Gatekeeper Information system GridFTP Gatekeeper Information system GridFTP Mapfile C=AT/O=UIBK/OU=HEPG/CN=Nemo .atlas job is executed e.g. as local user atlas003 Storage

  3. Atlas Data Challenge 2 • 3 Grid flavours will be used : • Nordugrid • Sweden (15), Denmark (10), Norway (3), Finland (3), Slovakia (1), Estonia (1) • Nordugrid develops Grid middleware, assists in software deployment, is not dedicated to physics, is not limited to Nordic countries • LHC Computing Grid (LCG) • based on middleware development of DataGrid project • resources in Europe, USA, Taiwan, Japan • goal: first truly world wide grid with 24h/7d/week operation (at least two Grid Operating Centres at two different time zones) • Grid 2003 (US-Project) • 25 sites across USA and South Korea • High Energy Physics Innsbruck: participates in LCG as tier2

  4. LCG LCG-Cluster UIBK-HEPHY Tier2 (under evaluation) LCFG grid02-04 WN grid06 WN UI WN Tier 1 GridKa Karlsruhe supports as primary site HEPHY Innsbruck in installing and testing the LCG-Cluster grid05 SE grid gate grid01 0.6 TB grid

  5. Grid 2003 2736 CPUs http://www.ivdgl.org/grid2003/ Nordugrid 1499 CPUs http://www.nordugrid.org LCG core sites: CNAF,CERN,FNAL,FZK,NIKHEF,PIC-Barcelona,RAL,Taipei 1844 CPUs (~1300 KSI2K) planned for 2Q: 3009 KSI2K http://goc.grid-support.ac.uk

  6. Access for grid users to student‘s PC-labs in Innsbruck during weekends, holidays, whenever they are not needed for teaching • Cooperation between • High Energy Physics group (part of pilot grid project 1) : installation, providing scripts, testing, documentation for users) • Central Information Technology Service (ZID) • Institute of Computer Science 1)Innsbrucker Hochenergie-Grid-Projekt (Federal Ministry for Education, Science and Culture)

  7. ZID-Grid (details) • Strategy: be as flexible as possible • one cluster with adequate size for each group (virtual organization), cluster size can be changed easily (default: one lab = one cluster) • easily change resource management (OpenPBS, Condor or Sun Grid Engine), one configuration file/cluster (add other services) • one timetable/cluster for grid opening hours • One wake up on lan server per location starts the machines of a cluster if the timetable says yes • Status informations of all clusters (ganglia and Globus MDS) are collected by a server (agrid) • Users: High Energy Physics, DPS (Distributed and Parallel Systems, Computer Science), Institute of Structural Analysis (starting next week), ... and members of the AustrianGrid project.

  8. ZID-Gridhttp://agrid.uibk.ac.at/ Administration, Information collector, job submission agrid server DPS (Distributed and Parallel systems) Grid machines jobs status arch_14 Master Technik start Wake up on LAN server Slaves Innrain Student‘s lab Sowi

  9. AustrianGridhttp://www.gup.uni-linz.ac.at/austriangrid/ • AustrianGrid project starts April 1st 2004 • initiated by High Energy Physics • Partners in Vienna, Linz, Innsbruck • 1st technical meeting December 4th 2003: • several resources already available via grid • Linz • SCI-Cluster 9 Knoten/18 CPUs, Myrinet Cluster 8 Knoten / 16 CPUs • SGI Altix 3000 128 CPUs, SGI Origin 3800 128 CPUs • Vienna • Cluster Bridge 16*4 CPUs, Cluster Gescher 16 CPUs, other clusters • Innsbruck • ZIDGrid-Cluster (~ 192 CPUs), 14 Cluster • AustrianGrid certification authority in Vienna (L. Lifka VCPC) , needs to be accepted by partners outside Austria (CPS Certificate Practice Statementprepared in Innsbruck, reviewed by partners) • Upcoming activities • information system collecting status of all resources • adding grid gates to more resources

More Related