1 / 13

UKQCD Achievements and Future Priorities

UKQCD Achievements and Future Priorities. QCDgrid. Who and what Achievements QCDgrid middleware Future priorities Demo of meta-data catalogue browser. Alan Irving University of Liverpool UKQCD. UKQCD and the Grid: QCDgrid architecture. Phase 1: data grid 

misu
Télécharger la présentation

UKQCD Achievements and Future Priorities

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. UKQCD Achievements and Future Priorities QCDgrid Who and what Achievements QCDgrid middleware Future priorities Demo of meta-data catalogue browser Alan IrvingUniversity of LiverpoolUKQCD

  2. UKQCD and the Grid:QCDgrid architecture • Phase 1: data grid  • Phase 2: distributed processing • Roll out to all UKQCD • ILDG: International Lattice Data Grid • (ILDG meeting in Edinburgh: Dec 19/20)

  3. QCDgrid people • The main players…… • James Perry – EPCC • jamesp@epcc.ed.ac.uk • Globus, EDG and QCDgrid middleware, browser (OGSA) • Chris Maynard – Edinburgh • cmaynard@ph.ed.ac.uk • XML schema, QCDgrid administrator • Craig McNeile – Liverpool • mcneile@amtp.liv.ac.uk • UKQCD Software Manager, QCDgrid user interface

  4. QCDgrid Achievements • We have a working data grid (J Perry, EPCC) • We have used it for routine work (C McNeile, Liverpool) • We can build our own 0.6 Tbyte RAID disk arrays for < £2000 (S Downing, Liverpool) • We have explicitly tested RAID systems by reformatting one drive (C Allton , Swansea) • We have a draft XML schema for lattice QCD data (C Maynard, Edinburgh) • We have started an International Lattice Data Grid project (R Kenway, Edinburgh)

  5. QCDgrid middleware (Globus 2.0, Linux 7.x) • Grid administration • control-thread.sh • add-qcdgrid-node • disable-qcdgrid-node • enable-qcdgrid-node • remove-qcdgrid-node • retire-qcdgrid-node • unretire-qcdgrid-node • get-disk-space • qcdgrid-checksum • qcdgrid-filetime • rebuild-qcdgrid-rc • create-qcdgrid-rc • delete-qcdgrid-rc • verify-qcdgrid-rc • Configuration • qcdgrid.conf • nodes.conf • nodeprefs.conf • User commands • qcdgrid-list • put-file-on-qcdgrid • get-file-from-qcdgrid • i-like-this-file • qcdgrid-delete

  6. QCDgrid: the future

  7. UKQCD is a distributed physics organisation Distributed computing  Grid Local disk farms and clusters QCDOC : source of configs Primary data from QCDOC QCDOC not directly on the grid Output -> binary + XML Front end feeds data to QCDgrid Data storage on RAID RAIDs as QCDgrid nodes Analyses on clusters Phase I: analysis on specified nodes pulling/pushing data from QCDgrid Phase II: cluster processing via grid middleware? QCDOC and the grid … QCDOC FE Grid Node

  8. QCDgrid: future plans Short term • Decide on database: native XML (Xindice) or relational + XML interface tools • Complete meta-data catalogue browser • Expand no of sites in Grid to 4 UKQCD + RAL • Decide on data-binding or other strategy for QCDOC I/O • Construct more RAID arrays. Longer term • Agree ILDG standard XML schema • Agree ILDG middleware functionality at web-services level • Monitor EDG M/W • Investigate EDG M/W for remote job processing • Test interoperability with other ILDG sites • Develop M/W to implement any new IDLG strategy

  9. QCDgrid and HPC resources requested Hardware • ‘Tier 1’: QCDOC + FE [JIF] • ‘Tier 2’: [PPARC, ‘son-of-SRIF’] • 6x100 node PC clusters • 6x20 Tbyte disk farms (could be integrated with LHC Tier 2 facilities?) Staff • Grid specific: [PPARC/GridPP, Esci] • 2 FTEx3years for QCDgrid and ILDG development • 1 FTEx3 years for maintenance, dissemination (shared with other theory groups) • Core SW development: [PPARC/HPC] • 4 phys. progs. across all UKQCD • Subdetector specific: • N.A.

  10. QCDgrid logical name browser

  11. QCDgrid metadata catalogue browser

  12. Output from Metadata Catalogue search trumpton.ph.ed.ac.uk:aci|aci> Query returned 427 results Grid filename: NF2/BETA526/CLOVER195/V16X32/KAPPA3450/GAUGE/D526C195K3450U017500.tar Grid filename: NF2/BETA526/CLOVER195/V16X32/KAPPA3450/GAUGE/D526C195K3450U017600.tar Grid filename: NF2/BETA526/CLOVER195/V16X32/KAPPA3450/GAUGE/D526C195K3450U017700.tar Grid filename: NF2/BETA526/CLOVER195/V16X32/KAPPA3450/GAUGE/D526C195K3450U017800.tar Grid filename: NF2/BETA526/CLOVER195...

  13. QCDgrid command line use (get-file-from-qcdgrid) ulgbcm.liv.ac.uk:aci|gridwork> getmoreconfigs creating an initial todo list 7 configs in initial list todo list shows 7 configs still to do but 5 have .tar already done, so delete any duplicates updating todolist now 2 configs still to be done getting NF2/BETA52/CLOVER202/V16X32/KAPPA3500/GAUGE/D52C202K3500U014060.tar from the grid getting NF2/BETA52/CLOVER202/V16X32/KAPPA3500/GAUGE/D52C202K3500U014070.tar from the grid now have:- -rw-r--r-- 1 aci aci 42139648 Dec 11 21:41 /users/aci/configs/D52C202K3500U014010.tar -rw-r--r-- 1 aci aci 42139648 Jan 27 21:07 /users/aci/configs/D52C202K3500U014020.tar -rw-r--r-- 1 aci aci 42139648 Jan 27 21:52 /users/aci/configs/D52C202K3500U014030.tar

More Related