1 / 25

Bryan Lawrence Head, British Atmospheric Data Centre Rutherford Appleton Laboratory

The UK e-Science Program. Bryan Lawrence Head, British Atmospheric Data Centre Rutherford Appleton Laboratory b.n.lawrence@rl.ac.uk (with thanks to Tony Hey, the Director of the UK e-Science Program who provided most of the slides: Tony.Hey@epsrc.ac.uk ) . CEOS Meeting, Frascati, May 2002.

helena
Télécharger la présentation

Bryan Lawrence Head, British Atmospheric Data Centre Rutherford Appleton Laboratory

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The UK e-Science Program Bryan Lawrence Head, British Atmospheric Data Centre Rutherford Appleton Laboratory b.n.lawrence@rl.ac.uk (with thanks to Tony Hey, the Director of the UK e-Science Program who provided most of the slides: Tony.Hey@epsrc.ac.uk ) CEOS Meeting, Frascati, May 2002

  2. Outline • The Grid, UK e-science and the context • The UK e-science initiative • The core Programme • Support for e-science projects and international involvement • The Grid Network Team • Grid Middleware R&D. • Projects: e-healthcare (MIAS), MyGrid, ClimatePrediction.com, the NERC DataGrid. • Concluding statements

  3. E-Science and the Grid ‘e-Science is about global collaboration in key areas of science, and the next generation of infrastructure that will enable it.’ ‘e-Science will change the dynamic of the way science is undertaken’ John Taylor Director General of Research Councils Office of Science and Technology

  4. UK e-Science Initiative • £120M Programme over 3 years • £75M is for Grid Applications in all areas of science and engineering • £10M for Supercomputer upgrade • £35M ‘Core Program’ to encourage development of generic ‘industrial strength’ Grid middleware • Require £20M additional ‘matching’ funds from industry

  5. Excerpt from e-Science Director’s job objectives ‘Develop effective collaborative Core Programme projects between the science base, industry and national funding agencies, and ensure the application and outcomes from the projects.’

  6. Edinburgh Glasgow DL Newcastle Belfast Manchester Oxford Hinxton RAL Cardiff London Southampton UK e-Science Grid (1) National Centre: Edinburgh; + regional centres

  7. UK e-Science Grid (2) • All e-Science Centres donating resources to form a UK ‘national’ Grid • Supercomputers, clusters, storage, facilities • All Centres will run same Grid Software - Starting point is Globus, Storage Resource Broker and Condor • Work with Global Grid Forum and major computing companies (IBM, Oracle, Microsoft, Sun,….) • Aim to ‘industry harden’ Grid software to be capable of realizing secure VO vision

  8. Support for e-Science Projects • ‘Grid Starter Kit’; continually updated - maintain library of Open Source Grid m/w - http://www.gridsupport.ac.uk/gridcentre.shtml • Grid Support Centre in operation - leads Grid Engineering Group; supports users • Training Courses - first courses given • National e-Science Institute Research Seminar Programme – see website http://www.nesc.ac.uk

  9. Support for International Involvement • ‘GridNet’ funding - supports participation in the GGF • ‘Grid Fellowships’ in Geneva and US • Links with major US Centres - San Diego Supercomputer Center and NCSA • Joint UK-NSF ‘N+N’ Meeting on e-Science - held in San Fransisco last year • Other international collaborations - China, Singapore, India, ….

  10. Grid Network Team • Expert group to identify end-to-end network bottlenecks and other network issues - e.g. problems with multicast for Access Grid • Identify e-Science project requirements • Funding £0.5M traffic engineering/QOS project with PPARC, UKERNA and CISCO

  11. SuperJanet4, June 2002 20Gbps 10Gbps Scotland via Glasgow Scotland via Edinburgh 2.5Gbps 622Mbps WorldCom Glasgow WorldCom Edinburgh 155Mbps NNW NorMAN YHMAN WorldCom Leeds WorldCom Manchester Northern Ireland EMMAN MidMAN WorldCom Reading WorldCom London EastNet TVN External Links WorldCom Bristol WorldCom Portsmouth South Wales MAN LMN SWAN& BWEMAN Kentish MAN LeNSE

  12. Grid Middleware R&D • £16M funding available for industrial collaborative projects • £11M allocated to Centres projects plus £5M for ‘Open Call’ projects • Set up two Task Forces - Database Task Force (Chaired by Norman Paton from Manchester Centre) - Architecture Task Force (Chaired by Malcolm Atkinson, Director of NeSC)

  13. Generic Grid Middleware R&D • Reports on Globus, SRB/Databases and .NET middleware • limitations of present Grid Middleware • Developing UK ‘Road Map’ for evolution of present Grid Middleware -Short term improvements (6-12 months) - Longer term plans • adaptive, intelligent infrastructure • database interfaces allowing querying and extended transactions

  14. UK e-Science Projects • £75M for e-Science application ‘pilots’ - spans all sciences and engineering • Particle Physics and Astronomy (PPARC) - £20M GridPP and £6M AstroGrid • Engineering and Physical Sciences (EPSRC) - funding 6 projects at around £3M each • Biology, Medical and Environmental Science • £20M fund supporting a number of projects

  15. Core Funded Projects Many projects, Some core-funded, some joint + RC funding

  16. e-Healthcare Grand Challenge • Interdisciplinary Research Centre, MIAS: “From Medical Images and Signals to Clinical Information” • Funding £2M Joint IRC projects with MIAS on e-Healthcare application Example: Breast cancer surgery • normalization of mammography and ultrasound scans • FE modelling of breast tissue • Deliver useful clinical information to surgeon ensuring privacy and security

  17. Particle Physics and Astronomy (PPARC) • GridPP • links to EU DataGrid, CERN LHC Computing Project, US GriPhyN and PPDataGrid Projects, and iVDGL Global Grid Project • AstroGrid • links to EU AVO and US NVO projects

  18. EPSRC e-Science Projects 6 Projects: Comb-e-Chem, DAME, Reality Grid, MyGrid, GEODISE, Discovery Net, Example: My Grid: Personalised Extensible Environments for Data Intensive in silico Experiments in Biology • Manchester, EBI, Southampton, Nottingham, Newcastle, Sheffield, GSK, Astra-Zeneca, IBM, Sun

  19. MyGrid e-Science Workbench • Goal is to develop ‘workbench’ to support: • Experimental process of data accumulation • Use of community information • Scientific collaboration • Provide facilities for resource selection, data management and process enactment • Bioinformatics applications • Functional genomics, pattern database annotation

  20. ClimatePrediction.com • NERC thematic, e-science and national CORE demonstrator funding! • Partnership between

  21. ClimatePrediction.Com • Estimating Climate Uncertainty • - current estimates based on a handful of models. • need to consider predictions based on 1000’s of ensembles - harness the power of 10,000’s of PCs by providing downloadable model experiments

  22. Managing a petabyte-scale massively-distributed data archive HTTP HTTP Scientific investigators Participants & policy-makers Summary statistics HTTP (DODS URL) Live Access Server Obs ESG-II/NERC DataGrid Peer-to-peer visualisation Datamining GridFTP 100Tb of key output at 10-20 sites ConventionalFTP/HTTP 1Pb total output on 1M participants’ PCs

  23. TheNERC DataGrid The NERC DataGrid • Proposal to the Natural Environment Research Council: • Collaboration between two professional data centres, the CLRC e-science centre, and PCMDI (+ESG). • Aim to improve the ability to locate and use both observational and simulation data: build software clients. • Eventual expansion to include all NERC disciplines including Earth Observation (NEODC early adopter)

  24. 1 Catalogue Ingestor 4 Computation Other: e.g. PML/ESSC Local Catalogue Catalogue Client 3 XML Cat&Client Server (s) Python API 6 Catalogue Client Computation Graphics Based on ESG 2 Evolving to web services 5 NDG expected evolution At USER Institution Computation NERC DDC Data Repositories

  25. Concluding Statements • Wide variety of UK Application projects using clusters, supercomputers, data repositories, remote working tools etc. • Emphasis on support for data federation and annotation as much as computation • Metadata and ontologies key to higher level Grid services • For commercial success Grid needs to have interface to DBMS

More Related