1 / 19

Future UK e-Science Grid Middleware

Future UK e-Science Grid Middleware. Dr Steven Newhouse London e-Science Centre Department of Computing, Imperial College London. Contents. Grid Middleware UK e-Science Core Programme I & II LeSC Activities. Status of the Grid.

ianna
Télécharger la présentation

Future UK e-Science Grid Middleware

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Future UK e-Science Grid Middleware Dr Steven Newhouse London e-Science Centre Department of Computing, Imperial College London

  2. Contents • Grid Middleware • UK e-Science Core Programme I & II • LeSC Activities

  3. Status of the Grid • Today: ‘early adoption’ phase - just like the Web in the early days • Tomorrow: sophisticated combinations of services to locate information, applications to process it, and computer systems to run them • Requirements: Infrastructure to support: • e-Science • Virtual Organisations • e-Commerce • e-Utilities

  4. Exposing Resources as Services COMPUTE RESOURCES STORAGE RESOURCES SOFTWARE RESOURCES SERVICE LEVEL AGREEMENT • Defines: • What? • Who? • When? Permissible SLA C A B D

  5. Open Grid Services Architecture • OGSA addresses architectural issues related to broadly interoperable Grid Services. • OGSI based on the GSS provides mandatory features, such as service invocation, lifetime management, a service data interface, and security interfaces. • GT3 implementation of OGSI & equivalent GT2 services GT3 GT2 Jini Jxta OGSI

  6. UK e-Science Core Programme Edinburgh Glasgow Newcastle Manchester Belfast DL Cambridge Oxford Hinxton RAL Cardiff London Southampton • From Tony Hey’s slides at EPSRC Pilot Project meeting (end of Jan 03) • CP I: e-Science Centres & open calls • CP II: Plans not promises as dependent on funding • SR2002: £16M • DTI: £??M

  7. Core Programme 2: Overall Rationale • Assist development of essential, well-engineered, generic, Grid middleware usable by both e-scientists and industry • Provide necessary infrastructure support for UK e-Science Research Council projects • Collaborate with the international e-Science and Grid communities • Work with UK industry to develop industrial-strength Grid middleware

  8. Key Activities • UK e-Science Grid/Centres and e-Science Institute • Grid Support Centre and Network Monitoring • Core Middleware engineering • National Data Curation Centre • e-Science Exemplars/New Opportunities • Outreach and International involvement

  9. The e-Science Grid/Centres and the e-Science Institute • Continuation of e-Science Centre’s & Institute • 2 year extension for infrastructure after review • Further development of UK e-Science Grid • Collaborative Industrial Projects • Call for DTI collaborative industrial projects targeted to key middleware areas • Grid Support Centre & Network Monitoring

  10. Core Grid Middleware Activity • Need to develop open source, open standard compliant, Grid Middleware stack that will integrate and federate with industrial solutions • Software Engineering focus as well as R&D • Aim is to produce robust, well-documented, re-usable software that is maintainable and can evolve to embrace emerging Grid Service standards • Link UK activities with Europe & US • Reduce duplication of effort • Standards development & compliance testing

  11. Possible Core Grid Middleware Themes • Heterogeneous Database Integration • Security • Legal and regulatory • Accounting Systems for VOs • Collaborative Decision-making Networks • Enterprise Computing Systems • Outsourcing/e-Utilities • Real-time High End Computing • Enterprise Application Integration • Business Processes for VOs Links to EPSRC CS Research? • Autonomic Computing • Semantic Grid • Rapid Customised Assembly of Services • Trusted Ubiquitous Systems • ….

  12. ICENI • ICe-Science Networked Infrastructure • Developed by LeSC Grid Middleware Group • Collect and provide relevant Grid meta-data • Use to define and develop higher-level services • Interaction with other frameworks: Web Services, Jxta etc. The Iceni, under Queen Boudicca, united the tribes of South-East England in a revolt against the occupying Roman forces in AD60.

  13. ICENI Architecture Web Services Gateway Public Computational Community Computational Resource CR SR Identity Application Portal Manager JavaCoG Globus Private Administrative Domain Storage Resources Domain Manager CR CR Resource Browser Public Computational Community SR SR Network Resources SR Software Resources Application Mapper Policy Manager Resource Manager Resource CR Application Component Broker SR Design Tools Design Tools Gateway between private Public Private and public regions RESOURCES POLICY SERVICES USERS

  14. Service Oriented Architecture • ICENI interfaces for services & discovery • Platform neutral interfaces • Resource: Abstraction for a capability • Policy: How & where the resource is exposed • Service: Route for user interaction with a resource ICENI Services Integration & Interoperability Layer Policy Jini Jxta OGSA Portal OGSA Users & Clients

  15. User Interaction • API to discover & interact with services • Exploit Netbeans Application Framework

  16. Grid Economic Services Architecture(GGF-WG) Grid User/Actor Service Data Service Interface Grid Economic Service Interface OGSA Grid Banking Service Contract Verification Contract Negotiation Economic Service Data Service Charging OGSA Resource Usage Service Record Resource Usage OGSA Chargeable Grid Service Service Data Service Interface OGSA Grid Service

  17. How it might work… Request a price for users (…) to run jobs over 16 processors for the next 2 hours using auctioning. 30s lifetime to complete auction Request use of software library For the next 2 hours. Accept Flat rate fee of £2/hour. Container Factory User Container Factory

  18. What’s next? • Ontologies • For Scientific Software • For ICENI Services • For resources • Fuzzy Service Matching • Malleable & Ductile Scientific Components • Hard deadline scheduling for network & calculation • Computational Markets

  19. Acknowledgements • Director: Professor John Darlington • Technical Director: Dr Steven Newhouse • Research Staff: • Anthony Mayer, Nathalie Furmento • Stephen McGough, James Stanton • Yong Xie, William Lee • Marko Krznaric, Murtaza Gulamali • Asif Saleem, Laurie Young, Gary Kong • Support Staff: • Keith Sephton, Oliver Jevons, Sue Brookes • Contact: • http://www.lesc.ic.ac.uk/ • e-mail: lesc@ic.ac.uk

More Related