1 / 21

MoSGrid AP2: Portale

MoSGrid AP2: Portale. Sandra Gesing sandra.gesing@uni-tuebingen.de Simulation Biologischer Systeme Eberhard-Karls-Universität Tübingen. 21.06.2010. Inhalt. Evaluierung Architektur P-GRADE Demonstration Aktuelle Arbeiten SHIWA IWSG’10. Evaluierung. Portal Frameworks Liferay

deliz
Télécharger la présentation

MoSGrid AP2: Portale

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. MoSGridAP2: Portale Sandra Gesing sandra.gesing@uni-tuebingen.de Simulation Biologischer Systeme Eberhard-Karls-Universität Tübingen 21.06.2010

  2. Inhalt • Evaluierung • Architektur • P-GRADE Demonstration • Aktuelle Arbeiten • SHIWA • IWSG’10 Sandra Gesing - MoSGrid - AP2 Portale

  3. Evaluierung • Portal Frameworks • Liferay • Pluto • GateIn (JBoss + Exo) • Workflow-enabled Grid Portal • P-Grade Sandra Gesing - MoSGrid - AP2 Portale

  4. Evaluierung • Administrator Seite • JSR 168/268 • Unicore 6 • Zeit und Aufwand für Installation/Implementation • Support • Sicherheit • Monitoring • User Seite • Benutzbarkeit • Effizienz • Workflow • Sicherheit • Monitoring Focus Sandra Gesing - MoSGrid - AP2 Portale

  5. Evaluierung Sandra Gesing - MoSGrid - AP2 Portale

  6. Evaluierung • P-GRADE • Workflows • Workflow-Editor • Grid • Monitoring • 20 Personenjahre Entwicklung • 30 Entwickler in der Community • Installation aufwendig • Umstellung auf Liferay Sandra Gesing - MoSGrid - AP2 Portale

  7. Grid interoperation by P-GRADE portal • P-GRADE Portal enables: Simultaneous usage of several production Grids at workflow level • Currently connectable grids: • LCG-2 and gLite: EGEE, SEE-GRID, BalticGrid • GT-2: UK NGS, US OSG, US Teragrid • Campus Grids with PBS or LSF • BOINC desktop Grids • ARC: NorduGrid • In prototype: • Clouds (Eucalyptus, Amazon) • Planned: • UniCore: D-Grid (joint work with MosGrid)

  8. 2006 2010 2009 2008 2003 Open source from Jan. 2008 P-GRADE portal family P-GRADE portal 2.4 GEMLCA Grid Legacy Code Arch. P-GRADE portal 2.5 Param. Sweep NGS P-GRADE portal GEMLCA, repository concept Basic concept P-GRADE portal 2.8 WS-PGRADE Portal Beta release 3.1 P-GRADE portal 2.9.1 Current release WS-PGRADE Portal Release 3.2

  9. Main features of P-GRADE portal Supports • generic, workflow-oriented applications • parameter sweep (PS) applications with new super-workflow concept • A. Balasko: Flexible PS application management in P-GRADE portal • 3-level parallelism (MPI, WF-branch, PS) • Simultaneous access of wide variety of resources • Z. Farkas: PBS and ARC integration to P-GRADE portal • P. Kacsuk: P-GRADE and WS-PGRADE portals supporting desktop grids and clouds • Access to workflow repository • Akos Balasko and Miklos Kozlovszky: SEE-GRID and EGEE Portal applications • Development of application specific portals • Andreas Quandt and Lucia Espona Pernas: Portal for Proteomics • Tamas Kiss, Gabor Terstyanszky, Zsolt Lichtenberger, Christopher Reynolds: Rendering Portal Service for the Blender User Community

  10. WS-PGRADE and gUSE • New product in the P-GRADE portal family: • WS-PGRADE (Web Services Parallel Grid Runtime and Developer Environment) • WS-PGRADE uses the high-level services of • gUSE (Grid User Support Environment) architecture • Integrates and generalizes P-GRADE portal and NGS P-GRADE portal features • Advance data-flows (PS features) • Built-in GEMLCA • Built-in Workflow repository • gUSE advanced features • Scalable architecture (written as set of services and can be installed on one or more servers) • Can execute simultaneously very large number of jobs (100.000 – 1.000.000) • Various grid submission services (GT2, GT4, LCG-2, gLite, BOINC, local) • Built-in inter-grid broker (seamless access to various types of resources and grids) • Comfort features • Different separated user views supported by gUSE application repository • See details in: • M. Kozlovszky and Peter Kacsuk: WS-PGRADE portal and its usage in the CancerGrid project • WS-P-GRADE portal tutorial • Drawback: • Not as stable and matured as P-GRADE

  11. P-GRADE portal family summary

  12. Job Workflow Leeds Job Job Job Simultaneous use of production Grids at workflow level UK NGS GT2 Manchester SZTAKI Portal Server P-GRADE Portal User EGEE-VOCE gLite Budapest Supports both direct and brokered job submission WMS broker Athens Brno

  13. Architektur P-Grade Portal (integrierter Workflow-Editor) Workflow Engine Repository Grid Middleware (Unicore 6) Services Repository Hardware (lokal oder im Grid, Cloud, Internet eingebunden) Batch System Sandra Gesing - MoSGrid - AP2 Portale

  14. Aktuelle Arbeiten • P-GRADE Installation • Gaussian/Gromacs Portlets • Unicore Anbindung Sandra Gesing - MoSGrid - AP2 Portale

  15. SHIWASHaringInteroperableWorkflows for Large-ScaleScientific Simulations on Available DCIs Introduction 2010-04-22 Start date: 2010-07-01 Duration: 24 months SHIWA consortium http://shiwa-workflow.eu SHIWA is supported by the FP7 Capacities Programme under contract No RI-261585

  16. Main objectives of SHIWA • To enable developing workflows, uploading them to a repository, searching, downloading and re-using them inside and through Virtual Research Communities • To achieve coarse- and fine-grained workflow interoperability to enable workflow sharing • To support Virtual Research Communities in design and implementation workflows to run in-silico experiments • To improve interoperability among Distributed Computing Infrastructures at workflow level • To simplify access to Distributed Computing Infrastructures to run workflows on multiple DCIs • To promote the use of European e-Infrastructures among simulation communities from different disciplines

  17. Workflow Interoperability by SHIWA 17

  18. Project partners of SHIWA

  19. Organisation of work (WPs)

  20. IWSG‘10 • International Workshop on Science Gateways for e-Science • Nachfolgeworkshop von IWPLS’09 • 20. – 22. September 2010 • Catania auf Sizilien • Talks, Lightning Talks, Poster Session • Deadline Ende August Sandra Gesing - MoSGrid - AP2 Portale

  21. Vielen Dank für Ihre Aufmerksamkeit. Fragen? Sandra Gesing - MoSGrid - AP2 Portale

More Related