1 / 27

PRACE and the Greek Tier-1

PRACE and the Greek Tier-1. PRACE Autumn School 2014, Athens, Greece. Overview. PRACE Status History Tier-0 Resources The Greek Tier-1 System Status Technical Details. 8 billion hours granted since 2010 (a system with 9 00k cores for 1 year)

eudora
Télécharger la présentation

PRACE and the Greek Tier-1

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. PRACE and the Greek Tier-1 PRACE Autumn School 2014, Athens, Greece PRACE Autumn School 2014, Athens, Greece

  2. Overview • PRACE • Status • History • Tier-0 Resources • The Greek Tier-1 System • Status • Technical Details

  3. 8 billion hours granted since 2010 (a system with 900k cores for 1 year) • 303scientific projects enabled from 38countries • More than20SME and industriesaccess in first year • 360PATC Training days • 2734Trained people • 170applications enabled PRACE the European HPC Research Infrastructure • 25members, since 2010 • 6supercomputers in4hosting countries, nearly 15 Pflop/s • Common operation procedure between 35centers in Europe • 22prototypes evaluated • 169white papers produced • 1500communications from our users • 166Thesis • HPC Community building:183events • Enabling world-class science through large scale simulations • Providing HPC services on leading edge capability systems • Operating as a single entity to give access to world-class supercomputers • Attract, train and retain competences • Lead the integration of a highly effective HPCecosystem • Offering its resources through a single and fair pan-European peer review process to academia and industry

  4. 25 PRACE Members Legal entity (AISBL) PRACEwith seat location in Brussels • 67+ Million € from EC FP7 for preparatory and implementationGrants INFSO-RI-211528, 261557, and 283493Complemented by ~ 50 Million € from PRACE members • Austria: JKU - Johannes Kepler University of Linz • Belgium: DGO6-SPW – Service Public de Wallonie • Bulgaria: NCSA - Executive agency • Cyprus: CaSToRC –The Cyprus Institute • Czech Republic: VŠB - Technical University of Ostrava • Denmark: DCSC - Danish Center for Scientific Computing • Finland: CSC - IT Center for Science Ltd. • France: GENCI - Grand Equipement National de Calcul Intensif • Germany: GCS - GAUSS Centre for Supercomputing e.V • Greece: GRNET - Greek Research and Technology Network S.A. • Hungary: NIIFI - National Information Infrastructure Development Institute • Ireland: ICHEC - Irish Centre for High-End Computing • Israel: IUCC - Inter-University Computation Center • Italy: CINECA - Consorzio Interuniversitario • Norway: SIGMA – UNINETT Sigma AS – • The Netherlands: SURFSARA: SARA Computing and Networking Services • Poland: PSNC – Instytut Chemii Bioorganicznej Pan • Portugal: FCTUC – Faculdade Ciencias e Tecnologia da Universidade de Coimbra • Slovenia: ULFME - University of Ljubljana, Faculty of Mechanical Engineering • Spain: BSC – Barcelona Supercomputing Center – Centro Nacional de Supercomputación • Sweden: SNIC – Vetenskapsrådet – Swedish Research Council • Switzerland: ETH – Eidgenössische Technische Hochschule Zürich • Turkey: UYBHM – Ulusal Yuksek Basarimli Hesaplama Merkezi, • UK: EPSRC – The Engineering and Physical Sciences Research Council

  5. PRACE RI PRACE History HPC part of the ESFRI Roadmap;creation of a vision involving 15 European countries Creation of the PRACE Research Infrastructure Signature of the MoU Creation of the Scientific Case HPCEUR HET PRACE Initiative 2013 2014 2004 2005 2006 2007 2008 2009 2010 2011 2012 PRACE-3IP PRACE Preparatory Phase Project PRACE-1IP PRACE-2IP

  6. Three pillars of the PRACE Mission HPC for Industry Guaranteeing in Europe the independent access to HPC-competence for industry HPC for Science Implement the ESFRI vision of a European HPC-service at the top of an HPC provisioning pyramid HPC by Vendors Helping European (hard- & software) vendors to foster their technology and HPC-competence

  7. Realizing the ESFRI Vision for an HPC RI • European HPC-facilities at the top of anHPC provisioning pyramid • Tier-0: European Centres for Petaflop/s • Tier-1: National Centres • Tier-2: Regional/University Centres • Creation of a European HPC ecosystem • HPC service providers on all tiers • Scientific and industrial user communities • The European HPC hard- and software industry • Other e-Infrastructures Tier-0 European centres capability Tier-1 National centres # of systems Tier-2 Regional/Universitycentres 8

  8. Tier-0 Petaflop-Capability in PRACE BlueGene/P5.87 Petaflop/s PRACE@GCS@Jülich BlueGene/Q 1 Petaflop/s PRACE@CINECA Bull Cluster Curie 1.8 Petaflop/s PRACE@GENCI@CEA CRAY HORNET 4 Petaflop/s PRACE@GCS@HLRS IBM SuperMUC3 Petaflop/s PRACE@GCS@LRZ MareNostrum1 Petaflop/s PRACE@BSC 9

  9. GettingAccess to PRACE Resources • Access is … • … granted based on scientific merit only – through peer review • … open to all Researchers in public and private organisations • … free of charge for all users • Different types of access are offered • Preparatory access • only technical peer review • prepare for project access, optionally with PRACE support • Project access • both technical and scientific peer review by 3 independent reviewers • 12 month allocations, 2 calls per year: watch www.prace-ri.eu • Multi-Year access (pilot in next call: April 17) • both technical and scientific peer review • demand has been assessed through a call for EoIs

  10. Scientific Achievements: some examples CLIMATE : PAVING THE PATH TO THE 6th IPCC CAMPAIGN An international collaboration towards new high resolution and ensemble climate models Academia IMAGING A WHOLE SEISMIC AREA 1st numerical mapping of the North of Italy Team: , F. Doblas-Reyes (Catalan institute of climate sciences), C. Jones (SHMI, Sweden), E. Maisonnave (CERFACS), W. Hazeleger,(KNMI) 38+50 million corehours on Marenostrum(Spain) 53,4 million core hourson SuperMUC (Germany) Team: Dr. Andrea Morelli– Instituto Nazionale di Geofisica e Vulcanologia, Italy

  11. Industrial impacts : OPEN R&D examples (1/2) Industry: Large Companies SAFER CARS Multiplying crash simulation parameters → Anticipating the new European security rules (EuroNCAP6 in 2015) • SOLVE INSTABILITIES IN HELICOPTER ENGINES • Optimized and reliable turbines • → Predict combustion instabilities on gas turbines and piston engines • → Combustion = 86% of the use of energy on Earth Team: M. Pariente, Y. Tourbier (Renault), A. Kamoulakos (ESI Group) and Mines St Etienne 42 million corehourson CURIE (France) 15 million core hours on CURIE (France) Team: Anthony Roux (Safran) G. Staffelbach (CERFACS)

  12. IMPROVING SHIP SURVIVABILITY UNDER WAVE IMPACT AND AQUAPLANNING FOR AUTOMOTIVE Industrial impacts : OPEN R&D examples (2/2) → Major step: 32k cores simulations – Increased EU visibility Industry: SMEs USING HPC TO DISCOVER NEW THERAPIES • → Creation of a database containing a comprehensive range of possible protein targets for every drug available 20.8 million corehourson Hermit (Germany) 8.2 million corehours on CURIE (France) Team: M. De Leffe, D. Guilber and all (Hydrocean), G. Oger, N. Grenier and all (Ecole Centrale de Nantes) 200 000 corehours on CURIE (France) Team: Dompé/University of Parma

  13. HPC to accelerate growth and innovation for European SMEs through open R&D HydrOcéan was again awarded under the PRACE 6th Call for Proposals for Project Access with 8.2 million core hours on CURIE @ GENCI@CEA, France

  14. Total core hours dedicated in calls 1-7 Earth System Sciences: 9% Fundamental Physics: 19% Engineering & Energy 13 % “Starting mid-2014, PRACE 2.0 will come into life, continuing and increasing the access to Europe’s largest and most capable systems through a single peer-review process”

  15. Tier -1 Resource Exchange Programme(DECI) • Calls twice per year – projects for 1 year – enabling support provided • For European researchers • Access to the most powerful national (Tier-1) computing resources in Europe • Resource Exchange programme • Provides ability to EU researchers to use architectures and systems not available in their countries • Cray XE6, Cray XC30, IBM Blue Gene/Q, Intel clusters (various processor and memory configurations) and hybrid systems (clusters with GPGPU accelerators).

  16. PRACE-GRSupercomputing services for the Greek Research and Academic Community MIS 379417

  17. Who we are • GRNET = The Greek research and technology network • Academic Network provider and more... also major national e-Infrastucture provider • Grids – HellasGrid / Member of EGI • Cloud computing - ~okeanos IaaS cloud • HPC- PRACE-GR / Member of PRACE • Our mandate: To provide state-of-the art IT infrastructure and services to the Greek Reasearch and Academic community

  18. What is PRACE-GR • Goal: • The development of a national HPC infrastructure • Offer resources to PRACE as Tier-1 system • Targeted call by the Ministry of Development (2011). Part of the overall strategy to support Greece’s involvement in the ESFRI projects, PRACE in particular. • Budget 3.5 MEuro • Procurement and installation of HPC infrastructure (3.2M euro) • Operation and provision of support services (300K euro)

  19. How it all started • HellasHPC network of excellence • Run for 8 months (2010-2011) • 35 partners - MoU • HPC state-of-the art • National strategy • Feasibility study • Based on the results of the HellasHPC feasibility study • National survey among 29 academic and research institutes (summer 2010) • Collected requirements from 200 scientific applications developed by 162 research teams from various scientific domains

  20. What the system will look like User data/home directories Scratch space for applications 170-180 Tflop (Linpack) Monitoring/management Infiniband 1:1 fabric Access nodes, Admin nodes , other services Τίτλος παρουσίασης

  21. Current status • Open tender announced on 5/9/2013 and closed on 18/11/2013. • Formal proposal evaluation completed and contract has been awarded • IBM NextScale / 462 Nodes Intel Xeon E5-2680v2, 10-core, (8520 cores) dual socket 2.8GHz, 64 GB RAM • Infiniband FDR-14 (56 Gbps) non-blocking fabric • GPFS-based parallel storage (~1PB) • Cooling infrastructure • + other support systems and networks

  22. User / Application Software • PGI CDK Cluster Development Kit • GNU Compiler Collection • Intel Cluster Studio XE • FFTW • BLAS and GotoBLAS • LAPACK • GNU Scientific Library • GAUSSIAN • NWChem • GROMACSNAMD • CP2K • Code Saturn • WRF • HDF5, • NetCDF, • GRIB • MPI libs (MVAPICH2, MPI 1.2 and 2.2, OpenMP 3.1) • UPC (Berkley runtime + GCUPC) Τίτλος παρουσίασης

  23. Programming Languages • Fortran 77 (X3.9-1978), 90 (SO/IEC 1539:1991), 95 (ISO/IEC 1539-1:1997) and 2003 (ISO/IEC 1539-1:2004 • C (ISO/IEC 9899:1999) • C++ (ISO/IEC 14882:1998) • Python 2.6, 2.7 and 3.3 • Java SE 6 and 7 • Perl 5 and 6 • Tcl/Tk 8.5.5 and 8.6 • UCP • Co-Array Fortran Τίτλος παρουσίασης

  24. How can you take advantage of the system • Open calls and peer-review procedure similar to PRACE • Early-access • Preparatory and Regular Access • Establishment of a review committees • Official procedures to be defined in the coming months

  25. What’s next • Future expansions of the system • GPUs / accelerators • Fat nodes (more cores and memory per box). Better support for shared memory applications. • More storage • Expansions already been taken into account in order to make sure that the hosting infrastructure will be able to support them.

  26. More info • GRNET site: http://www.grnet.gr • Mailing list: allmembers@lists.hellashpc.gr

More Related