1 / 14

Intensive computation at LUTH

Intensive computation at LUTH. TOWARDS GRAND CHALLENGE SIMULATIONS Yann Rasera.

stasia
Télécharger la présentation

Intensive computation at LUTH

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Intensive computation at LUTH TOWARDS GRAND CHALLENGE SIMULATIONS Yann Rasera 17/03/09 AERES LUTH

  2. Intensive computation Resources of LUTH Distributed computing : EGEE 3 Towards Massively Parallel Processing Grand Challenge simulations 17/03/09 AERES LUTH

  3. INTENSIVE COMPUTATIONAT LUTH Physics and astrophysics theory Simulations and analysis on supercomputers and grids Numerical algorithm development Framework for the interpretation of observational data Gravity, plasma physics, galaxy formation, interstellar medium chemistry, solar wind MHD Massively parallel runs, hybrid simulations, distributed computing Spectral methods, Poisson solver, radiative transfer, chemistry solver HESS, ALMA, Hershell, Planck, COROT, GAIA 17/03/09 AERES LUTH

  4. I. RESOURCES OF LUTH • Local computing resources: 222 cores and 33 TB • Important use of the SIO mesocenter • Use and active participation to EGEE grid development • Many parallel codes written or developed locally • Allocations on the three main supercomputing centers in France (ranked 14th, 16th, and 48thin the top500) • Engineer in scientific computing: expert in code parallelization 17/03/09 AERES LUTH

  5. II. DISTRIBUTED COMPUTING: EGEE 3 ACTIVE PARTICIPATION AND USE OF A&A CLUSTER • PHYSICS AND CHEMISTRY OF INTERSTELLAR MEDIUM • Meudon PDR code (F. Le Petit, J. Le Bourlot, E. Roueff) • UV radiativetransfer – chemistry - thermal processes • Detailed observations  strong constraints • Explore space parameters (density, CR flux, dust… ) • hundreds of models in 3 days (instead of several months ) • VERY HIGH ENERGY γ-RAYS EMISSION FROM AGN • SSC code (Katarzynski, K., J-P. Lenain, H. Sol) • Synchrotron Self-Compton emission • HESS observations • 25 parameters • 30 000 jobs (60 000 hours mono-cpu) in three months only 17/03/09 AERES LUTH

  6. III. TOWARDS MASSIVELY PARALLEL PROCESSING • SUPERNOVA REMNANTS AND JETS FROM YOUNG STARS • HYDRO-MUSCL (C. Nguyen, C. Cavet, C. Michaut) • Hydrodynamics and cooling • Finite volume method: Riemann solver • Parallelization with MPI • Radiative transfer (under development) • BINARY BLACK HOLES ORBITS • KADATH (P. Grandclément, E. Gourgoulhon, J. Novak) • General relativity: spectral solver • Decomposition on Chebyshev polynomials • Parallel Computation of Jacobian column per column • Inversion of Jacobian matrix (200 000×200 000) • Use MUMPS and SuperLU parallel libraries • Parallel version under development: scaling is promising 17/03/09 AERES LUTH

  7. GALAXY FORMATION • COSMO3D (J-M Alimi, S. Courty, F. Roy, R. Teyssier, J-P Chièze, E. Audit) • Poisson, hydrodynamics, chemistry solver • Domaine decomposition • Run on hundreds of processors • MAGNETIC STELLAR ATMOSPHERE • CARATSTRAT (G. Alecian, M.J.Stift) • Polarized transfer, atomic diffusion, abundance stratification • Radiative transfer and diffusion equation solvers • Hybrid version MPI/ADA under development • Up to 128 processes at CINES 17/03/09 AERES LUTH

  8. IV. GRAND CHALLENGE SIMULATIONS AN EXAMPLE: THE DARK ENERGY UNIVERSE SIMULATION SERIES J-M Alimi, Y. Rasera, F. Roy, J. Courtin, P-S Corasaniti, A. Füzfa, V. Boucher, F. Fraschetti, R. Teyssier GOAL: Imprints of DARK ENERGY on COSMIC STRUCTURE FORMATION 17/03/09 AERES LUTH

  9. N-BODY SIMULATION SERIES • THREE DARK ENERGY COSMOLOGIES • ΛCDM (standard model) • Quintessence with Ratra-Peebles potential (RPCDM) • Quintessence with Sugra potential (SUCDM) • Calibrated on latest WMAP CMB data and UNION SNIa data • THREE BOX LENGTHES • 3.6 Gpc : good statistics on clusters • 900 Mpc : good statistics on Milky-Way size halos – Internal structure of clusters • 225 Mpc : small halos - internal structure - redshift evolution • Probe from cosmological to subgalactic scales • NINE SIMULATIONS WITH 1 BILLION PARTICLES EACH • Up to 7 billions resolution elements • Resolve scales from 4 kpc to 4 Gpc • Resolve halos from 3.1010Msun to 8.1015Msun • Up to 500 000 resolved halos per simulation • Up to 3 000 000 particles per halo LARGEST DARK ENERGY SIMULATION SERIES TO DATE 17/03/09 AERES LUTH

  10. SCALABILITY AND RUNS NEEDS A SUITE OF PARALLEL CODES WITH GOOD SCALABILITY 4096 processes – 512 MB memory per process only • Initial conditions: MPGRAFIC (S. Prunet, Pichon C.) + QUINT (Y. Rasera) • N-body solver: RAMSES (R. Teyssier) + QUINT (Y. Rasera) • Quick power spectrum for tests: POWERGRID (S. Prunet) + PARALLEL (Y. Rasera) • Analysis: Parallel Friend of Friend halo finder (F. Roy)Developed for this run !!! NEEDS A LOT OF CPU-TIME 5 000 000 hours mono-cpu (600 years) on Babel (IDRIS) • Allocation possible thanks to Horizon Project • First to use Babel up to 24576 processors • Found I/O node problem and MPI bug • Many crashes of supercomputer Efficient restart 17/03/09 AERES LUTH

  11. LARGE VOLUME DATA AND POST-PROCESSING NEEDS A GOOD NETWORK AND BACKUP SYSTEM 216 snapshots+ 6 lightcones+3 samples = 40 TB • LUTH computer room moved to gigabits connection • Bought recently: backup system (10 TB) + horizon 2 server (7 TB) • 13 TB are stored locally + 5 TB of additional copy NEEDS TO ANALYSE AND ORGANIZE DATA Creation of a parallel halo finder (F. Roy) • Parallel version using domain decomposition • Tested up to 20483 particles and 4096 processes • Sort particles on a region or halo basis • Subsequent analysis is therefore communication-free NEEDS TO DIFFUSE DATA Dark energy universe virtual observatory • Project: Website, « Dark energy virtual observatory », Horizon collaboration 17/03/09 AERES LUTH

  12. 225 Mpc ΛCDM Sugra Ratra-Peebles 14 Mpc 56 Mpc

  13. ΛCDM Sugra Ratra-Peebles FIRST RESULTS z=0 z=0 z=2.3 z=2.3 z=1 z=1 • Unprecedented range of masses and scales for dark energy simulations • Dark energy mass functions and power spectra with unprecedented accuracy • Differences between cosmologies: help breaking degeneracies between dark energy models • Differences with analytical predictions: help extending analytical models 17/03/09 AERES LUTH

  14. CONCLUSION • Intensive computation is a strong component at LUTH • LUTH is an active participant of the grid EGEE III in astrophysics • leading actor for the «A&A cluster» • Use of the grid up to 30000 jobs in 3 months • LUTH is moving towards Massively Parallel Processing • Several parallel applications up to 120 processes • Development and scalability tests to move to higher number of processes • LUTH has already performed one Grand Challenge simulation: • up to 4096 processes • 5 millions hours mono-cpu • LUTH is preparing for petaflop computing 17/03/09 AERES LUTH

More Related