Presentation is loading. Please wait.

Presentation is loading. Please wait.

17/03/09 AERES LUTH. I. Resources of LUTH II. Distributed computing : EGEE 3 III. Towards Massively Parallel Processing IV. Grand Challenge simulations.

Similar presentations


Presentation on theme: "17/03/09 AERES LUTH. I. Resources of LUTH II. Distributed computing : EGEE 3 III. Towards Massively Parallel Processing IV. Grand Challenge simulations."— Presentation transcript:

1 17/03/09 AERES LUTH

2 I. Resources of LUTH II. Distributed computing : EGEE 3 III. Towards Massively Parallel Processing IV. Grand Challenge simulations 17/03/09 AERES LUTH

3 Physics and astrophysics theory Numerical algorithm development Simulations and analysis on supercomputers and grids Framework for the interpretation of observational data Gravity, plasma physics, galaxy formation, interstellar medium chemistry, solar wind MHD Spectral methods, Poisson solver, radiative transfer, chemistry solver Massively parallel runs, hybrid simulations, distributed computing HESS, ALMA, Hershell, Planck, COROT, GAIA 17/03/09 AERES LUTH

4 Local computing resources: 222 cores and 33 TB Important use of the SIO mesocenter Use and active participation to EGEE grid development Many parallel codes written or developed locally Allocations on the three main supercomputing centers in France (ranked 14 th, 16 th, and 48 th in the top500) Engineer in scientific computing: expert in code parallelization 17/03/09 AERES LUTH

5 PHYSICS AND CHEMISTRY OF INTERSTELLAR MEDIUM Meudon PDR code (F. Le Petit, J. Le Bourlot, E. Roueff) UV radiative transfer – chemistry - thermal processes Detailed observations  strong constraints Explore space parameters (density, CR flux, dust… ) hundreds of models in 3 days (instead of several months ) VERY HIGH ENERGY γ-RAYS EMISSION FROM AGN SSC code (Katarzynski, K., J-P. Lenain, H. Sol) Synchrotron Self-Compton emission HESS observations 25 parameters 30 000 jobs (60 000 hours mono-cpu) in three months only ACTIVE PARTICIPATION AND USE OF A&A CLUSTER 17/03/09 AERES LUTH

6 SUPERNOVA REMNANTS AND JETS FROM YOUNG STARS HYDRO-MUSCL (C. Nguyen, C. Cavet, C. Michaut) Hydrodynamics and cooling Finite volume method: Riemann solver Parallelization with MPI Radiative transfer (under development) BINARY BLACK HOLES ORBITS KADATH (P. Grandclément, E. Gourgoulhon, J. Novak) General relativity: spectral solver Decomposition on Chebyshev polynomials Parallel Computation of Jacobian column per column Inversion of Jacobian matrix (200 000×200 000) Use MUMPS and SuperLU parallel libraries Parallel version under development: scaling is promising 17/03/09 AERES LUTH

7 GALAXY FORMATION COSMO3D (J-M Alimi, S. Courty, F. Roy, R. Teyssier, J-P Chièze, E. Audit) Poisson, hydrodynamics, chemistry solver Domaine decomposition Run on hundreds of processors MAGNETIC STELLAR ATMOSPHERE CARATSTRAT (G. Alecian, M.J.Stift) Polarized transfer, atomic diffusion, abundance stratification Radiative transfer and diffusion equation solvers Hybrid version MPI/ADA under development Up to 128 processes at CINES 17/03/09 AERES LUTH

8 AN EXAMPLE: THE DARK ENERGY UNIVERSE SIMULATION SERIES J-M Alimi, Y. Rasera, F. Roy, J. Courtin, P-S Corasaniti, A. Füzfa, V. Boucher, F. Fraschetti, R. Teyssier GOAL: Imprints of DARK ENERGY on COSMIC STRUCTURE FORMATION 17/03/09 AERES LUTH

9 THREE DARK ENERGY COSMOLOGIES ΛCDM (standard model) Quintessence with Ratra-Peebles potential (RPCDM) Quintessence with Sugra potential (SUCDM)  Calibrated on latest WMAP CMB data and UNION SNIa data THREE BOX LENGTHES 3.6 Gpc : good statistics on clusters 900 Mpc : good statistics on Milky-Way size halos – Internal structure of clusters 225 Mpc : small halos - internal structure - redshift evolution  Probe from cosmological to subgalactic scales NINE SIMULATIONS WITH 1 BILLION PARTICLES EACH Up to 7 billions resolution elements Resolve scales from 4 kpc to 4 Gpc Resolve halos from 3.10 10 Msun to 8.10 15 Msun Up to 500 000 resolved halos per simulation Up to 3 000 000 particles per halo LARGEST DARK ENERGY SIMULATION SERIES TO DATE 17/03/09 AERES LUTH

10 Initial conditions: MPGRAFIC (S. Prunet, Pichon C.) + QUINT (Y. Rasera) N-body solver: RAMSES (R. Teyssier) + QUINT (Y. Rasera) Quick power spectrum for tests: POWERGRID (S. Prunet) + PARALLEL (Y. Rasera) Analysis: Parallel Friend of Friend halo finder (F. Roy)  Developed for this run !!! NEEDS A SUITE OF PARALLEL CODES WITH GOOD SCALABILITY 4096 processes – 512 MB memory per process only NEEDS A LOT OF CPU-TIME 5 000 000 hours mono-cpu (600 years) on Babel (IDRIS) Allocation possible thanks to Horizon Project First to use Babel up to 24576 processors Found I/O node problem and MPI bug Many crashes of supercomputer  Efficient restart 17/03/09 AERES LUTH

11 LUTH computer room moved to gigabits connection Bought recently: backup system (10 TB) + horizon 2 server (7 TB) 13 TB are stored locally + 5 TB of additional copy NEEDS A GOOD NETWORK AND BACKUP SYSTEM 216 snapshots+ 6 lightcones+3 samples = 40 TB NEEDS TO ANALYSE AND ORGANIZE DATA Creation of a parallel halo finder (F. Roy) Parallel version using domain decomposition Tested up to 2048 3 particles and 4096 processes Sort particles on a region or halo basis Subsequent analysis is therefore communication-free NEEDS TO DIFFUSE DATA Dark energy universe virtual observatory Project: Website, « Dark energy virtual observatory », Horizon collaboration 17/03/09 AERES LUTH

12 ΛCDM Sugra Ratra-Peebles

13 Unprecedented range of masses and scales for dark energy simulations Dark energy mass functions and power spectra with unprecedented accuracy Differences between cosmologies: help breaking degeneracies between dark energy models Differences with analytical predictions: help extending analytical models 17/03/09 AERES LUTH ΛCDM Sugra Ratra-Peebles z=0 z=1 z=2.3 z=0 z=1

14 Intensive computation is a strong component at LUTH LUTH is an active participant of the grid EGEE III in astrophysics leading actor for the «A&A cluster» Use of the grid up to 30000 jobs in 3 months LUTH is moving towards Massively Parallel Processing Several parallel applications up to 120 processes Development and scalability tests to move to higher number of processes LUTH has already performed one Grand Challenge simulation: up to 4096 processes 5 millions hours mono-cpu LUTH is preparing for petaflop computing 17/03/09 AERES LUTH


Download ppt "17/03/09 AERES LUTH. I. Resources of LUTH II. Distributed computing : EGEE 3 III. Towards Massively Parallel Processing IV. Grand Challenge simulations."

Similar presentations


Ads by Google