Presentation is loading. Please wait.

Presentation is loading. Please wait.

Nick Gnedin FCPA Retreat Computational Astrophysics and Cosmology.

Similar presentations


Presentation on theme: "Nick Gnedin FCPA Retreat Computational Astrophysics and Cosmology."— Presentation transcript:

1 Nick Gnedin FCPA Retreat Computational Astrophysics and Cosmology

2 Contents Dreams and reality Science Methods and Tools Computational Cosmology in the World… … and @ FNAL

3 Ultimate Dream of Every Cosmologist Noah’s Ark of the Universe: a quantitative model of the universe in a large enough volume to represent all types of galaxies in sufficient detail. “Every cosmologist carries a digital universe in his pack.” Napoleon Bonaparte

4 … and Reality If we cannot have this… Large-scale simulationsSmall-scale simulations

5 Science: Computational Cosmology for Fundamental Physics Weak lensing : Directly probes the matter distribution. Unfortunately, we exist. Baryonic effects can be calibrated by simulations. The required computational effort is massive. (Rudd et al 2007)

6 Science: Computational Cosmology for Fundamental Physics Baryon Acoustic Oscillations : Most clean test for dark energy (potentially). Galaxies are biased. Baryonic effects can be calibrated by simulations. We are still very far from predicting galaxy colors from simulations reliably. (Tegmark et al 2006)

7 Science: Computational Cosmology for Fundamental Physics Galaxy Clusters : Sensitive probes to DE (live on exponential tail). Complex physical systems. Baryonic effects can be calibrated by simulations. It is not clear how much physics we need to include to model clusters with high precision (even ignoring central 20%). (Nagai et al 2007)

8 Science: Computational Cosmology for Fundamental Astrophysics Galaxy Formation : Direct comparison with most observations. Important physics (such as star formation) will not be understood any time soon. There exist simple scaling laws in the ISM. Models based on those laws make wrong galaxies.

9 Science: Computational Cosmology for Fundamental Astrophysics Reionization & 1 st stars : Physics is reasonably well understood. Radiative Transfer is a hard computational problem. Substantial progress has been achieved, though. A very large dynamic range is required (1kpc – 100Mpc).

10 Science: Computational Cosmology for Fundamental Astrophysics Supermassive Black Holes : Probably the most important unsolved astrophysical problem now. Very large dynamic range and complex physics. AMR can reach it now. We may not have even the most rudimentary, conceptual understanding of the physics of AGN.

11 Methods and Tools Dark Matter: similar to plasma simulations particle methods (N-body) Gas Dynamics: mostly Lagrangian methods little use of engineering expertise Radiative Transfer: 6D problem useful approximations exist full computational solution is still lacking

12 Cosmological N-body The problem of cosmological N-body simulations is essentially solved PM = particle-mesh P 3 M = particle-particle particle-mesh Tree Hybrid codes: TreePM, Adaptive P 3 M Adaptive Mesh Refinement (AMR)

13 10 Billion Particles Simulation

14 Cosmological Gas Dynamics Little use of engineering expertise: very high resolution required complex physics gas is gravitating no solid boundaries

15 20 th Century (1 st Generation Codes) Eulerian schemes: R ~ 1000, N~10 9, const Simple Path to Hell (SPH): R ~ 30,000, N~10 8, const Arbitrary Lagrangian-Eulerian (ALE): R ~ 10,000, N~10 7, const

16 21 st Century (2 nd Generation Codes) Adaptive Mesh Refinement (AMR): R > 100,000, N ~ 10 9, variable can follow fragmentation non-uniform initial conditions spatially variable resolution

17 Our Tools We (FNAL+KICP) have several cosmological codes The main one is the Adaptive Refinement Tree (ART) code Implementation of Adaptive Mesh Refinement method Refines on a cell-by-cell basis Full 4D adaptive Includes - dark matter dynamics - gas dynamics and chemistry - radiative transfer - star formation, etc

18 Computational Cosmology in the World Virgo Consortium [>30 people] (Germany + UK) Europe : Research mostly done by large groups: Project Horizon [>20 people] (France) USA/Canada : Large number of small groups (10-12 people) Chicago (FNAL+KICP), UWashington, Harvard, CITA, SLAC, LANL, Princeton, UCLA, UIUC, Berkeley, UMass…

19 Computing Needs Modern state-of-the art cosmological simulations require in excess of 100,000 CPU-hours (130 CPU-months). Biggest ones use > 1,000,000 CPU-hours. A single simulation produces a Terabyte-scale data set. Even rudimentary analysis requires intermediate-level computing capability. The time of “this is a workstation to analyze your simulation” is over.

20 Solution (a-la LQCD) National consortium/collaboration (the level of integration may vary) Powerful local, intermediate-level resources (10,000 CPU, can be spread over a few separate places: FNAL, SLAC, ANL, LANL) United approach to funding agencies and national supercomputer centers.

21 Our Plan build-up local resources on par with other US groups create a Chicago-wide collaboration (FNAL, KICP, ANL) charm Mont, PAC, and Director overtake other US groups in local computing resources (reach 1,000 – 2,000 cores; there are good reasons why FNAL is the best place for that) move towards a national consortium/collaboration:  join with DOE labs (LANL, LBL, SLAC)  invite universities to join get lots of money along the way… (people, software support)


Download ppt "Nick Gnedin FCPA Retreat Computational Astrophysics and Cosmology."

Similar presentations


Ads by Google