Computational Astrophysics: Research to Teaching and Beyond Adam Frank University of Rochester.

Slides:



Advertisements
Similar presentations
Stars Disks and Jets The View From Telescopes, Computers and the Lab Adam Frank University of Rochester.
Advertisements

Fast Magnetic Reconnection B. Pang U. Pen E. Vishniac.
Clumps With External Magnetic Fields, Their Interaction With Shocks, and the Effect of Magnetic Thermal Conduction Shule Li, Adam Frank, Eric Blackman.
“The interaction of a giant planet with a disc with MHD turbulence II: The interaction of the planet with the disc” Papaloizou & Nelson 2003, MNRAS 339.
MUTAC Review April 6-7, 2009, FNAL, Batavia, IL Mercury Jet Target Simulations Roman Samulyak, Wurigen Bo Applied Mathematics Department, Stony Brook University.
1 Cyberinfrastructure Framework for 21st Century Science & Engineering (CF21) IRNC Kick-Off Workshop July 13,
Parallelizing stencil computations Based on slides from David Culler, Jim Demmel, Bob Lucas, Horst Simon, Kathy Yelick, et al., UCB CS267.
1 Approved for unlimited release as SAND C Verification Practices for Code Development Teams Greg Weirs Computational Shock and Multiphysics.
Efficient Parallelization for AMR MHD Multiphysics Calculations Implementation in AstroBEAR Collaborators: Adam Frank Brandon Shroyer Chen Ding Shule Li.
Scientific Programming MAIN INPUTINITCOMPUTEOUTPUT SOLVER DERIV FUNC2 TABUL FUNC1 STATIC BLASLAPACKMEMLIB.
Alpine3D: an alpine surface processes model Mathias Bavay WSL Institute for Snow and Avalanche Research SLF, Davos, Switzerland.
9/21/09LUNAR Steering Committee Meeting GSFC Simulations of the Dark Ages Initial Plans.
An Advanced Simulation and Computing (ASC) Academic Strategic Alliances Program (ASAP) Center at The University of Chicago The Center for Astrophysical.
Prospects and Problems of Using Galaxy Clusters for Precision Cosmology Jack Burns Center for Astrophysics and Space Astronomy University of Colorado,
Chamber Dynamic Response Modeling Zoran Dragojlovic.
1 31 May 2008 AAS/ASP Discover the Universe with NASA During the International Year of Astronomy 2009! Kepler’s Other Worlds Alan Gould Lawrence Hall of.
E. WES BETHEL (LBNL), CHRIS JOHNSON (UTAH), KEN JOY (UC DAVIS), SEAN AHERN (ORNL), VALERIO PASCUCCI (LLNL), JONATHAN COHEN (LLNL), MARK DUCHAINEAU.
Module on Computational Astrophysics Jim Stone Department of Astrophysical Sciences 125 Peyton Hall : ph :
Cosmological MHD Hui Li Collaborators: S. Li, M. Nakamura, S. Diehl, B. Oshea, P. Kronberg, S. Colgate (LANL) H. Xu, M. Norman (UCSD), R. Cen (Princeton)
APNe Conference, Seattle, July 2003 Clumpy Flows in Protoplanetary and Planetary Nebulae Alexei Poludnenko, Adam Frank University of Rochester, Laboratory.
Jonathan Carroll-Nellenback University of Rochester.
Jonathan Carroll-Nellenback University of Rochester.
Brookhaven Science Associates U.S. Department of Energy MUTAC Review April , 2004, LBNL Target Simulation Roman Samulyak, in collaboration with.
VACET: Deploying Technology for Visualizing and Analyzing Astrophysics Simulations Author May 19, 2009.
Brookhaven Science Associates U.S. Department of Energy MUTAC Review January 14-15, 2003, FNAL Target Simulations Roman Samulyak Center for Data Intensive.
Nick Gnedin FCPA Retreat Computational Astrophysics and Cosmology.
N ATIONAL E NERGY R ESEARCH S CIENTIFIC C OMPUTING C ENTER 1 NERSC Visualization Greenbook Workshop Report June 2002 Wes Bethel LBNL.
Multidimensional Diffusive Shock Acceleration in Winds from Massive Stars Paul P. Edmon University of Minnesota Collaborators: Tom Jones (U of M), Andrew.
Collapsar Accretion and the Gamma-Ray Burst X-Ray Light Curve Chris Lindner Milos Milosavljevic, Sean M. Couch, Pawan Kumar.
How Stars Form Shantanu Basu Physics & Astronomy University of Western Ontario Preview Western, May 3/4, 2003.
Advanced Simulation and Computing (ASC) Academic Strategic Alliances Program (ASAP) Center at The University of Chicago The Center for Astrophysical Thermonuclear.
Living With a Star (LWS) and the Vision for Exploration LWS Mission Goal: Develop the scientific understanding necessary to effectively address those aspects.
Renaissance: Formation of the first light sources in the Universe after the Dark Ages Justin Vandenbroucke, UC Berkeley Physics 290H, February 12, 2008.
 Advanced Accelerator Simulation Panagiotis Spentzouris Fermilab Computing Division (member of the SciDAC AST project)
GEOSCIENCE NEEDS & CHALLENGES Dogan Seber San Diego Supercomputer Center University of California, San Diego, USA.
Computational Simulations of Relativistic Jets using the Cubed Sphere Grid Previous Grid Constructions The objective of our study is to determine what.
SFUMATO: A self-gravitational MHD AMR code Tomoaki Matsumoto ( Hosei Univerisity ) Circumstellar disk Outflow Magnetic field Protostar Computational domain.
The Meudon PDR code on complex ISM structures F. Levrier P. Hennebelle, E. Falgarone, M. Gerin (LERMA - ENS) F. Le Petit (LUTH - Observatoire de Paris)
Presented by Adaptive Hybrid Mesh Refinement for Multiphysics Applications Ahmed Khamayseh and Valmor de Almeida Computer Science and Mathematics Division.
Earth and Space Science: What are the Implications and Opportunities for Education? Programmatic decisions Definition of new scientific and education.
Energy Balance in Clusters of Galaxies Patrick M. Motl & Jack O. Burns Center for Astrophysics and Space Astronomy University of Colorado at Boulder X-ray.
Simulated [CII] 158 µm observations for SPICA / SAFARI F. Levrier P. Hennebelle, E. Falgarone, M. Gerin (LERMA - ENS) F. Le Petit (LUTH - Observatoire.
LCSE – NCSA Partnership Accomplishments, FY01 Paul R. Woodward Laboratory for Computational Science & Engineering University of Minnesota October 17, 2001.
ROSES 2006 Code S & T Workshop Michael Way Space Sciences Division.
Core Collapse Supernovae: Power Beyond Imagination
EScience Workshop 12/08 eScience Workshop 12/08 © Rubin Landau, CPUG Computational (Physics) Thinking Rubin H Landau, Founding Dir Computational Physics.
STARS.
C OMPUTATIONAL R ESEARCH D IVISION 1 Defining Software Requirements for Scientific Computing Phillip Colella Applied Numerical Algorithms Group Lawrence.
Cactus Workshop - NCSA Sep 27 - Oct Scott H. Hawley*, Matthew W. Choptuik*  *University of Texas at Austin *  University of British Columbia
IDC HPC USER FORUM Weather & Climate PANEL September 2009 Broomfield, CO Panel questions: 1 response per question Limit length to 1 slide.
PLUTO: a modular code for computational astrophysics Developers: A. Mignone 1,2, G. Bodo 2 1 The University of Chicago, ASC FLASH Center 2 INAF Osseratorio.
1 / 17 AstroBEAR: Kristopher Yirak: Clumped YSO Jets and the Convergence of Radiatively.
Computational Fluid Dynamics - Fall 2007 The syllabus CFD references (Text books and papers) Course Tools Course Web Site:
May 23, 2006SINS meeting Structure Formation and Particle Mixing in a Shear Flow Boundary Layer Matthew Palotti University of Wisconsin.
Regression Testing for CHIMERA Jessica Travierso Austin Peay State University Research Alliance in Math and Science National Center for Computational Sciences,
1 The Interactions of Aerosols, Clouds, and Radiation on the Regional Scale.
CITA|ICAT Jonathan Dursi HPCS’06 15 May Towards Understanding some Astrophysical Flows using Multiscale Simulations with the FLASH code Jonathan Dursi,
Chamber Dynamic Response Modeling
Modeling Astrophysical Turbulence
Programming Models for SimMillennium
Dynamo action & MHD turbulence (in the ISM, hopefully…)
Using AGN as Probes of the ICM
M. Oveissi, D. Eckard*, A. Rose+, B. Shneiderman+, G. W. Rubloff
YSO Jets: Feedback from Mesoscopic to Macroscopic scales
Heliosphere - Lectures 5-7
Shule Li, Adam Frank, Eric Blackman
AstroBEAR is a parallelized hydrodynamic/MHD simulation code suitable for a variety of astrophysical problems. AstroBEAR is designed for 2D and 3D adaptive.
University of Rochester, Department of Physics and Astronomy.
Tomoaki Matsumoto (Hosei Univ. / NAOJ) Masahiro N. Machida (NAOJ)
Parallel Implementation of Adaptive Spacetime Simulations A
Presentation transcript:

Computational Astrophysics: Research to Teaching and Beyond Adam Frank University of Rochester

A Cast of Many AstroBEAR MHD / Clumpy Flows Sorin Mitran UNC *Andrew Cunningham (UR, UC Berkeley, LLNL) Alexei Poludnenko (UR, NRL) Kris Yirak (UR Grad Student) Jonathan Carroll (UR Grad Student) Thanks to: NSF, DOE, NASA, UR Laboratory for Laser Energetics

Boundary conditions Initial conditions Simulations: Numerical Experiments with Diff. Eq & PDEs space time

3 rd Age of Simulations: 1 st Age: 1-D calculations with computers 1 st Age: 1-D calculations with computers –CDC 7600, etc 2 nd Age: Moderate rez 2-D calculations 2 nd Age: Moderate rez 2-D calculations –Supercomputers: Cray, etc 3 rd Age: High rez Multi-D, Multi-physics calc. 3 rd Age: High rez Multi-D, Multi-physics calc. –Grid Computing, Massively Parallel, Clusters We are talking about “Virtual Reality” on a scientific level. We are talking about “Virtual Reality” on a scientific level. Sim. data sets now as “rich” as real data sets Sim. data sets now as “rich” as real data sets –Petabytes/flops = one quadrillion bytes. Not just simulation but “Cyberscience” Not just simulation but “Cyberscience” –Integrate IT on all levels of science practice

History: Research Simulation Shocked Clumps Astrophysical environments very heterogeneous. Winds, Blastwaves & ISM are all “clumpy” Interaction of 1-clump with passing wind is a well studied problem. Critical Parameter t cc : Cloud Crushing Time Woodward 1976

Frontiers of Algorithm Development Adaptive Mesh Refinement – code automatically places grid cells where needed. Adaptive Mesh Refinement – code automatically places grid cells where needed. Multi-physics – code simulates many physical processes simultaneously Multi-physics – code simulates many physical processes simultaneously –Magnetic Fields, gravity, radiation transport, chemistry, ionization dynamics.

UR Computational Group: AstroBEAR Multiyear, Multi grad-student effort Began 2002 Now on 4 th generation of student

Simulation in Age of CyberScience How to maintain 10 6 line code base across N years M revisions K Student Generations Static documentation doesn’t work. Our solution: Wiki, self- compiling documentation

Adaptive Mesh Refinement AMR R. Deiterding Different AMR Methods: Different AMR Methods: Grid based regridding Cell based regridding University of Rochester Code University of Rochester Code AstroBEAR: grid Hierarchy of Grids Hierarchy of Grids Require Prolongation/Restriction Operators Require Prolongation/Restriction Operators Carry data from one grid level to another Prolongation (Corse to Fine) Prolongation (Corse to Fine) Restriction (Fine to Corse) Restriction (Fine to Corse)

Mach 10 radiatively cooled bullet AMR grid generation in the system Pre-Planetary Nebula: CRL 618 “Explosion” from dying Solar-type star

AMR MHD Div B = 0 Hydro: Need conservative prolongation/ restriction operators. Hydro: Need conservative prolongation/ restriction operators. MHD: Maintain solenoidal condition. Need divergence free operators on “staggered mesh”. MHD: Maintain solenoidal condition. Need divergence free operators on “staggered mesh”. Cunningham et al 2007

Results I MHD Shocked Clump “Standard” Test M = 10  = n c /n w =   =4  = 4

Many Clumps: Radiative MHD Shocks in Heterogeneous Media Cunningham et al 2007 M = 10  = n c /n w =   =10  = 10

How AMR Changes Game: Resolution and Convergence Convergence formally defined as approach to known analytic solution. For complex non-linear problem these rarely exist Define convergence as change relative to highest resolution simulation possible. For adiabatic shock clump convergence appears at N = 120 cells/R cFor adiabatic shock clump convergence appears at N = 120 cells/R c

Resolution and Convergence Radiative Clumps Radiative cooling allows post shock flows to collapse – but how far? AMR allows us to run highest resolution radiative clump simulations to date: N = 1500/R c Highest resolution radiative clump simulations to date: N ~ 1500/R c

Resolution and Convergence Radiative Clumps Radiative cooling allows post shock flows to collapse – but how far? Increases in resolution show qualitatively new behaviors as  x < crit

Resolution and Convergence Radiative Clumps What measures, metrics can we trust at given resolution? ester.edu/~yirak/for adam/vorticities.png

Radiative MHD Clumps with Self-consistent Fields

How many MHD-AMR Codes  Not many  AstroBEAR  Flash  ENZO  Orion  AMR VAC  Athena

Computation in Teaching My adventure in E-ed-biz 2000 NSF Career Award AstroFlow – Simulation outreach tool Planetarium asks to buy copy (?!?) Create Truth-N-Beauty LLC with UR E-education digital media company Produce simulation based modules for: McGraw-Hill, Prentice Hall etc

Second Avenue Software The e-biz goes it alone 2006 Truth-N-Beauty becomes 2nd Ave

What we built Celestial Sphere Phases of Moon Seasons Solar System Builder Planetary Atmospheres

Going Beyond Teaching Outreach: Use partners in new media to create interactives for websites. Discover/Astronomy/SciAmerican “Serious Games” – twitch games with a science theme.

Going Beyond Teaching

Star Formation

Conclusions Computation Computation –Advanced AMR/multi-physics codes allow new era of simulation –“Weather vs. Climate”: what to do with Petaflops/bytes Teaching by Simulation Teaching by Simulation –New opportunities if done well (graphics, pedagogy) –New opportunities outside of classroom.