June 29 San FranciscoSciDAC 2005 Terascale Supernova Initiative Discovering New Dynamics of Core-Collapse Supernova Shock Waves John M. Blondin NC State.

Slides:



Advertisements
Similar presentations
University of Chicago Department of Energy The Parallel and Grid I/O Perspective MPI, MPI-IO, NetCDF, and HDF5 are in common use Multi TB datasets also.
Advertisements

An Analysis of ASPECT Mantle Convection Simulator Performance and Benchmark Comparisons Eric M. Heien [1], Timo Heister [2], Wolfgang Bangerth [2], Louise.
Beowulf Supercomputer System Lee, Jung won CS843.
Compact remnant mass function: dependence on the explosion mechanism and metallicity Reporter: Chen Wang 06/03/2014 Fryer et al. 2012, ApJ, 749, 91.
The evolution and collapse of BH forming stars Chris Fryer (LANL/UA)  Formation scenarios – If we form them, they will form BHs.  Stellar evolution:
A Case Study in the Visualization of Supernova Simulation Data Ed Bachta Visualization and Interactive Spaces Lab.
Brendan Krueger CEA Saclay 2013 October 18 THE ONSET OF NEUTRINO- DRIVEN CONVECTION IN CORE-COLLAPSE SUPERNOVAE.
SAN DIEGO SUPERCOMPUTER CENTER at the UNIVERSITY OF CALIFORNIA; SAN DIEGO IEEE Symposium of Massive Storage Systems, May 3-5, 2010 Data-Intensive Solutions.
UNCLASSIFIED: LA-UR Data Infrastructure for Massive Scientific Visualization and Analysis James Ahrens & Christopher Mitchell Los Alamos National.
Neutron Star Formation and the Supernova Engine Bounce Masses Mass at Explosion Fallback.
Mark Duchaineau Data Science Group April 3, 2001 Multi-Resolution Techniques for Scientific Data Visualization.
Core-collapse SN neutrinos and GW bursts We are developing a proposal to the LSC-Virgo to search for associated bursts of low-energy neutrinos and GW bursts.
3DAPAS/ECMLS panel Dynamic Distributed Data Intensive Analysis Environments for Life Sciences: June San Jose Geoffrey Fox, Shantenu Jha, Dan Katz,
Scientific Data Infrastructure in CAS Dr. Jianhui Scientific Data Center Computer Network Information Center Chinese Academy of Sciences.
In Situ Sampling of a Large-Scale Particle Simulation Jon Woodring Los Alamos National Laboratory DOE CGF
1 07/10/ Séminaire du LUTh - Jérôme Guilet Asymmetric explosions of Core collapse supernovae Jérôme Guilet En collaboration avec Thierry Foglizzo,
9/16/2000Ian Bird/JLAB1 Planning for JLAB Computational Resources Ian Bird.
J OINT I NSTITUTE FOR N UCLEAR R ESEARCH OFF-LINE DATA PROCESSING GRID-SYSTEM MODELLING FOR NICA 1 Nechaevskiy A. Dubna, 2012.
Physics Steven Gottlieb, NCSA/Indiana University Lattice QCD: focus on one area I understand well. A central aim of calculations using lattice QCD is to.
Changing Science and Engineering: the impact of HPC Sept 23, 2009 Edward Seidel Assistant Director, Mathematical and Physical Sciences, NSF (Director,
Date donald smits center for information technology Centre of Information Technology RUG Robert Janz Centre of Information Technology University.
1 Arie Shoshani, LBNL SDM center Scientific Data Management Center(SDM-ISIC) Arie Shoshani Computing Sciences Directorate Lawrence Berkeley National Laboratory.
An Accelerated Strategic Computing Initiative (ASCI) Academic Strategic Alliances Program (ASAP) Center at The University of Chicago The Center for Astrophysical.
High Energy and Nuclear Physics Collaborations and Links Stu Loken Berkeley Lab HENP Field Representative.
Exploding Massive Stars: The Perfect ‘App’ for Computational Physics SESAPS 2003 John M. Blondin North Carolina State University.
About nuclear liquid boiling in core-collapse supernova explosions Dmytro Iakubovskyi (Bogolyubov Institute for Theoretical Physics, Kiev, Ukraine)
Big data and learning: building and learning from LearnSphere (…because the next generation DataShop was just too incremental) John Stamper Pittsburgh.
Lecture 14 Neutrino-Powered Explosions Mixing, Rotation, and Making Black Holes.
Mark Rast Laboratory for Atmospheric and Space Physics Department of Astrophysical and Planetary Sciences University of Colorado, Boulder Kiepenheuer-Institut.
Overview of the Texas Advanced Computing Center and International Partnerships Marcia Inger Assistant Director Development & External Relations April 26,
F. Douglas Swesty, DOE Office of Science Data Management Workshop, SLAC March Data Management Needs for Nuclear-Astrophysical Simulation at the Ultrascale.
Diskless Checkpointing on Super-scale Architectures Applied to the Fast Fourier Transform Christian Engelmann, Al Geist Oak Ridge National Laboratory Februrary,
VAPoR: A Discovery Environment for Terascale Scientific Data Sets Alan Norton & John Clyne National Center for Atmospheric Research Scientific Computing.
Computational Simulations of Relativistic Jets using the Cubed Sphere Grid Previous Grid Constructions The objective of our study is to determine what.
Supernova neutrino challenges Christian Y. Cardall Oak Ridge National Laboratory Physics Division University of Tennessee, Knoxville Department of Physics.
TeraScale Supernova Initiative: A Networker’s Challenge 11 Institution, 21 Investigator, 34 Person, Interdisciplinary Effort.
TeraGrid Quarterly Meeting Arlington, VA Sep 6-7, 2007 NCSA RP Status Report.
J.-N. Leboeuf V.K. Decyk R.E. Waltz J. Candy W. Dorland Z. Lin S. Parker Y. Chen W.M. Nevins B.I. Cohen A.M. Dimits D. Shumaker W.W. Lee S. Ethier J. Lewandowski.
Large Scale Time-Varying Data Visualization Han-Wei Shen Department of Computer and Information Science The Ohio State University.
Presented by Visualization at the Leadership Computing Facility Sean Ahern Scientific Computing Center for Computational Sciences.
Comprehensive Scientific Support Of Large Scale Parallel Computation David Skinner, NERSC.
National Center for Supercomputing Applications University of Illinois at Urbana–Champaign Visualization Support for XSEDE and Blue Waters DOE Graphics.
Power and Cooling at Texas Advanced Computing Center Tommy Minyard, Ph.D. Director of Advanced Computing Systems 42 nd HPC User Forum September 8, 2011.
Core Collapse Supernovae: Power Beyond Imagination
Securing the Grid & other Middleware Challenges Ian Foster Mathematics and Computer Science Division Argonne National Laboratory and Department of Computer.
Neutron Star Kicks Chris Fryer Aimee Hungerford Frank Timmes  Observational Evidence and constraints on kicks  Theoretical kick mechanisms.
SAN DIEGO SUPERCOMPUTER CENTER at the UNIVERSITY OF CALIFORNIA, SAN DIEGO Advanced User Support for MPCUGLES code at University of Minnesota October 09,
NICS Update Bruce Loftis 16 December National Institute for Computational Sciences University of Tennessee and ORNL partnership  NICS is the 2.
Presented by NCCS Hardware Jim Rogers Director of Operations National Center for Computational Sciences.
Computing Issues for the ATLAS SWT2. What is SWT2? SWT2 is the U.S. ATLAS Southwestern Tier 2 Consortium UTA is lead institution, along with University.
An Architectural Approach to Managing Data in Transit Micah Beck Director & Associate Professor Logistical Computing and Internetworking Lab Computer Science.
Millions of Jobs or a few good solutions …. David Abramson Monash University MeSsAGE Lab X.
1 (Brief) Introductory Remarks On Behalf of the U.S. Department of Energy ESnet Site Coordinating Committee (ESCC) W.Scott Bradley ESCC Chairman
Markus Frank (CERN) & Albert Puig (UB).  An opportunity (Motivation)  Adopted approach  Implementation specifics  Status  Conclusions 2.
AT LOUISIANA STATE UNIVERSITY CCT: Center for Computation & LSU Condor in Louisiana Tevfik Kosar Center for Computation & Technology Louisiana.
Neutrino Studies at the Spallation Neutron Source, ORNL, 8/29/03W.R. Hix (UTenn./ORNL) Neutrino-Nucleus Interactions and the Core Collapse Supernova Mechanism.
Large-scale accelerator simulations: Synergia on the Grid turn 1 turn 27 turn 19 turn 16 C++ Synergia Field solver (FFT, multigrid) Field solver (FFT,
Christian Y. Cardall Oak Ridge National Laboratory Physics Division University of Tennessee, Knoxville Department of Physics and Astronomy Terascale Supernova.
Study on the growth and time-variability of fluctuations in super-sonic flows: toward more realistic investigations on SASI K. Takahashi & S. Yamada (Waseda.
January 9, 2007AAS/AAPT John M. Blondin NC State University Discovering the Complexity of Supernovae through 3D Simulations.
Distributed Visualization Our use of this term extends its traditional meaning –Distributed: Still aim to support geographically distributed users and.
Fermi National Accelerator Laboratory & Thomas Jefferson National Accelerator Facility SciDAC LQCD Software The Department of Energy (DOE) Office of Science.
Regression Testing for CHIMERA Jessica Travierso Austin Peay State University Bronson Messer National Center for Computational Sciences August 2009.
Regression Testing for CHIMERA Jessica Travierso Austin Peay State University Research Alliance in Math and Science National Center for Computational Sciences,
Modern supercomputers, Georgian supercomputer project and usage areas
Jay Boisseau, Director Texas Advanced Computing Center
A. Rama Bharathi Regd. No: 08931F0040 III M.C.A
Super Computing By RIsaj t r S3 ece, roll 50.
A New Type of Thermonuclear Supernova, and How the Pulsar Got his Spin
TeraScale Supernova Initiative
Presentation transcript:

June 29 San FranciscoSciDAC 2005 Terascale Supernova Initiative Discovering New Dynamics of Core-Collapse Supernova Shock Waves John M. Blondin NC State University Scientific Discovery through Advanced Computing

June 29 San FranciscoSciDAC 2005

June 29 San FranciscoSciDAC Colgate and White Neutrino-Driven prompt explosion 1985 Bethe and Wilson Shock reheating via neutrino energy deposition 1992 Herant, Benz, and Colgate Convective instability above neutrino-sphere A Long History of Computational Physics…

June 29 San FranciscoSciDAC 2005 The Modern Picture…

June 29 San FranciscoSciDAC 2005 It all starts with core collapse…

June 29 San FranciscoSciDAC 2005 First generation of 2D SN models hinted at a low-order asymmetry in the shock wave at late times (100’s of msec after bounce). Burrows, Hayes & Fryxell 1995

June 29 San FranciscoSciDAC 2005 Dynamics of the Supernova Shock Wave When, Where and How is spherical symmetry is broken?

June 29 San FranciscoSciDAC 2005 Modeling post-bounce shock

June 29 San FranciscoSciDAC 2005 SN Code Verification This post-bounce model provides an opportunity to verify supernova codes against the results of a linear perturbation analysis. Houck and Chevalier 1992 Blondin and Mezzacappa 2005

June 29 San FranciscoSciDAC 2005 Spherical Accretion Shock Instability

June 29 San FranciscoSciDAC 2005 SASI Standing pressure waves within the cavity of a spherical accretion shock are amplified with each oscillation. The shock becomes significantly distorted after only a few periods. In core-collapse supernovae, SASI will operate in conjunction (competition?) with neutrino-driven convection.

June 29 San FranciscoSciDAC 2005

June 29 San FranciscoSciDAC 2005 This initial SASI discovery with axisymmetric 2D simulations pointed to the obvious need for models in full 3D. To better understand the challenges of 3D, let us first look at the process of discovery for the initial 2D models. Must move to 3D!

June 29 San FranciscoSciDAC 2005 Hurdles for Large-Scale 3D Simulation code Floating points Data output Data transport Visualization and analysis Not a problem Thank you DOE It works Does not work I can’t see!

June 29 San FranciscoSciDAC 2005 First Results: SASI Exists in 3D With data stuck on the West Coast, this was science in the dark! 3D Cartesian grid 100 Million zones 100’s of processors 100’s of GB in full run

June 29 San FranciscoSciDAC 2005 Science Begins with Data Scientific discovery is done with interactive access to data. Must have interactive access on a large-memory computer for analysis and visualization. Must have high bandwidth in accessing the data. Must have sufficient storage to hold data for weeks/months. Cray X1 Billion-cell simulation in 30 hours generates 4 terabytes Visualization platform Shared file system

June 29 San FranciscoSciDAC 2005 Interactive Visualization of TB Datasets A commodity linux cluster provides all the ‘must haves.’ Data is sliced into slabs and stored on local disks on the cluster nodes. EnSight Gold provides an easy visualization solution, including remote client-server operation and collaboration.

June 29 San FranciscoSciDAC 2005

June 29 San FranciscoSciDAC 2005 We have jumped the hurdle, but there is much more to be gained. Current research in scientific visualization is providing glimpses of very powerful new techniques. The next step is to get these tools into the hands of application scientists so they can explore their data.

June 29 San FranciscoSciDAC 2005 LoRS tools / IBP depots

June 29 San FranciscoSciDAC 2005 Parallel analysis and vis on distributed data Run Simulation On 100’s to 1000’s of cpus Data Flow Continues to Evolve Supercomputer Analysis Cluster ( flops )( interactive ) Logistical Network

June 29 San FranciscoSciDAC 2005

June 29 San FranciscoSciDAC 2005

June 29 San FranciscoSciDAC 2005

June 29 San FranciscoSciDAC 2005

June 29 San FranciscoSciDAC 2005

June 29 San FranciscoSciDAC 2005 Current ‘full physics’ models in 2 spatial dimensions (256 2 ) produce 70 GB per run. Current ‘limited physics’ models in 3 spatial dimensions ( ) produce 4 TB per run. We know this problem must be attacked in 3D with accurate nuclear physics and neutrino transport. With advances in code development and computing platforms, we are looking at PB datasets in the near future! Forecast looks challenging