Fig. 1. A wiring diagram for the SCEC computational pathways of earthquake system science (left) and large-scale calculations exemplifying each of the.

Slides:



Advertisements
Similar presentations
CyberShake Project and ShakeMaps. CyberShake Project CyberShake is a SCEC research project that is a physics-based high performance computational approach.
Advertisements

Faults in Focus: Earthquake Science Accomplishments Thomas H. Jordan Director, Southern California Earthquake Cente r 28 February 2014.
1 High Performance Computing at SCEC Scott Callaghan Southern California Earthquake Center University of Southern California.
March 7, 2008NGA-East 2nd Workshop1 RECENT DEVELOPMENTS IN STRONG MOTION SIMULATIONS FOR CEUS Paul Somerville and Robert Graves URS Pasadena MOTIVATION:
11/02/2007PEER-SCEC Simulation Workshop1 NUMERICAL GROUND MOTION SIMULATIONS: ASSUMPTIONS, VERIFICATION AND VALIDATION Earthquake Source Velocity Structure.
CyberShake Study 14.2 Technical Readiness Review.
Computational codes, structural models, and simulation results versioned with associated tests. Development of new computational, data, and physical models.
SCEC Information Technology Overview for 2012 Philip J. Maechling Information Technology Architect Southern California Earthquake Center SCEC Board of.
NSF Geoinformatics Project (Sept 2012 – August 2014) Geoinformatics: Community Computational Platforms for Developing Three-Dimensional Models of Earth.
IMPLEMENTATION OF SCEC RESEARCH IN EARTHQUAKE ENGINEERING ONGOING PROJECTS SCEC PROPOSAL TO NSF SCEC 2004 RFP.
1 SCEC Broadband Platform Development Using USC HPCC Philip Maechling 12 Nov 2012.
1.UCERF3 development (Field/Milner) 2.Broadband Platform development (Silva/Goulet/Somerville and others) 3.CVM development to support higher frequencies.
Toward urgent forecasting of aftershock hazard: Simultaneous estimation of b-value of the Gutenberg-Richter ’ s law of the magnitude frequency and changing.
CyberShake Study 15.4 Technical Readiness Review.
CyberShake Study 2.3 Technical Readiness Review. Study re-versioning SCEC software uses year.month versioning Suggest renaming this study to 13.4.
SCEC Workshop on Earthquake Ground Motion Simulation and Validation Development of an Integrated Ground Motion Simulation Validation Program.
Southern California Earthquake Center SCEC Application Performance and Software Development Yifeng Cui [2], T. H. Jordan [1], K. Olsen [3], R. Taborda.
SCEC Community Modeling Environment (SCEC/CME): SCEC TeraShake Platform: Dynamic Rupture and Wave Propagation Simulations Seismological Society of America.
Southern California Earthquake Center SCEC Application Performance and Software Development Thomas H. Jordan [1], Y. Cui [2], K. Olsen [3], R. Taborda[4],
Ground motion simulations in the Pollino region (Southern Italy) for Mw 6.4 scenario events.
Southern California Earthquake Center SCEC Technical Activity Groups (TAGs) Self-organized to develop and test critical methodologies for solving specific.
06/22/041 Data-Gathering Systems IRIS Stanford/ USGS UNAVCO JPL/UCSD Data Management Organizations PI’s, Groups, Centers, etc. Publications, Presentations,
Visualizing TERASHAKE Amit Chourasia Visualization Scientist Visualization Services San Diego Supercomputer center Geon Visualization Workshop March 1-2,
CyberShake Study 15.3 Science Readiness Review. Study 15.3 Scientific Goals Calculate a 1 Hz map of Southern California Produce meaningful 2 second results.
Phase 1: Comparison of Results at 4Hz Phase 1 Goal: Compare 4Hz ground motion results from different codes to establish whether the codes produce equivalent.
HIGH FREQUENCY GROUND MOTION SCALING IN THE YUNNAN REGION W. Winston Chan, Multimax, Inc., Largo, MD W. Winston Chan, Multimax, Inc., Largo, MD Robert.
Southern California Earthquake Center CyberShake Progress Update 3 November 2014 through 4 May 2015 UGMS May 2015 Meeting Philip Maechling SCEC IT Architect.
Experiences Running Seismic Hazard Workflows Scott Callaghan Southern California Earthquake Center University of Southern California SC13 Workflow BoF.
Unified Structural Representation (USR) The primary mission of the USR Focus Area has been the development of a unified, object-oriented 3-D representation.
Southern California Earthquake Center SCEC Collaboratory for Interseismic Simulation and Modeling (CISM) Infrastructure Philip J. Maechling (SCEC) September.
UCERF3 Uniform California Earthquake Rupture Forecast (UCERF3) 14 Full-3D tomographic model CVM-S4.26 of S. California 2 CyberShake 14.2 seismic hazard.
1 1.Used AWP-ODC-GPU to run 10Hz Wave propagation simulation with rough fault rupture in half-space with and without small scale heterogeneities. 2.Used.
Southern California Earthquake Center SI2-SSI: Community Software for Extreme-Scale Computing in Earthquake System Science (SEISM2) Wrap-up Session Thomas.
Southern California Earthquake Center CyberShake Progress Update November 3, 2014 – 4 May 2015 UGMS May 2015 Meeting Philip Maechling SCEC IT Architect.
Welcome to the CME Project Meeting 2013 Philip J. Maechling Information Technology Architect Southern California Earthquake Center.
PEER 2003 Meeting 03/08/031 Interdisciplinary Framework Major focus areas Structural Representation Fault Systems Earthquake Source Physics Ground Motions.
INTRODUCTION TO XSEDE. INTRODUCTION  Extreme Science and Engineering Discovery Environment (XSEDE)  “most advanced, powerful, and robust collection.
SCEC Capability Simulations on TeraGrid
National Center for Supercomputing Applications University of Illinois at Urbana-Champaign Recent TeraGrid Visualization Support Projects at NCSA Dave.
Overview of Scientific Workflows: Why Use Them?
Special Project Highlights ( )
CyberShake Study 2.3 Readiness Review
CyberShake Study 16.9 Science Readiness Review
High Performance Computing at SCEC
Materials for GTC DC Panel Presentation
SCEC UGMS Committee Meeting
2020 NEHRP Provisions Issues Ground Motion
Seismic Hazard Analysis Using Distributed Workflows
High Performance Computing at SCEC
Date of download: 11/2/2017 Copyright © ASME. All rights reserved.
Meeting Objectives Discuss proposed CISM structure and activities
Scott Callaghan Southern California Earthquake Center
CyberShake Study 16.9 Discussion
SCEC Community Modeling Environment (SCEC/CME)
High-F Project Southern California Earthquake Center
Philip J. Maechling (SCEC) September 13, 2015
2University of Southern California, 3San Diego Supercomputer Center
High-Performance Computing (HPC) IS Transforming Seismology
CyberShake Study 17.3 Science Readiness Review
SCEC UGMS Committee Meeting No. 6
CyberShake Study 2.2: Science Review Scott Callaghan 1.
by J. Galetzka, D. Melgar, J. F. Genrich, J. Geng, S. Owen, E. O
by Asaf Inbal, Jean Paul Ampuero, and Robert W. Clayton
rvGAHP – Push-Based Job Submission Using Reverse SSH Connections
CyberShake Study 14.2 Science Readiness Review
by Satoshi Ide, Annemarie Baltay, and Gregory C. Beroza
Southern California Earthquake Center
CyberShake Study 18.8 Technical Readiness Review
CyberShake Study 2.2: Computational Review Scott Callaghan 1.
by Julian C. Lozos Science Volume 2(3):e March 11, 2016
Presentation transcript:

Fig. 1. A wiring diagram for the SCEC computational pathways of earthquake system science (left) and large-scale calculations exemplifying each of the pathways (below). (0) SCEC Broadband Platform simulations used to develop GMPE for Eastern U.S. (1) Uniform California earthquake rupture forecast, UCERF3, run on TACC Stampede. (2) CyberShake ground motion prediction model 14.2, run on NCSA Blue Waters. (3) Dynamic rupture model including fractal fault roughness, run on XSEDE Kraken. (4) 3D velocity model for Southern California crust, CVM-S4.26, run on ALCF Mira. Model components include dynamic and kinematic fault rupture (DFR and KFR), anelastic wave propagation (AWP), nonlinear site response (NSR), and full-3D tomography (F3DT). UCERF3 Uniform California Earthquake Rupture Forecast (UCERF3) 14 Full-3D tomographic model CVM-S4.26 of S. California 2 CyberShake 14.2 seismic hazard model for LA region 3 Dynamic rupture model of fractal roughness on SAF Los Angel es SA-3s, 2% PoE in 50 years depth = 6 km 0 SCEC Broadband Platform NGA-E GMPE Development

CVM-S4.26BBP-1D Figure 2. Comparison of two seismic hazard models for the Los Angeles region from CyberShake Study 14.2, completed in early March, The left panel is based on an average 1D model, and the right panel is based on the F3DT-refined structure CVM- S4.26. The 3D model shows important amplitude differences from the 1D model, several of which are annotated on the right panel: (1) lower near-fault intensities due to 3D scattering; (2) much higher intensities in near-fault basins due to directivity-basin coupling; (3) higher intensities in the Los Angeles basins; and (4) lower intensities in hard-rock areas. The maps are computed for 3-s response spectra at an exceedance probability of 2% in 50 years. Both models include all fault ruptures in the Uniform California Earthquake Rupture Forecast, version 2 (UCERF2), and each comprises about 240 million seismograms.

Figure 3. Left: Scaling of SCEC HPC Applications. Weak scaling of AWP-ODC on XK7, XE6 and HPS250. Right: Strong scaling of Hercules (computing wall clock time) on Kraken, Blue Waters, Mira and Titan. Benchmarks are based on variety of problem sizes. AWP-ODC is measured with sustained 2.3 PFLOPS on XK7, and 653 TFLOPS on XE6. Hercules is measured with a 2.8 Hz 1.5 billion finite elements for complete scaling experiments (continuous lines) and other size problem for isolated (blue dots) data points as indicated in the figure. Hercules core count correspond to CPU cores, but runs on Titan also used NVIDIA GPU accelerators.

Figure 4: CyberShake workflow. Circles indicate computational modules and rectangles indicate files and databases. We have automated these processing stages using Pegasus-WMS software to ensure the processes with data dependencies run in the required order. Workflow tools also increase the robustness of our calculations by providing error detection and restart capabilities that enable us to restart a partially completed workflow from an intermediate place in the processing, rather than from the beginning.

Figure 3. (Left) AWP-ODC-GPU Weak scaling and sustained performance using AWP in single precision. Solid (dashed) black line is (ideal) speedup on Titan, round/triangle/cross points are Flops performance on Titan/Blue Waters/Keeneland. A perfect linear speedup is observed between 16 and 8,192 nodes. A sustained 2.3 Pflop/s performance was recorded on 16,384 Titan nodes; (right): SORD performance on OLCF

Figure 4: Hercules scalability curves based on measured performance on ALCF Blue Gene/Q (Mira), NSF Track 1 (Blue Waters) and XSEDE Track 2 (Kraken) systems. Benchmarking on Mira for this allocation was only completed for the strong scaling curve from 8K to 32K cores (using 32 processes per node). These initial results, though limited, indicate that Hercules will sustain the excellent scalability shown in other machines, for which the computational readiness of the code is well established.