Presentation is loading. Please wait.

Presentation is loading. Please wait.

Simulations of Large Earthquakes on the Southern San Andreas Fault Amit Chourasia Visualization Scientist San Diego Supercomputer Center Presented to:

Similar presentations


Presentation on theme: "Simulations of Large Earthquakes on the Southern San Andreas Fault Amit Chourasia Visualization Scientist San Diego Supercomputer Center Presented to:"— Presentation transcript:

1 Simulations of Large Earthquakes on the Southern San Andreas Fault Amit Chourasia Visualization Scientist San Diego Supercomputer Center Presented to: Latin American Journalists July 11, 2007

2 Global Seismic Hazard Source: Global Seismic Hazard Assessment Program

3 Expansion of urban centers in tectonically active areas is driving an exponential increase in earthquake risk. Growth of Earthquake Risk Growth of cities 2000-2015 Source: National Geographic Increasing Loss Slide: Courtesy Kim Olsen

4 Structural vulnerability Risk Equation Risk = Probable Loss (lives & dollars) = Hazard  Exposure  Fragility Faulting, shaking, landsliding, liquifaction Extent & density of built environment Slide: Courtesy Kim Olsen

5 Seismic Hazard Analysis Definition:Specification of the maximum intensity of shaking expected at a site during a fixed time interval Example:National seismic hazard maps Intensity measure: peak ground acceleration (PGA) Interval: 50 years Probability of exceedance: 2%(http://geohazards.cr.usgs.gov/eq/) Slide: Courtesy Kim Olsen

6 “HAZUS’99 Estimates of Annual Earthquake Losses for the United States”, September, 2000 The FEMA 366 Report U.S. annualized earthquake loss (AEL) is about $4.4 billion/yr. For 25 states, AEL > $10 million/yr 74% of the total is concentrated in California 25% is in Los Angeles County alone Slide: Courtesy Kim Olsen

7 Southern California: a Natural Laboratory for Understanding Seismic Hazard and Managing Risk  Tectonic diversity  Complex fault network  High seismic activity  Excellent geologic exposure  Rich data sources  Large urban population with densely built environment  high risk  Extensive research program coordinated by Southern California Earthquake Center (SCEC) under NSF and USGS sponsorship Slide: Courtesy Kim Olsen

8 1994 Northridge When: 17 Jan 1994 Where: San Fernando Valley Damage: $20 billion Deaths: 57 Injured: >9000 Slide: Courtesy Kim Olsen

9 Slip deficit on the southern SAF since last event (1690): 315 years x 16 mm/year = 5.04 m -> M w 7.7 1857 M 7.9 ~1690 M 7.7 Major Earthquakes on the San Andreas Fault, 1690-present1906 M 7.8 146+91-60 yrs 220±13 yrs Slide: Courtesy Kim Olsen

10 TeraShake Simulation Region  600km x 300km x 80km  Spatial resolution = 200m  Mesh Dimensions  3000 x 1500 x 400 = 1.8 billion mesh points  Simulated time = 4 minutes  Number of time steps = 22,728 (0.011 sec time step)  60 sec source duration from Denali  3D Crustal structure: subset of SCEC CVM3.0  Near-surface S-wave velocity truncated at 500m/s, up to 0.5 Hz

11 Computational Challenge!

12 TeraShake-2 Data Flow TS2.dyn.200m 30x 256 procs, 12 hrs, TG IA-64 GPFS Okaya 200m Media Okaya 100m Media 100m Reformatting 100m Transform 100m Filtering 200m moment rate SDSC IA-64 TS2.dyn.100m 10x 1024 procs, 35 hrs Initial 200m Stress modify Initial 100m Stress modify TS2.wav.200m 3x 1024 procs, 35 hrs NCSA IA-64 Datastar p690 Datastar p655 Visualization Analysis Network TG IA-64 GPFS-wan NCSA-SAN SDSC-SAN Velocity mag. & cum peak Displace. mag & cum peak Seismograms Registered to Digital Library SRB SAM-QFS HPSS Datastar GPFS Slide: Courtesy Yifeng Cui

13 Challenges for Porting and Optimization Before Optimization  Code deals up to 24 million mesh nodes  Code scales up to 512 processors  Ran on local clusters only  No checkpoints/restart capability  Wave propagation simulation only  Researcher’s own code  Mesh partition and solver in one  Initialization not scalable, large memory need  I/O not scalable, not portable After Optimization  Codes enhanced to deal with 32 billion mesh nodes  Excellent speed-up to 40,960 processors, 6.1 Tflop/s  Ported to p655, BG/L, IA-64, XT3, Dell Linux etc  Added Checkpoints/restart/checksum capability  Integrated dynamic rupture + wave propagation as one  Serve as SCEC Community Velocity Model  Mesh partition separated from solver  10x speed-up of initialization, scalable, memory reduced  MPI-I/O improved 10x, scaled up to 40k processors Slide: Courtesy Yifeng Cui

14 Data from TeraShake 1.1 Scalar Surface (floats) 3000 x 1500 ie 600 km x 300 km =17.2 MB per timestep 20,000 timesteps 3 variables Vx, Vy & Vz Velocity components Total Scalar data = 1.1 TB Scalar Volume (floats) 3000 x 1500 x 400 ie 600 x 300 x 80 km^3 =7.2 GB per timestep 2,000 timesteps 3 variables Vx, Vy & Vz Velocity components Total Vol data = 43.2 TB Other Data – check points,etc Grand Total = 47.4 TB Aggregate Data : 160 TB (seven simulations)

15 Visualization MovieMovie (1.5 mb)

16 Comparative Visualization MovieMovie (11 mb)

17 PGV (NW-SE Rupture) PGV (SE-NW1 Rupture) Scenario Comparison

18 Topography Deformation MovieMovie (11 mb)

19 Glimpse of Visualization MovieMovie (65 mb)

20 Visualization  Over 130,0000 images  Consumed 40,000 hrs of compute time  More than 50 unique animations

21 Does Viz work?

22

23 TeraShake Results NW-directed rupture on southern San Andreas Fault is highly efficient in exciting L.A. Basin Maximum amplification from focusing associated with waveguide contraction Peak ground velocities exceeding 100 cm/s over much of the LA basin Uncertainties related to simplistic source description. Extremely nonlinear dynamic rupture propagation Effect of 3D velocity structure: SE- NW and NW-SE dynamic models NOT interchangeable Stress/strength/tapering - weak layer required in upper ~2km to avoid super-shear rupture velocity Dynamic ground motions: kinematic pattern persists in dynamic results, but peak motions 50-70% smaller than the kinematic values due to less coherent rupture front TeraShake-1TeraShake-2 Slide: Courtesy Yifeng Cui

24 Summary  TeraShake demonstrated that optimization and enhancement of major applications codes are essential for using large resources (number of CPUs, number of CPU-hours, TBs of data produced)  TeraShake showed that multiple types of resources are needed for large problems: initialization, run-time execution, analysis resources, and long-term collection management  TeraShake code as a community code now used by the wider SCEC community  Significant TeraGrid allocations are required to advance the seismic hazard analysis to a more accurate level  Next: PetaShake! Slide: Courtesy Yifeng Cui

25 References  Chourasia, A., Cutchin, S. M., Olsen, K.B., Minster, B., Day, S., Cui, Y., Maechling, P., Moore, R., Jordan, T. (2007) “Visual insights into high-resolution earthquake simulations”, IEEE Computer Graphics & Applications (Discovering the Unexpected) Sept-Oct 2007, In press.  Cui, Y., Moore, R., Olsen, K., Chourasia, A., Maechling, P., Minster. B., Day, S., Hu, Y., Zhu, J., Majumdar, A., Jordan, T. (2007), Enabling very-large scale earthquake simulations on parallel machines "Advancing Science and Society through Computation", International Conference on Computational Science 2007, Part I, Lecture Notes in Computer Science series 4487, pp. 46-53, Springer  Olsen, K.B., S.M. Day, J.B. Minster, Y. Cui, A. Chourasia, M. Faerman, R. Moore, P. Maechling, and T. Jordan (2006). Strong shaking in Los Angeles expected from southern San Andreas earthquake, Geophys. Res. Lett. 33, L07305,doi:10.1029/2005GRL025472

26 TeraShake Collaboration Large Scale Earthquake Simulation on Southern San Andreas 33 researchers, 8 Institutions  Southern California Earthquake Center  San Diego Supercomputer Center  Information Sciences Institute  Institute of Geophysics and Planetary Physics (UC) University of Southern California  San Diego State University  University of California, Santa Barbara  Carnegie-Mellon University  ExxonMobil Slide: Courtesy Marcio Faerman

27 Acknowledgements  Southern California Earthquake Center (SCEC)  San Diego Supercomputer Center (SDSC)  Funding: National Science Foundation

28 Thanks for your patience Q&A Websites: http://www.sdsc.edu/us/sachttp://www.sdsc.edu/us/sac (Computation) http://epicenter.usc.edu/cmeportal/TeraShake.htmlhttp://epicenter.usc.edu/cmeportal/TeraShake.html (Seismology) http://visservices.sdsc.edu/projects/scec/terashakehttp://visservices.sdsc.edu/projects/scec/terashake (Visualization)


Download ppt "Simulations of Large Earthquakes on the Southern San Andreas Fault Amit Chourasia Visualization Scientist San Diego Supercomputer Center Presented to:"

Similar presentations


Ads by Google