Presentation is loading. Please wait.

Presentation is loading. Please wait.

On Parallel Time Domain Finite Difference Computation of the Elastic Wave Equation and Perfectly Matched Layers (PML) Absorbing Boundary Conditions (With.

Similar presentations


Presentation on theme: "On Parallel Time Domain Finite Difference Computation of the Elastic Wave Equation and Perfectly Matched Layers (PML) Absorbing Boundary Conditions (With."— Presentation transcript:

1 On Parallel Time Domain Finite Difference Computation of the Elastic Wave Equation and Perfectly Matched Layers (PML) Absorbing Boundary Conditions (With Some SCEC Perspectives) Kim Olsen and Carey Marcinkovich Institute for Crustal Studies University of California at Santa Barbara UTAM Meeting 2003 University of Utah, February 13

2 Expansion of urban centers in tectonically active areas is driving an exponential increase in earthquake risk. Growth of Earthquake Risk Growth of cities 2000-2015 Source: National Geographic Increasing Loss

3 Structural vulnerability Risk Equation Risk = Probable Loss (lives & dollars) = Hazard  Exposure  Fragility Faulting, shaking, landsliding, liquifaction Extent & density of built environment

4 The FEMA 366 Report “HAZUS’99 Estimates of Annual Earthquake Losses for the United States”, September, 2000 U.S. annualized earthquake loss (AEL) is about $4.4 billion/yr. For 25 states, AEL > $10 million/yr 74% of the total is concentrated in California 25% is in Los Angeles County alone

5 1994 Northridge When: 17 Jan 1994 Where: San Fernando Valley Damage: $20 billion Deaths: 57 Injured: >9000

6 Southern California Earthquake Center Consortium of 14 core institutions and 26 other participating organizations, founded as an NSF STC in 1991 Co-funded by NSF and USGS under the National Earthquake Hazards Reduction Program (NEHRP) Mission: –Gather all kinds of data on earthquakes in Southern California –Integrate information into a comprehensive, physics-based understanding of earthquake phenomena –Communicate understanding to end-users and the general public to increase earthquake awareness, reduce economic losses, and save lives Core Institutions California Institute of Technology Columbia University Harvard University Massachusetts Institute of Technology San Diego State University Stanford University U.S. Geological Survey (3 offices) University of California, Los Angeles University of California, San Diego University of California, Santa Barbara University of Nevada, Reno University of Southern California (lead) http://www.scec.org

7 SCEC/ITR Project Goal: To develop a cyberinfrastructure that can support system-level earthquake science – the SCEC Collaboratory Funding: $10M grant over 5 yrs from NSF/ITR program (CISE and Geoscience Directorates) Start date: Oct 1, 2001 SCEC/ITR Project NSF SCEC Institutions IRIS USGS ISI SDSC Information Science Earth Science

8 ITR Goals Develop an information infrastructure for system-level earthquake science that can: –Capture and manipulate the knowledge that will permit a variety of users with different levels of sophistication to configure complex computational pathways. –Enable execution of physics-based simulations and data inversions that incorporate advances in fault-system dynamics, rupture dynamics, wave propagation, and non-linear site response. –Manage large, distributed collections of simulation results, as well as the large sets of geologic, geodetic and seismologic data required to validate the simulations and constrain parameter values. –Provide access to SHA products and methodologies to practicing engineers, emergency managers, decision-makers, and the general public.

9 Computational Pathways Intensity Measures Earthquake Forecast Model Attenuation Relationship 1 Pathway 1: Standard Seismic Hazard Analysis AWM Ground Motions SRM Unified Structural Representation Faults Motions Stresses Anelastic model 2 AWM = Anelastic Wave Model SRM = Site Response Model Pathway 2: Ground motion simulation RDMFSM 3 FSM = Fault System Model RDM = Rupture Dynamics Model Pathway 3: Physics-based earthquake forecasting Invert Other Data Geology Geodesy 4 Pathway 4: Ground motion inverse problem

10 Short-Term Objectives Development and verification of the computational modules Standardization of data structures and interfaces needed for interoperability Development of object classes, control vocabularies, and ontologies for knowledge management and interoperability Construction of computational and data grid testbeds Development of user interfaces for knowledge ingest and acquisition, code execution, and visualization Incorporating Knowledge Representation and Reasoning (KR&R)

11 Validation of AWM’s Before After Sampling of academic and Industry codes Uniform halfspace test

12 No-memory-variable formulation (Maxwell) Coarse memory-variable Formulation (8 relaxations) Reference Solution (FK) in red

13 Fourth-order Staggered–grid Finite Differences

14 T(n+1/2) T(n+1) stresses T(n) stresses velocities Fourth-order staggered-grid finite difference modeling method velocity-stress formulation of Hooke’s and Newton’s laws

15 Perfectly Matched Layers (PML) Absorbing Boundary Conditions

16 PML Discretization Split wave equation up in to parallel and perpendicular components Apply damping to perpendicular component Add up components

17

18 Double-couple Point Source versus Reference

19 Spectral Performance

20 PML(20) versus PML(5) versus Cerjan(20) PML(5) PML(20 ) Cerjan(20)

21 PML(20) Versus Cerjan(20)

22 Extended Fault Rupture (lot’s of low frequencies)

23

24 3D Velocity Model PML(10) Cerjan(20)

25 PML Efficiency Scheme Cerjan et al. (20) PML(5) PML(10) PML(20) Memory (Norm) 1.00 0.79 1.03 1.63 Cpu-time (Norm) 1.00 0.41 0.58 1.04

26 PML Stability Only problems encountered for heterogeneous 3D models ||Grad(v)|| >~5 Smoothing inside PML usually restores stability

27 Short-Term Objectives Development and verification of the computational modules Standardization of data structures and interfaces needed for interoperability Development of object classes, control vocabularies, and ontologies for knowledge management and interoperability Construction of computational and data grid testbeds Development of user interfaces for knowledge ingest and acquisition, code execution, and visualization Incorporating Knowledge Representation and Reasoning (KR&R)

28 Computational Grid (1)Scientist issues a request (compute or data retrieval) to "Job Manager" (2) Job Manager talks to a Testbed computer via GRID service communication protocals. (3) Testbed computer performs the requested actions.

29

30

31 Scaling as a Function of Subdomain Configuration Layers Columns Cubes

32 Short-Term Objectives Development and verification of the computational modules Standardization of data structures and interfaces needed for interoperability Development of object classes, control vocabularies, and ontologies for knowledge management and interoperability Construction of computational and data grid testbeds Development of user interfaces for knowledge ingest and acquisition, code execution, and visualization Incorporating Knowledge Representation and Reasoning (KR&R)

33 Parallel I/O (MPI IO)

34

35

36

37

38

39

40

41

42

43

44

45

46

47

48

49

50

51

52

53

54

55

56

57

58

59

60

61

62

63

64

65

66

67

68

69

70

71

72

73

74

75

76

77

78

79

80

81

82

83

84


Download ppt "On Parallel Time Domain Finite Difference Computation of the Elastic Wave Equation and Perfectly Matched Layers (PML) Absorbing Boundary Conditions (With."

Similar presentations


Ads by Google