Working Group on California Earthquake Probabilities (WGCEP) Development of a Uniform California Earthquake Rupture Forecast (UCERF)

Slides:



Advertisements
Similar presentations
UCERF3 Fault-by-Fault Review Update
Advertisements

Review of Catalogs and Rate Determination in UCERF2 and Plans for UCERF3 Andy Michael.
Now Some Implications of Deformation Models & Seismicity Observations…
SOFTWARE TESTING. INTRODUCTION  Software Testing is the process of executing a program or system with the intent of finding errors.  It involves any.
(Introduction to) Earthquake Energy Balance
New Mapping of Creeping Faults Bartlett Springs Fault & northern Green Valley Fault
2/21/ USGS NSHMP CA Workshop II1 UCERF3.2: Hazard Implications Hazard comparison metrics Inversion testing –Convergence and eqn. set weighting.
Edward (Ned) Field USGS, Pasadena plus Tom Jordan, Nitin Gupta, Vipin Gupta, Phil Maechling, Allin Cornell, Ken Campbell, Sid Hellman, & Steve Rock OpenSHA.
Active Folding within the L.A. Basin with a focus on: Argus et al. (2005), Interseismic strain accumulation and anthropogenic motion in metropolitan Los.
16/9/2011UCERF3 / EQ Simulators Workshop Terry Tullis Steve Ward John RundleJim Dieterich Keith Richards-Dinger Fred Pollitz Generic Description of Earthquake.
Control Performance Frequency Analysis Control Standards Surveys Inadvertent Interchange Time Error Frequency Bias Settings Control Area Assistance Automatic.
The trouble with segmentation David D. Jackson, UCLA Yan Y. Kagan, UCLA Natanya Black, UCLA.
DRAFT Strategic Planning U.S. Department of Energy Rebuild America Business Partners and Deanna Braunlin GAVIN Consulting, Inc. John Deakin Energy Program.
Earthquake Probabilities in the San Francisco Bay Region, 2002–2031 Working Group on California Earthquake Probabilities, 2002 Chapters 1 & 2.
A New Approach To Paleoseismic Event Correlation Glenn Biasi and Ray Weldon University of Nevada Reno Acknowledgments: Tom Fumal, Kate Scharer, SCEC and.
Simulation.
Chapter 4: The SFBR Earthquake Source Model: Magnitude and Long-Term Rates Ahyi Kim 2/23/07 EQW.
Using Geodetic Rates in Seismic Hazard Mapping March 30, Geodetic and Geologic slip rate estimates for earthquake hazard assessment in Southern California.
Chapter 5: Calculating Earthquake Probabilities for the SFBR Mei Xue EQW March 16.
16.5 Introduction to Cost- based plan selection Amith KC Student Id: 109.
Chapter 5: Project Scope Management
The Calibration Process
The Empirical Model Karen Felzer USGS Pasadena. A low modern/historical seismicity rate has long been recognized in the San Francisco Bay Area Stein 1999.
If we build an ETAS model based primarily on information from smaller earthquakes, will it work for forecasting the larger (M≥6.5) potentially damaging.
Paleoseismic and Geologic Data for Earthquake Simulations Lisa B. Grant and Miryha M. Gould.
Determining Sample Size
Demand Management and Forecasting
1 Framework Programme 7 Guide for Applicants
Project Risk and Cost Management. IS the future certain? The future is uncertain, but it is certain that there are two questions will be asked about our.
Chapter 6 The Normal Probability Distribution
1 Performance Evaluation of Computer Networks: Part II Objectives r Simulation Modeling r Classification of Simulation Modeling r Discrete-Event Simulation.
Comments on UCERF 3 Art Frankel USGS For Workshop on Use of UCERF3 in the National Seismic Hazard Maps Oct , 2012.
Updating Models of Earthquake Recurrence and Rupture Geometry of the Cascadia Subduction Zone for UCERF3 and the National Seismic Hazard Maps Art Frankel.
WGCEP Workshop What Represents Best Available Science in terms of Time-Dependent Earthquake Probabilities? Introduction by Ned Field.
Timothy Reeves: Presenter Marisa Orr, Sherrill Biggers Evaluation of the Holistic Method to Size a 3-D Wheel/Soil Model.
Section 8.1 Estimating  When  is Known In this section, we develop techniques for estimating the population mean μ using sample data. We assume that.
Workshop on Fault Segmentation and Fault-To-Fault Jumps in Earthquake Rupture (March 15-17, 2006) Convened by Ned Field, Ray Weldon, Ruth Harris, David.
MD Digital Government Summit, June 26, Maryland Project Management Oversight & System Development Life Cycle (SDLC) Robert Krauss MD Digital Government.
National Seismic Hazard Maps and Uniform California Earthquake Rupture Forecast 1.0 National Seismic Hazard Mapping Project (Golden, CO) California Geological.
Research opportunities using IRIS and other seismic data resources John Taber, Incorporated Research Institutions for Seismology Michael Wysession, Washington.
Operations Management For Competitive Advantage 1Forecasting Operations Management For Competitive Advantage Chapter 11.
Using IRIS and other seismic data resources in the classroom John Taber, Incorporated Research Institutions for Seismology.
Earthquake hazard isn’t a physical thing we measure. It's something mapmakers define and then use computer programs to predict. To decide how much to believe.
A (re-) New (ed) Spin on Renewal Models Karen Felzer USGS Pasadena.
Forecasting Operations Management For Competitive Advantage.
The System and Software Development Process Instructor: Dr. Hany H. Ammar Dept. of Computer Science and Electrical Engineering, WVU.
Quantifying and characterizing crustal deformation The geometric moment Brittle strain The usefulness of the scaling laws.
Coulomb Stress Changes and the Triggering of Earthquakes
Yuehua Zeng & Wayne Thatcher U. S. Geological Survey
Issues concerning the interpretation of statistical significance tests.
Delta-DOR WG: Report of the Spring 2010 Meeting Portsmouth, VA, USA May 7 th, 2010 Roberto Maddè ESA/ESOC,
Yan Y. Kagan Dept. Earth and Space Sciences, UCLA, Los Angeles, CA , Evaluation.
Some General Implications of Results Because hazard estimates at a point are often dominated by one or a few faults, an important metric is the participation.
SOFTWARE TESTING. Introduction Software Testing is the process of executing a program or system with the intent of finding errors. It involves any activity.
112/16/2010AGU Annual Fall Meeting - NG44a-08 Terry Tullis Michael Barall Steve Ward John Rundle Don Turcotte Louise Kellogg Burak Yikilmaz Eric Heien.
The 2002 Working Group Approach to Modeling Earthquake Probabilities Michael L. Blanpied U.S. Geological Survey Earthquake Hazards Program, Reston, VA.
The Snowball Effect: Statistical Evidence that Big Earthquakes are Rapid Cascades of Small Aftershocks Karen Felzer U.S. Geological Survey.
A proposed triggering/clustering model for the current WGCEP Karen Felzer USGS, Pasadena Seismogram from Peng et al., in press.
California Earthquake Rupture Model Satisfying Accepted Scaling Laws (SCEC 2010, 1-129) David Jackson, Yan Kagan and Qi Wang Department of Earth and Space.
Plate-tectonic analysis of shallow seismicity: Apparent boundary width, beta, corner magnitude, coupled lithosphere thickness, and coupling in 7 tectonic.
Epistemic uncertainty in California-wide simulations of synthetic seismicity Fred Pollitz, USGS Menlo Park Acknowledgments: David Schwartz, Steve Ward.
Fault Segmentation: User Perspective Norm Abrahamson PG&E March 16, 2006.
Chapter 9: Introduction to the t statistic. The t Statistic The t statistic allows researchers to use sample data to test hypotheses about an unknown.
SHORT- AND LONG-TERM EARTHQUAKE FORECASTS FOR CALIFORNIA AND NEVADA Kagan, Y. Y. and D. D. Jackson Department of Earth and Space Sciences, University of.
AUDIT STAFF TRAINING WORKSHOP 13 TH – 14 TH NOVEMBER 2014, HILTON HOTEL NAIROBI AUDIT PLANNING 1.
Future Directions and Capabilities of Simulators Jim Dieterich
Comments on physical simulator models
Software Testing.
Some issues/limitations with current UCERF approach
Meeting Objectives Discuss proposed CISM structure and activities
Presentation transcript:

Working Group on California Earthquake Probabilities (WGCEP) Development of a Uniform California Earthquake Rupture Forecast (UCERF)

To provide the California Earthquake Authority (CEA) with a statewide, time-dependent ERF that uses “best available science” and is endorsed by the USGS, CGS, and SCEC, and is evaluated by Scientific Review Panel (SRP) and CEPEC & NEPEC Coordinated with the next National Seismic Hazard Mapping Program (NSHMP) time-independent model CEA will use this to set earthquake insurance rates (they want 5-year forecasts, maybe 1-year in future) WGCEP Goals:

Black Box Deformation Model(s) Earthquake Prob Model(s) Earthquake Rate Model(s) Black Box Black Box (A) (B) (C) (D) UCERF Model Components Fault Model(s)

Black Box Deformation Model(s) Earthquake Prob Model(s) Earthquake Rate Model(s) Black Box Black Box (A) (B) (C) (D) Fault Model(s) Fault Section Database Paleo Sites Database GPS Database Historical Qk Catalog Instrumental Qk Catalog UCERF Model Components

Delivery Schedule February 8, 2006 (to CEA) UCERF 1.0 & S. SAF Assessment to CEA Aug 31, 2006 (to CEA) Fault Section Database 2.0 Earthquake Rate Model 2.0 (preliminary for NSHMP) April 1, 2007 (to NSHMP) Final, reviewed Earthquake Rate Model (for use in 2007 NSHMP revision) September 30, 2007 (to CEA) UCERF 2.0 (reviewed by SRP and CEPEC) UCERFs ≥3 later

i. Fault Section Database 2.0: “… will contain the parameters describing a revised statewide set of fault sections. This statewide set will be sufficient for building Earthquake Rate Model 2.0.” ii. Earthquake Rate Model 2.0: “… will be the product delivered to the USGS as USC- SCEC/USGS/CGS input to NSHMP(2007).” Aug 31 Deliverables: Reality: this will be a preliminary version of what’s ultimately used, as revisions up to the last minute are inevitable (versions 2.X). NSHMP will have their own public review in October, Thus, how polished does the Aug. 31 delivery need to be? Branch weights will not be included (except maybe preliminary)

Aug 31 Deliverables: Incremental changes to NSHMP (2002) because: We want to keep track of what changes matter Don’t promise more than we can deliver NSHMP consumers don’t want more than this (more ambitious changes in versions ≥3.0) i. Fault Section Database 2.0: “… will contain the parameters describing a revised statewide set of fault sections. This statewide set will be sufficient for building Earthquake Rate Model 2.0.” ii. Earthquake Rate Model 2.0: “… will be the product delivered to the USGS as USC- SCEC/USGS/CGS input to NSHMP(2007).”

Aug 31 Deliverables: Development strategy: “Focus on what’s important rather than what’s interesting” Statewide consistency Simplify wherever possible Question: What hazard or loss metric should we use? i. Fault Section Database 2.0: “… will contain the parameters describing a revised statewide set of fault sections. This statewide set will be sufficient for building Earthquake Rate Model 2.0.” ii. Earthquake Rate Model 2.0: “… will be the product delivered to the USGS as USC- SCEC/USGS/CGS input to NSHMP(2007).”

Hazard Calculation IntensityMeasure Type & Level (IMT & IML) Intensity-MeasureRelationship List of Supported Intensity-Measure Types List of Site-Related Independent Parameters Earthquake-RuptureForecast List of Adjustable Parameters Site Location List of Site- Related Parameters Prob(IMT≥IML) Time Span Adjustable Parameter Settings

Black Box Deformation Model(s) Earthquake Prob Model(s) Earthquake Rate Model(s) Black Box Black Box (A) (B) (C) (D) UCERF Model Components Fault Model(s) This is a time- independent ERF

The ERF Adjustable Parameters are the epistemic uncertainties:

Those in Earthquake Rate Model 2.0 include (so far) :

These could (should?) be added to Earthquake Rate Model 2.0: Easily Added: Fault Section Database 2.0 Uncertainties for: slip rate upper and lower seismogenic depth aseismicity factors (if available) average dip for each fault section Fraction of Mo-Rate on A & B sources into smaller events* Additional epistemic uncertainty on A- & B-fault mags* More Work: Other slip per even assumptions on A-Faults (other than the characteristic slip; D sr = D s ) : WG02 Slip (D sr proportional to v s ) Uniform/Boxcar Slip (D sr = D r ) Tapered (D sr goes down toward ends of rupture) Others? Uncertainties in segment mean recurrence intervals Uncertainties in segment boundaries for A-Faults Combining some adjacent B-Fault sources

Definitely Not Included in Earthquake Rate Model 2.0: Deformation Models ≥3.0 Fault-to-fault rupture jumps (via generalized inverse or simulations) Relaxation of assumed segmentation while honoring paleoseismic data (which might demand a segmented model) These inherent limits should be kept in mind when deciding how much more work should be put into Earthquake Rate Models 2.x

The outer branches of a logic tree represent an “ERF Epistemic List” (a list of ERFs w/ diff. param. settings & associated weights): How extensively should the logic-tree be sampled? Example with WGCEP-2002 … What will actually be used?

Basic Questions for SRP What more should be accomplished by Aug. 31st? What should be accomplished by April 1 (NSHMP deadline)? What hazard or loss metric should be used? How extensively should the logic-tree be sampled? How shall the branch weights be assigned? How shall the formal review proceed? Fault Section Database 2.0 Fault Models 2.1 & 2.2 Deformation Models 2.x Earthquake Catalog Regional Seismicity Constraints Magnitude-Area Relations Segment Recurrence Data Alt. A-Fault Rupture Models Type-B Fault & C-zone models Recipe for combining everything

Intro to Fault Section Database, Fault Models, and Deformation Models

Somerville (2006): M = log(A) Ellsworth A: M = log(A) Ellsworth B: M = log(A) Hanks & Bakun (2002): M = log(A) if A<537 M = (4/3)log(A) if A>537

If S segments, then R=S(S+1)/2 different ruptures involving contiguous segments. We want the long-term rate (f r ) of each r th rupture. We know for each segment: Slip Rate (v s ); Mean Recur Int (T s =1/ s ) Constraints are: Equation Set (1) Equation Set (2) Equation Set (3) Positivity where D sr is the average slip in the r th rupture on the s th segment, and G sr is a matrix indicating whether the r th rupture involves the s th segment (1 if so, 0 if not). Type-A Fault Rupture Models Under-determined (infinite number of solutions)

WGCEP-2002 Solution If end segment can’t go alone, then neither can its neighbor. Implicitly assumes D rs is proportional to v s Requires moment balancing

Current WGCEP Solution Minimum Rate Solution - That which minimizes the total rate of ruptures (and therefore maximizes event magnitudes), consistent w/ obs. Maximum Rate Solution - That which maximizes the total rate of ruptures (and therefore minimizes event magnitudes), consistent w/ obs. Geological Insight Solution - That which makes all f r as close as possible to a complete set of defined by geologists, consistent w/ obs. With compiled T s (spreadsheet), and v s from the chosen deformation model, solve for the following (by hand): Assume characteristic slip (like WGCEP-1995): D sr = D s = v s / s = v s T s Equal Rate Solution -That which makes all f r as equal as possible (These don’t span solution space, but should span hazard/loss space?)

Current WGCEP Solution Aseismic Slip Factor applied as reduction of area or slip rate. Each rupture given a Gaussian magnitude PDF w/ default sigma = 0.12 and truncation at +/- 2 sigma. Further details:

Current WGCEP Solution Somerville (2006): M = log(A) Ellsworth A: M = log(A) Ellsworth B: M = log(A) Hanks & Bakun (2002): M = log(A) if A<537 M = (4/3)log(A) if A>537

Current WGCEP Solution Dependence of tabulated mean recurrence intervals (T s ) on deformation model (v s ). Some rupture models are not exactly rate balanced. Influence of mean recurrence interval uncertainties. How well & consistently are these uncertainties defined? Alternatives to the characteristic slip assumption (currently trying to implement a generalized inversion solution using NNLS of Lawson and Hanson (1974)). What needs further consideration (?):

Current WGCEP Solution We also have an Un-Segmented option for Type-A faults (same as for Type-B faults).

Set the following 1)Deformation Model 2)Aseismicity Factor Reduced Area? 3)Mag-Area Relationship (to get mean and upper mag for char and GR dists) 4)% Char vs GR 5)Mag Sigma (for char dist) 6)Truncation Level (for char dist) 7)B-Faults b-value (for GR dist) Type-B Fault Rupture Models One for each fault section in the database that is not part of a Type-A fault (although Concord-Green Valley & Greenville are sections combined GR lower mag = 6.5. If GR upper mag < 6.5, all moment-rate goes in the Char dist. NSHMP-2002 used a truncation level of 1.25 sigma (rather than the 2 sigma of WGCEP-2002), but also added an additional +/- 0.2 epistemic uncertainty to both the Char and Upper-GR Mags (not yet done here).

Type-B Fault Rupture Models NSHMP-2002 B-Fault exceptions (special cases with “fixed” values): Owl Lake ( M 6.5, rate 0.002/yr), Owens Valley fault (M7.6, rate ) The magnitude was fixed at the magnitude of the 1882? (1872) earthquake. Honey Lake fault (M 6.9, rate ) recurrence rate of about 1500 years based on paleoseismic study by Wills and Borchardt (1991) Eureka Peak ( M 6.4, rate ) The 5,000 yr recurrence is similar to the other Mojave faults Burnt Mountain (M6.5, rate ) The 5,000 yr recurrence is the same as other Mojave faults Cucamonga (M6.9, ) Maximum magnitude and recurrence of about 650 years based on 2 m average recurrence. Sierra Madre-San Fernando (M6.7, 0.001) Magnitude based on San Fernando earthquake, recurrence of 1 ka based on USGS (1996). Palos Verdes (M7.2, ) Recurrence based on McNeilan et al. (1996) of 650 years. Blackwater (M 6.9, ) Based on paleoseismic studies by Hecker et al., 1993; Rubin and Sieh, 1993; and Herzberg and Rockwell, Calico-Hidalgo (M7.2, ) Based on paleoseismic studies by Hecker et al., 1993; Rubin and Sieh, 1993; and Herzberg and Rockwell, Gravel Hills-Harper Lake (M7.0, ) Based on paleoseismic studies by Hecker et al., 1993; Rubin and Sieh, 1993; and Herzberg and Rockwell, Helendale-S. Lockhart (M7.2, ) Based on paleoseismic studies by Hecker et al., 1993; Rubin and Sieh, 1993; and Herzberg and Rockwell, Lenwood-Lockhart-Old Woman Springs (M7.5, ) Based on paleoseismic studies by Hecker et al., 1993; Rubin and Sieh, 1993; and Herzberg and Rockwell, Pisgah-Bullion Mountain Mesquite Lake (M7.2, ) Based on paleoseismic studies by Hecker et al., 1993; Rubin and Sieh, 1993; and Herzberg and Rockwell, Landers (M7.3, ) Based on paleoseismic studies by Hecker et al., 1993; Rubin and Sieh, 1993; and Herzberg and Rockwell, Magnitude based on 1992 Landers qk South Emerson-Copper Mountain (M 6.9, ) Based on paleoseismic studies by Hecker et al., 1993; Rubin and Sieh, 1993; and Herzberg and Rockwell, Johnson Valley (M 6.7, ) Based on paleoseismic studies by Hecker et al., 1993; Rubin and Sieh, 1993; and Herzberg and Rockwell, Maacama (M 7.5 ) Floated a M 7.5 earthquake because the discontinuous strands of the fault were not thought to be capable of rupturing in a larger event.

Type-C Zone Rupture Models For events in areas where deformation is occurring over a wide area on poorly known or unknown faults, modeled as strike slip events oriented along the structural trend (fixed strike); GR between mag 6.5 and 7.3 (except foothills gets mag 6-7); moment rate as:  *L*h*slipRate. 4 New zones:

In computing total regional seismicity parameters, Karen recommends several revisions to the NSHMP-2002 method, including: making corrections for magnitude error and rounding before calculating a values using only modern instrumental data to calculate b value leaving aftershocks in the catalog when calculating parameters for the entire study region using an inverse power law rather than a Gaussian smoothing kernel when calculating spatially variable seismicity rates Background Seismicity Results: –14% lower total rate of M≥5 events –b-value of 1.0 rather than 0.8 –a different spatial distribution or seismicity

In computing total regional seismicity parameters, Karen recommends several revisions to the NSHMP-2002 method, including: making corrections for magnitude error and rounding before calculating a values using only modern instrumental data to calculate b value leaving aftershocks in the catalog when calculating parameters for the entire study region using an inverse power law rather than a Gaussian smoothing kernel when calculating spatially variable seismicity rates Background Seismicity Results: –14% lower total rate of M≥5 events –b-value of 1.0 rather than 0.8 –a different spatial distribution or seismicity Her best estimate of the total rate of M≥5 events is 10+/-1.2 per including aftershocks, and 5.4+/ otherwise (and assuming the definition of aftershocks applied by the NSHMP- 2002). However, for our definition of “California” (RELM-test region) mult these by 0.75 to get: Rate of M≥5 = 7.5 +/- 0.9 including aftershocks Rate of M≥5 = 4.0 +/- 0.6 excluding aftershocks

Background Seismicity Background seismicity is computed as the total target rate minus the rates implied by the Type-A, Type–B, and C-zone sources. The adjustable parameters (epistemic uncertainties) are: Total Rate of M≥5 events (default=7.5 including aftershocks; 4.0 excluding) Regional b-value (default=1.0) Maximum Magnitude of Background (default=7 as in NSHMP) The magnitude frequency distribution for all background seismicity combined is computed as follows: 1) Create a target cumulative Gutenberg-Richter distribution between magnitude 5.0 and the maximum magnitude of the background seismicity (truncated on the incremental distribution) with the specified b-value and total rate of M≥5 events. 2) From this target distribution subtract the cumulative magnitude frequency distribution of all the Type-A, Type-B, and C-zone sources. 3) Set any negative rates in the resultant magnitude-frequency distribution to zero. What remains is then applied as background seismicity using the relative spatial distribution given in Appendix G (and taking care to lower the relative rates accordingly over the A-Fault, B-fault, and C-zone sources). Total regional moment rate of model varies with the assumed max-mag of background.

Results Magnitude frequency distributions (MFDs) obtained from Earthquake Rate Model 2.0 with default parameters (listed in Table 1 and shown in Figure 3). The bold black line is the total model MFD, blue is for all Type A sources, green is for the Gutenberg-Richter part of all Type B sources, charcoal is for the characteristic part of all Type B sources, and hot pink is for the background sources. Red is simply a target MFD that should be ignored at high magnitudes.

Results Changing: Mag-Area Relationship = Ellsworth-B Regional b-value = 0.9.

Those in Earthquake Rate Model 2.0 include (so far) :

These could (should?) be added to Earthquake Rate Model 2.0: Easily Added: Fault Section Database 2.0 Uncertainties for: slip rate upper and lower seismogenic depth aseismicity factors (if available) average dip for each fault section Fraction of Mo-Rate on A & B sources into smaller events* Additional epistemic uncertainty on A- & B-fault mags* More Work: Other slip per even assumptions on A-Faults (other than the characteristic slip; D sr = D s ) : WG02 Slip (D sr proportional to v s ) Uniform/Boxcar Slip (D sr = D r ) Tapered (D sr goes down toward ends of rupture) Others? Uncertainties in segment mean recurrence intervals Uncertainties in segment boundaries for A-Faults Combining some adjacent B-Fault sources

Definitely Not Included in Earthquake Rate Model 2.0: Deformation Models ≥3.0 Fault-to-fault rupture jumps (via generalized inverse or simulations) Relaxation of assumed segmentation while honoring paleoseismic data (which might demand a segmented model) These inherent limits should be kept in mind when deciding how much more work should be put into Earthquake Rate Models 2.x

The outer branches of a logic tree represent an “ERF Epistemic List” (a list of ERFs w/ diff. param. settings & associated weights): How extensively should the logic-tree be sampled? Example with WGCEP-2002 … What will actually be used?

Basic Questions for SRP What more should be accomplished by Aug. 31st? What should be accomplished by April 1 (NSHMP deadline)? What hazard or loss metric should be used? How extensively should the logic-tree be sampled? How shall the branch weights be assigned? How shall the formal review proceed? Fault Section Database 2.0 Fault Models 2.1 & 2.2 Deformation Models 2.x Earthquake Catalog Regional Seismicity Constraints Magnitude-Area Relations Segment Recurrence Data Alt. A-Fault Rupture Models Type-B Fault & C-zone models Recipe for combining everything

NSF CEA USGS CGS SCEC MOC State of CA USGS Menlo Park USGS Golden Sources of WGCEP funding Geoscience organizations Management oversight committee WGCEP ExCom Subcom. A Subcom. B Subcom. C … … Working group leadership Task-oriented subcommittees Working Group on California Earthquake Probabilities WGCEP Organization & Funding Sources SCEC will provide CEA with a single-point interface to the project. SRP Scientific review panel

WGCEP Management Oversight Committee (MOC):  SCEC Thomas H. Jordan (CEA contact)  USGS, Menlo Park  Rufus Catchings  USGS, Golden Jill McCarthy  CGS Michael Reichle WGCEP Management: In charge of resource allocation and approving all project plans, budgets, and schedules Their signoff will constitute the SCEC/USGS/CGS endorsement

Responsible for convening experts, reviewing options, making decisions, and orchestrating implementation of the model and supporting databases Role of leadership is not to advocate models, but to accommodate whatever models are appropriate WGCEP Executive Committee:  Edward (Ned) Field; SCEC/USGS, Pasadena  Thomas Parsons, USGS, Menlo Park  Chris Wills, CGS  Ray Weldon, SCEC/UofO  Mark Petersen, USGS, Golden  Ross Stein, USGS, Menlo Park Key Scientists: Provide expert opinion and/or specific model elements - likely receiving funding & documenting their contributions. Contributors

Scientific Review Panel: Bill Ellsworth (chair) Art Frankel David Jackson Jim Dieterich Lloyd Cluff Allin Cornell Mike Blanpied David Schwartz CEPEC: Lucile JonesDuncan Agnew Tom JordanMike Reichle Jim BruneDavid Openheimer William LettisPaul Segall John Parrish This group will ultimately decide whether we’ve chosen a minimum set of alternative models that adequately spans the range of viable 5-year forecasts for California

1)Statewide model 2)Use of CFM (including alternatives) 3)Use GPS data via kinematically consistent deformation model(s) 4)Relax strict segmentation assumptions 5)Allow fault-to-fault jumps 6)Apply elastic-rebound-motivated renewal models in (4) & (5) 7)Include earthquake triggering effects 8)Deploy as extensible, adaptive (living) model 9)Simulation enabled Issues/Possible Innovations

Decision Making Process Two type of decisions: 1) what model components to include (logic-tree branches ) 2) what weights to apply to each Decisions will be made and a case-by-case (or branch-by-branch) basis (web site has details;

Decision Making Process In general: 1. The ExCom hosts meetings/workshops to solicit expert opinion. 2. The ExCom, with perhaps assistance from others, drafts proposed branches and preliminary weights with full documentation and posts these on the web. 3. feedback is requested from the broader community and responses are entered into an official record. 4. The ExCom revises and documents accordingly. 5. The SRP reviews the entire process and iterates with the ExCom if need be (MOC serves as referee).

Decision Making Process This entire decision making process will be well documented for posterity. We will also strive to establish a quantitative basis for setting weights, both for numerical reproducibility and future modifications. However, it may be that "gut feeling" will in some cases be the best or only way to assimilate a large number of constraints.

Validation & Verification Verification will be conducted via standard practice in software development (e.g., JUnit Testing for our Java Classes).JUnit Testing Validation via participation in RELM/CSEP testing efforts (although these won’t be definitive anytime). Test the assumptions that go into the models. Examine simulated catalogs. Both validation and verification will be addressed on a case-by-case basis; we will have explicit sections dedicated to each in the formal documentation of all model components.

More Info? UCERF 1 vs UCERF 2 UCERF 2 Logic Tree Possible Innovations: 1)Statewide model 2)Use of CFM (including alternatives) 3)Use GPS data via kinematically consistent deformation model(s) 4)Relax strict segmentation assumptions 5)Allow fault-to-fault jumps 6)Apply elastic-rebound-motivated renewal models in (4) & (5) 7)Include earthquake triggering effects 8)Deploy as extensible, adaptive (living) model. 9)Simulation enabled

UCERF 1 vs 2 Updated/revised fault models, slip rates and aseismic-slip-factor estimates Revision of rupture models for type-A faults based on new information, and to achieve more statewide consistency with respect to the range of segmented vs cascade vs floating-rupture models. Reexamination of type B-faults and their magnitude-frequency distributions Reconsideration of how historical seismicity is smoothed to generate the distribution of background events Apply the range of time dependent probability models considered by WGCEP-2002 on a consistent, statewide basis (making adjustments/ improvements where necessary)

1)Statewide model 2)Use of CFM (including alternatives) 3)Use GPS data via kinematically consistent deformation model(s) 4)Relax strict segmentation assumptions 5)Allow fault-to-fault jumps 6)Apply elastic-rebound-motivated renewal models in (4) & (5) 7)Include earthquake triggering effects 8)Deploy as extensible, adaptive (living) model. 9)Simulation enabled Issues/Possible Innovations

UCERF 1) Statewide model Issues/Possible Innovations

2) Use of CFM (including alternatives) Issues/Possible Innovations take this state wide?

3) Use GPS data via kinematically consistent deformation model(s) Issues/Possible Innovations e.g., Peter Bird’s NeoKinema

Black Box Deformation Model(s) Earthquake Prob Model(s) Earthquake Rate Model(s) Black Box Black Box Fault Models(s) We want: 1) Improved slip rates on major faults 2) Strain rates elsewhere 3) GPS data included GPS data from Geodesy group 4) Kinematically consistent 5) Accommodate all important faults 6) Can accommodate alternative fault models 7) Accounts for geologic and geodetic data uncertainties 8) Includes viscoelastic effects 9) Includes significant 3D effects 10) Statewide application

Black Box Deformation Model(s) Earthquake Prob Model(s) Earthquake Rate Model(s) Black Box Black Box Fault Models(s) We want: 1) Improved slip rates on major faults 2) Deformation rates elsewhere 3) GPS data included 4) Kinematically consistent 5) Accommodate all important faults 6) Can accommodate alternative fault models 7) Accounts for geologic and geodetic data uncertainties 8) Includes viscoelastic effects 9) Includes significant 3D effects 10) Statewide application Are any existing models better than sticking to what we have? WGCEP recommending pursuit of: NeoKinema Harvard-MIT Block Model Parson’s FEM Shen & Zeng Perhaps others … No model has all these attributes

Black Box Deformation Model(s) Earthquake Prob Model(s) Earthquake Rate Model(s) Black Box Black Box Fault Models(s) No model has all these attributes Are any existing models better than doing nothing? WGCEP recommending pursuit of: Delivery will be revised slip rates on modeled faults (and perhaps deformation rates elsewhere, and stressing rates on faults) NeoKinema Harvard-MIT Block Model Parson’s FEM Perhaps others?

4) Relax strict segmentation Issues/Possible Innovations All previous WGCEPs have assumed strict segmentation, more recently allowing both single- and multi-segment ruptures (cascades). e.g., WGCEP-2002:

4) Relax strict segmentation Issues/Possible Innovations But... Viable interpretations of S. SAF paleoseismic data (Weldon et al.)

4) Relax strict segmentation Issues/Possible Innovations Does it matter (all models are discretized to some extent)? SJF; SB-SJV segment intersection SAF: Mojave-San Bernardino intersection 50% in 50 yrs

5) Allow fault-to-fault jumps Issues/Possible Innovations No previous WGCEPs have allowed such ruptures.

5) Allow fault-to-fault jumps Issues/Possible Innovations But...

5) Allow fault-to-fault jumps Issues/Possible Innovations Fault Interactions and Large Complex Earthquakes in the Los Angeles Area Anderson, Aagaard, & Hudnut (2003, Science 320, ) “… We find that … a large northern San Jacinto fault earthquake could trigger a cascading rupture of the Sierra Madre-Cucamonga system, potentially causing a moment magnitude 7.5 to 7.8 earthquake on the edge of the Los Angeles metropolitan region.

5) Allow fault-to-fault jumps Issues/Possible Innovations Can dynamic rupture modelers help define fault- to-fault jumping probabilities?

6) Figure out how to apply elastic-rebound-motivated renewal models “properly” Issues/Possible Innovations Problem: how to compute conditional time-dependent probabilities when you allow both single and multi- segment ruptures (let alone relaxing segmentation)? The way previous WGCEPs have modeled this seems logically inconsistent …

From WGCEP-2002:

How then do we compute conditional probabilities where we have single and multi-segment ruptures? We have ideas …

7) Include earthquake triggering effects Issues/Possible Innovations CEA wants 1- to 5-year forecasts This is between the 30-year forecasts of previous WGCEPs (renewal models) and the 24-hour forecasts of STEP (the CEPEC-endorsed model based on aftershock statistics) We are attempting to design a framework that could accommodate a variety of alternative approaches (e.g., from RELM)

8) Deploy as extensible, adaptive (living) model. Issues/Possible Innovations i.e., modifications can be made as warranted by scientific developments, the collection of new data, or following the occurrence of significant earthquakes. The model can be “living” to the extent that update & evaluation process can occur in short order. CEA wants this.

9) Simulation enabled (i.e, can generate synthetic catalogs) Issues/Possible Innovations Needed to go beyond next-event forecasts if stress interactions and earthquake triggering are included Helpful (if not required) to understand model behavior Can be used to calibrate the model (e.g., % moment in aftershocks) If we can deal with simulated events, we’ll be ready for any real events

Implementation Plan Guiding principles: If it ain’t broke, don’t fix it Some of the hoped-for innovations won’t work out Everything will take longer than we think Build components in parallel (not in series) Get a basic version of each component implemented ASAP, and add improved versions when available We cannot miss the NSHMP and CEA delivery deadlines!

Black Box Deformation Model 1.0 Earthquake Prob Model 1.0 Earthquake Rate Model 1.0 Black Box Black Box Fault Model 1.0 UCERF 1.0 NSHMP-2002 Fault Model NSHMP-2002 Fault Slip Rates NSHMP-2002 Earthquake Rate Model Simple conditional probabilities based on date of last earthquakes By Feb. 8, 2006

Black Box Deformation Model 2.0 Earthquake Prob Model 2.0 Earthquake Rate Model 2.0 Black Box Black Box Fault Models 2.X UCERF 2.0 Revision of NSHMP-2002 Fault Model based on SCEC CFM (including alternatives) & any desired changes for N. California Revised slip rates for elements in Fault Model 2.0, perhaps constrained by GPS data. New model based on reevaluation of fault segmentation and cascades (e.g., based on Weldon et al.); may relax segmentation and allow fault-to-fault jumps. Application of more sophisticated time- dependent probability calculations We can at least deliver this for the NSHMP 2007 time- independent hazard maps (by June 1, 2007) & This to CEA By Sept. 30, 2007

Black Box Deformation Model 3.0 Earthquake Prob Model 3.0 Earthquake Rate Model 3.0 Black Box Black Box Fault Models 2.X UCERF 3.0 e.g., relax segmentation, allow fault- to-fault jumps, and use off-fault deformation rates to help constrain off-fault seismicity. e.g., enable real-time modification of probabilities base on stress or seismicity- rate changes. Use of a more sophisticated, California- wide deformation model such as NeoKinema.

Black Box Deformation Model 3.0 Earthquake Prob Model 3.0 Earthquake Rate Model 3.0 Black Box Black Box Fault Models 2.X UCERF 3.0 e.g., relax segmentation, allow fault- to-fault jumps, and use off-fault deformation rates to help constrain off-fault seismicity. e.g., enable real-time modification of probabilities base on stress or seismicity- rate changes. Use of a more sophisticated, California- wide deformation model such as NeoKinema. All of these are relatively ambitious and delivery cannot be guaranteed by 2007.

Black Box Deformation Model(s) Earthquake Prob Model(s) Earthquake Rate Model(s) Black Box Black Box (A) (B) (C) (D) UCERF Model Components Fault Model(s) Fault-slip rates (at least) Long-term rate of all possible events (on and off modeled faults) Time-dependent probabilities

Black Box Deformation Model(s) Earthquake Prob Model(s) Earthquake Rate Model(s) Black Box Black Box (A) (B) (C) (D) Fault Model(s) Fault Section Database Paleo Sites Database GPS Database Historical Qk Catalog Instrumental Qk Catalog UCERF Model Components

Black Box Deformation Model(s) Earthquake Prob Model(s) Earthquake Rate Model(s) Black Box Black Box (A) (B) (C) (D) Fault Model(s) Fault Section Database Paleo Sites Database GPS Database Historical Qk Catalog Instrumental Qk Catalog UCERF Model Components

Black Box Deformation Model(s) Earthquake Prob Model(s) Earthquake Rate Model(s) Black Box Black Box (A) (B) (C) (D) Fault Model(s) Fault Section Database Paleo Sites Database GPS Database Historical Qk Catalog Instrumental Qk Catalog UCERF Model Components

Black Box Deformation Model(s) Earthquake Prob Model(s) Earthquake Rate Model(s) Black Box Black Box (A) (B) (C) (D) Fault Model(s) Fault Section Database Paleo Sites Database GPS Database Historical Qk Catalog Instrumental Qk Catalog UCERF Model Components

Black Box Deformation Model(s) Earthquake Prob Model(s) Earthquake Rate Model(s) Black Box Black Box (A) (B) (C) (D) Fault Model(s) Fault Section Database Paleo Sites Database GPS Database Historical Qk Catalog Instrumental Qk Catalog UCERF Model Components