Review of Catalogs and Rate Determination in UCERF2 and Plans for UCERF3 Andy Michael.

Slides:



Advertisements
Similar presentations
The rate of aftershock density decay with distance Karen Felzer 1 and Emily Brodsky 2 1. U.S. Geological Survey 2. University of California, Los Angeles.
Advertisements

Now Some Implications of Deformation Models & Seismicity Observations…
Smoothed Seismicity Rates Karen Felzer USGS. Decision points #1: Which smoothing algorithm to use? National Hazard Map smoothing method (Frankel, 1996)?
KICK-OFF MEETING & 1st PROJECT WORKSHOP October 4, 2012 Skopje, Macedonia Summary on previous efforts NATO SfP (Block 1) BSHAP Earthquake Catalogue.
Prague, March 18, 2005Antonio Emolo1 Seismic Hazard Assessment for a Characteristic Earthquake Scenario: Integrating Probabilistic and Deterministic Approaches.
1 Vertically Integrated Seismic Analysis Stuart Russell Computer Science Division, UC Berkeley Nimar Arora, Erik Sudderth, Nick Hay.
Extreme Earthquakes: Thoughts on Statistics and Physics Max Werner 29 April 2008 Extremes Meeting Lausanne.
Yan Y. Kagan Dept. Earth and Space Sciences, UCLA, Los Angeles, CA , EARTHQUAKE.
Yan Y. Kagan, David D. Jackson Dept. Earth and Space Sciences, UCLA, Los Angeles, CA ,
NGA-East: National Seismic Hazard Mapping Perspective Mark Petersen USGS Golden, CO.
Yan Y. Kagan Dept. Earth and Space Sciences, UCLA, Los Angeles, CA , Full.
Chapter 4: The SFBR Earthquake Source Model: Magnitude and Long-Term Rates Ahyi Kim 2/23/07 EQW.
Chapter 5: Calculating Earthquake Probabilities for the SFBR Mei Xue EQW March 16.
Yan Y. Kagan, David D. Jackson Dept. Earth and Space Sciences, UCLA, Los Angeles, CA ,
1 Some Current Problems in Point Process Research: 1. Prototype point processes 2. Non-simple point processes 3. Voronoi diagrams.
Yan Y. Kagan, David D. Jackson Dept. Earth and Space Sciences, UCLA, Los Angeles, CA ,
03/09/2007 Earthquake of the Week
Time-dependent seismic hazard maps for the New Madrid seismic zone and Charleston, South Carolina areas James Hebden Seth Stein Department of Earth and.
Yan Y. Kagan Dept. Earth and Space Sciences, UCLA, Los Angeles, CA , Global.
Sandstone, Northern Colorado. The review questions are now posted. Also, I posted a condensed set of slides for Geologic Time. You will need to know.
Yan Y. Kagan Dept. Earth and Space Sciences, UCLA, Los Angeles, CA , Earthquake.
The Empirical Model Karen Felzer USGS Pasadena. A low modern/historical seismicity rate has long been recognized in the San Francisco Bay Area Stein 1999.
Richter Magnitude Scale Quantitative measurement Measurement based off of seismograph readings ◦ Amplitude of seismograph readings determines size Logarithmic.
Earthquakes Susan Bilek Associate Professor of Geophysics New Mexico Tech How to figure out the who, what, where, why… (or the location, size, type)
If we build an ETAS model based primarily on information from smaller earthquakes, will it work for forecasting the larger (M≥6.5) potentially damaging.
Statistics of Seismicity and Uncertainties in Earthquake Catalogs Forecasting Based on Data Assimilation Maximilian J. Werner Swiss Seismological Service.
Earthquake scaling and statistics
1 Earthquake Magnitude Measurements for Puerto Rico Dariush Motazedian and Gail M. Atkinson Carleton University.
Paleoseismic and Geologic Data for Earthquake Simulations Lisa B. Grant and Miryha M. Gould.
The interevent time fingerprint of triggering for induced seismicity Mark Naylor School of GeoSciences University of Edinburgh.
1 Earthquake Magnitude Measurements for Puerto Rico Dariush Motazedian and Gail M. Atkinson.
RAPID SOURCE PARAMETER DETERMINATION AND EARTHQUAKE SOURCE PROCESS IN INDONESIA REGION Iman Suardi Seismology Course Indonesia Final Presentation of Master.
1 Fault Dynamics of the April 6, 2009 L'Aquila, Italy Earthquake Sequence Robert B. Herrmann Saint Louis University Luca Malagnini INGV, Roma.
Yan Y. Kagan Dept. Earth and Space Sciences, UCLA, Los Angeles, CA ,
Magnitude and Completeness Assessment of the ISC-GEM Global Instrumental Earthquake Catalogue ( ) D. Di Giacomo, I. Bondár, E.R. Engdahl, D.A.
Section 9.6 There are two logarithmic bases that occur so frequently in applications that they are given special names. Common logarithms are logarithms.
Research opportunities using IRIS and other seismic data resources John Taber, Incorporated Research Institutions for Seismology Michael Wysession, Washington.
A functional form for the spatial distribution of aftershocks Karen Felzer USGS Pasadena.
Using IRIS and other seismic data resources in the classroom John Taber, Incorporated Research Institutions for Seismology.
Robust Quantification of Earthquake Clustering: Overcoming the Artifacts of Catalog Errors Ilya Zaliapin Department of Mathematics and Statistics University.
Southern California Earthquake Center Triggering Models vs. Smoothed Seismicity PG = 1.35/eqk PG = 10/eqk Information gain per earthquake Reference forecast.
Earthquake hazard isn’t a physical thing we measure. It's something mapmakers define and then use computer programs to predict. To decide how much to believe.
Large Earthquake Rapid Finite Rupture Model Products Thorne Lay (UCSC) USGS/IRIS/NSF International Workshop on the Utilization of Seismographic Networks.
Karen Felzer & Emily Brodsky Testing Stress Shadows.
California Project Seismicity in the oil and gas fields Tayeb A. Tafti University of Southern California July 2, 2013.
Correlating aftershock sequences properties to earthquake physics J. Woessner S.Wiemer, S.Toda.
GLOBAL EARTHQUAKE FORECASTS Yan Y. Kagan and David D. Jackson Department of Earth and Space Sciences, University of California Los Angeles Abstract We.
An Assessment of the High-Gain Streckheisen STS2 Seismometer for Routine Earthquake Monitoring in the US ISSUE: Is the high-gain STS2 too sensitive to.
Do Now – In Notebooks 1. What is an earthquake? 2. What kind of stress acts on a normal fault? Does the crust lengthen or shorten? 3. What is the difference.
Decustering, Rates, and b-values or
The Snowball Effect: Statistical Evidence that Big Earthquakes are Rapid Cascades of Small Aftershocks Karen Felzer U.S. Geological Survey.
A proposed triggering/clustering model for the current WGCEP Karen Felzer USGS, Pasadena Seismogram from Peng et al., in press.
California Earthquake Rupture Model Satisfying Accepted Scaling Laws (SCEC 2010, 1-129) David Jackson, Yan Kagan and Qi Wang Department of Earth and Space.
Fundamental Review of State Government Seismological Laboratory John Anderson, Director.
Plate-tectonic analysis of shallow seismicity: Apparent boundary width, beta, corner magnitude, coupled lithosphere thickness, and coupling in 7 tectonic.
Repeatable Path Effects on The Standard Deviation for Empirical Ground Motion Models Po-Shen Lin (Institute of geophysics, NCU) Chyi-Tyi Lee (Institute.
9. As hazardous as California? USGS/FEMA: Buildings should be built to same standards How can we evaluate this argument? Frankel et al., 1996.
100+ years of instrumental seismology: the example of the ISC-GEM Global Earthquake Catalogue D.A. Storchak, D. Di Giacomo, and the International Team.
Estimated Magnitude of the 1663 Earthquake at Charlevoix, Quebec Figure A14–1.
Distinguishing Artifacts of Earthquake Catalogs From Genuine Seismicity Patterns Ilya Zaliapin Department of Mathematics and Statistics University of Nevada,
SHORT- AND LONG-TERM EARTHQUAKE FORECASTS FOR CALIFORNIA AND NEVADA Kagan, Y. Y. and D. D. Jackson Department of Earth and Space Sciences, University of.
Brittle failure occurs within “seismogenic zone” defined by fault properties Typically 15 km for vertical strike slip faults ~30-50 km for subduction zone.
Abstract The space-time epidemic-type aftershock sequence (ETAS) model is a stochastic process in which seismicity is classified into background and clustering.
Earthquake Statistics John Rundle GEL/EPS 131
Kinematic Modeling of the Denali Earthquake
RECENT SEISMIC MONITORING RESULTS FROM THE CENTRAL
Tectonics V: Quantifying and characterizing crustal deformation
Geology 15 Fall 2013 Lecture 13 Mid Term I Review Schedule Review
Principal Stress rotates to EW direction
Engineering Geology and Seismology
Presentation transcript:

Review of Catalogs and Rate Determination in UCERF2 and Plans for UCERF3 Andy Michael

Historical Earthquake Catalog: 1850 – 1932 Primary compilation: Toppozada and Branum (2003) , M≥5.5, magnitude based on area with MMI V, VI, VIII, relies on many other sources for magnitude of individual events. 417 events from Location and magnitude solved independently. Bakun (1999, 2000, 2006) independently analyzed 84 events location and magnitude jointly based on fitting intensity at points to a ground motion prediction relationship. Magnitudes generally agree well. In the Bay Area: 15 are given the same magnitudes 17 differ by differ by differ by more (0.4, 0.5, and 0.6) – sparse data

Instrumental Catalog: 1932 – 2006 CGS compilation of the Southern and Northern California Catalogs and for Nevada also use NEIC and Nevada Seismological Lab. Mostly ML for M≥4, some Md in northern California Mw for larger recent events taken from Global CMT catalog From 1972 on used ANSS merged catalog Some problems with updated SCSN catalog not in ANSS and so the updated SCSN version of events was used. This appears to have been fixed.

Magnitude Error and Rounding Toppozada and Branum: ±0.3 units when data is sparse Bakun: individually estimated errors, similar when data is sparse. Bakun’s uncertainties could be underestimates because the magnitude uncertainty does not take into account location uncertainty. When magnitudes agree and Bakun’s errors are smaller then used Toppozada’s magnitude and Bakun’s uncertainty. When error is unknown ±0.333 is used but true errors could be higher. If errors are higher than estimated then rates could be overestimated.

Magnitude Error and Rounding SCSN: new uncertainties caluclated using a bootstrap of the amplitudes used to calculate the magnitudes. NCSN: uncertainties for most events listed in catalogs. Global CMT: 0.09 uncertainty from Kagan et al. (2006) which is consistent with 0.08 estimated by Bakun (1999).

Rounding: Historic events: 0.1 SCSN: NCSN: uncertainties for most events listed in catalogs. Global CMT: calculated to 0.01 from moment : 0.1 or 0.5 depending on event (estimated from histograms of magnitudes)

logN=a−bM The a and b values found here are calculated using methods employed by the 1996 and 2002 National Hazard Maps, with several revisions. These revisions include: making corrections for magnitude error and rounding before calculating a values, using only modern instrumental data to calculate b value, and using a new comprehensive and spatially variable assessment of the magnitude completeness threshold as a function of time.

We also calculate the seismicity rate in several different ways to account for the fact that the seismicity rate may change with time (for example, the higher seismicity rates in the San Francisco Bay Area before 1927 than after), and perform simulations to evaluate the accuracy with which the seismicity rate averaged over the last 156 years represents the true long term seismicity rate.

Finally, the National Hazard Maps have traditionally only used the historical earthquake solutions of Toppozada, most recently compiled in Toppozada et al. (2002). We do our calculations both with the Toppozada solutions and with 84 of the Toppozada solutions substituted with historical earthquake solutions of Bakun (Bakun 1999; Bakun 2000; Bakun 2006). We find that this substitution creates an insignificant increase in the statewide seismicity rate of 0.6%, although it may produce larger differences on a regional level.

Completeness Schorlemmer et al. (2006) bases completeness thresholds in Southern California on observations of how frequently individual seismic stations detect earthquakes of different magnitudes and distances. For each location determine completeness by proximity to stations that record earthquakes with some quality. Extended to historical sources.

Plans for UCERF3: Further consider uncertainties and biases in intensity assignments and magnitudes of historic earthquakes. Could these change the seismicity rate in the historic catalog? Could this change the empirical model? Could this improve the fit between observed and modeled rates? What can be done without redoing the entire historic catalog? Declustering: Traditionally use Gardner and Knopoff (1974) Produces change in b-value from 1 to 0.8 ETAS models use the same magnitude frequency distribution for mainshocks and aftershocks. Consider other declustering methods How will this affect rates? How do we do this while being consistent with national maps?

More Plans for UCERF3: Changes in instrumental magnitudes. Characterize off-fault seismicity focal mechanisms and Mmax New assignments of historic events to faults (?) New approaches to smoothed seismicity rates Magnitude-Frequency Distributions: Characteristic versus Gutenberg-Richter