Presentation is loading. Please wait.

Presentation is loading. Please wait.

ShakeAlert CISN Testing Center (CTC) Development

Similar presentations


Presentation on theme: "ShakeAlert CISN Testing Center (CTC) Development"— Presentation transcript:

1 ShakeAlert CISN Testing Center (CTC) Development
Philip Maechling, Maria Liukis, Thomas H. Jordan Southern California Earthquake Center (SCEC) 14 October 2010 SCEC: An NSF + USGS Research Center

2 CTC Progress in 2010 Operating algorithm evaluation system with California-based performance reports and raw data available (2008-present). Changed our automated software testing infrastructure from web-based (Joomla) system to server-based (CSEP) system. Added ShakeMap RSS reader into CSEP for use as a source of authorized observational data that will be used to evaluate earthquake parameter and ground motion forecasts. Implemented a prototype EEW forecast evaluation test as plotting of PGV used in ShakeMaps for each event. Began nightly automated retrieval of observational data from ShakeMap RSS and create observation-based ground motion maps. Started implementation of ground motion forecast evaluation defined in 2008 CISN Testing Document.

3 EEW Testing Center Provides On-going Performance Evaluation
Performance summaries available through login ( Evaluation results for 2010 include 144 M4+ earthquakes in CA Testing Region Cumulative raw summaries (2008-present) posted at (scec.usc.edu/scecpedia/Earthquake_Early_Warning)

4 EEW Testing Center Provides On-going Performance Evaluation
Example Performance Information from Algorithm Testing System for 2010 Total Events M4.0+ in California Testing Region in 2010 : 146 Events M4.0+ in region in 2010 with EEW triggers : 57 Events M4.0+ in region in 2010 with only good triggers : 45 Events M4.0+ in region in 2010 with on missed triggers : 1 Events M4.0+ in region in 2010 with both types triggers : 11

5 CTC Progress in 2010 Operating algorithm evaluation system with California-based performance reports and raw data available (2008-present). Changed our automated software testing infrastructure from web-based (Joomla) system to server-based (CSEP) system. Added ShakeMap RSS reader into CSEP for use as a source of authorized observational data that will be used to evaluate earthquake parameter and ground motion forecasts. Implemented a prototype EEW forecast evaluation test as plotting of PGV used in ShakeMaps for each event. Began nightly automated retrieval of observational data from ShakeMap RSS and create observation-based ground motion maps. Started implementation of ground motion forecast evaluation defined in 2008 CISN Testing Document.

6 CISN Testing Center (CTC) Forecast Evaluation Processing System
ShakeMap RSS Feed Ground Motion Observations Retrieve Filter Data Catalog Observed ANSS EQ Parameter and Ground Motion Data Filtered Earthquake Earthquake Catalog Catalog ANSS Earthquake Catalog Evaluation tests comparing Forecasts and Observations ShakeAlert Earthquake Parameter Forecast CISN Decision Modules Load Reports ShakeAlert Ground Motion Forecast ShakeAlert Forecast Information CISN User Modules CISN EEW Testing Center (CTC) and Web Site

7

8

9 CTC Progress in 2010 Operating algorithm evaluation system with California-based performance reports and raw data available (2008-present). Changed our automated software testing infrastructure from web-based (Joomla) to server-based (CSEP) system. Added ShakeMap RSS reader into CSEP for use as a source of authorized observational data that will be used to evaluate earthquake parameter and ground motion forecasts. Implemented a prototype EEW forecast evaluation test as plotting of PGV used in ShakeMaps for each event. Began nightly automated retrieval of observational data from ShakeMap RSS and create observation-based ground motion maps. Started implementation of ground motion forecast evaluation defined in 2008 CISN Testing Document.

10

11 CTC Progress in 2010 Operating algorithm evaluation system with California-based performance reports and raw data available (2008-present). Changed our automated software testing infrastructure from web-based (Joomla) to server-based (CSEP) system. Added ShakeMap RSS reader into CSEP for use as a source of authorized observational data that will be used to evaluate earthquake parameter and ground motion forecasts. Implemented a prototype EEW forecast evaluation test as plotting of PGV used in ShakeMaps for each event. Began nightly automated retrieval of observational data from ShakeMap RSS and create observation-based ground motion maps. Started implementation of ground motion forecast evaluation defined in 2008 CISN Testing Document.

12 CTC Progress in 2010 Current ShakeAlert CTC retrieves ShakeMap RSS data and plots observations for all Mag 3.0+ earthquakes in California Testing Region as shown (left).

13 CTC Progress in 2010 Operating algorithm evaluation system with California-based performance reports and raw data available (2008-Present) Changed our automated software testing infrastructure from web-based (Joomla) to server-based (CSEP) system. Added ShakeMap RSS reader into CSEP for use as a source of authorized observational data that will be used to evaluate earthquake parameter and ground motion forecasts. Implemented a prototype EEW forecast evaluation test as plotting of PGV used in ShakeMaps for each event. Began nightly automated retrieval of observational data from ShakeMap RSS and create observation-based ground motion maps. Started implementation of ground motion forecast evaluation defined in 2008 CISN Testing Document.

14 CISN Testing Center (CTC) Forecast Evaluation Processing System
ShakeMap RSS Feed Ground Motion Observations Retrieve Filter Data Catalog Observed ANSS EQ Parameter and Ground Motion Data Filtered Earthquake Earthquake Catalog Catalog ANSS Earthquake Catalog Evaluation tests comparing Forecasts and Observations ShakeAlert Earthquake Parameter Forecast CISN Decision Modules Load Reports ShakeAlert Ground Motion Forecast ShakeAlert Forecast Information CISN User Modules CISN EEW Testing Center (CTC) and Web Site

15 CTC Progress in 2010 Operating algorithm evaluation system with California-based performance reports and raw data available (2008-Present) Changed our automated software testing infrastructure from web-based (Joomla) to server-based (CSEP) system. Added ShakeMap RSS reader into CSEP for use as a source of authorized observational data that will be used to evaluate earthquake parameter and ground motion forecasts. Implemented a prototype EEW forecast evaluation test as plotting of PGV used in ShakeMaps for each event. Began nightly automated retrieval of observational data from ShakeMap RSS and create observation-based ground motion maps. Started implementation of ground motion forecast evaluation defined in 2008 CISN Testing Document.

16 CTC Progress in 2010 Initial CTC Evaluation Test is defined in 2008 CISN EEW Testing Document (as updated July 2010). Previous Algorithm Testing Center did not implement this summary. Access to ShakeMap RSS ground motion observations makes automated implementation practical.

17 Related EEW Activities
Caltech helped us with EEW analysis for SCEC M8 scenario earthquake simulation for a north-to-south (Cholamane to Bombay Beach) rupture. Triggers station list, and warning times for event for each station listed on web page ( Caltech Civil Engineering helped us with building response studies for this event using Caltech Frame3D system resulting in animations of 18-story steel frame building at various sites in California. Visual Guide to Modified Mercalli Intensity Scale using YouTube video as examples of MMI levels posted: (

18 Scientific and Technical Coordination Issues
Prioritize forecast evaluation tests to be implemented in CTC Coordinate ShakeAlert information transfer to CTC SCEC science planning of EEW forecast evaluation experiments Use of EEW in time-dependent PSHA information Consider Extending ShakeMap format as CAP-based forecast exchange format. Send forecasts information (and time of report) to produce: ShakeMap Intensity Maps ShakeMap Uncertainties Maps Consider ShakeAlert interfaces to support comparative EEW performance tests. Provide access to information for each trigger: Stations Used In Trigger Stations Available when declaring Trigger Software Version declaring Trigger

19 End

20 Proposed CTC Evaluation Tests

21 Design of an Experiment
Rigorous CISN EEW testing will involve the following definitions: Define a forecast Define testing area Define input data used in forecasts Define reference observation data Define measures of success for forecasts

22 Proposed Performance Measures
Summary Reports for each M ≥ M-min: Key documents is 3 March 2008 document which specifies six types of tests. Summary 1: Magnitude Summary 2: Location Summary 3: Ground Motion Summary 4: System Performance Summary 5: False Triggers Summary 6: Missed Triggers

23 Experiment Design Summary 1.1: Magnitude X-Y Diagram
Measure of Goodness: Data points fall on diagonal line Relevant: T2,T3,T4 Drawbacks: Timeliness element not represented Which in series of magnitude estimates should be used in plot.

24 Experiment Design Summary 1.2: Initial magnitude error by magnitude
Measure of Goodness: Data points fall on horizontal line Relevant: T2,T3,T4 Drawbacks: Timeliness element not represented

25 Experiment Design Summary 1.3: Magnitude accuracy by update
Measure of Goodness: Data points fall on horizontal line Relevant: T3,T4 Drawbacks: Timeliness element not represented

26 Proposed Performance Measures
Summary Reports for each M ≥ M-min: Key documents is 3 March 2008 document which specifies six types of tests. Summary 1: Magnitude Summary 2: Location Summary 3: Ground Motion Summary 4: System Performance Summary 5: False Triggers Summary 6: Missed Triggers

27 Experiment Design Summary 2.1: Cumulative Location Errors
Measure of Goodness: Data points fall on vertical zero line Relevant: T3, T4 Drawbacks: Does not consider magnitude accuracy or timeliness

28 Experiment Design Summary 2.2: Magnitude and Location error by time after origin Measure of Goodness: Data points fall on horizontal zero line Relevant: T3, T4 Drawbacks: Event-specific not cumulative

29 Proposed Performance Measures
Summary Reports for each M ≥ M-min: Key documents is 3 March 2008 document which specifies six types of tests. Summary 1: Magnitude Summary 2: Location Summary 3: Ground Motion Summary 4: System Performance Summary 5: False Triggers Summary 6: Missed Triggers

30 Experiment Design Summary 3.1 : Intensity Map Comparisons
Measure of Goodness: Forecast map matches observed map Relevant: T4 Drawbacks: Not a quantitative results

31 Experiment Design Summary 3.2: Intensity X-Y Diagram
Measure of Goodness: Data points fall on diagonal line Relevant: T1,T2,T4 Drawbacks: Timeliness element not represented Which in series of intensity estimate should be used in plots T3.

32 Experiment Design Summary 3.3: Intensity Ratio by Magnitude
Measure of Goodness: Data points fall on horizontal line Relevant: T1,T2,T4 Drawbacks: Timeliness element not represented Which intensity estimate in series should be used in plot. Add to each slide whether report is cummulative or one per event-based.

33 Measure of Goodness: Data points fall on horizontal line
Summary 3.3: Predicted to Observed Intensity Ratio by Distance and Magnitude Measure of Goodness: Data points fall on horizontal line Relevant: T1,T2,T4 Drawbacks: Timeliness element not represented Which intensity estimate in series should be used in plot. Add to each slide whether report is cummulative or one per event-based.

34 Summary 3.3: Evaluate Conversion from PGV to Intensity
Group has proposed to evaluate algorithms by comparing intensities and they provide a formula for conversion to Intensity. Add to each slide whether report is cummulative or one per event-based.

35 Summary 3.4: Evaluate Conversion from PGV to Intensity
Group has proposed to evaluate algorithms by comparing intensities and they provide a formula for conversion to Intensity. Add to each slide whether report is cummulative or one per event-based.

36 Experiment Design Summary 3.5: Statistical Error Distribution for Magnitude and Intensity Measure of Goodness: No missed events or false alarms in testing area Relevant: T4 Drawbacks:

37 Experiment Design Summary 3.6: Mean-time to first location or intensity estimate (small blue plot) Measure of Goodness: Peak of measures at zero Relevant: T1,T2,T3,T4 Drawbacks: Cumulative and does not involve accuracy of estimates

38 Proposed Performance Measures
Summary Reports for each M ≥ M-min: Key documents is 3 March 2008 document which specifies six types of tests. Summary 1: Magnitude Summary 2: Location Summary 3: Ground Motion Summary 4: System Performance Summary 5: False Triggers Summary 6: Missed Triggers

39 Experiment Design No examples for System Performance Summary defined as Summary 4.1: Ratio of reporting versus non-reporting stations:

40 Proposed Performance Measures
Summary Reports for each M ≥ M-min: Key documents is 3 March 2008 document which specifies six types of tests. Summary 1: Magnitude Summary 2: Location Summary 3: Ground Motion Summary 4: System Performance Summary 5: False Triggers Summary 6: Missed Triggers

41 Experiment Design Summary 5.1: Missed event and False Alarm Map
Measure of Goodness: No missed events or false alarms in testing area Relevant: T3, T4 Drawbacks: Must develop definitions for missed events and false alarms, Does not reflect timeliness

42 Experiment Design Summary 5.2: Missed event and False Alarm Map
Measure of Goodness: No missed events or false alarms in testing area Relevant: T3, T4 Drawbacks: Must develop definitions for missed events and false alarms, Does not reflect timeliness

43 Proposed Performance Measures
Summary Reports for each M ≥ M-min: Key documents is 3 March 2008 document which specifies six types of tests. Summary 1: Magnitude Summary 2: Location Summary 3: Ground Motion Summary 4: System Performance Summary 5: False Triggers Summary 6: Missed Triggers

44 Experiment Design Summary 6.1: Missed Event map
Measure of Goodness: No missed events in testing region Relevant: T3, T4 Drawbacks: Must define missed event. Does not indicate timeliness

45 End


Download ppt "ShakeAlert CISN Testing Center (CTC) Development"

Similar presentations


Ads by Google