Application of Forecast Verification Science to Operational River Forecasting in the National Weather Service Julie Demargne, James Brown, Yuqiong Liu.

Slides:



Advertisements
Similar presentations
Climate Prediction Applications Science Workshop
Advertisements

Model Evaluation Tools MET. What is MET Model Evaluation Tools ( MET )- a powerful and highly configurable verification package developed by DTC offering:
Report of the Q2 Short Range QPF Discussion Group Jon Ahlquist Curtis Marshall John McGinley - lead Dan Petersen D. J. Seo Jean Vieux.
1 Verification Continued… Holly C. Hartmann Department of Hydrology and Water Resources University of Arizona RFC Verification Workshop,
14 May 2001QPF Verification Workshop Verification of Probability Forecasts at Points WMO QPF Verification Workshop Prague, Czech Republic May 2001.
NOAA’s NWS and the USGS: Partnering to Meet America’s Water Information Needs Ernie Wells Hydrologic Services Division NOAA National Weather Service May.
SIPR Dundee. © Crown copyright Scottish Flood Forecasting Service Pete Buchanan – Met Office Richard Maxey – SEPA SIPR, Dundee, 21 June 2011.
Oct 12, 2010 Hydrologic Early Warning System for East Africa Ashutosh Limaye, John Gitau, Eric Kabuchanga CRAM Workshop September 26, 2011.
Gridded OCF Probabilistic Forecasting For Australia For more information please contact © Commonwealth of Australia 2011 Shaun Cooper.
Verification and evaluation of a national probabilistic prediction system Barbara Brown NCAR 23 September 2009.
Instituting Reforecasting at NCEP/EMC Tom Hamill (ESRL) Yuejian Zhu (EMC) Tom Workoff (WPC) Kathryn Gilbert (MDL) Mike Charles (CPC) Hank Herr (OHD) Trevor.
National Weather Service Protecting Lives and Property NCRFC Support of Wisconsin’s Manure Management Advisory System Development and Production of a Decision.
Western Water Supply Kevin Werner, Andrew Murray, WR/SSD Jay Breidenbach, WFO Boise Cass Goodman, Steve Shumate, CBRFC Alan Takamoto, Scott Staggs, CNRFC.
Ensemble Post-Processing and it’s Potential Benefits for the Operational Forecaster Michael Erickson and Brian A. Colle School of Marine and Atmospheric.
Western Water Supply Kevin Werner, Andrew Murray, WR/SSD Jay Breidenbach, WFO Boise Cass Goodman, Steve Shumate, CBRFC Alan Takamoto, Scott Staggs, CNRFC.
June 23, 2011 Kevin Werner NWS Colorado Basin River Forecast Center 1 NOAA / CBRFC Water forecasts and data in support of western water management.
Colorado Basin River Forecast Center Water Supply Forecasting Method Michelle Stokes Hydrologist in Charge Colorado Basin River Forecast Center April 28,
Evaluation of Potential Performance Measures for the Advanced Hydrologic Prediction Service Gary A. Wick NOAA Environmental Technology Laboratory On Rotational.
Instructions 1.Replace the “Your RFC” text in Slide Master (go to: View > Slide Master) 2.Replace highlighted (yellow) text in slides 3.Complete (optional)
Water Supply Forecast using the Ensemble Streamflow Prediction Model Kevin Berghoff, Senior Hydrologist Northwest River Forecast Center Portland, OR.
Verification has been undertaken for the 3 month Summer period (30/05/12 – 06/09/12) using forecasts and observations at all 205 UK civil and defence aerodromes.
ECMWF WWRP/WMO Workshop on QPF Verification - Prague, May 2001 NWP precipitation forecasts: Validation and Value Deterministic Forecasts Probabilities.
Using Partnerships to Meet NOAA’s Needs for its Next Generation Storm Surge System NOS/OCS/CSDL J. Feyen F. Aikman M. Erickson NWS/NCEP/EMC H. Tolman NWS/OST/MDL.
P. Ñurmi / WWRP QPF Verification - Prague 1 Operational QPF Verification of End Products and NWP Pertti Nurmi Finnish Meteorological Institute.
1 James Brown An introduction to verifying probability forecasts RFC Verification Workshop.
Mississippi River Tri-Agency Meeting National Weather Service 1 COE/NWS/USGS Tri-Agency Meeting Mississippi River Basin AHPS UPDATE COE/NWS/USGS Tri-Agency.
National Weather Service - Southeast River Forecast Center Southeast River Forecast Center North Florida Visit July 17-20, 2006 Southeast River Forecast.
National Weather Service Application of CFS Forecasts in NWS Hydrologic Ensemble Prediction John Schaake Office of Hydrologic Development NOAA National.
Model validation Simon Mason Seasonal Forecasting Using the Climate Predictability Tool Bangkok, Thailand, 12 – 16 January 2015.
Overview of the Colorado Basin River Forecast Center Lisa Holts.
The IEM-KCCI-NWS Partnership: Working Together to Save Lives and Increase Weather Data Distribution.
61 st IHC, New Orleans, LA Verification of the Monte Carlo Tropical Cyclone Wind Speed Probabilities: A Joint Hurricane Testbed Project Update John A.
The NOAA Hydrology Program and its requirements for GOES-R Pedro J. Restrepo Senior Scientist Office of Hydrologic Development NOAA’s National Weather.
Strategic Plan for HEPEX John Schaake, Eric Wood and Roberto Buizza AMS Annual Meeting Atlanta February 2, 2006.
1 National Flood Workshop Dr. Thomas Graziano Chief Hydrologic Services Division National Weather Service National Oceanic and Atmospheric Administration.
1 Critical Water Information for Floods to Droughts NOAA’s Hydrology Program January 4, 2006 Responsive to Natural Disasters Forecasts for Hazard Risk.
CBRFC Stakeholder Forum February 24, 2014 Ashley Nielson Kevin Werner NWS Colorado Basin River Forecast Center 1 CBRFC Forecast Verification.
Logistical Verification Forecast Services in IHFS Mary Mullusky RFC Verification Workshop, August
1 Proposal for a Climate-Weather Hydromet Test Bed “Where America’s Climate and Weather Services Begin” Louis W. Uccellini Director, NCEP NAME Forecaster.
Colorado Basin River Forecast Center and Drought Related Forecasts Kevin Werner.
Northeast River Forecast Center Taunton, MA National Oceanic and Atmospheric Administration’s National Weather Service Hydrologic Ensemble Forecast Service.
Standard Verification Strategies Proposal from NWS Verification Team NWS Verification Team Draft03/23/2009 These slides include notes, which can be expanded.
RFC Climate Requirements 2 nd NOAA Climate NWS Dialogue Meeting January 4, 2006 Kevin Werner.
1 Probabilistic Forecast Verification Allen Bradley IIHR Hydroscience & Engineering The University of Iowa RFC Verification Workshop 16 August 2007 Salt.
Probabilistic Forecasts of Extreme Precipitation Events for the U.S. Hazards Assessment Kenneth Pelman 32 nd Climate Diagnostics Workshop Tallahassee,
Verification of ensemble systems Chiara Marsigli ARPA-SIMC.
Nathalie Voisin 1, Florian Pappenberger 2, Dennis Lettenmaier 1, Roberto Buizza 2, and John Schaake 3 1 University of Washington 2 ECMWF 3 National Weather.
1 Symposium on the 50 th Anniversary of Operational Numerical Weather Prediction Dr. Jack Hayes Director, Office of Science and Technology NOAA National.
Science plan S2S sub-project on verification. Objectives Recommend verification metrics and datasets for assessing forecast quality of S2S forecasts Provide.
Diagnostic verification and extremes: 1 st Breakout Discussed the need for toolkit to build beyond current capabilities (e.g., NCEP) Identified (and began.
Climate Prediction Center: Challenges and Needs Jon Gottschalck and Arun Kumar with contributions from Dave DeWitt and Mike Halpert NCEP Production Review.
Judith Curry James Belanger Mark Jelinek Violeta Toma Peter Webster 1
Focus areas of the NWS Missouri/Souris River Floods of May-August 2011 Service Assessment – Per the NOAA and NWS Strategic Plans, gather stakeholder input.
DOWNSCALING GLOBAL MEDIUM RANGE METEOROLOGICAL PREDICTIONS FOR FLOOD PREDICTION Nathalie Voisin, Andy W. Wood, Dennis P. Lettenmaier University of Washington,
VERIFICATION OF A DOWNSCALING SEQUENCE APPLIED TO MEDIUM RANGE METEOROLOGICAL PREDICTIONS FOR GLOBAL FLOOD PREDICTION Nathalie Voisin, Andy W. Wood and.
Probabilistic Forecasts - Baseline Products for the Advanced Hydrologic Prediction Services (AHPS) Dave Reed HIC – LMRFC 2006 NWA Annual Meeting October.
1 Where the Rubber Meets the Road: Testbed Experiences of the Hydrometeorological Prediction Center David Novak 1, Faye Barthold 2, Mike Bodner 1, and.
Findings and Recommendations from the Hydrologic Verification System Requirements Team Peter Gabrielsen, Julie Demargne, Mary Mullusky, Kevin Werner, Bill.
Long-lead streamflow forecasts: 2. An approach based on ensemble climate forecasts Andrew W. Wood, Dennis P. Lettenmaier, Alan.F. Hamlet University of.
Overview of CBRFC Flood Operations Arizona WFOs – May 19, 2011 Kevin Werner, SCH.
11 Short-Range QPF for Flash Flood Prediction and Small Basin Forecasts Prediction Forecasts David Kitzmiller, Yu Zhang, Wanru Wu, Shaorong Wu, Feng Ding.
Ensemble Forecasts Andy Wood CBRFC. Forecast Uncertainties Meteorological Inputs: Meteorological Inputs: Precipitation & temperature Precipitation & temperature.
RFC Verification Workshop What to Expect Julie Demargne and Mary Mullusky RFC Verification Workshop, August
National Oceanic and Atmospheric Administration’s National Weather Service Colorado Basin River Forecast Center Salt Lake City, Utah 11 The Hydrologic.
HIC Meeting, 02/25/2010 NWS Hydrologic Forecast Verification Team: Status and Discussion Julie Demargne OHD/HSMB Hydrologic Ensemble Prediction (HEP) group.
HEC-ResSim 3.3 New Features to Support Complex Studies
Verifying and interpreting ensemble products
Nathalie Voisin, Andy W. Wood and Dennis P. Lettenmaier
Verification of Tropical Cyclone Forecasts
Short Range Ensemble Prediction System Verification over Greece
Presentation transcript:

Application of Forecast Verification Science to Operational River Forecasting in the National Weather Service Julie Demargne, James Brown, Yuqiong Liu and D-J Seo NROW, November 4-5, 2009 UCAR

2 Approach to river forecasting Observations Models Input forecasts Forecast products Users Forecasters

3 Where is the …? Verification ?? ? In the past Limited verification of hydrologic forecasts How good are the forecasts for application X?

4 Where is the …? !!! Verification Experts Verification Systems Verification Products Papers Now Verification

5 Hydrologic forecasting: a multi-scale problem National Major river system River basin with river forecast points Forecast group Headwater basin with radar rainfall grid High resolution flash flood basins Hydrologic forecasts must be verified consistently across all spatial scales and resolutions.

6 Hydrologic forecasting: a multi-scale problem Seamless probabilistic water forecasts are required for all lead times and all users; so is verification information. Benefits Forecast Lead Time Protection of Life & Property Hydropower Recreation Ecosystem State/Local Planning Environment Flood Mitigation & Navigation Agriculture Health Commerce Reservoir Control Forecast Uncertainty Weeks Months Seasons Years Days Hours Minutes

7 Need for hydrologic forecast verification In 2006, NRC recommended NWS expand verification of its uncertainty products and make it easily available to all users in near real time  Users decide whether to take action with risk-based decision  Must educate users on how to interpret forecast and verification info

8 River forecast verification service NWS-Hydrologic-Forecast-Verification- Team_Final-report_Sep09.pdf.pdf rfcdev/docs/ Final_Verification_Report.pdf

9 River forecast verification service To help us answer  How good are the forecasts for application X?  What are the strengths and weaknesses of the forecasts?  What are the sources of error and uncertainty in the forecasts?  How are new science and technology improving the forecasts and the verifying observations?  What should be done to improve the forecasts?  Do forecasts help users in their decision making?

10 River forecast verification service Observations Models Input forecasts Forecast products Verification systems Verification products River forecasting system Users Forecasters

11 River forecast verification service Verification Service within Community Hydrologic Prediction System (CHPS) to:  Compute metrics  Display data & metrics  Disseminate data & metrics  Provide real-time access to metrics  Analyze uncertainty and error in forecasts  Track performance

12 Verification challenges Verification is useful if the information generated leads to decisions about the forecast/system being verified  Verification needs to be user oriented No single verification measure provides complete information about the quality of a forecast product  Several verification metrics and products are needed To facilitate communication of forecast quality, common verification practices and products are needed from weather and climate forecasts to water forecasts  Collaborations between meteorology and hydrology communities are needed (e.g., Thorpex-Hydro, HEPEX)

13 Verification challenges: two classes of verification Diagnostic verification:  to diagnose and improve model performance  done off-line with archived forecasts or hindcasts to analyze forecast quality relative to different conditions/processes Real-time verification:  to help forecasters and users make decisions in real-time  done in real-time (before the verifying observation occurs) using information from historical analogs and/or past forecasts and verifying observations under similar conditions

14 Diagnostic verification products Key verification metrics for 4 levels of information for single-valued and probabilistic forecasts 1. Observations-forecasts comparisons (scatter plots, box plots, time series plots) 2. Summary verification (e.g. MAE/Mean CRPS, skill score) 3. More detailed verification (e.g. measures of reliability, resolution, discrimination, correlation, results for specific conditions) 4. Sophisticated verification (e.g. for specific events with ROC) To be evaluated by forecasters and forecast users

15 Forecast value Observed value User-defined threshold Diagnostic verification products Examples for level 1: scatter plot, box-and-whiskers plot

16 Diagnostic verification products Zero error line Observed daily total precipitation [mm] Low bias High bias “Blown” forecasts American River in California – 24-hr precipitation ensembles (lead day 1) Forecast error (forecast - observed) [mm] ‘Errors’ for one forecast Max. 90% 80% Median 20% 10% Min. Examples for level 1: box-and-whiskers plot

17 Diagnostic verification products Examples for level 2: skill score maps by months January AprilOctober Smaller score, better

18 Diagnostic verification products Examples for level 3: more detailed plots Score Performance under different conditions Performance for different months

19 Diagnostic verification products Examples for level 4: event specific plots Probability of False Detection POFD Probability of Detection POD Event: > 85 th percentile from observed distribution Predicted Probability Observed frequency Perfect Reliability Discrimination

20 Diagnostic verification products Examples for level 4: user-friendly spread-bias plot 60% of time, observation should fall in window covering middle 60% (i.e. median ±30%) “Underspread” “Hit rate” = 90% 60% Perfect

21 Diagnostic verification analyses Analyze any new forecast process with verification Use different temporal aggregations  Analyze verification statistic as a function of lead time  If similar performance across lead times, data can be pooled Perform spatial aggregation carefully  Analyze results for each basin and results plotted on spatial maps  Use normalized metrics (e.g. skill scores)  Aggregate verification results across basins with similar hydrologic processes (e.g. by response time) Report verification scores with sample size  In the future, confidence intervals

22 Diagnostic verification analyses Evaluate forecast performance under different conditions  w/ time conditioning: by month, by season  w/ atmospheric/hydrologic conditioning: –low/high probability threshold –absolute thresholds (e.g., PoP, Flood Stage)  Check that sample size is not too small Analyze sources of uncertainty and error  Verify forcing input forecasts and output forecasts  For extreme events, verify both stage and flow  Sensitivity analysis to be set up at all RFCs: 1) what is the optimized QPF horizon for hydrologic forecasts? 2) do run-time modifications made on the fly improve forecasts?

23 Diagnostic verification software Interactive Verification Program (IVP) developed at OHD: verifies single-valued forecasts at given locations/areas

24 Diagnostic verification software Ensemble Verification System (EVS) developed at OHD: verifies ensemble forecasts at given locations/areas

25 Dissemination of diagnostic verification Example: WR water supply website Data Visualization Error MAE, RMSE Conditional on lead time, year Skill Skill relative to Climatology Conditional Categorical FAR, POD, contingency table (based on climatology or user definable)

26 Dissemination of diagnostic verification Example: OHRFC bubble plot online

27 Real-time verification How good could the ‘live’ forecast be? Live forecast Observations

28 Select analogs from a pre-defined set of historical events and compare with ‘live’ forecast Real-time verification Analog 1 Observed Live forecast Analog Forecast Analog Observed Analog 2 Analog 3 “Live forecast for Flood is likely to be too high”

29 Real-time verification What happened Live forecast Adjust ‘live’ forecast based on info from the historical analogs Live forecast was too high

30 Real-time verification Example for ensemble forecasts Temperature ( o F) Forecast lead day Live forecast (L) Analog observations Analog forecasts (H): μ H = μ L ± 1.0˚C “Day 1 forecast is probably too high”

31 Real-time verification Build analog query prototype using multiple criteria  Seeking analogs for precipitation: “Give me past forecasts for the 10 largest events relative to hurricanes for this basin.”  Seeking analogs for temperature: “Give me all past forecasts with lead time 12 hours whose ensemble mean was within 5% of the live ensemble mean.”  Seeking analogs for flow: “Give me all past forecasts with lead times of hours whose probability of flooding was >=0.95, where the basin-averaged soil-moisture was > x and the immediately prior observed flow exceeded y at the forecast issue time”. Requires forecasters’ input!

32 Outstanding science issues Define meaningful reference forecasts for skill scores Separate timing error and amplitude error in forecasts Verify rare events and specify sampling uncertainty in metrics Analyze sources of uncertainty and error in forecasts Consistently verify forecasts on multiple space and time scales Verify multivariate forecasts (issued at multiple locations and for multiple time steps) by accounting for statistical dependencies Account for observational error (measurement and representativeness errors) and rating curve error Account for non-stationarity (e.g., climate change)

33 Verification service development OHD-NCEP Thorpex-Hydro project OHD OCWWS NCEP Forecasters Users Academia Forecast agencies Private HEPEX Verification Test Bed (CMC, Hydro-Quebec, ECMWF) OHD-Deltares collaboration for CHPS enhancements COMET-OHD-OCWWS collaboration on training

34 Looking ahead 2012:  Info on quality of forecast service available online  real-time and diagnostic verification implemented in CHPS  RFC verification standard products available online along with forecasts 2015:  Leveraging grid-based verification tools FUTURE

35 FORECASTER Thank you Questions?

36 Extra slide

37 Diagnostic verification products Key verification metrics from NWS Verification Team report