Verification of Precipitation Areas Beth Ebert Bureau of Meteorology Research Centre Melbourne, Australia

Slides:



Advertisements
Similar presentations
Nowcasting and Short Range NWP at the Australian Bureau of Meteorology
Advertisements

Spatial point patterns and Geostatistics an introduction
Experiences concerning fuzzy-verification and pattern recognition methods Ulrich Damrath.
6th WMO tutorial Verification Martin GöberContinuous 1 Good afternoon! नमस्कार नमस्कार Guten Tag! Buenos dias! до́брый день! до́брыйдень Qwertzuiop asdfghjkl!
Tim Smyth and Jamie Shutler Assessment of analysis and forecast skill Assessment using satellite data.
Validation of Satellite Precipitation Estimates for Weather and Hydrological Applications Beth Ebert BMRC, Melbourne, Australia 3 rd IPWG Workshop / 3.
Verification of GEOM and FAKE cases with SAL Contribution from U Mainz Christiane Hofmann, Matthias Zimmer, Heini Wernli Kindly presented by Christian.
Gridded OCF Probabilistic Forecasting For Australia For more information please contact © Commonwealth of Australia 2011 Shaun Cooper.
Assessment of Tropical Rainfall Potential (TRaP) forecasts during the Australian tropical cyclone season Beth Ebert BMRC, Melbourne, Australia.
Monitoring the Quality of Operational and Semi-Operational Satellite Precipitation Estimates – The IPWG Validation / Intercomparison Study Beth Ebert Bureau.
Validation of the Ensemble Tropical Rainfall Potential (eTRaP) for Landfalling Tropical Cyclones Elizabeth E. Ebert Centre for Australian Weather and Climate.
PROVIDING DISTRIBUTED FORECASTS OF PRECIPITATION USING A STATISTICAL NOWCAST SCHEME Neil I. Fox and Chris K. Wikle University of Missouri- Columbia.
Federal Department of Home Affairs FDHA Federal Office of Meteorology and Climatology MeteoSwiss Quantitative precipitation forecasts in the Alps – first.
Exploring the Use of Object- Oriented Verification at the Hydrometeorological Prediction Center Faye E. Barthold 1,2, Keith F. Brill 1, and David R. Novak.
Verification of Numerical Weather Prediction systems employed by the Australian Bureau of Meteorology over East Antarctica during the summer season.
The Centre for Australian Weather and Climate Research A partnership between CSIRO and the Bureau of Meteorology Object-based Spatial Verification for.
COSMO General Meeting Zurich, 2005 Institute of Meteorology and Water Management Warsaw, Poland- 1 - Verification of the LM at IMGW Katarzyna Starosta,
CARPE DIEM Centre for Water Resources Research NUID-UCD Contribution to Area-3 Dusseldorf meeting 26th to 28th May 2003.
1 GOES-R AWG Hydrology Algorithm Team: Rainfall Probability June 14, 2011 Presented By: Bob Kuligowski NOAA/NESDIS/STAR.
ECMWF WWRP/WMO Workshop on QPF Verification - Prague, May 2001 NWP precipitation forecasts: Validation and Value Deterministic Forecasts Probabilities.
1 On the use of radar data to verify mesoscale model precipitation forecasts Martin Goeber and Sean Milton Model Diagnostics and Validation group Numerical.
Verifying Satellite Precipitation Estimates for Weather and Hydrological Applications Beth Ebert Bureau of Meteorology Research Centre Melbourne, Australia.
4th Int'l Verification Methods Workshop, Helsinki, 4-6 June Methods for verifying spatial forecasts Beth Ebert Centre for Australian Weather and.
Geostatistical approach to Estimating Rainfall over Mauritius Mphil/PhD Student: Mr.Dhurmea K. Ram Supervisors: Prof. SDDV Rughooputh Dr. R Boojhawon Estimating.
Development of an object- oriented verification technique for QPF Michael Baldwin 1 Matthew Wandishin 2, S. Lakshmivarahan 3 1 Cooperative Institute for.
Evaluation of the ability of Numerical Weather Prediction models run in support of IHOP to predict the evolution of Mesoscale Convective Systems Steve.
Towards an object-oriented assessment of high resolution precipitation forecasts Janice L. Bytheway CIRA Council and Fellows Meeting May 6, 2015.
STEPS: An empirical treatment of forecast uncertainty Alan Seed BMRC Weather Forecasting Group.
June 19, 2007 GRIDDED MOS STARTS WITH POINT (STATION) MOS STARTS WITH POINT (STATION) MOS –Essentially the same MOS that is in text bulletins –Number and.
Summary of observed changes in precipitation and temperature extremes (D9)
© Crown copyright Met Office Preliminary results using the Fractions Skill Score: SP2005 and fake cases Marion Mittermaier and Nigel Roberts.
We carried out the QPF verification of the three model versions (COSMO-I7, COSMO-7, COSMO-EU) with the following specifications: From January 2006 till.
On the spatial verification of FROST-2014 precipitation forecast fields Anatoly Muraviev (1), Anastasia Bundel (1), Dmitry Kiktev (1), Nikolay Bocharnikov.
Ebert-McBride Technique (Contiguous Rain Areas) Ebert and McBride (2000: Verification of precipitation in weather systems: determination of systematic.
Improving Ensemble QPF in NMC Dr. Dai Kan National Meteorological Center of China (NMC) International Training Course for Weather Forecasters 11/1, 2012,
Latest results in verification over Poland Katarzyna Starosta, Joanna Linkowska Institute of Meteorology and Water Management, Warsaw 9th COSMO General.
Forecast Verification Research Beth Ebert and Laurie Wilson, JWGFVR co-chairs WWRP-JSC meeting, Geneva, Feb 2011.
Federal Department of Home Affairs FDHA Federal Office of Meteorology and Climatology MeteoSwiss Priority project « Advanced interpretation and verification.
1 Climate Test Bed Seminar Series 24 June 2009 Bias Correction & Forecast Skill of NCEP GFS Ensemble Week 1 & Week 2 Precipitation & Soil Moisture Forecasts.
Page 1© Crown copyright Scale selective verification of precipitation forecasts Nigel Roberts and Humphrey Lean.
Federal Department of Home Affairs FDHA Federal Office of Meteorology and Climatology MeteoSwiss Quantitative precipitation forecast in the Alps Verification.
Feature-based (object-based) Verification Nathan M. Hitchens National Severe Storms Laboratory.
Object-oriented verification of WRF forecasts from 2005 SPC/NSSL Spring Program Mike Baldwin Purdue University.
Typhoon Forecasting and QPF Technique Development in CWB Kuo-Chen Lu Central Weather Bureau.
Validation of Satellite-Derived Rainfall Estimates and Numerical Model Forecasts of Precipitation over the US John Janowiak Climate Prediction Center/NCEP/NWS.
U. Damrath, COSMO GM, Athens 2007 Verification of numerical QPF in DWD using radar data - and some traditional verification results for surface weather.
Exploring the Possibility of Using Tropical Cyclone Numbers to Project Taiwan Summer Precipitation Patterns Mong-Ming Lu and Ru-Jun May Research and Development.
Page 1© Crown copyright 2004 The use of an intensity-scale technique for assessing operational mesoscale precipitation forecasts Marion Mittermaier and.
Trials of a 1km Version of the Unified Model for Short Range Forecasting of Convective Events Humphrey Lean, Susan Ballard, Peter Clark, Mark Dixon, Zhihong.
WRF Verification Toolkit Workshop, Boulder, February 2007 Spatial verification of NWP model fields Beth Ebert BMRC, Australia.
NCAR, 15 April Fuzzy verification of fake cases Beth Ebert Center for Australian Weather and Climate Research Bureau of Meteorology.
10th COSMO General Meeting, Cracow, Poland Verification of COSMOGR Over Greece 10 th COSMO General Meeting Cracow, Poland.
User-Focused Verification Barbara Brown* NCAR July 2006
11 Short-Range QPF for Flash Flood Prediction and Small Basin Forecasts Prediction Forecasts David Kitzmiller, Yu Zhang, Wanru Wu, Shaorong Wu, Feng Ding.
Application of the CRA Method Application of the CRA Method William A. Gallus, Jr. Iowa State University Beth Ebert Center for Australian Weather and Climate.
VALIDATION OF HIGH RESOLUTION SATELLITE-DERIVED RAINFALL ESTIMATES AND OPERATIONAL MESOSCALE MODELS FORECASTS OF PRECIPITATION OVER SOUTHERN EUROPE 1st.
Deutscher Wetterdienst Long-term trends of precipitation verification results for GME, COSMO-EU and COSMO-DE Ulrich Damrath.
SAL - Structure, Ampliutde, Location
Fuzzy verification using the Fractions Skill Score
Systematic timing errors in km-scale NWP precipitation forecasts
Verifying Precipitation Events Using Composite Statistics
Spatial Verification Intercomparison Meeting, 20 February 2007, NCAR
Multi-scale validation of high resolution precipitation products
Soo-Hyun Yoo and Pingping Xie
Verification of multi-model ensemble forecasts using the TIGGE dataset
Assessing uncertainties on production forecasting based on production Profile reconstruction from a few Dynamic simulations Gaétan Bardy – PhD Student.
Adaption of an entity based verification
Numerical Weather Prediction Center (NWPC), Beijing, China
Seasonal Forecasting Using the Climate Predictability Tool
Peter May and Beth Ebert CAWCR Bureau of Meteorology Australia
Presentation transcript:

Verification of Precipitation Areas Beth Ebert Bureau of Meteorology Research Centre Melbourne, Australia

Outline 1. “Eyeball” verification - use of maps 2. QPF verification using gridpoint match-ups 3. Space-time verification of pooled data 4. Entity-based (rain “blob”) verification 5. Summary

1. “Eyeball” verification - some examples Accumulated rain over eastern Germany and western Poland, 4-8 July 1997

WWRP Sydney 2000 Forecast Demonstration Project

RAINVAL - Operational verification of NWP QPFs

2. QPF verification using (grid)point match-ups HAll verification statistics can be applied to spatial estimates when treated as a matched set of forecasts/observations at a set of individual points! ObservedForecast Method 1: Analyze observations onto a grid

ObservedForecast Method 2: Interpolate model forecast to station locations Q: Which verification approach is better? A: It depends!

Arguments in favor of grid: point observations may not represent rain in local area gridded analysis of observations better represents the grid- scale values that a model predicts spatially uniform sampling  Use to verify gridded forecasts Arguments in favor of station locations: observations are “pure” (not smoothed or interpolated)  Use to verify forecasts at point locations or sets of point locations Note: Verification scores improve with increasing scale!

Preparation of gridded (rain gauge) verification data: Real time vs. non-real time Quality control to eliminate bad data Mapping procedure: –simple gridbox average –objective analysis (Barnes, statistical interpolation, kriging, splines, etc.) Map observations to model grid Model intercomparison - map to common grid Uncertainty in gridbox values

Continuous statistics quantify errors in forecast rain amount

Categorical statistics quantify errors in forecast rain occurrence

Verification of QPFs from NWP models Vary rain threshold from light to heavy Equitable threat scoreBias score

Verification of NWP QPFs over Germany equitable threat w.r.t. chance equitable threat w.r.t. persistence

Verification of nowcasts in Sydney 2000 FDP —— Nowcast Persistence

3. Space-time QPF verification (a) Pool forecasts and observations in SPACE AND TIME  summary statistics

Caution: Results may mask regional and/or seasonal differences annual winter summer Model performance in Australian tropics

(b)Pool forecasts and observations in SPACE but NOT TIME  maps of temporal statistics No data Bias score June 1995-November 1996

(c)Pool forecasts and observations in TIME but NOT SPACE  time series of spatial statistics OBS LAPS 24 h LAPS 36 h LAPS 48 h 1-30 Apr 2001 Australian region

Limitations to QPF verification using (grid)point match-ups : Some seemingly good verification statistics may result from compensating errors –too much rain in one part of the domain offset by too little rain in another part of the domain –interseasonal rainfall variation captured but shorter period variation not captured Conservative forecasts are rewarded Some rain forecasts look quite good except for the location of the system; unfortunately, traditional verification statistics severely penalize these cases

4. Entity-based QPF verification (rain “blobs”) Verify the properties of the forecast rain system against the properties of the observed rain system: location rain area rain intensity (mean, maximum) ObservedForecast

Define a rain entity by a Contiguous Rain Area (CRA), a region bounded by a user-specified isohyet. Some possible choices of CRA thresholds are: 1 mm d -1 :~ all rain in system 5 mm d -1 :“important” rain 20 mm d -1 : rain center Observed Forecast

Determining the location error: Horizontally translate the QPF until the total squared error between the forecast and the analysis (observations) is minimized in the shaded region. The displacement is the vector difference between the original and final locations of the forecast. Arrow shows optimum shift. Observed Forecast

CRA error decomposition The total mean squared error (MSE) can be written as: MSE total = MSE displacement + MSE volume + MSE pattern The difference between the mean square error before and after translation is the contribution to total error due to displacement, MSE displacement = MSE total – MSE shifted The error component due to volume represents the bias in mean intensity, where and are the CRA mean forecast and observed values after the shift. The pattern error accounts for differences in the fine structure of the forecast and observed fields, MSE pattern = MSE shifted - MSE volume

Example: Nowcasts from Sydney 2000 FDP

Example: Australian regional NWP model

Rain areaMean rain intensity North of 25°S South of 25°S

Maximum rain intensity Rain volume

Displacement error

Event forecast classification Two most important aspects of a “useful” QPF: Location of predicted rain must be close to the observed location Predicted maximum rain rate must be “in the ballpark”

Example: Proposed event forecast criteria for 24h NWP QPFs Good location: Forecast rain system must be within 2° lat/lon or one effective radius of the rain system, but not farther than 5° from the observed location Good intensity: Maximum rain rate must be within one category of observed (using rain categories of 1-2, 2-5, 5-10, 10-25, 25-50, , , , >200 mm d -1 ) Event forecast classification Australian 24h QPFs from BoM regional model, July 1995-June 1999 (2066 events)

Error decomposition Australian 24h QPFs from BoM regional model, July 1995-June 1999 (2066 events)

Advantages of entity-based QPF verification: intuitive, quantifies “eyeball” verification addresses location errors allows decomposition of total error into contributions from location, volume, and pattern errors rain event forecasts can be classified as "hits", "misses", etc. does not reward conservative forecasts Disadvantages of entity-based verification: more than one way to do pattern matching (i.e., not 100% objective forecast must resemble observations sufficiently to enable pattern matching

5. Summary Spatial QPF success* can be qualitatively and quantitatively measured in many ways, each of which tells only part of the story *Note: “success” depends on the requirements of the user!! Objective Subjective PointArea (grid)point match-ups Precision Meaning maps entities