Extracting probabilistic severe weather guidance from convection-allowing model forecasts Ryan Sobash 4 December 2009 Convection/NWP Seminar Series Ryan.

Slides:



Advertisements
Similar presentations
ECMWF Slide 1Met Op training course – Reading, March 2004 Forecast verification: probabilistic aspects Anna Ghelli, ECMWF.
Advertisements

Slide 1ECMWF forecast User Meeting -- Reading, June 2006 Verification of weather parameters Anna Ghelli, ECMWF.
Slide 1ECMWF forecast products users meeting – Reading, June 2005 Verification of weather parameters Anna Ghelli, ECMWF.
Chapter 13 – Weather Analysis and Forecasting
5 th International Conference of Mesoscale Meteor. And Typhoons, Boulder, CO 31 October 2006 National Scale Probabilistic Storm Forecasting for Aviation.
14 May 2001QPF Verification Workshop Verification of Probability Forecasts at Points WMO QPF Verification Workshop Prague, Czech Republic May 2001.
SPC Potential Products and Services and Attributes of Operational Supporting NWP Probabilistic Outlooks of Tornado, Severe Hail, and Severe Wind.
Louisville, KY August 4, 2009 Flash Flood Frank Pereira NOAA/NWS/NCEP/Hydrometeorological Prediction Center.
Gridded OCF Probabilistic Forecasting For Australia For more information please contact © Commonwealth of Australia 2011 Shaun Cooper.
Statistical Postprocessing of LM Weather Parameters Ulrich Damrath Volker Renner Susanne Theis Andreas Hense.
Federal Department of Home Affairs FDHA Federal Office of Meteorology and Climatology MeteoSwiss The Latent Heat Nudging Scheme of COSMO EWGLAM/SRNWP Meeting,
Statistical Postprocessing of Weather Parameters for a High-Resolution Limited-Area Model Ulrich Damrath Volker Renner Susanne Theis Andreas Hense.
Improving Excessive Rainfall Forecasts at HPC by using the “Neighborhood - Spatial Density“ Approach to High Res Models Michael Eckert, David Novak, and.
Exploring the Use of Object- Oriented Verification at the Hydrometeorological Prediction Center Faye E. Barthold 1,2, Keith F. Brill 1, and David R. Novak.
1 st UNSTABLE Science Workshop April 2007 Science Question 3: Science Question 3: Numerical Weather Prediction Aspects of Forecasting Alberta Thunderstorms.
The Consideration of Noise in the Direct NWP Model Output Susanne Theis Andreas Hense Ulrich Damrath Volker Renner.
Ensemble Post-Processing and it’s Potential Benefits for the Operational Forecaster Michael Erickson and Brian A. Colle School of Marine and Atmospheric.
Chapter 13 – Weather Analysis and Forecasting. The National Weather Service The National Weather Service (NWS) is responsible for forecasts several times.
Roll or Arcus Cloud Supercell Thunderstorms.
Evaluation and Comparison of Multiple Convection-Allowing Ensembles Examined in Recent HWT Spring Forecasting Experiments Israel Jirak, Steve Weiss, and.
COSMO General Meeting Zurich, 2005 Institute of Meteorology and Water Management Warsaw, Poland- 1 - Verification of the LM at IMGW Katarzyna Starosta,
“1995 Sunrise Fire – Long Island” Using an Ensemble Kalman Filter to Explore Model Performance on Northeast U.S. Fire Weather Days Michael Erickson and.
Probabilistic forecasts of (severe) thunderstorms for the purpose of issuing a weather alarm Maurice Schmeits, Kees Kok, Daan Vogelezang and Rudolf van.
Verification has been undertaken for the 3 month Summer period (30/05/12 – 06/09/12) using forecasts and observations at all 205 UK civil and defence aerodromes.
Climatology and Predictability of Cool-Season High Wind Events in the New York City Metropolitan and Surrounding Area Michael Layer School of Marine and.
Storm Prediction Center: Storm-Scale Ensemble of Opportunity Israel Jirak & Steve Weiss Science Support Branch Storm Prediction Center Norman, OK Acknowledgments:
ESA DA Projects Progress Meeting 2University of Reading Advanced Data Assimilation Methods WP2.1 Perform (ensemble) experiments to quantify model errors.
Ensemble Numerical Prediction of the 4 May 2007 Greensburg, Kansas Tornadic Supercell using EnKF Radar Data Assimilation Dr. Daniel T. Dawson II NRC Postdoc,
Verification of the Cooperative Institute for Precipitation Systems‘ Analog Guidance Probabilistic Products Chad M. Gravelle and Dr. Charles E. Graves.
How can LAMEPS * help you to make a better forecast for extreme weather Henrik Feddersen, DMI * LAMEPS =Limited-Area Model Ensemble Prediction.
Federal Department of Home Affairs FDHA Federal Office of Meteorology and Climatology MeteoSwiss High-resolution data assimilation in COSMO: Status and.
Towards an object-oriented assessment of high resolution precipitation forecasts Janice L. Bytheway CIRA Council and Fellows Meeting May 6, 2015.
Guidance on Intensity Guidance Kieran Bhatia, David Nolan, Mark DeMaria, Andrea Schumacher IHC Presentation This project is supported by the.
Continued Development of Tropical Cyclone Wind Probability Products John A. Knaff – Presenting CIRA/Colorado State University and Mark DeMaria NOAA/NESDIS.
Verification methods - towards a user oriented verification WG5.
Model Resolution Prof. David Schultz University of Helsinki, Finnish Meteorological Institute, and University of Manchester.
Page 1© Crown copyright 2006 Matt Huddleston With thanks to: Frederic Vitart (ECMWF), Ruth McDonald & Met Office Seasonal forecasting team 14 th March.
© Crown copyright Met Office Probabilistic turbulence forecasts from ensemble models and verification Philip Gill and Piers Buchanan NCAR Aviation Turbulence.
61 st IHC, New Orleans, LA Verification of the Monte Carlo Tropical Cyclone Wind Speed Probabilities: A Joint Hurricane Testbed Project Update John A.
Probabilistic Forecasting. pdfs and Histograms Probability density functions (pdfs) are unobservable. They can only be estimated. They tell us the density,
Data assimilation, short-term forecast, and forecasting error
USING THE ROSSBY RADIUS OF DEFORMATION AS A FORECASTING TOOL FOR TROPICAL CYCLOGENESIS USING THE ROSSBY RADIUS OF DEFORMATION AS A FORECASTING TOOL FOR.
Utskifting av bakgrunnsbilde: -Høyreklikk på lysbildet og velg «Formater bakgrunn» -Under «Fyll», velg «Bilde eller tekstur» og deretter «Fil…» -Velg ønsket.
Performance of the Experimental 4.5 km WRF-NMM Model During Recent Severe Weather Outbreaks Steven Weiss, John Kain, David Bright, Matthew Pyle, Zavisa.
SPC Ensemble Applications: Current Status, Recent Developments, and Future Plans David Bright Storm Prediction Center Science Support Branch Norman, OK.
Severe Weather: Tornadoes Harold E. Brooks NOAA/National Severe Storms Laboratory Norman, Oklahoma
Refinement and Evaluation of Automated High-Resolution Ensemble-Based Hazard Detection Guidance Tools for Transition to NWS Operations Kick off JNTP project.
Advanced interpretation and verification of very high resolution models National Meteorological Administration Rodica Dumitrache, Aurelia LUPASCU,
INFORMATION EXTRACTION AND VERIFICATION OF NUMERICAL WEATHER PREDICTION FOR SEVERE WEATHER FORECASTING Israel Jirak, NOAA/Storm Prediction Center Chris.
Production of a multi-model, convective- scale superensemble over western Europe as part of the SESAR project EMS Annual Conference, Sept. 13 th, 2013.
Spatial Verification Methods for Ensemble Forecasts of Low-Level Rotation in Supercells Patrick S. Skinner 1, Louis J. Wicker 1, Dustan M. Wheatley 1,2,
R2O and O2R between NSSL and SPC: The benefits of Collocation Jack Kain, Steve Weiss, Mike Coniglio, Harold Brooks, Israel Jirak, and Adam Clark.
Tornado Warning Skill as a Function of Environment National Weather Service Sub-Regional Workshop Binghamton, New York September 23, 2015 Yvette Richardson.
Development of a Rapid Intensification Index for the Eastern Pacific Basin John Kaplan NOAA/AOML Hurricane Research Division Miami, FL and Mark DeMaria.
2. Basic Characteristics and Forecast The 500-hPa pattern for this event featured a deep low centered over Idaho. A composite analysis of past tornado.
Nathalie Voisin 1, Florian Pappenberger 2, Dennis Lettenmaier 1, Roberto Buizza 2, and John Schaake 3 1 University of Washington 2 ECMWF 3 National Weather.
Federal Department of Home Affairs FDHA Federal Office of Meteorology and Climatology MeteoSwiss Probabilities from COSMO-2 derived with the neighborhood.
Comparison of Convection-permitting and Convection-parameterizing Ensembles Adam J. Clark – NOAA/NSSL 18 August 2010 DTC Ensemble Testbed (DET) Workshop.
Overview of SPC Efforts in Objective Verification of Convection-Allowing Models and Ensembles Israel Jirak, Chris Melick, Patrick Marsh, Andy Dean and.
Paper Review Jennie Bukowski ATS APR-2017
Systematic timing errors in km-scale NWP precipitation forecasts
Verifying and interpreting ensemble products
Nathalie Voisin, Andy W. Wood and Dennis P. Lettenmaier
Probabilistic forecasts
Christoph Gebhardt, Zied Ben Bouallègue, Michael Buchhold
Verification of Tropical Cyclone Forecasts
Some Verification Highlights and Issues in Precipitation Verification
SUPERCELL PREDICTABILITY:
Short Range Ensemble Prediction System Verification over Greece
Craig Schwartz, Glen Romine, Ryan Sobash, and Kate Fossell
Presentation transcript:

Extracting probabilistic severe weather guidance from convection-allowing model forecasts Ryan Sobash 4 December 2009 Convection/NWP Seminar Series Ryan Sobash 4 December 2009 Convection/NWP Seminar Series

Identification of severe convection in high-resolution models Today’s state-of-the-art NWP models are run at resolutions capable of explicitly representing convection [< 4km]. Although these models do not explicitly predict the phenomena responsible for severe reports, the potential exists to produce guidance for these hazards if a relationship exists between the “intensity” of model convection and observed severe weather reports. If this relationship is robust, high-res model forecasts could be used to produce automated forecast guidance of convective hazards, just as traditional guidance is produced with environmental parameters.

Previous work – Identifying supercells Several studies have investigated the issue of mining output of operational models to identify severe convection, primarily supercells. Elmore et al. (2002) identified supercells (using a w-zeta correlation) in an operational high-res ensemble to provide guidance to forecasters on storm intensity/longevity. Shaded: UH > 25 m 2 s -2 Kain et al. (2008) identified rotating storms in real-time model output using an integrated helicity parameter (updraft helicity - UH) during Spring 2005 to obtain statistics on number/coverage of modeled supercells. Kain et al. (2008) preferred the UH parameter over a correlation parameter (supercell detection index - SDI). Both produced similar results. Kain et al. (2008)

Previous work – Identifying supercells The updraft helicity parameter is an measure of the vertical component of helicity associated with an updraft: The quantity is integrated between 2 km and 5 km AGL to identify rotating updrafts in the lower-to-middle troposphere. It is computed on the model grid using a mid-point approximation. Restricted to positive values (i.e. cyclonically rotating updrafts). Could potentially identify shear areas not associated with supercells?

Previous work – Spring Experiment 2008 During the 2008 NSSL/SPC Spring Experiment, severe convection was identified in a convection-allowing ensemble to produce guidance for forecasters. Along with UH, other fields associated with severe convection were identified in the model output. > large updraft helicity > strong low-level wind speed > moderately strong low-level winds co-located with linear reflectivity segments The grid points that met any of these criteria were flagged and accumulated over a 24 hour period (using hour model forecasts). These grid points are interpreted as the locations of “surrogate” severe weather reports.

Previous work – Spring Experiment 2008 A Gaussian smoother was applied following the procedure outlined in Brooks et. al. (1998) to produce a “practically perfect” Convective Outlook given a distribution reports. Subjective interpretations indicated this technique routinely captured the areas of severe reports.

Current Study An investigation of this guidance concept was undertaken. Although an ensemble was used during SE2008, a deterministic model run was used in this work. NSSL-WRF Configuration: > WRF-ARW V2.2 > Initialization time : 00 UTC > Forecast length : 36 hours > Horizontal resolution : 4 km > Physics : MYJ BL, WSM6 MP Domain

Current Study 5 fields were chosen to identify convection in the NSSL-WRF output: UH: updraft helicity (computed between 2 km and 5 km) [m2 s-2] UU: 10 m wind speed [m s-1] RF: 1 km AGL simulated reflectivity [dBz] UP: max. column updraft (below 400 hPa) [m s-1] DN: max. column downdraft (below 400 hPa) [m s-1] To capture intra-hourly convective-scale variations, the maximum value of each field within an hour was recorded. Grid points where a field exceeds a severe “threshold” were flagged and are referred to as surrogate severe reports. Focus on hour forecasts.

Current Study The thresholds were chosen subjectively during SE2008. To provide a more systematic examination, a range of thresholds were selected from each field’s frequency distribution during SE2008, near the original subjective thresholds. UH: 33 > 103 m 2 s -2 UU: 19 > 24 m s -1 RF: 53 > 56 dBz UP: 20 > 27 m s -1 DN: -5 > -7 m s -1 Specific thresholds based on percentiles of the distribution.

Current Study Each day’s surrogate reports are placed on an 80 km grid and smoothed using a Gaussian weighting function. Since the Gaussian weights sum to 1, the result can be interpreted as a probabilistic forecast. The final product is a surrogate severe probability forecast (SSPF). This post-processing technique was performed by Theis et. al. (2005) on precipitation forecasts to introduce a simple measure of spatial uncertainty into deterministic high-resolution forecasts.

Case: 29 May 2008 UH UURF DN UP OBS

Case: 29 May 2008 UH UURF DNUPOBS

Research Questions Does the SSPF provide skillful probabilistic guidance? Can the SSPF provide useful guidance to SPC forecasters? ? 1. What fields produce the most skillful guidance? 2. For a given field, which thresholds produce the most skillful guidance? 3. What forms of guidance are most appropriate? 4. How reliable are the probabilistic forecasts?

SSPF Verification ROC (relative operating characteristic) curves > assesses ability of forecasts to discriminate between different outcomes – is conditioned on observations > plot of probability of detection (POD) vs. probability of false detection (aka false alarm rate – FAR) > diagonal is POD=POFD (chance of forecasting event is equal to chance of forecasting a false alarm) > commonly summarized with ROC curve area Reliability diagrams > assesses ability of forecasts to produce reliable probabilities – is conditioned on the forecasts > ideally, want the forecasts of 30% to occur 30% of the time A skillful probabilistic forecasting system produces large ROC curve areas and is highly reliable. Images courtesy Austrailian Bureau of Meteorology Verification website

SSPF Verification SSPF-UH ROC curves for SE2008 ROC curve areas Decreasing threshold

SSPF Verification SSPF-UU ROC curves for SE2008 ROC curve areas Decreasing threshold

SSPF Verification SSPF-RF ROC curves for SE2008 ROC curve areas Decreasing threshold

SSPF Verification SSPF-UP ROC curves for SE2008 ROC curve areas Decreasing threshold

SSPF Verification SSPF-DN ROC curves for SE2008 ROC curve areas Decreasing threshold

SSPF Verification SSPF-UH Reliability diagram for SE2008 Increasing threshold No Skill Climatology

SSPF Verification SSPF-UU Reliability diagram for SE2008 Increasing threshold No Skill Climatology

SSPF Verification SSPF-RF Reliability diagram for SE2008 Increasing threshold No Skill Climatology

SSPF Verification SSPF-UP Reliability diagram for SE2008 Increasing threshold No Skill Climatology

SSPF Verification SSPF-DN Reliability diagram for SE2008 Increasing threshold No Skill Climatology

SSPF Verification Largest ROC curve areas using lowest thresholds for UH, UP, DN. ROC curve areas appear to asymptote approaching these thresholds. UH naturally reliable at mid-range (~50 m2s-2) thresholds, other fields tend to overforecast probabilities at all thresholds. Probabilistic forecasts which have large ROC curve areas, but are insufficiently reliable can be calibrated to improve reliability.

SSPF Verification SSPF forecasts were produced with constant sigma = 120 km. Can reliability of the forecasts be improved by changing this smoothing parameter? sigma = 120 kmsigma = 160 kmsigma = 240 km Too hot… Too cold… Just right?

SSPF Verification SSPF-UH Reliability for SE2008 sigma=80 kmsigma=160 km sigma=240 km Increase sigma

SSPF Verification SSPF-UU Reliability for SE2008 sigma=80 kmsigma=160 km sigma=240 km

SSPF Verification SSPF-RF Reliability for SE2008 sigma=80 kmsigma=160 km sigma=240 km

SSPF Verification SSPF-UP Reliability for SE2008 sigma=80 kmsigma=160 km sigma=240 km

SSPF Verification SSPF-DN Reliability for SE2008 sigma=80 kmsigma=160 km sigma=240 km

SSPF Verification Summary UH is the best predictor as indicated by ROC area, reliability diagrams. The best performing SSPF is UH with sigma ~ 200 km, using a threshold ~ 35 m 2 s -2. Calibration of other fields is more challenging… - Changes in sigma have desired effect for some forecast probabilities, but not for others. - Still have an overforecasting problem for UU, RF, UP, DN. UH is uniquely suited to identify severe convection. Increasing sigma decreases potential for higher probabilities. For rare-event forecasting, this may not be an issue.

Future work Work is underway to verify forecast over a 1.5 year period to determine if these findings are applicable under all seasons and regions. Discriminate between severe weather type. Applying the SSPF procedure to an ensemble of forecasts. Will help improve upon this proof-of-concept.

Research Questions Does the SSPF provide skillful probabilistic guidance? Yes. Can the SSPF provide useful guidance to SPC forecasters? Potentially. Additional calibration needed. Could be useful as a starting point in the analysis of model data. ? Day1 Surrogate Convective Outlook 29/1200Z – 30/1200Z Model: /00Z NSSL-WRF