Refinement and Evaluation of Automated High-Resolution Ensemble-Based Hazard Detection Guidance Tools for Transition to NWS Operations Kick off JNTP project.

Slides:



Advertisements
Similar presentations
Model Evaluation Tools MET. What is MET Model Evaluation Tools ( MET )- a powerful and highly configurable verification package developed by DTC offering:
Advertisements

5 th International Conference of Mesoscale Meteor. And Typhoons, Boulder, CO 31 October 2006 National Scale Probabilistic Storm Forecasting for Aviation.
Report of the Q2 Short Range QPF Discussion Group Jon Ahlquist Curtis Marshall John McGinley - lead Dan Petersen D. J. Seo Jean Vieux.
Storm Prediction Center Highlights NCEP Production Suite Review December 3, 2013 Steven Weiss, Israel Jirak, Chris Melick, Andy Dean, Patrick Marsh, and.
14 May 2001QPF Verification Workshop Verification of Probability Forecasts at Points WMO QPF Verification Workshop Prague, Czech Republic May 2001.
SPC Potential Products and Services and Attributes of Operational Supporting NWP Probabilistic Outlooks of Tornado, Severe Hail, and Severe Wind.
WPC Winter Weather Desk Operations and Verification Dan Petersen Winter weather focal point Keith Brill, David Novak, Wallace Hogsett, and Mark.
Louisville, KY August 4, 2009 Flash Flood Frank Pereira NOAA/NWS/NCEP/Hydrometeorological Prediction Center.
Improving Excessive Rainfall Forecasts at HPC by using the “Neighborhood - Spatial Density“ Approach to High Res Models Michael Eckert, David Novak, and.
Demonstration Testbed for the Evaluation of Experimental Models for Tropical Cyclone Forecasting in Support of the NOAA Hurricane Forecast Improvement.
Improving Probabilistic Ensemble Forecasts of Convection through the Application of QPF-POP Relationships Christopher J. Schaffer 1 William A. Gallus Jr.
NWP Verification with Shape- matching Algorithms: Hydrologic Applications and Extension to Ensembles Barbara Brown 1, Edward Tollerud 2, Tara Jensen 1,
Exploring the Use of Object- Oriented Verification at the Hydrometeorological Prediction Center Faye E. Barthold 1,2, Keith F. Brill 1, and David R. Novak.
HPC Winter Weather Desk Operations and 2011 and 2012 Winter Weather Experiments Dan Petersen Winter weather focal point with contributions from.
1 st UNSTABLE Science Workshop April 2007 Science Question 3: Science Question 3: Numerical Weather Prediction Aspects of Forecasting Alberta Thunderstorms.
Ensemble Post-Processing and it’s Potential Benefits for the Operational Forecaster Michael Erickson and Brian A. Colle School of Marine and Atmospheric.
Testbeds and Projects with Ongoing Ensemble Research:  Hydrometeorology Testbed (HMT)  Hazardous Weather Testbed (HWT)  Hurricane Forecast Improvement.
Tara Jensen for DTC Staff 1, Steve Weiss 2, Jack Kain 3, Mike Coniglio 3 1 NCAR/RAL and NOAA/GSD, Boulder Colorado, USA 2 NOAA/NWS/Storm Prediction Center,
Evaluation of Potential Performance Measures for the Advanced Hydrologic Prediction Service Gary A. Wick NOAA Environmental Technology Laboratory On Rotational.
Evaluation and Comparison of Multiple Convection-Allowing Ensembles Examined in Recent HWT Spring Forecasting Experiments Israel Jirak, Steve Weiss, and.
Performance of the MOGREPS Regional Ensemble
Copyright 2012, University Corporation for Atmospheric Research, all rights reserved Verifying Ensembles & Probability Fcsts with MET Ensemble Stat Tool.
Warn on Forecast Briefing September 2014 Warn on Forecast Brief for NCEP planning NSSL and GSD September 2014.
Objective Evaluation of Aviation Related Variables during 2010 Hazardous Weather Testbed (HWT) Spring Experiment Tara Jensen 1*, Steve Weiss 2, Jason J.
1 On the use of radar data to verify mesoscale model precipitation forecasts Martin Goeber and Sean Milton Model Diagnostics and Validation group Numerical.
The 2014 Flash Flood and Intense Rainfall Experiment Faye E. Barthold 1,2, Thomas E. Workoff 1,3, Wallace A. Hogsett 1*, J.J. Gourley 4, and David R. Novak.
Forecasting in a Changing Climate Harold E. Brooks NOAA/National Severe Storms Laboratory (Thanks to Andy Dean, Dave Stensrud, Tara Jensen, J J Gourley,
Evaluation of a Challenging Warm Season QPF month at HPC: June 2009 Brendon Rubin-Oster Richard Otto (with contributions from Mike Bodner, Keith Brill,
Toward a 4D Cube of the Atmosphere via Data Assimilation Kelvin Droegemeier University of Oklahoma 13 August 2009.
Bill Kuo 1, Louisa Nance 1, Barb Brown 1 and Zoltan Toth 2 Developmental Testbed Center 1. National Center for Atmospheric Research 2. Earth System Research.
Ed Tollerud, Tara Jensen, Barb Brown ( also Yuejian Zhu, Zoltan Toth, Tony Eckel, Curtis Alexander, Huiling Yuan,…) Module 6 Objective: Provide a portable,
How can LAMEPS * help you to make a better forecast for extreme weather Henrik Feddersen, DMI * LAMEPS =Limited-Area Model Ensemble Prediction.
Federal Department of Home Affairs FDHA Federal Office of Meteorology and Climatology MeteoSwiss Data mining in the joint D- PHASE and COPS archive Mathias.
“New tools for the evaluation of convective scale ensemble systems” Seonaid Dey Supervisors: Bob Plant, Nigel Roberts and Stefano Migliorini.
61 st IHC, New Orleans, LA Verification of the Monte Carlo Tropical Cyclone Wind Speed Probabilities: A Joint Hurricane Testbed Project Update John A.
DTC Verification for the HMT Edward Tollerud 1, Tara Jensen 2, John Halley Gotway 2, Huiling Yuan 1,3, Wally Clark 4, Ellen Sukovich 4, Paul Oldenburg.
Recent Advancements from the Research-to-Operations (R2O) Process at HMT-WPC Thomas E. Workoff 1,2, Faye E. Barthold 1,3, Michael J. Bodner 1, David R.
Use of Mesoscale Ensemble Weather Predictions to Improve Short-Term Precipitation and Hydrological Forecasts Michael Erickson 1, Brian A. Colle 1, Jeffrey.
NOAA Hazardous Weather Test Bed (SPC, OUN, NSSL) Objectives – Advance the science of weather forecasting and prediction of severe convective weather –
INFORMATION EXTRACTION AND VERIFICATION OF NUMERICAL WEATHER PREDICTION FOR SEVERE WEATHER FORECASTING Israel Jirak, NOAA/Storm Prediction Center Chris.
MMET Team Michelle Harrold Tracy Hertneky Jamie Wolff Demonstrating the utility of the Mesoscale Model Evaluation Testbed (MMET)
Forecasting a Continuum of Environmental Threats (FACETs): Overview, Plans and Early Impressions of a Proposed High-Impact Weather Forecasting Paradigm.
Probabilistic Forecasts of Extreme Precipitation Events for the U.S. Hazards Assessment Kenneth Pelman 32 nd Climate Diagnostics Workshop Tallahassee,
Generate ξ m,i for each direction i given H, σ 1 and m (Eq. 2) calculate X’ m,i for each direction i (Eq. 1) given ξ m,i and X m, which corresponds for.
Convective-Scale Numerical Weather Prediction and Data Assimilation Research At CAPS Ming Xue Director Center for Analysis and Prediction of Storms and.
Edward Tollerud 1, Tara Jensen 2, John Halley Gotway 2, Huiling Yuan 1,3, Wally Clark 4, Ellen Sukovich 4, Paul Oldenburg 2, Randy Bullock 2, Gary Wick.
1 Aviation Forecasting – Works in Progress NCVF – Ceiling & Visibility CoSPA – Storm Prediction A Joint Effort Among: MIT Lincoln Laboratory NCAR – National.
August 6, 2001Presented to MIT/LL The LAPS “hot start” Initializing mesoscale forecast models with active cloud and precipitation processes Paul Schultz.
NCAR, 15 April Fuzzy verification of fake cases Beth Ebert Center for Australian Weather and Climate Research Bureau of Meteorology.
Fly - Fight - Win 2 d Weather Group Mr. Evan Kuchera HQ AFWA 2 WXG/WEA Template: 28 Feb 06 Approved for Public Release - Distribution Unlimited AFWA Ensemble.
Comparison of Convection-permitting and Convection-parameterizing Ensembles Adam J. Clark – NOAA/NSSL 18 August 2010 DTC Ensemble Testbed (DET) Workshop.
Extracting probabilistic severe weather guidance from convection-allowing model forecasts Ryan Sobash 4 December 2009 Convection/NWP Seminar Series Ryan.
DET Module 1 Ensemble Configuration Linda Wharton 1, Paula McCaslin 1, Tara Jensen 2 1 NOAA/GSD, Boulder, CO 2 NCAR/RAL, Boulder, CO 3/8/2016.
1 Where the Rubber Meets the Road: Testbed Experiences of the Hydrometeorological Prediction Center David Novak 1, Faye Barthold 2, Mike Bodner 1, and.
1 NWS Digital Services American Meteorological Society Annual Partners Meeting San Diego, CA January 13, 2005 LeRoy Spayd National Weather Service Office.
Overview of SPC Efforts in Objective Verification of Convection-Allowing Models and Ensembles Israel Jirak, Chris Melick, Patrick Marsh, Andy Dean and.
The Quantitative Precipitation Forecasting Component of the 2011 NOAA Hazardous Weather Testbed Spring Experiment David Novak 1, Faye Barthold 1,2, Mike.
Using MODE-TD John Halley Gotway NCAR/RAL/JNTP and Developmental Testbed Center (DTC) MET Team: Tara Jensen, Randy Bullock, Tressa Fowler, Barbara Brown,
HIC Meeting, 02/25/2010 NWS Hydrologic Forecast Verification Team: Status and Discussion Julie Demargne OHD/HSMB Hydrologic Ensemble Prediction (HEP) group.
A few examples of heavy precipitation forecast Ming Xue Director
LEPS VERIFICATION ON MAP CASES
Hydrometeorological Predication Center
TAIWIN model verification task
Verifying and interpreting ensemble products
Tom Hopson, Jason Knievel, Yubao Liu, Gregory Roux, Wanli Wu
5Developmental Testbed Center
Dan Petersen Bruce Veenhuis Greg Carbin Mark Klein Mike Bodner
Communicating Uncertainty via Probabilistic Forecasts for the January 2016 Blizzard in Southern New England Frank M Nocera, Stephanie L. Dunten & Kevin.
Christoph Gebhardt, Zied Ben Bouallègue, Michael Buchhold
Verification of Tropical Cyclone Forecasts
Presentation transcript:

Refinement and Evaluation of Automated High-Resolution Ensemble-Based Hazard Detection Guidance Tools for Transition to NWS Operations Kick off JNTP project meeting

Project Organization GSD JNTPMMM EMC- MMB WPCHWTAWCOPG

Project Organization Isidora Curtis Trevor Our Team Julie Rebecca Jacob Carly / NAM RR Dave Novak WPC-HMT folks Adam Clark/Greg Stumpf AWCOPG Social Science NAM RR Evaluation / Diagnostics Developers FACETS project

Staffing Tara – 280 hours Jamie – 236 hours Tressa – 120 hours John hours Tatiana – need to decide what is needed Randy – 220 hours Josh – 50 hours Louisa – 50 hours Per year for 3 years

Goals Develop probabilistic guidance on a set of hazards based on HRRR / NAM-RR for the testbeds to use and subjectively evaluate Prob QPF Prob of Snowfall Prob of Tornados, Hail, Wind Convective Prob Objectively evaluate the members to help with development and the products to give forecasters confidence Work with forecasters to understand how they will use the product and how to best represent probabilities for more effective communication Have products flowing into N-AWIPS, AWIPS, AWIPS II Transfer probability system and verification system at end of project PHDT Probabilistic Hazard Detection Tool

What’s to be evaluated 00Z forecast uses - 18,19,20,21,22Z HRRR runs All points within XX km are used to formulate probability Being explored by GSD XX Calibration Other methods for formulating probability Adding in NAM-RR members

First Cut at Product Prob > 2.54 mm in 6 hrs Prob > 25.4 mm in 6 hrs

Initial Areas of Emphasis

MET Ensemble and Probability Evaluation Ensemble Characteristics (Ensemble Stat) Rank Histogram PIT CRPS Ignorance Score Spread-Skill Probability Measures (Grid and Point stat) Brier Score + Decomposition Brier Skill Score ROC and Area Under ROC Reliability

QPE_06 >12.7 mm vs. 50% Prob(APCP_06>12.7 mm) Good Forecast with Displacement Error? Traditional Metrics Brier: 0.07 Area Under ROC: 0.62 Spatial Metrics Centroid Distance: Obj1) 200 km Obj2) 88km Area Ratio: Obj1) 0.69 Obj2) Obj PODY: 0.72 Obj FAR: 0.32 Probability Compared with QPE field

Practically Perfect Fcst as Obs Take obs (in this case storm reports) and apply Gaussian filter to generate probability field for obs Then use this field in MODE… Courtesy of Brooks et al. 1998

MODE for Different Probabilities – May 11, 2013 (DTC SREF tests) Prob >25% of 2.54mm in 3hr Prob >75% of 2.54mm in 3hr Prob >50% of 2.54mm in 3hr Observation Forecast Intersection Area Forecast Area NWS PoP = C x A where "C" = the confidence that precipitation will occur somewhere in the forecast area "A" = the percent of the area that will receive measureable precipitation. NWS PoP - Percent chance that rain will occur at any given point in the area. A C Symmetric Difference (non-intersecting area) Forecast Area

Ensemble MODE

Time Series Consistency Evaluations 14 What can be measured? Number of ‘crossovers’ (using Wald-Wolfowitz test) Current focus on TC, but may apply to other ‘objects’ and general time series. Change in TC Intensity with new initialization Magnitude of TC Intensity Revision

MODE Time DOMAIN X, Y, Time Great tool for high temporal resolution to look at forecast consistency and evolution Planning to include in next release

What’s to be evaluated Evaluate ensemble members to help assess “best members” and optimal configuration HRRR NAM-RR Employ Ensemble-Stat, Grid-Stat and MODE on PHDT Look at Forecast Consistency – helps development phase Explore MODE-TD in out-years but why not start now?

JNTP SOW Year 1 Month 1-2: Identify metrics and variables needed. Months 3-9: Develop verification system on NOAA supercomputer (e.g. Zeus or Theia) for 1- 2 ensembles and 1-2 deterministic baselines. Develop initial operating capability for Ensemble-MODE evaluation of rain and snow bands. Months 9-12: Attend HMT-WPC. Explore use of MODE-TD on variables relevant to rain and snowband prediction. Demonstrate initial operating capability. Months 12: Identify enhancements needed to verification system per user feedback.

JNTP SOW Year 2 Months 1-9: Enhance MET system to include ensemble methodologies tested in year 1 and confirm applicability to rain and snowbands Months 9-12: Attend HMT-WPC. Demonstrate extended capability to WPC staff. Integrate additions to MODE and MODE-TD into MET repository. Month 12: Identify final enhancements needed to verification system per user feedback. Year 3 Months 1-9: Make final modifications to verification system. Document capabilities, including interpretation of output. Months 9-12: Attend HMT-WPC. Demonstrate operational capability to WPC staff. Transition verification system to WPC and provide user support.