Tom Hopson, Jason Knievel, Yubao Liu, Gregory Roux, Wanli Wu

Slides:



Advertisements
Similar presentations
Chapter 13 – Weather Analysis and Forecasting
Advertisements

Model Evaluation Tools MET. What is MET Model Evaluation Tools ( MET )- a powerful and highly configurable verification package developed by DTC offering:
A Brief Guide to MDL's SREF Winter Guidance (SWinG) Version 1.0 January 2013.
Statistical post-processing using reforecasts to improve medium- range renewable energy forecasts Tom Hamill and Jeff Whitaker NOAA Earth System Research.
Comparing and Contrasting Post-processing Approaches to Calibrating Ensemble Wind and Temperature Forecasts Tom Hopson Luca Delle Monache, Yubao Liu, Gregory.
Details for Today: DATE:3 rd February 2005 BY:Mark Cresswell FOLLOWED BY:Assignment 2 briefing Evaluation of Model Performance 69EG3137 – Impacts & Models.
Instituting Reforecasting at NCEP/EMC Tom Hamill (ESRL) Yuejian Zhu (EMC) Tom Workoff (WPC) Kathryn Gilbert (MDL) Mike Charles (CPC) Hank Herr (OHD) Trevor.
1 Kalman filter, analog and wavelet postprocessing in the NCAR-Xcel operational wind-energy forecasting system Luca Delle Monache Research.
JEFS Status Report Department of Atmospheric Sciences University of Washington Cliff Mass, Jeff Baars, David Carey JEFS Workshop, August
Department of Meteorology and Geophysics University of Vienna since 1851 since 1365 TOWARDS AN ANALYSIS ENSEMBLE FOR NWP-MODEL VERIFICATION Manfred Dorninger,
MOS Performance MOS significantly improves on the skill of model output. National Weather Service verification statistics have shown a narrowing gap between.
Ensemble Post-Processing and it’s Potential Benefits for the Operational Forecaster Michael Erickson and Brian A. Colle School of Marine and Atmospheric.
W ildland F ire D ecision S upport S ystem Overview April, 2008.
A Regression Model for Ensemble Forecasts David Unger Climate Prediction Center.
Chapter 13 – Weather Analysis and Forecasting. The National Weather Service The National Weather Service (NWS) is responsible for forecasts several times.
Multi-Model Ensembling for Seasonal-to-Interannual Prediction: From Simple to Complex Lisa Goddard and Simon Mason International Research Institute for.
Probabilistic forecasts of (severe) thunderstorms for the purpose of issuing a weather alarm Maurice Schmeits, Kees Kok, Daan Vogelezang and Rudolf van.
Copyright 2012, University Corporation for Atmospheric Research, all rights reserved Verifying Ensembles & Probability Fcsts with MET Ensemble Stat Tool.
Ensemble Data Assimilation and Uncertainty Quantification Jeffrey Anderson, Alicia Karspeck, Tim Hoar, Nancy Collins, Kevin Raeder, Steve Yeager National.
ESA DA Projects Progress Meeting 2University of Reading Advanced Data Assimilation Methods WP2.1 Perform (ensemble) experiments to quantify model errors.
Rank Histograms – measuring the reliability of an ensemble forecast You cannot verify an ensemble forecast with a single.
Operational Flood Forecasting for Bangladesh: Tom Hopson, NCAR Peter Webster, GT A. R. Subbiah and R. Selvaraju, ADPC Climate Forecast Applications for.
Tutorial. Other post-processing approaches … 1) Bayesian Model Averaging (BMA) – Raftery et al (1997) 2) Analogue approaches – Hopson and Webster, J.
Exploring sample size issues for 6-10 day forecasts using ECMWF’s reforecast data set Model: 2005 version of ECMWF model; T255 resolution. Initial Conditions:
EUROBRISA Workshop – Beyond seasonal forecastingBarcelona, 14 December 2010 INSTITUT CATALÀ DE CIÈNCIES DEL CLIMA Beyond seasonal forecasting F. J. Doblas-Reyes,
Requirements from KENDA on the verification NetCDF feedback files: -produced by analysis system (LETKF) and ‘stat’ utility ((to.
MODEL OUTPUT STATISTICS (MOS) TEMPERATURE FORECAST VERIFICATION JJA 2011 Benjamin Campbell April 24,2012 EAS 4480.
Multi-Model or Post- processing: Pros and Cons Tom Hopson - NCAR Martyn Clark - NIWA Andrew Slater - CIRES/NSIDC.
Probabilistic Forecasting. pdfs and Histograms Probability density functions (pdfs) are unobservable. They can only be estimated. They tell us the density,
ENSEMBLES RT4/RT5 Joint Meeting Paris, February 2005 Overview of the WP5.3 Activities Partners: ECMWF, METO/HC, MeteoSchweiz, KNMI, IfM, CNRM, UREAD/CGAM,
Model Post Processing. Model Output Can Usually Be Improved with Post Processing Can remove systematic bias Can produce probabilistic information from.
Federal Department of Home Affairs FDHA Federal Office of Meteorology and Climatology MeteoSwiss Local Probabilistic Weather Predictions for Switzerland.
Refinement and Evaluation of Automated High-Resolution Ensemble-Based Hazard Detection Guidance Tools for Transition to NWS Operations Kick off JNTP project.
Quantile regression as a means of calibrating and verifying a mesoscale NWP ensemble Tom Hopson 1 Josh Hacker 1, Yubao Liu 1, Gregory Roux 1, Wanli Wu.
18 September 2009: On the value of reforecasts for the TIGGE database 1/27 On the value of reforecasts for the TIGGE database Renate Hagedorn European.
© 2009 UCAR. All rights reserved. ATEC-4DWX IPR, 21−22 April 2009 National Security Applications Program Research Applications Laboratory Ensemble-4DWX.
Statistical Post Processing - Using Reforecast to Improve GEFS Forecast Yuejian Zhu Hong Guan and Bo Cui ECM/NCEP/NWS Dec. 3 rd 2013 Acknowledgements:
Nathalie Voisin 1, Florian Pappenberger 2, Dennis Lettenmaier 1, Roberto Buizza 2, and John Schaake 3 1 University of Washington 2 ECMWF 3 National Weather.
Common verification methods for ensemble forecasts
Judith Curry James Belanger Mark Jelinek Violeta Toma Peter Webster 1
DOWNSCALING GLOBAL MEDIUM RANGE METEOROLOGICAL PREDICTIONS FOR FLOOD PREDICTION Nathalie Voisin, Andy W. Wood, Dennis P. Lettenmaier University of Washington,
VERIFICATION OF A DOWNSCALING SEQUENCE APPLIED TO MEDIUM RANGE METEOROLOGICAL PREDICTIONS FOR GLOBAL FLOOD PREDICTION Nathalie Voisin, Andy W. Wood and.
ATEC-4DWX Project 6 October 2008 Jason Knievel, lead scientist Keith Searight, lead engineer Scott Swerdlin, project manager.
Verification methods - towards a user oriented verification The verification group.
National Oceanic and Atmospheric Administration’s National Weather Service Colorado Basin River Forecast Center Salt Lake City, Utah 11 The Hydrologic.
Google Meningitis Modeling Tom Hopson October , 2010.
Encast Global forecasting.
Jeffrey Anderson, NCAR Data Assimilation Research Section
Skillful Arctic climate predictions
Tom Hopson, NCAR (among others) Satya Priya, World Bank
Ensemble Forecasting: Calibration, Verification, and use in Applications Tom Hopson.
Verifying and interpreting ensemble products
Precipitation Products Statistical Techniques
Makarand A. Kulkarni Indian Institute of Technology, Delhi
Model Post Processing.
Binary Forecasts and Observations
Winter storm forecast at 1-12 h range
Nathalie Voisin, Andy W. Wood and Dennis P. Lettenmaier
MOS Developed by and Run at the NWS Meteorological Development Lab (MDL) Full range of products available at:
Improving forecasts through rapid updating of temperature trajectories and statistical post-processing Nina Schuhen, Thordis L. Thorarinsdottir and Alex.
Post Processing.
Google Meningitis Modeling
The Importance of Reforecasts at CPC
Validation of NOAA-16/ATOVS Products from AAPP/IAPP Packages in Korea
Integration of NCAR DART-EnKF to NCAR-ATEC multi-model, multi-physics and mutli- perturbantion ensemble-RTFDDA (E-4DWX) forecasting system Linlin Pan 1,
COSMO-LEPS Verification
Ensemble-4DWX update: focus on calibration and verification
Verification of Tropical Cyclone Forecasts
Forecast system development activities
Rapid Adjustment of Forecast Trajectories: Improving short-term forecast skill through statistical post-processing Nina Schuhen, Thordis L. Thorarinsdottir.
Presentation transcript:

Tom Hopson, Jason Knievel, Yubao Liu, Gregory Roux, Wanli Wu Ensemble-4DWX update: focus on guidance for configuring ensemble members Tom Hopson, Jason Knievel, Yubao Liu, Gregory Roux, Wanli Wu National Security Applications Program Research Applications Laboratory ATEC-4DWX IPR, April 20-21 2010

Reminder of status in Aug 2009 Improving scripting to manage the computational resources Increased calibration stations from 26 to 34 (regional) Calibration hindcast increased from roughly 600 to over 1000 points Increased hindcast size allows calibration of 2 additional lead-times (36hr, 42hr) Included new skill measure for ensemble skill-spread relationship Corrected over-dispersion of calibration process Automated calibrating 2-m T Testing automated calibration of 2-m dew point T National Security Applications Program Research Applications Laboratory ATEC-4DWX IPR, April 20-21 2010

Progress since August 2010 Adding new graphics for DPG’s easier access to post-processed results Continue to improve scripting and potential use of new computational resources to improve post-processing times Wind speed calibration development Developed tool for testing utility of ensemble number to optimize DPG’s reduced model configuration over potential new locations Work continuing on scientific papers Cad - clear air damming study National Security Applications Program Research Applications Laboratory ATEC-4DWX IPR, April 20-21 2010

Recap on Calibration and verification tools Ensemble calibration to correct predicted distribution Calibration is needed for users capable of decision making with probabilistic guidance. Will be needed for foreseeable future Verification of different ensemble characteristics is easily completed when performing calibration National Security Applications Program Research Applications Laboratory ATEC-4DWX IPR, April 20-21 2010

Calibration: review and recap “bias” obs Forecast PDF Probability Probability Forecast PDF obs “spread” or “dispersion” calibration Temperature [K] Temperature [K] Calibration (and verification) is now fully-automated Utilizes “persistence” if available 34 sites in and near DPG Full calibration for all sites ~ 1X per week for each weather variable Using lookup tables ~ every hour (8 hrs, was 1hr) National Security Applications Program Research Applications Laboratory ATEC-4DWX IPR, April 20-21 2010

Probability/°K T [K] Probability/°K T [K] Step 2: For each quan, use forward step-wise cross-validation to select best regress set Selection requires: a) min QR cost function, b) binomial distrib at 95% confidence If requirements not met, retain climatological “prior” Step I: Determine climatological quantiles Probability/°K climatological PDF 1. Regressor set: 1. reforecast ens 2. ens mean 3. ens stdev 4. persistence 5. LR quantile (not shown) 3. T [K] 2. 4. Temperature [K] observed forecasts Time Step 3: segregate forecasts based on ens dispersion; refit models (Step 2) for each range Final result: “sharper” posterior PDF represented by interpolated quans forecasts posterior Forecast PDF I. II. III. II. I. Probability/°K prior T [K] Temperature [K] Time National Security Applications Program Research Applications Laboratory ATEC-4DWX IPR, April 20-21 2010

3-hr dewpoint time series Station DPG S01 Before Calibration After Calibration National Security Applications Program Research Applications Laboratory ATEC-4DWX IPR, April 20-21 2010

42-hr dewpoint time series Station DPG S01 Before Calibration After Calibration National Security Applications Program Research Applications Laboratory ATEC-4DWX IPR, April 20-21 2010

PDFs: raw vs. calibrated Blue is “raw” ensemble Black is calibrated ensemble Red is the observed value Notice: significant change in both “bias” and dispersion of final PDF (also notice PDF asymmetries) obs ATEC-4DWX IPR, April 20-21 2010

Troubled rank histograms 0 10 20 30 Counts 0 10 20 30 Counts 1 2 3 4 5 6 7 8 9 10 Ensemble # 1 2 3 4 5 6 7 8 9 10 Ensemble # ATEC-4DWX IPR, April 20-21 2010

3-hr dewpoint rank histograms Station DPG S01 National Security Applications Program Research Applications Laboratory ATEC-4DWX IPR, April 20-21 2010

42-hr dewpoint rank histograms Station DPG S01 National Security Applications Program Research Applications Laboratory ATEC-4DWX IPR, April 20-21 2010

RMSE of ensemble members Station DPG S01 3hr Lead-time 42hr Lead-time National Security Applications Program Research Applications Laboratory ATEC-4DWX IPR, April 20-21 2010

Significant calibration regressors Station DPG S01 3hr Lead-time 42hr Lead-time National Security Applications Program Research Applications Laboratory ATEC-4DWX IPR, April 20-21 2010

Assessing ensemble utility Goal: provide guidance in ensemble member selection Caveat: results only applying for T and Td surface variables – potential different ensemble usage for different variables and levels Approach: Starting with complete ensemble set, rank model ensembles in terms of their usage in the calibration procedure by assessing over all weather variables, locations, and lead-times Iteratively remove least-used ensembles from calibration, and use verification statistics to determine the cost of reduction What is the “optimal size” that balances forecast skill and computational expense? National Security Applications Program Research Applications Laboratory ATEC-4DWX IPR, April 20-21 2010

Single value to summarize performance. Skill Scores Single value to summarize performance. Reference forecast - best naive guess; persistence, climatology A perfect forecast implies that the object can be perfectly observed Positively oriented – Positive is good National Security Applications Program Research Applications Laboratory ATEC-4DWX IPR, Aug 11-12 2009 ATEC-4DWX IPR, April 20-21 2010

Preliminary Results … Brier referenced to Persistence RMSE referenced to Persistence National Security Applications Program Research Applications Laboratory ATEC-4DWX IPR, April 20-21 2010

… more theoretically Probability/°K Temperature [K] Probability/°K 42hr Lead-time Uncorrelated ensembles Probability/°K Temperature [K] Correlated ensembles Probability/°K More ensembles usefully define mean of PDF if error growth less than linear *and* uncorrelated For perfectly-correlated ensembles, addition of *any* ensemble degrades skill Temperature [K] National Security Applications Program Research Applications Laboratory ATEC-4DWX IPR, April 20-21 2010

Future plans Continue work on refining calibration algorithm to decrease computation time (required as new weather variables are added) Implement for wind speed and wind direction Continue development of guidance tool for optimal ensemble selection for operational use Develop scheme for model points without surface observations over whole model domain National Security Applications Program Research Applications Laboratory ATEC-4DWX IPR, April 20-21 2010