Similar Day Ensemble Post-Processing as Applied to Wildfire Threat and Ozone Days Michael Erickson 1, Brian A. Colle 1 and Joseph J. Charney 2 1 School.

Slides:



Advertisements
Similar presentations
Chapter 13 – Weather Analysis and Forecasting
Advertisements

A Brief Guide to MDL's SREF Winter Guidance (SWinG) Version 1.0 January 2013.
Details for Today: DATE:3 rd February 2005 BY:Mark Cresswell FOLLOWED BY:Assignment 2 briefing Evaluation of Model Performance 69EG3137 – Impacts & Models.
Gridded OCF Probabilistic Forecasting For Australia For more information please contact © Commonwealth of Australia 2011 Shaun Cooper.
Effects of Grid Resolution and Perturbations in Meteorology and Emissions on Air Quality Simulations Over the Greater New York City Region Christian Hogrefe.
Meteorological Evolution and Model Performance for Fire Threat Days Over the Northeast U.S. Joe Pollina 1,2, Brian A. Colle 1, Mike Erickson 1 1 School.
Fire Summary The simulations presented in this study represent the meteorological conditions associated with the Warren Grove Wildfire in south-central.
An Update on the Stony Brook University Ensemble Forecast System        Brian A. Colle, Matthew Jones, Yanluan Lin, and Joseph B. Olson Institute.
Description and Preliminary Evaluation of the Expanded UW S hort R ange E nsemble F orecast System Maj. Tony Eckel, USAF University of Washington Atmospheric.
An Investigation of Cool Season Extratropical Cyclone Forecast Errors Within Operational Models Brian A. Colle 1 and Michael Charles 1,2 1 School of Marine.
Validation of Storm Surge Models for the New York Bight and Long Island Regions and the Impact of Ensembles Tom Di Liberto Dr. Brian A. Colle Stony Brook.
Reliability Trends of the Global Forecast System Model Output Statistical Guidance in the Northeastern U.S. A Statistical Analysis with Operational Forecasting.
MOS Developed by and Run at the NWS Meteorological Development Lab (MDL) Full range of products available at:
The Collaborative Effort Between Stony Brook University and the National Weather Service Part 3 – Integration of Mesoscale Models in Operational Weather.
Towards an Ensemble Forecast Air Quality System for New York State Michael Erickson 1, Brian A. Colle 1, Christian Hogrefe 2,3, Prakash Doraiswamy 3, Kenneth.
United States Coast Guard 1985 Evaluation of a Multi-Model Storm Surge Ensemble for the New York Metropolitan Region Brian A. Colle Tom Di Liberto Stony.
Transitioning unique NASA data and research technologies to the NWS 1 Evaluation of WRF Using High-Resolution Soil Initial Conditions from the NASA Land.
Evaluation of a Mesoscale Short-Range Ensemble Forecasting System over the Northeast United States Matt Jones & Brian A. Colle NROW, 2004 Institute for.
CSTAR Update: New Tools for More Efficient Use of Ensembles in Operations Brian A. Colle, Minghua Zheng, and Edmund K.M. Chang, School of Marine and Atmospheric.
MOS Performance MOS significantly improves on the skill of model output. National Weather Service verification statistics have shown a narrowing gap between.
The Expanded UW SREF System and Statistical Inference STAT 592 Presentation Eric Grimit 1. Description of the Expanded UW SREF System (How is this thing.
¿How sensitive are probabilistic precipitation forecasts to the choice of the ensemble generation method and the calibration algorithm? Juan J. Ruiz 1,2.
Juan Ruiz 1,2, Celeste Saulo 1,2, Soledad Cardazzo 1, Eugenia Kalnay 3 1 Departamento de Cs. de la Atmósfera y los Océanos (FCEyN-UBA), 2 Centro de Investigaciones.
Ensemble Post-Processing and it’s Potential Benefits for the Operational Forecaster Michael Erickson and Brian A. Colle School of Marine and Atmospheric.
Chapter 13 – Weather Analysis and Forecasting. The National Weather Service The National Weather Service (NWS) is responsible for forecasts several times.
Using Short Range Ensemble Model Data in National Fire Weather Outlooks Sarah J. Taylor David Bright, Greg Carbin, Phillip Bothwell NWS/Storm Prediction.
“1995 Sunrise Fire – Long Island” Using an Ensemble Kalman Filter to Explore Model Performance on Northeast U.S. Fire Weather Days Michael Erickson and.
Climatology and Predictability of Cool-Season High Wind Events in the New York City Metropolitan and Surrounding Area Michael Layer School of Marine and.
Assessment of the vertical exchange of heat, moisture, and momentum above a wildland fire using observations and mesoscale simulations Joseph J. Charney.
Wind Regimes of Southern California winter S. Conil 1,2, A. Hall 1 and M. Ghil 1,2 1 Department of Atmospheric and Oceanic Sciences, UCLA, Los Angeles,
Towards the Usage of Post-processed Operational Ensemble Fire Weather Indices over the Northeast United States Michael Erickson 1, Brian A. Colle 1, and.
ESA DA Projects Progress Meeting 2University of Reading Advanced Data Assimilation Methods WP2.1 Perform (ensemble) experiments to quantify model errors.
WWOSC 2014, Aug 16 – 21, Montreal 1 Impact of initial ensemble perturbations provided by convective-scale ensemble data assimilation in the COSMO-DE model.
ISDA 2014, Feb 24 – 28, Munich 1 Impact of ensemble perturbations provided by convective-scale ensemble data assimilation in the COSMO-DE model Florian.
Verification of the Cooperative Institute for Precipitation Systems‘ Analog Guidance Probabilistic Products Chad M. Gravelle and Dr. Charles E. Graves.
National Weather Service Application of CFS Forecasts in NWS Hydrologic Ensemble Prediction John Schaake Office of Hydrologic Development NOAA National.
Celeste Saulo and Juan Ruiz CIMA (CONICET/UBA) – DCAO (FCEN –UBA)
1 An overview of the use of reforecasts for improving probabilistic weather forecasts Tom Hamill NOAA / ESRL, Physical Sciences Div.
CMAS Conference, October 6 – 8, 2008 The work presented in this paper was performed by the New York State Department of Environmental Conservation with.
Short-Range Ensemble Prediction System at INM José A. García-Moya SMNT – INM 27th EWGLAM & 12th SRNWP Meetings Ljubljana, October 2005.
Model Post Processing. Model Output Can Usually Be Improved with Post Processing Can remove systematic bias Can produce probabilistic information from.
. Outline  Evaluation of different model-error schemes in the WRF mesoscale ensemble: stochastic, multi-physics and combinations thereof  Where is.
Use of Mesoscale Ensemble Weather Predictions to Improve Short-Term Precipitation and Hydrological Forecasts Michael Erickson 1, Brian A. Colle 1, Jeffrey.
Insights from CMC BAMS, June Short Range The SPC Short-Range Ensemble Forecast (SREF) is constructed by post-processing all 21 members of the NCEP.
Southern California February 9, 2002 MISR mesoscale climate dynamics in Southern California Sebastien Conil Alex Hall IRI, April 4, 2006.
Evapotranspiration Estimates over Canada based on Observed, GR2 and NARR forcings Korolevich, V., Fernandes, R., Wang, S., Simic, A., Gong, F. Natural.
Statistical Post Processing - Using Reforecast to Improve GEFS Forecast Yuejian Zhu Hong Guan and Bo Cui ECM/NCEP/NWS Dec. 3 rd 2013 Acknowledgements:
Federal Department of Home Affairs FDHA Federal Office of Meteorology and Climatology MeteoSwiss A more reliable COSMO-LEPS F. Fundel, A. Walser, M. A.
Predictability of High Impact Weather during the Cool Season: CSTAR Update and the Development of a New Ensemble Sensitivity Tool for the Forecaster Brian.
An Investigation of the Mesoscale Predictability over the Northeast U.S.        Brian A. Colle, Matthew Jones, and Joseph Olson Institute for Terrestrial.
An Examination Of Interesting Properties Regarding A Physics Ensemble 2012 WRF Users’ Workshop Nick P. Bassill June 28 th, 2012.
Ensembles and Probabilistic Prediction. Uncertainty in Forecasting All of the model forecasts I have talked about reflect a deterministic approach. This.
Impacts of Rossby Wave Packets on Forecast Uncertainties and Errors
Exploring Multi-Model Ensemble Performance in Extratropical Cyclones over Eastern North America and the Western Atlantic Ocean Nathan Korfe and Brian A.
Nathalie Voisin 1, Florian Pappenberger 2, Dennis Lettenmaier 1, Roberto Buizza 2, and John Schaake 3 1 University of Washington 2 ECMWF 3 National Weather.
NCEP CMC ECMWF MEAN ANA BRIAN A COLLE MINGHUA ZHENG │ EDMUND K. CHANG Applying Fuzzy Clustering Analysis to Assess Uncertainty and Ensemble System Performance.
Fly - Fight - Win 2 d Weather Group Mr. Evan Kuchera HQ AFWA 2 WXG/WEA Template: 28 Feb 06 Approved for Public Release - Distribution Unlimited AFWA Ensemble.
DOWNSCALING GLOBAL MEDIUM RANGE METEOROLOGICAL PREDICTIONS FOR FLOOD PREDICTION Nathalie Voisin, Andy W. Wood, Dennis P. Lettenmaier University of Washington,
VERIFICATION OF A DOWNSCALING SEQUENCE APPLIED TO MEDIUM RANGE METEOROLOGICAL PREDICTIONS FOR GLOBAL FLOOD PREDICTION Nathalie Voisin, Andy W. Wood and.
Verification methods - towards a user oriented verification The verification group.
Ensembles and Probabilistic Prediction
Antecedent Environments Conducive to the Production of Extreme Temperature and Precipitation Events in the United States Andrew C. Winters, Daniel Keyser,
Verifying and interpreting ensemble products
Question 1 Given that the globe is warming, why does the DJF outlook favor below-average temperatures in the southeastern U. S.? Climate variability on.
A Review of the CSTAR Ensemble Tools Available for Operations
Nathalie Voisin, Andy W. Wood and Dennis P. Lettenmaier
Post Processing.
Christoph Gebhardt, Zied Ben Bouallègue, Michael Buchhold
Rapid Adjustment of Forecast Trajectories: Improving short-term forecast skill through statistical post-processing Nina Schuhen, Thordis L. Thorarinsdottir.
Short Range Ensemble Prediction System Verification over Greece
Presentation transcript:

Similar Day Ensemble Post-Processing as Applied to Wildfire Threat and Ozone Days Michael Erickson 1, Brian A. Colle 1 and Joseph J. Charney 2 1 School of Marine and Atmospheric Sciences, Stony Brook University, Stony Brook, NY 2 USDA Forest Service, East Lansing, MI

NCEP GEFS 546 dm Contour CMC GGEM 546 dm Contour Ensemble Forecasting: Many Ensembles To Choose From Example from 00 UTC 10/24/2010 – 500 hPa 144 Hr Forecast …And there are many other ensembles out there (i.e. ECMWF, NOGAPS, NCEP SREF, UKMET to name a few).

850 hPa Temperature Anomaly NCEP SREF Mean SLP and Spread Sea Level Pressure 1012 and 1028 hPa Contour Ensemble Forecasting: A Few Ways to Look at Data Example from 00 UTC 10/24/2010 – GEFS 144 Hr Forecast Probability of Precipitation Sea Level Pressure Mean and Spread Source: With so much data out there, can ensemble post-processing be used to benefit the operational forecaster by creating deterministic and probabilistic gridded forecasts for certain types of weather patterns?

SREF+SBU 2-m Temperature Bias > 298 K Between UTC Diurnal Mean Error – SREF+SBUBias by Member > 298 K Caveats Using ensemble output directly may be misleading, since ensembles are frequently underdispersed with large surface model biases. Although ensemble post-processing methods are growing in sophistication, the sensitivity of biases to the synoptic flow pattern is not well known.

Do model biases vary with the ambient surface weather conditions (i.e. on days with high fire threat and high ozone)? Does applying a similar day approach to ensemble post- processing improve deterministic and probabilistic forecasts? Are there any dominant atmospheric flow patterns during these anomalous events that are related to model biases? Can similar day ensemble post-processing be used to create simple, helpful and skillful forecasts in operations? Questions to Be Addressed

-Consists of 7 MM5 and 6 WRF members run at 12 km grid spacing within a larger 36 km nest. -Variety of ICs (GFS, NAM, NOGAPS, CMC), microphysical, convective and PBL schemes. -Analyzed the Stony Brook University (SBU) and NCEP Short Range Ensemble Forecast (SREF) system for 2-m temperature and 10-m wind speed. -The Automated Surface Observing System (ASOS) are used as verifying observations from over a subset of the Northeast. Verification Domain Methods and Data Region of Study 00 UTC SBU 13 Member Ensemble -10 ETA, 5 RSM, 3 WRF-NMM, and 3 WRF-ARW. -IC’s perturbed using a breeding technique. 21 UTC NCEP SREF 21 Member Ensemble Region of Study

Bias Correction Methods 1.Running Mean Bias Correction: Determine bias over training period and subtract it from forecast (Wilson et al. 2007): 1.CDF Bias Correction: Adjust model CDF to the observed CDF for all forecast values (Hamill and Whitaker 2005), then elevation and landuse. Wind speed was bias corrected with the CDF method, temperature used the running mean method. CDF Bias Correction Example CDF For Model and Observation

Explored the impact of training period on post-processing for high fire threat and ozone days: 1.Sequential Training – Used the most recent 14 consecutive days. 2.Conditional Training – Used the most recent 14 similar days. Analyzed daytime model output ( UTC) for ensembles initialized the day of and the day before the hazardous weather event (i.e. SBU model hours and 36-48). Used the Fire Potential Index (FPI) from the Woodland Fire Assessment System (WFAS) between A fire threat day must have 10% or greater of the domain reach 50 FPI while the remainder of the domain exceeds 25 FPI. Exploring Model Bias on Hazardous Weather Days High Fire Threat Classification A high ozone day must have 10% of AIRNow stations with an Air Quality Index (AQI) > 60 ppb, while the remainder of the domain > 30 ppb. High Ozone Threat Classification Additional Details

Bias Correction Methods for Fire Threat Days - Temperature Sequential bias correction with temperature still has an average ensemble mean bias of -0.8 K, which is removed when using conditional bias correction. MAE is also improved for almost every ensemble member and the ensemble mean. ME per Model For Temperature MAE per Model For Temperature

Bias Correction Methods for Fire Threat Days – Temperature > 298 K Spatially, the negative temperature bias on high fire threat days is found in every station. Conditional bias correction removes the negative temperature bias and reduces the spread of the biases. Raw Warm Season Raw Fire Threat DaysRaw Warm Season Seq. Bias Cor. Fire Threat DaysCond. Bias Cor. Fire Threat Days

Bias Correction Methods for Fire Threat Days – Wind Speed High fire threat days have a smaller positive wind speed bias than the warm season average. As with temperature, conditional bias correction removes the bias and improves MAE. ME per Model For Wind Speed MAE per Model For Wind Speed

Bias Correction Methods for High Ozone Days – Wind Speed > 2.6 m/s Wind speed model biases on high ozone days are also less positively biased than the warm season average. Spatially, the CDF bias correction removes most of the bias, although not as effectively as the additive bias correction with temperature. Raw High Ozone DaysRaw Warm Season Seq. Bias Cor. High Ozone DaysCond. Bias Cor. High Ozone Days

Reliability Plots for Conditional Bias Correction High Ozone Days – Wind Speed Even after conditional bias correction, reliability plots still reveal a lack of probabilistic skill, indicative of a strong amount of underdispersion. Therefore, additional post-processing is necessary.

Bayesian Model Averaging (BMA) Bayesian Model Averaging (BMA, Raftery et al. 2005) calibrates ensemble forecasts by estimating: Weights for each ensemble member. The uncertainty associated with each forecast. 10 members were selected from the SBU/SREF system with a training period of 28 days. 5 best SBU members (in terms of MAE) for each PBL scheme. 5 control SREF members. The same model hours and training method used in bias correction is also used for BMA. Parameters estimated using a MCMC method developed by Vrugt et al. (2008). The BMA derived distribution Members Have Varying Weights Sample BMA PDF For Wind Speed

Reliability Plots for Conditional Bias Correction and BMA High Ozone Days – Wind Speed BMA improves probabilistic results (i.e. reliability corresponds closer to the 1:1 line) for high ozone wind speed results. However, it is important to evaluate ensemble dispersion on the average.

Rank Histograms of Temperature for Hazardous Weather Days BMA greatly improves ensemble underdipsersion, but can not correct any lingering bias as a result of using sequential training. Sequential Training – Fire Threat Days Conditional Training – Fire Threat Days Conditional Training – High Ozone DaysSequential Training – High Ozone Days

Brier Skill Scores - Conditional and Sequential BMA Referenced Against Sequential Bias Correction – Temperature Brier Skill Scores (BSS) indicate probabilistic benefit with values greater than zero. Since BSS is usually > 0, BMA improves the ensemble regardless of training. In many cases, conditional BMA performs better than sequential BMA, with statistically significant improvement on high fire threat days. Fire Threat Days - Temperature High Ozone Days - Temperature

Brier Skill Scores - Conditional and Sequential BMA Referenced Against Sequential Bias Correction – Wind Speed Unlike temperature, the difference between conditional and sequential BMA is not statistically significant. Fire Threat Days – Wind Speed High Ozone Days – Wind Speed

BMA in Operational Forecasting – High Fire Threat Example Ensemble Mean Forecast on April 25 th 2008 BMA can be used to generate a deterministic forecast spatially over the entire region. Raw Mean Ensemble ForecastSequential Bias Corrected Mean Forecast Conditional BMA Mean Forecast Observation Conditional Bias Corrected Mean Forecast

BMA in Operational Forecasting – High Fire Threat Example Probability > 287 K on April 25 th 2008 BMA can be used to generate probabilistic forecasts for critical thresholds that are typically more accurate than bias correction or the raw ensemble. Raw Ensemble ForecastConditional Bias Corrected ForecastSequential Bias Corrected Forecast Conditional BMA Forecast Observation

Sensitivity of Ensemble Performance to Member Selection A key assumption when running BMA was that the 5 SBU and 5 SREF members selected was a good choice. This assumption is tested by comparing the 10 SBU/SREF ensemble used previously (B10) to a randomly selected 10 member ensemble (R10) 1000 times. Warm Season – Conditional Bias Correction; No BMA Hazardous Weather Days – Conditional Bias Correction and BMA The sensitivity of BMA performance to member selection is tested by rerunning BMA with 2 ensembles consisting of 10 members each: 1. The same 5 SBU and 5 SREF member ensemble used earlier (B10-BMA) randomly selected SBU and 5 randomly selected SREF members (R10-BMA). The benefits of combining the SBU + SREF is tested by creating 3 ensembles from B best SBU members (B5-SBU-BMA) best SREF members (B5-SREF-BMA) randomly selected best SBU and 2.5 randomly selected best SREF (B5- ALL-BMA).

Sensitivity of Ensemble Performance to Member Selection – Warm Season – Bias Correction (No BMA) The B10 ensemble has lower MAE (i.e. more skill) than R10, but it also has less ensemble spread (i.e. is more underdispersed) for temperature and wind speed. This underdispersion results in B10 having less probabilistic skill than R10 with wind speed Spread: B10 Minus R10 Ensemble BSS: B10 Referenced Against R10 MAE: B10 Minus R10 Ensemble Temperature Wind Speed

Brier Skill Scores – Comparison Between Best Ensemble and Randomly Generated Ensembles- Temperature B10-BMA performs better probabilistically than R10-BMA. Since this was not the case with B10 and R10, BMA can correct for underdispersion if the ensemble has deterministic skill. Fire Threat Days High Ozone Days

Brier Skill Scores – Benefits of Adding the SBU and SREF Ensembles - Temperature The combined SBU and SREF ensemble (5B-ALL-BMA) performs better than the SBU (5B-SBU-BMA) and SREF (5B-SREF-BMA) separately on high fire threat days. Results with high ozone days are mixed, with no clear benefit or loss with using 5B-ALL-BMA. Fire Threat Days High Ozone Days

Environmental Modes on High Fire Days – 500 hPa Height Anomaly High fire threat days may be associated with a few consistent large- scale atmospheric flow regimes. In order to examine this, the North American Regional Reanalysis (NARR, 32 km grid spacing) 3-hour composites were gathered on high fire threat days. Atmospheric modes on high fire threat days were captured using Empirical Orthogonal Functions (EOF) analysis on 500 hPa height anomalies, and the dominant Principal Components (PCs) were correlated to the 2-m temperature model bias.

Environmental Modes on High Fire Days – 500 hPa Height Anomaly 500 hPa Anomaly EOF 1 - 4

Conclusions ● High fire threat and ozone days have cooler 2-m temperature and less windy 10-m wind speed model biases compared to the warm season average. ● Conditional post-processing is better than sequential training at removing biases and calibrating the ensemble for high fire threat and ozone days. ● Results with a similar day approach suggests that analog post-processing could be used to create unbiased and skillful gridded operational forecasts. ● Similar day post-processing could be extended to benefit forecasts of high impact events (Nor‘easters, cold snaps, heat waves, wind energy, etc). ● However, it is not immediately obvious how to implement a large ensemble of unique members with BMA. Combining the SBU and SREF ensembles and picking skillful members (in terms of MAE) is frequently better than randomly selecting members. ● Preliminary EOF results suggest the presence of 500 hPa flow regimes on high fire threat days that are correlated to the 2-m temperature model biases. Exploring the cause for these biases could lead to an improvement in the model parameterized physics.