Gridded OCF Probabilistic Forecasting For Australia For more information please contact © Commonwealth of Australia 2011 Shaun Cooper.

Slides:



Advertisements
Similar presentations
Slide 1ECMWF forecast User Meeting -- Reading, June 2006 Verification of weather parameters Anna Ghelli, ECMWF.
Advertisements

Slide 1ECMWF forecast products users meeting – Reading, June 2005 Verification of weather parameters Anna Ghelli, ECMWF.
Chapter 13 – Weather Analysis and Forecasting
Report of the Q2 Short Range QPF Discussion Group Jon Ahlquist Curtis Marshall John McGinley - lead Dan Petersen D. J. Seo Jean Vieux.
Willem A. Landman Ruth Park Stephanie Landman Francois Engelbrecht.
Details for Today: DATE:3 rd February 2005 BY:Mark Cresswell FOLLOWED BY:Assignment 2 briefing Evaluation of Model Performance 69EG3137 – Impacts & Models.
Creating probability forecasts of binary events from ensemble predictions and prior information - A comparison of methods Cristina Primo Institute Pierre.
A statistical method for calculating the impact of climate change on future air quality over the Northeast United States. Collaborators: Cynthia Lin, Katharine.
Monitoring the Quality of Operational and Semi-Operational Satellite Precipitation Estimates – The IPWG Validation / Intercomparison Study Beth Ebert Bureau.
Improving Excessive Rainfall Forecasts at HPC by using the “Neighborhood - Spatial Density“ Approach to High Res Models Michael Eckert, David Novak, and.
Validation of the Ensemble Tropical Rainfall Potential (eTRaP) for Landfalling Tropical Cyclones Elizabeth E. Ebert Centre for Australian Weather and Climate.
The Centre for Australian Weather and Climate Research A partnership between CSIRO and the Bureau of Meteorology 3.1 Prediction skill in the Tropical Indian.
EG1204: Earth Systems: an introduction Meteorology and Climate Lecture 7 Climate: prediction & change.
Verification of Numerical Weather Prediction systems employed by the Australian Bureau of Meteorology over East Antarctica during the summer season.
Ensemble Post-Processing and it’s Potential Benefits for the Operational Forecaster Michael Erickson and Brian A. Colle School of Marine and Atmospheric.
Colorado Basin River Forecast Center Water Supply Forecasting Method Michelle Stokes Hydrologist in Charge Colorado Basin River Forecast Center April 28,
Chapter 13 – Weather Analysis and Forecasting. The National Weather Service The National Weather Service (NWS) is responsible for forecasts several times.
Using Short Range Ensemble Model Data in National Fire Weather Outlooks Sarah J. Taylor David Bright, Greg Carbin, Phillip Bothwell NWS/Storm Prediction.
COSMO General Meeting Zurich, 2005 Institute of Meteorology and Water Management Warsaw, Poland- 1 - Verification of the LM at IMGW Katarzyna Starosta,
Verification has been undertaken for the 3 month Summer period (30/05/12 – 06/09/12) using forecasts and observations at all 205 UK civil and defence aerodromes.
Whiteboardmaths.com © 2009 All rights reserved
Advances in weather, climate and water forecasting technologies Alasdair Hainsworth Bureau of Meteorology March 2011.
ESA DA Projects Progress Meeting 2University of Reading Advanced Data Assimilation Methods WP2.1 Perform (ensemble) experiments to quantify model errors.
6. Conclusions and further work An analysis of storm dew-point temperatures, using all available dew-point estimates was carried out for 10 significant.
The 2014 Flash Flood and Intense Rainfall Experiment Faye E. Barthold 1,2, Thomas E. Workoff 1,3, Wallace A. Hogsett 1*, J.J. Gourley 4, and David R. Novak.
Short-Range Ensemble Prediction System at INM José A. García-Moya & Carlos Santos SMNT – INM COSMO Meeting Zurich, September 2005.
Verification of the Cooperative Institute for Precipitation Systems‘ Analog Guidance Probabilistic Products Chad M. Gravelle and Dr. Charles E. Graves.
Precipitation Types Important for Real Time Input and Forecasting
Composite Products And Nowcast Decision Support for the Beijing 2008 FDP John Bally David Scurrah Beth Ebert Debin Su.
Intraseasonal TC prediction in the southern hemisphere Matthew Wheeler and John McBride Centre for Australia Weather and Climate Research A partnership.
How can LAMEPS * help you to make a better forecast for extreme weather Henrik Feddersen, DMI * LAMEPS =Limited-Area Model Ensemble Prediction.
Improving Ensemble QPF in NMC Dr. Dai Kan National Meteorological Center of China (NMC) International Training Course for Weather Forecasters 11/1, 2012,
Celeste Saulo and Juan Ruiz CIMA (CONICET/UBA) – DCAO (FCEN –UBA)
Model validation Simon Mason Seasonal Forecasting Using the Climate Predictability Tool Bangkok, Thailand, 12 – 16 January 2015.
Latest results in verification over Poland Katarzyna Starosta, Joanna Linkowska Institute of Meteorology and Water Management, Warsaw 9th COSMO General.
© Crown copyright Met Office Probabilistic turbulence forecasts from ensemble models and verification Philip Gill and Piers Buchanan NCAR Aviation Turbulence.
Short-Range Ensemble Prediction System at INM José A. García-Moya SMNT – INM 27th EWGLAM & 12th SRNWP Meetings Ljubljana, October 2005.
Improved road weather forecasting by using high resolution satellite data Claus Petersen and Bent H. Sass Danish Meteorological Institute.
A Numerical Study of Early Summer Regional Climate and Weather. Zhang, D.-L., W.-Z. Zheng, and Y.-K. Xue, 2003: A Numerical Study of Early Summer Regional.
The Centre for Australian Weather and Climate Research A partnership between CSIRO and the Bureau of Meteorology Verification and Metrics (CAWCR)
Federal Department of Home Affairs FDHA Federal Office of Meteorology and Climatology MeteoSwiss A more reliable COSMO-LEPS F. Fundel, A. Walser, M. A.
Typhoon Forecasting and QPF Technique Development in CWB Kuo-Chen Lu Central Weather Bureau.
Probabilistic Forecasts of Extreme Precipitation Events for the U.S. Hazards Assessment Kenneth Pelman 32 nd Climate Diagnostics Workshop Tallahassee,
U. Damrath, COSMO GM, Athens 2007 Verification of numerical QPF in DWD using radar data - and some traditional verification results for surface weather.
Short-Range Ensemble Prediction System at INM García-Moya, J.A., Santos, C., Escribà, P.A., Santos, D., Callado, A., Simarro, J. (NWPD, INM, SPAIN) 2nd.
NCAR, 15 April Fuzzy verification of fake cases Beth Ebert Center for Australian Weather and Climate Research Bureau of Meteorology.
INCORPORATING AN ENSEMBLE FORECASTING PROXY INTO A KNOWLEDGE BASED SYSTEM Harvey Stern, Bureau of Meteorology, Australia. Harvey Stern, Bureau of Meteorology,
Judith Curry James Belanger Mark Jelinek Violeta Toma Peter Webster 1
Extracting probabilistic severe weather guidance from convection-allowing model forecasts Ryan Sobash 4 December 2009 Convection/NWP Seminar Series Ryan.
DOWNSCALING GLOBAL MEDIUM RANGE METEOROLOGICAL PREDICTIONS FOR FLOOD PREDICTION Nathalie Voisin, Andy W. Wood, Dennis P. Lettenmaier University of Washington,
VERIFICATION OF A DOWNSCALING SEQUENCE APPLIED TO MEDIUM RANGE METEOROLOGICAL PREDICTIONS FOR GLOBAL FLOOD PREDICTION Nathalie Voisin, Andy W. Wood and.
EVALUATION OF A GLOBAL PREDICTION SYSTEM: THE MISSISSIPPI RIVER BASIN AS A TEST CASE Nathalie Voisin, Andy W. Wood and Dennis P. Lettenmaier Civil and.
Predicting Intense Precipitation Using Upscaled, High-Resolution Ensemble Forecasts Henrik Feddersen, DMI.
11 Short-Range QPF for Flash Flood Prediction and Small Basin Forecasts Prediction Forecasts David Kitzmiller, Yu Zhang, Wanru Wu, Shaorong Wu, Feng Ding.
Application of the CRA Method Application of the CRA Method William A. Gallus, Jr. Iowa State University Beth Ebert Center for Australian Weather and Climate.
Improving Numerical Weather Prediction Using Analog Ensemble Presentation by: Mehdi Shahriari Advisor: Guido Cervone.
PROGRAM B – Project B6.3 ELEVATED FIRE DANGER CONDITIONS ASSOCIATED WITH FOEHN-LIKE WINDS IN EASTERN VICTORIA J.J Sharples, R.O Weber School of Physical,
FORECASTING HEATWAVE, DROUGHT, FLOOD and FROST DURATION Bernd Becker
Question 1 Given that the globe is warming, why does the DJF outlook favor below-average temperatures in the southeastern U. S.? Climate variability on.
Nathalie Voisin, Andy W. Wood and Dennis P. Lettenmaier
Post Processing.
Seasonal Predictions for South Asia
Finnish Meteorological Institute
Deterministic (HRES) and ensemble (ENS) verification scores
SWFDP Key Issues for GIFS-TIGGE
SRNWP-PEPS COSMO General Meeting September 2005
Verification of Tropical Cyclone Forecasts
Can we distinguish wet years from dry years?
Short Range Ensemble Prediction System Verification over Greece
Ryan Kang, Wee Leng Tan, Thea Turkington, Raizan Rahmat
Presentation transcript:

Gridded OCF Probabilistic Forecasting For Australia For more information please contact © Commonwealth of Australia 2011 Shaun Cooper and Timothy Hume Centre for Australian Weather and Climate Research Conclusions Acknowledgements Sloughter, J. M., Raftery, A., Gneiting, T., and Fraley, C. (2007). Probabilitistic quantitative precipitation forecasting using Bayesian model averaging. Monthly Weather Review, 135:3209–3320. Thanks to Philip Riley for advice and ideas for probabilistic rainfall forecasting. The original PME rainfall forecasts in the Bureau of Meteorology were developed by Elizabeth Ebert and Chermelle Engel. References Diagram illustrating the method used to produce probabilistic forecasts from the Gridded OCF system. Gridded Operational Consensus Forecasting (Gridded OCF) is an operational forecast guidance system used by the Australian Bureau of Meteorology. It consists of a Poor Man's Ensemble (PME) of Numerical Weather Prediction (NWP) output from a number of international centres. The PME members are bias corrected with respect to a gridded mesoscale analysis (MSAS) which covers the Australian domain (but precipitation forecasts are not currently bias corrected). A range of products are then generated from the PME members, including: Weighted average consensus forecasts (used as deterministic guidance by weather forecasters) Probabilistic forecasts of particular parameters exceeding specified thresholds This poster describes the probabilistic products produced by the Gridded OCF system. Gridded OCF: Introduction Probabilistic Forecasts: “Vote Counting” The “vote counting” method counts the number of PME members which satisfy a certain condition (e.g. rainfall exceeding a specified threshold), and express the probability as a proportion of the total number of members. These probabilities will not generally be calibrated. Probability of daily precipitation exceeding 1 mm computed using the “vote counting” method. 24 hour forecast valid at T00:00:00Z. Colours indicate percentages ranging from 0% (dark purple) to 100% (light pink). Probability of the wind speed exceeding 34 knots (gale force winds). 24 hour forecast valid at T00:00:00Z. Colours indicate percentages ranging from 0% (dark purple) to 100% (light pink) This method is easily applied to a range of variables. For example: Precipitation exceeding defined thresholds Wind speed exceeding critical speeds Temperature (cold events, heat waves and so on) Examples Calibration of Rainfall Forecasts Probabilistic forecasts generated using the “vote counting” method (see above) are not usually calibrated. That is, the observed frequency of an event turns out to be different than the forecast probability of the event occurring. To calibrate precipitation forecasts, the preceding year of uncalibrated forecasts covering the Australian states of Victoria and New South Wales are compared with daily rainfall analyses valid at the same time. A look-up table is constructed to convert uncalibrated probabilities to calibrated probabilities. The calibration tables are continuously updated using the most recent year of data. Comparison of un-calibrated (top panel) with calibrated (bottom panel) forecasts of the probability of daily precipitation exceeding 0.2 mm (made using the “vote counting method”). 24 hour forecast valid at T00:00Z. Colours indicate percentages ranging from 0% (dark purple) to 100% (light pink). Example of a daily rainfall analysis (constructed from rain gauge observations) for Analyses such as these are used for calibrating probabilities derived using the “vote counting” method (see panel above), and also in the QPF method (see panel to the right). Colours indicate the 24 hour accumulation from 0 mm (grey) to 80 mm (pink-white). No data available over the sea. Probabilistic Forecasts: QPF Method This method assigns probabilities of rainfall exceeding a threshold based on a quantitative precipitation forecast (QPF). The higher the QPF, the higher the probability assigned to exceeding a defined threshold. The method is based on work by Sloughter et al. (2007) which employs Bayesian Model Averaging. It can also be applied to a single deterministic rainfall forecast. In our work, we apply it to the PME average forecast; that is, we average the QPFs from each ensemble member and use the resulting average QPF to derive the probability that the daily precipitation will exceed defined thresholds. Example: The figure shows the relationship between the PME's average QPF and the probability of precipitation exceeding 0.2 mm over the Australian states of Victoria and New South Wales. The curve was derived by comparing one year of historical QPF forecasts with the corresponding daily rainfall analyses (taken as the “truth”) for the year. Probability of rainfall exceeding 0.2 mm computed using the QPF method. 24 hour forecast valid at T00:00Z. Colours indicate percentages ranging from 0% (dark purple) to 100% (light pink). Probabilistic forecasts generated using the QPF method are calibrated A continuous range of probabilities are generated, whereas the vote counting method produces a discrete number of possible probabilities based on the number of ensemble members Verification Reliability (top) and ROC (bottom) diagrams for 24 hour forecasts of probability of precipitation exceeding 0.2 mm (vote counting method) Reliability (top) and ROC (bottom) diagrams for 24 hour forecasts of probability of precipitation exceeding 0.2 mm (QPF method) Forecasts of the probability of precipitation exceeding 0.2mm over the Australian states of Victoria and New South Wales were prepared for the period 1 September 2009 through 29 May 2011 inclusive using both the calibrated “vote counting” and QPF methods. The probabilistic forecasts were verified against the operational daily rainfall analyses (see earlier panel), and reliability and relative operating characteristics (ROC) curves created. It is clear from the diagrams (vote counting results on the left, QPF method on the right) that both methods produce reasonably reliable forecasts. The area beneath the ROC curve is approximately the same for both forecast methods, indicating they have comparable skill. Two methods of generating probabilistic forecasts from a “Poor Man’s Ensemble” were investigated The “vote counting” method is simple, and can be easily applied to a variety of meteorological parameters such as rainfall, wind speed and temperature. A weakness of the “vote counting” method is that the probability forecasts are “quantised”. For example, if the PME has 4 members, only probabilities of 0, 25, 50, 75 and 100% are possible. The “vote counting” method does not usually produce calibrated probability forecasts. An extra calibration phase is required. The QPF method generates a continuous range of probabilities, even if there is only a single member in the PME. Forecasts made using the QPF method are calibrated. However, there is not much difference in forecast skill (as shown by the ROC and reliability diagrams) between the QPF and “vote counting” methods. Future Work Apply the probabilistic forecasting methods to other types of weather, such as the probability of extreme temperature events. Improved calibration techniques.