We think you have liked this presentation. If you wish to download it, please recommend it to your friends in any social system. Share buttons are a little bit lower. Thank you!
Presentation is loading. Please wait.
Published byJeremiah Goodwin
Modified over 3 years ago
Page 1© Crown copyright 2004 Presentation to ECMWF Forecast Product User Meeting 16th June 2005
Page 2© Crown copyright 2004 Met Office use and verification of the monthly forecasting system Bernd Becker, Richard Graham. Met Office monthly forecast suite Products from the Monthly Outlook Standard Verification System
Page 3© Crown copyright 2004 Monthly Forecasting System Coupled ocean-atmosphere integrations: a 51-member ensemble is integrated for 32 days every week. Atmospheric component: IFS with the latest operational cycle 29r1 and with a TL159L40 resolution (320 * 161) Oceanic component: HOPE (from Max Plank Institute) with a zonal resolution of 1.4 degrees and 29 vertical levels Coupling: OASIS (CERFACS). Coupling every ocean time step (1 hour) Perturbations: Atmosphere: Singular vectors + stochastic physics Ocean: SST perturbations in the initial conditions + wind stress perturbations during data assimilation. Hindcast statistics: 5-member ensemble integrated over 32 days during the past 12 years. Representing a 60-member ensemble. Running every week
Page 4© Crown copyright 2004 HC minHC max qb1 qb 2 qb 3 qb 4 FXmean FXminFXmax qm1 qm2qm4qm3 qm5 FORMOST global PDF data Forecast Hind cast Properties of quintile PDF well below below normal above well above
Page 5© Crown copyright 2004 FORMOST Monthly Outlook
Page 6© Crown copyright 2004 Example UK 12-18 day temperature forecast for 10 climate districts Deterministic forecast (based on most probable category or ensemble mean) Probability forecast Verification ….a sudden change to below or well-below average temperatures is expected… (Forecast text issued 11 th Feb for week 21-27 of Feb)
Page 7© Crown copyright 2004 Example global capability tercile probability forecast – Europe, days12-18 Verification (ECMWF operations) P(abv) P(avg) P(blw) PrecipTmean
Page 8© Crown copyright 2004 WK 3&4 WK 1 Obs. T anom. 24-30. Jan WK 2
Page 9© Crown copyright 2004 Verification May 2002 - April 2005 93 forecast/ observation pairs of Temperature and Precipitation Verifying Observations: ECMWF short range (12-36 hrs) forecasts over the period Global Forecasts: Relative Operating Characteristics for quintile forecast Reliability Diagram Brier skill score decomposition
Page 10© Crown copyright 2004 Relative Operating Characteristics ROC measures the ability of the forecast to discriminate between two alternative outcomes. measuring resolution. It is not sensitive to bias in the forecast, so says nothing about reliability. A biased forecast may still have good resolution and produce a good ROC curve, which means that it may be possible to improve the forecast through calibration. The ROC can thus be considered as a measure of potential usefulness. The ROC is conditioned on the observations (i.e., given that bin X occurred, what was the corresponding forecast?) The area under the ROC curve indicates skill above climatology, when > 0.5 POFD= FA/(FA+CR) POD= H/(H+M) 3 cold warm
Page 11© Crown copyright 2004 ROC Score for Temperature well below normal 19 to 32 days ahead ROC Map
Page 12© Crown copyright 2004 Brier Score = RMS ensemble forecast 1/N Sum_i (P_k – O_k)^2 BS = Reliability – Resolution + Uncertainty Reliability: conditional Bias, weighted vertical distance of the reliability curve to the diagonal. The flatter the less resolution. Resolution: difference between fx pdf and expectation, variance (HR). The larger (better), the more often the fx pdf is steeper and narrower than the climatological pdf. (Flat sharpness plot) Uncertainty: difference between obs pdf and sample average pdf, variance(H/total),.2(1-.2)=0.16 always > Resolution Bracketed Terms: BSS= 1 – BS/BS_clim BS_clim= REL(=0) - Res(=0) + Unc Brier Score HR=H/(H+FA) 3
Page 13© Crown copyright 2004 Monthly Verification (Europe): Tmean ROC & reliability, all seasons, days 12-18 Based on 93 operational forecasts (all seasons) Verification data = ECMWF T+12-36 POD= H/(H+M) HR=H/(H+FA) 22
Page 14© Crown copyright 2004 Monthly Verification (Europe): Precip ROC & reliability, all seasons, days 12-18 POD= H/(H+M) POFD= FA/(FA+CR) HR=H/(H+FA) 2 2
Page 15© Crown copyright 2004 Conclusion The monthly forecasts model runs are produced at ECMWF, products are derived at the Met Office, operationally. Standardised Verification system (SVS) for Long-range Forecasts (LRF) is beginning to take shape. Forecasts for day 19-32 are as useful as climatology. Predictions of Quintile 1 and 5 are more skilful than of Quintiles 2 to 4 Europe is a difficult region to predict at long time range. Weather uncertainty imposes a fundamental limit on the sharpness of the probabilistic forecast. The Monthly Outlook is a powerful tool to provide forecast guidance up to a month ahead in many areas.
Page 16© Crown copyright 2004 THANK YOU! Richard Graham Margaret Gordon Andrew Colman
Page 17© Crown copyright 2004 The Met Office We are one of the world's leading providers of environmental and weather-related services. Our solutions and services meet the needs of many communities of interest…from the general public, government and schools, to civil aviation and almost every industry sector around the world.
Page 18© Crown copyright 2004 Desirable Attributes of Scores Equitable: Equitable without dependence on the forecast distribution: Rewards for correct forecasts inversely proportional to their event frequencies Penalties for incorrect forecasts directly proportional to their event frequencies Random forecasts and constant forecasts have the same (0) skill. Note: 2 & 3 imply that all information in the contingency table is taken into account.
Page 19© Crown copyright 2004 Recap: Post processing and Products Data Volume reduction before transfer to The Met Office: Calculate 1.Tercile/Quintile boundaries from the Hindcast ensemble 2.Tercile/Quintile populations from the Forecast ensemble 3.Maximum, Mean and Minimum from Forecast and from Hindcast 4.Forecast Tercile/Quintile averages 5.Average in time to week 1, 2 and 3&4. UK Forecast: http://www.bbc.co.uk/weather/ukweather/monthly_outlook.shtml 1.Interpolation to points representing UK climate regions 2.Calibration with historical UK climate region observations 3.Interpretation of the Histogram, Ensemble mean or Mode in cases with large spread, derive deterministic forecast tercile/quintile 4.Mapping Tercile/Quintile average onto calibration PDF to derive deterministic forecast value Global Forecast: 1.Tercile/Quintile probabilities 2.Calibrate by overlaying Tercile/Quintile boundaries derived from 1989 – 1998 ERA40 data
Page 20© Crown copyright 2004 Key Points Monthly Outlooks are a split process between ECMWF and Met Office The Monthly Outlook is operational Standardised Verification system is shaping up
Page 21© Crown copyright 2004 Most probable Quint Low spread, Ensemble mean a good best estimate, high confidence. High Spread, ensemble mean wrong in 80% of all cases. High Spread, delta probability > 5%. Most probable quint best estimate. Hedge by use of above/below quint.
Page 22© Crown copyright 2004 FORMOST Verification Week 2 Quint boundaries Observed Deterministic t
Page 23© Crown copyright 2004 FORMOST Verification Week 3&4, UK average t
Page 24© Crown copyright 2004 FORMOST Verification Week 3&4, UK average t
Page 25© Crown copyright 2004 Monthly forecast: 3 category probabilities prob. of above prob. of average prob. of below Issued: 9 th June 2005 valid: 20 th to 26 th June temperatureprecipitation
Page 26© Crown copyright 2004 Grid point diagnostics Properties of a contingency table per grid point wan 5 QUINT 4 n 3 bn 2 wbn 1 H M FA CR 100 % probability 90 80 70 60 50 40 30 20 10 0 1 3 5 8 13 20 23 1 2 3 7 8 11 13 20 29 Hit : Q =Qobs & P(Q)>= Pthresh Miss: Q =Qobs & P(Q) < Pthresh False Alarm: Q != Qobs & P(Q)>= Pthresh Correct rejection: Q !=Qobs & P(Q)
"name": "Page 26© Crown copyright 2004 Grid point diagnostics Properties of a contingency table per grid point wan 5 QUINT 4 n 3 bn 2 wbn 1 H M FA CR 100 % probability 90 80 70 60 50 40 30 20 10 0 1 3 5 8 13 20 23 1 2 3 7 8 11 13 20 29 Hit : Q =Qobs & P(Q)>= Pthresh Miss: Q =Qobs & P(Q) < Pthresh False Alarm: Q != Qobs & P(Q)>= Pthresh Correct rejection: Q !=Qobs & P(Q)= Pthresh Miss: Q =Qobs & P(Q) < Pthresh False Alarm: Q != Qobs & P(Q)>= Pthresh Correct rejection: Q !=Qobs & P(Q)
Page 27© Crown copyright 2004 EUROPE T HR=H/(H+FA) 19 lead time 2 weeks period 3 3 POD= H/(H+M)
Page 28© Crown copyright 2004 EUROPE P HR=H/(H+FA) 19d lead time 2 weeks period 3 3 POD= H/(H+M)
Page 1© Crown copyright 2004 ECMWF Forecast Products Users Meeting 15th June 2006.
User Meeting 15 June 2005 Monthly Forecasting Frederic Vitart ECMWF, Reading, UK.
ECMWF Slide 1Met Op training course – Reading, March 2004 Forecast verification: probabilistic aspects Anna Ghelli, ECMWF.
Page 1© Crown copyright 2004 Seasonal forecasting activities at the Met Office Long-range Forecasting Group, Hadley Centre Presenter: Richard Graham ECMWF.
Federal Department of Home Affairs FDHA Federal Office of Meteorology and Climatology MeteoSwiss Extended range forecasts at MeteoSwiss: User experience.
Page 1 © Crown copyright 2005 ECMWF User Meeting, June 2006 Developments in the Use of Short and Medium-Range Ensembles at the Met Office Ken Mylne.
Norwegian Meteorological Institute met.no LAMEPS – Limited area ensemble forecasting in Norway, using targeted EPS Marit Helene Jensen, Inger-Lise Frogner,
The Centre for Australian Weather and Climate Research A partnership between CSIRO and the Bureau of Meteorology Verification and Metrics (CAWCR)
© Crown copyright Met Office Standard Verification System for Long-range Forecasts (SVSLRF) Richard Graham, Met Office Hadley Centre. With acknowledgements.
THOR Annual Meeting - Bergen 9-11 November /25 On the impact of initial conditions relative to external forcing on the skill of decadal predictions:
Severe Weather Forecasts
Performance of the MOGREPS Regional Ensemble
© Crown copyright Met Office Probabilistic turbulence forecasts from ensemble models and verification Philip Gill and Piers Buchanan NCAR Aviation Turbulence.
Model and Relationships 6 M 1 M M M M M M M M M M M M M M M M
Climate Prediction Applications Science Workshop
Seasonal forecasts Laura Ferranti and the Seasonal Forecast Section User meeting June 2005.
ECMWF long range forecast systems
Verification of ensemble systems Chiara Marsigli ARPA-SIMC.
FACTORING ax2 + bx + c Think “unfoil” Work down, Show all steps.
Nathalie Voisin 1, Florian Pappenberger 2, Dennis Lettenmaier 1, Roberto Buizza 2, and John Schaake 3 1 University of Washington 2 ECMWF 3 National Weather.
Verification of Probabilistic Forecast J.P. Céron – Direction de la Climatologie S. Mason - IRI.
© Crown copyright Met Office Andrew Colman presentation to EuroBrisa Workshop July Met Office combined statistical and dynamical forecasts for.
Chapter Thirteen The One-Way Analysis of Variance.
Recent & planned developments to the Met Office Global and Regional Ensemble Prediction System (MOGREPS) Richard Swinbank, Warren Tennant, Sarah Beare,
Verification has been undertaken for the 3 month Summer period (30/05/12 – 06/09/12) using forecasts and observations at all 205 UK civil and defence aerodromes.
Lecture 2 ANALYSIS OF VARIANCE: AN INTRODUCTION
Heidke Skill Score (for deterministic categorical forecasts) Heidke score = Example: Suppose for OND 1997, rainfall forecasts are made for 15 stations.
Lecture 3 Validity of screening and diagnostic tests
Commonly Used Distributions
Details for Today: DATE:13 th January 2005 BY:Mark Cresswell FOLLOWED BY:Practical Dynamical Forecasting 69EG3137 – Impacts & Models of Climate Change.
Lets play bingo!!. Calculate: MEAN Calculate: MEDIAN
25 seconds left…...
Probabilistic Forecasting. pdfs and Histograms Probability density functions (pdfs) are unobservable. They can only be estimated. They tell us the density,
Verification of probability and ensemble forecasts
Slide 1ECMWF forecast products users meeting – Reading, June 2005 Verification of weather parameters Anna Ghelli, ECMWF.
1 RA I Sub-Regional Training Seminar on CLIMAT&CLIMAT TEMP Reporting Casablanca, Morocco, 20 – 22 December 2005 Status of observing programmes in RA I.
ECMWF Training course 26/4/2006 DRD meeting, 2 July 2004 Frederic Vitart 1 Predictability on the Monthly Timescale Frederic Vitart ECMWF, Reading, UK.
Overview of KMA s Operational LRF Services Korea Meteorological Administration.
1 Verification Continued… Holly C. Hartmann Department of Hydrology and Water Resources University of Arizona RFC Verification Workshop,
Subtraction: Adding UP
Verification of ensembles Courtesy of Barbara Brown Acknowledgments: Tom Hamill, Laurence Wilson, Tressa Fowler Copyright UCAR 2012, all rights reserved.
Experimental Design and Analysis of Variance
ECMWF Training Course 2005 slide 1 Forecast sensitivity to Observation Carla Cardinali.
Slide 1ECMWF forecast User Meeting -- Reading, June 2006 Verification of weather parameters Anna Ghelli, ECMWF.
Chapter 13 – Weather Analysis and Forecasting
Details for Today: DATE:3 rd February 2005 BY:Mark Cresswell FOLLOWED BY:Assignment 2 briefing Evaluation of Model Performance 69EG3137 – Impacts & Models.
61 st IHC, New Orleans, LA Verification of the Monte Carlo Tropical Cyclone Wind Speed Probabilities: A Joint Hurricane Testbed Project Update John A.
The ECMWF Monthly and Seasonal Forecast Systems
© 2017 SlidePlayer.com Inc. All rights reserved.