We think you have liked this presentation. If you wish to download it, please recommend it to your friends in any social system. Share buttons are a little bit lower. Thank you!
Presentation is loading. Please wait.
Published byJeremiah Goodwin
Modified over 2 years ago
Page 1© Crown copyright 2004 Presentation to ECMWF Forecast Product User Meeting 16th June 2005
Page 2© Crown copyright 2004 Met Office use and verification of the monthly forecasting system Bernd Becker, Richard Graham. Met Office monthly forecast suite Products from the Monthly Outlook Standard Verification System
Page 3© Crown copyright 2004 Monthly Forecasting System Coupled ocean-atmosphere integrations: a 51-member ensemble is integrated for 32 days every week. Atmospheric component: IFS with the latest operational cycle 29r1 and with a TL159L40 resolution (320 * 161) Oceanic component: HOPE (from Max Plank Institute) with a zonal resolution of 1.4 degrees and 29 vertical levels Coupling: OASIS (CERFACS). Coupling every ocean time step (1 hour) Perturbations: Atmosphere: Singular vectors + stochastic physics Ocean: SST perturbations in the initial conditions + wind stress perturbations during data assimilation. Hindcast statistics: 5-member ensemble integrated over 32 days during the past 12 years. Representing a 60-member ensemble. Running every week
Page 4© Crown copyright 2004 HC minHC max qb1 qb 2 qb 3 qb 4 FXmean FXminFXmax qm1 qm2qm4qm3 qm5 FORMOST global PDF data Forecast Hind cast Properties of quintile PDF well below below normal above well above
Page 5© Crown copyright 2004 FORMOST Monthly Outlook
Page 6© Crown copyright 2004 Example UK day temperature forecast for 10 climate districts Deterministic forecast (based on most probable category or ensemble mean) Probability forecast Verification ….a sudden change to below or well-below average temperatures is expected… (Forecast text issued 11 th Feb for week of Feb)
Page 7© Crown copyright 2004 Example global capability tercile probability forecast – Europe, days12-18 Verification (ECMWF operations) P(abv) P(avg) P(blw) PrecipTmean
Page 8© Crown copyright 2004 WK 3&4 WK 1 Obs. T anom Jan WK 2
Page 9© Crown copyright 2004 Verification May April forecast/ observation pairs of Temperature and Precipitation Verifying Observations: ECMWF short range (12-36 hrs) forecasts over the period Global Forecasts: Relative Operating Characteristics for quintile forecast Reliability Diagram Brier skill score decomposition
Page 10© Crown copyright 2004 Relative Operating Characteristics ROC measures the ability of the forecast to discriminate between two alternative outcomes. measuring resolution. It is not sensitive to bias in the forecast, so says nothing about reliability. A biased forecast may still have good resolution and produce a good ROC curve, which means that it may be possible to improve the forecast through calibration. The ROC can thus be considered as a measure of potential usefulness. The ROC is conditioned on the observations (i.e., given that bin X occurred, what was the corresponding forecast?) The area under the ROC curve indicates skill above climatology, when > 0.5 POFD= FA/(FA+CR) POD= H/(H+M) 3 cold warm
Page 11© Crown copyright 2004 ROC Score for Temperature well below normal 19 to 32 days ahead ROC Map
Page 12© Crown copyright 2004 Brier Score = RMS ensemble forecast 1/N Sum_i (P_k – O_k)^2 BS = Reliability – Resolution + Uncertainty Reliability: conditional Bias, weighted vertical distance of the reliability curve to the diagonal. The flatter the less resolution. Resolution: difference between fx pdf and expectation, variance (HR). The larger (better), the more often the fx pdf is steeper and narrower than the climatological pdf. (Flat sharpness plot) Uncertainty: difference between obs pdf and sample average pdf, variance(H/total),.2(1-.2)=0.16 always > Resolution Bracketed Terms: BSS= 1 – BS/BS_clim BS_clim= REL(=0) - Res(=0) + Unc Brier Score HR=H/(H+FA) 3
Page 13© Crown copyright 2004 Monthly Verification (Europe): Tmean ROC & reliability, all seasons, days Based on 93 operational forecasts (all seasons) Verification data = ECMWF T POD= H/(H+M) HR=H/(H+FA) 22
Page 14© Crown copyright 2004 Monthly Verification (Europe): Precip ROC & reliability, all seasons, days POD= H/(H+M) POFD= FA/(FA+CR) HR=H/(H+FA) 2 2
Page 15© Crown copyright 2004 Conclusion The monthly forecasts model runs are produced at ECMWF, products are derived at the Met Office, operationally. Standardised Verification system (SVS) for Long-range Forecasts (LRF) is beginning to take shape. Forecasts for day are as useful as climatology. Predictions of Quintile 1 and 5 are more skilful than of Quintiles 2 to 4 Europe is a difficult region to predict at long time range. Weather uncertainty imposes a fundamental limit on the sharpness of the probabilistic forecast. The Monthly Outlook is a powerful tool to provide forecast guidance up to a month ahead in many areas.
Page 16© Crown copyright 2004 THANK YOU! Richard Graham Margaret Gordon Andrew Colman
Page 17© Crown copyright 2004 The Met Office We are one of the world's leading providers of environmental and weather-related services. Our solutions and services meet the needs of many communities of interest…from the general public, government and schools, to civil aviation and almost every industry sector around the world.
Page 18© Crown copyright 2004 Desirable Attributes of Scores Equitable: Equitable without dependence on the forecast distribution: Rewards for correct forecasts inversely proportional to their event frequencies Penalties for incorrect forecasts directly proportional to their event frequencies Random forecasts and constant forecasts have the same (0) skill. Note: 2 & 3 imply that all information in the contingency table is taken into account.
Page 19© Crown copyright 2004 Recap: Post processing and Products Data Volume reduction before transfer to The Met Office: Calculate 1.Tercile/Quintile boundaries from the Hindcast ensemble 2.Tercile/Quintile populations from the Forecast ensemble 3.Maximum, Mean and Minimum from Forecast and from Hindcast 4.Forecast Tercile/Quintile averages 5.Average in time to week 1, 2 and 3&4. UK Forecast: 1.Interpolation to points representing UK climate regions 2.Calibration with historical UK climate region observations 3.Interpretation of the Histogram, Ensemble mean or Mode in cases with large spread, derive deterministic forecast tercile/quintile 4.Mapping Tercile/Quintile average onto calibration PDF to derive deterministic forecast value Global Forecast: 1.Tercile/Quintile probabilities 2.Calibrate by overlaying Tercile/Quintile boundaries derived from 1989 – 1998 ERA40 data
Page 20© Crown copyright 2004 Key Points Monthly Outlooks are a split process between ECMWF and Met Office The Monthly Outlook is operational Standardised Verification system is shaping up
Page 21© Crown copyright 2004 Most probable Quint Low spread, Ensemble mean a good best estimate, high confidence. High Spread, ensemble mean wrong in 80% of all cases. High Spread, delta probability > 5%. Most probable quint best estimate. Hedge by use of above/below quint.
Page 22© Crown copyright 2004 FORMOST Verification Week 2 Quint boundaries Observed Deterministic t
Page 23© Crown copyright 2004 FORMOST Verification Week 3&4, UK average t
Page 24© Crown copyright 2004 FORMOST Verification Week 3&4, UK average t
Page 25© Crown copyright 2004 Monthly forecast: 3 category probabilities prob. of above prob. of average prob. of below Issued: 9 th June 2005 valid: 20 th to 26 th June temperatureprecipitation
Page 26© Crown copyright 2004 Grid point diagnostics Properties of a contingency table per grid point wan 5 QUINT 4 n 3 bn 2 wbn 1 H M FA CR 100 % probability Hit : Q =Qobs & P(Q)>= Pthresh Miss: Q =Qobs & P(Q) < Pthresh False Alarm: Q != Qobs & P(Q)>= Pthresh Correct rejection: Q !=Qobs & P(Q)
Page 27© Crown copyright 2004 EUROPE T HR=H/(H+FA) 19 lead time 2 weeks period 3 3 POD= H/(H+M)
Page 28© Crown copyright 2004 EUROPE P HR=H/(H+FA) 19d lead time 2 weeks period 3 3 POD= H/(H+M)
Page 1© Crown copyright 2004 ECMWF Forecast Products Users Meeting 15th June 2006.
Page 1© Crown copyright Operational Use of ECMWF products at the Met Office: Current practice, Verification and Ideas for the future Tim Hewson 17 th June.
User Meeting 15 June 2005 Monthly Forecasting Frederic Vitart ECMWF, Reading, UK.
Training Course 2009 – NWP-PR: Ensemble Verification I 1/33 Ensemble Verification I Renate Hagedorn European Centre for Medium-Range Weather Forecasts.
ECMWF Slide 1Met Op training course – Reading, March 2004 Forecast verification: probabilistic aspects Anna Ghelli, ECMWF.
Page 1© Crown copyright 2004 Seasonal forecasting activities at the Met Office Long-range Forecasting Group, Hadley Centre Presenter: Richard Graham ECMWF.
ECMWF User Meeting / 1 Pertti Nurmi Juha Kilpinen Annakaisa Sarkanen ( Finnish Meteorological Institute ) Probabilistic Forecasts.
Training Course 2009 – NWP-PR: The Seasonal Forecast System at ECMWF 1 The Seasonal Forecast System at ECMWF Tim Stockdale European Centre for Medium-Range.
Page 1© Crown copyright 2005 Met Office seasonal predictions and applications Richard Graham Chris Gordon, Matt Huddleston, Mike Davey, Alberto Arribas,
Page 1© Crown copyright 2005 Use of EPS at the Met Office Ken Mylne and Tim Legg.
Training Course 2013– NWP-PR: The Monthly Forecast System at ECMWF 1 Monthly Forecasting at ECMWF Frédéric Vitart European Centre for Medium-Range Weather.
Training Course 2012– NWP-PR: The Monthly Forecast System at ECMWF 1 Monthly Forecasting at ECMWF Frédéric Vitart European Centre for Medium-Range Weather.
Training Course 2010– NWP-PR: The Monthly Forecast System at ECMWF 1 Monthly Forecasting at ECMWF Frédéric Vitart European Centre for Medium-Range Weather.
Training Course 2009 – NWP-PR: Ensemble Verification II 1/33 Ensemble Verification II Renate Hagedorn European Centre for Medium-Range Weather Forecasts.
Federal Department of Home Affairs FDHA Federal Office of Meteorology and Climatology MeteoSwiss Extended range forecasts at MeteoSwiss: User experience.
Page 1 © Crown copyright 2005 ECMWF User Meeting, June 2006 Developments in the Use of Short and Medium-Range Ensembles at the Met Office Ken Mylne.
ECMWF Forecast Products User Meeting, E. Zsoter, June 2006 Severe Weather Forecasts Severe Weather Forecasts Ervin Zsoter.
Slide 1 Forecast Products User Meeting June 2006 Summary Forecast Products User Meeting: June 2006 Summary.
Training Course 2009 – NWP-PR: Calibration of EPSs 1/42 Calibration of EPSs Renate Hagedorn European Centre for Medium-Range Weather Forecasts.
Richard (Rick) Jones SWFDP Training Workshop on Severe Weather Forecasting Bujumbura, Burundi, Nov 11-16, 2013.
HB 1 Forecast Products Users'Meeting, June 2005 Users meeting Summary Performance of the Forecasting System (1) Main (deterministic) model -Outstanding.
Norwegian Meteorological Institute met.no GLAMEPS: GLAME PS GLAMEPS: Grand Limited Area Model Ensemble Prediction System Towards establishing a European-wide.
1 The ECMWF Monthly and Seasonal Forecast Systems D. Anderson, M. Balmaseda, L. Ferranti, F. Molteni, T. Stockdale, F. Vitart ECMWF, Reading, UK.
Robin Hogan Ewan OConnor, Anthony Illingworth University of Reading, UK Clouds radar collaboration meeting 17 Nov 09 Ground based evaluation of cloud forecasts.
© Crown copyright Met Office Investigating a perturbed physics scheme in a wave ensemble system Ray Bell (Line manager: Francois-Bocquet) Ocean Iced Tea.
Slide 1 Predictability training course 2013, EPS applications: Droughts © ECMWF Applications of the EPS: Droughts Emanuel Dutra
Seasonal forecasting: status and plans David Anderson Tim Stockdale, Magdalena Balmasda, Arthur Vidard, Alberto Troccoli, Paco Doblas-Reyes, Kristian Morgensen,
Slide 1 Forecast Products User Meeting June 2006 Slide 1 ECMWF medium-range forecasts and products David Richardson Met Ops.
Robin Hogan Ewan OConnor, Anthony Illingworth University of Reading, UK Chris Ferro, Ian Jolliffe, David Stephenson University of Exeter, UK Verifying.
1 Lecture 2 ANALYSIS OF VARIANCE: AN INTRODUCTION.
© 2016 SlidePlayer.com Inc. All rights reserved.