© Crown copyright Met Office Operational OpenRoad verification Presented by Robert Coulson.

Slides:



Advertisements
Similar presentations
Climate Modeling LaboratoryMEASNC State University An Extended Procedure for Implementing the Relative Operating Characteristic Graphical Method Robert.
Advertisements

ECMWF Slide 1Met Op training course – Reading, March 2004 Forecast verification: probabilistic aspects Anna Ghelli, ECMWF.
Slide 1ECMWF forecast User Meeting -- Reading, June 2006 Verification of weather parameters Anna Ghelli, ECMWF.
Slide 1ECMWF forecast products users meeting – Reading, June 2005 Verification of weather parameters Anna Ghelli, ECMWF.
Anthony Illingworth, + Robin Hogan, Ewan OConnor, U of Reading, UK and the CloudNET team (F, D, NL, S, Su). Reading: 19 Feb 08 – Meeting with Met office.
Robin Hogan Ewan OConnor University of Reading, UK What is the half-life of a cloud forecast?
Robin Hogan Ewan OConnor Damian Wilson Malcolm Brooks Evaluation statistics of cloud fraction and water content.
© Crown copyright Met Office E-AMDAR evaluation. Mark Smees & Tim Oakley, Met Office, May 2008.
FINANCIAL AUDIT METHODOLOGY PETER CARLILL UK NATIONAL AUDIT OFFICE.
Details for Today: DATE:3 rd February 2005 BY:Mark Cresswell FOLLOWED BY:Assignment 2 briefing Evaluation of Model Performance 69EG3137 – Impacts & Models.
National Weather Service Protecting Lives and Property NCRFC Support of Wisconsin’s Manure Management Advisory System Development and Production of a Decision.
Federal Department of Home Affairs FDHA Federal Office of Meteorology and Climatology MeteoSwiss Quantitative precipitation forecasts in the Alps – first.
Evaluation.
1 Intercomparison of low visibility prediction methods COST-722 (WG-i) Frédéric Atger & Thierry Bergot (Météo-France)
Weather and Winter Mobility Program Overview U.S. Department of Transportation Federal Highway Administration Paul Pisano Weather & Winter Mobility Coordinator.
Uncertainty in Wind Energy
NWS Climate Products and Services to Support Decision Makers Committee for Climate Analysis, Monitoring, and Services October 30, 2007 Fiona Horsfall NOAA/National.
Creating Empirical Models Constructing a Simple Correlation and Regression-based Forecast Model Christopher Oludhe, Department of Meteorology, University.
Verification has been undertaken for the 3 month Summer period (30/05/12 – 06/09/12) using forecasts and observations at all 205 UK civil and defence aerodromes.
Grid for Coupled Ensemble Prediction (GCEP) Keith Haines, William Connolley, Rowan Sutton, Alan Iwi University of Reading, British Antarctic Survey, CCLRC.
Chapter 1 Managing Revenue and Expense
Richard (Rick)Jones Regional Training Workshop on Severe Weather Forecasting Macau, April 8 -13, 2013.
© Crown copyright Met Office Forecasting Icing for Aviation: Some thoughts for discussion Cyril Morcrette Presented remotely to Technical Infra-structure.
Products and Services to Support Decision Makers: Forecast Skill, L3MTO, and Beyond 2007 Committee for Environmental Services, Operations, and Research.
ECMWF WWRP/WMO Workshop on QPF Verification - Prague, May 2001 NWP precipitation forecasts: Validation and Value Deterministic Forecasts Probabilities.
1 On the use of radar data to verify mesoscale model precipitation forecasts Martin Goeber and Sean Milton Model Diagnostics and Validation group Numerical.
Climate Forecasting Unit Prediction of climate extreme events at seasonal and decadal time scale Aida Pintó Biescas.
April 24, 2007 Nihat Cubukcu Utilization of Numerical Weather Forecast in Energy Sector.
MIT ICAT MIT ICAT. MIT ICAT MIT ICAT Motivation Adverse Weather Significantly Impacts Flight Operations Safety % All US Accidents Efficiency --
4IWVM - Tutorial Session - June 2009 Verification of categorical predictands Anna Ghelli ECMWF.
Buckinghamshire County Council 1 Winter Service Presentation to Members November 4 th
15th International Road Weather Conference February 5th - 7th, 2010 in Québec City, Canada By Naoto Takahashi*, Roberto Tokunaga* & Naoki Nishiyama **
Verification methods - towards a user oriented verification WG5.
Event-based Verification and Evaluation of NWS Gridded Products: The EVENT Tool Missy Petty Forecast Impact and Quality Assessment Section NOAA/ESRL/GSD.
A Preliminary Verification of the National Hurricane Center’s Tropical Cyclone Wind Probability Forecast Product Jackie Shafer Scitor Corporation Florida.
Managerial Economics Demand Estimation & Forecasting.
Latest results in verification over Poland Katarzyna Starosta, Joanna Linkowska Institute of Meteorology and Water Management, Warsaw 9th COSMO General.
31 October 2015 A statistical forecast model for road surface friction Marjo Hippi Ilkka Juga Pertti Nurmi Finnish.
© Crown copyright Met Office Probabilistic turbulence forecasts from ensemble models and verification Philip Gill and Piers Buchanan NCAR Aviation Turbulence.
The Nature of Costs Chapter Two Copyright © 2014 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin.
World Meteorological Organization Working together in weather, climate and water Enhanced User and Forecaster Oriented TAF Quality Assessment CAeM-XIV.
1 Model Calibration John M. Broemmelsiek ITS / Traffic Operations US DOT / FHWA Louisiana Division
Statistical Decision Theory Bayes’ theorem: For discrete events For probability density functions.
ECMWF Training Course Reading, 25 April 2006 EPS Diagnostic Tools Renate Hagedorn European Centre for Medium-Range Weather Forecasts.
Copyright © Cengage Learning. All rights reserved. 12 Analysis of Variance.
Eidgenössisches Departement des Innern EDI Bundesamt für Meteorologie und Klimatologie MeteoSchweiz The challenge to verify operational weather warnings.
© Crown copyright Met Office WAFC CAT verification Objective verification of GRIB CAT forecasts Dr Philip G Gill, WAFS Workshop on the use and visualisation.
TOULOUSE (FRANCE), 5-9 September 2005 OBJECTIVE VERIFICATION OF A RADAR-BASED OPERATIONAL TOOL FOR IDENTIFICATION OF HAILSTORMS I. San Ambrosio, F. Elizaga.
Verification of ensemble systems Chiara Marsigli ARPA-SIMC.
Applying Ensemble Probabilistic Forecasts in Risk-Based Decision Making Hui-Ling Chang 1, Shu-Chih Yang 2, Huiling Yuan 3,4, Pay-Liam Lin 2, and Yu-Chieng.
Page 1© Crown copyright 2005 Met Office Verification -status Clive Wilson, Presented by Mike Bush at EWGLAM Meeting October 8- 11, 2007.
Common verification methods for ensemble forecasts
Page 1 Andrew Lorenc WOAP 2006 © Crown copyright 2006 Andrew Lorenc Head of Data Assimilation & Ensembles Numerical Weather Prediction Met Office, UK Data.
Details for Today: DATE:13 th January 2005 BY:Mark Cresswell FOLLOWED BY:Practical Dynamical Forecasting 69EG3137 – Impacts & Models of Climate Change.
© Crown copyright Met Office Verifying modelled currents using a threshold exceedance approach Dr Ray Mahdon An exploration of the Gerrity Skill Score.
A COMPARATIVE STUDY Dr. Shahram Tahmasseby Transportation Systems Engineer, The City of Calgary Calgary, Alberta, CANADA.
Figures from “The ECMWF Ensemble Prediction System”
Civil Assets, Stations & Facilities Project - ESTEEM (WLC) (Engineering Strategy for Economic & Efficient Management) April 2010.
Intelligent pricing plans for de-icing companies
Fuzzy verification using the Fractions Skill Score
Systematic timing errors in km-scale NWP precipitation forecasts
Addressing the environmental impact of salt use on roads
Verifying and interpreting ensemble products
Chapter 10 Verification and Validation of Simulation Models
Uncertainty in forecasts
Validation-Based Decision Making
Quantitative verification of cloud fraction forecasts
Mareile Wolff1 with input from Cristian Lussana2 and Thomas Nipen3
Linking operational activities and research
Impact Based Forecasting
Presentation transcript:

© Crown copyright Met Office Operational OpenRoad verification Presented by Robert Coulson

© Crown copyright Met Office Operational OpenRoad verification Introduction Verification Methodology Temporal Verification Economic skill in forecasts Questions and Answers

© Crown copyright Met Office Operational OpenRoad Verification Site specific verification. > 300 sites are verified. Observations from roadside sensors are used as TRUTH against which the forecasts are verified. A large sample size of matching forecast/observation pairs are collected over individual months and for the whole winter season for each site to ensure validation is reliable OpenRoad is a weather forecasting package that helps road maintenance decision makers to plan, manage and minimise the effects of winter weather. Road travel on a timely basis is essential for economies globally in the modern world. Cold winter climates impact on safety with dangerous travelling conditions causing delays if not acted upon. Therefore, the prediction and prevention of ice formation is imperative to mitigate against these. This presentation shows examples of the verification carried out for OpenRoad classic. The benefits of winter maintenance have been estimated at approximately eight times costs of prevention. In the UK, which has periods of marginal temperatures during the winter months, an estimated £150 million pounds per year (at about £1250 per km) has been spent historically (SIRWEC 2006). The estimated world expenditure on winter maintenance was put at £6 billion (€10 billion) (SIRWEC 2006), and clearly the realised benefit from this is significantly material. Thus there is a strong link for commercial customers to use accurate weather forecasts in order to save money. Hence strict forecast accuracy criteria are agreed in customer contracts.

© Crown copyright Met Office Verification Methodology Categorical verification of road surface frosts 2x2 tableObserved FrostNo Observed Frost No Forecast Frost Miss (c)Correct Rejection (d) Forecast FrostHit (a)False Alarm (b) The critical event is when the road surface temperature is less or equal to zero degrees Celsius due to formation of ice. The observed and forecast road surface temperatures between 12Z UTC and 12Z UTC the next day are collated. A single minimum forecast and observed temperature pair are recorded and stored for this time period (Ob, Fc).

© Crown copyright Met Office Communication to end users 2x2 tableObserved FrostNo Observed Frost No Forecast Frost Miss (c)Correct Rejection (d) Forecast FrostHit (a)False Alarm (b)

© Crown copyright Met Office Temporal Verification Experiments have shown that four times more salt is required to melt snow and ice than to prevent its initial formation on road surfaces. Conversely, if the treating agent is applied to the road surface too soon, traffic and precipitation may remove it. The predicted time of road surface temperature threshold crossing is critical for effective and economical maintenance of transport networks.

© Crown copyright Met Office N by M contingency table The road surface sensors enable the actual state of the road surface to be deduced, and this enables a time-series profile to be verified against the forecast road state. Uncertainty in matching observations to forecasts as the road surfaces are being treated live.

© Crown copyright Met Office Economic skill values for forecasts Action taken ? Event Occurs ? Frost ObservedNo Frost Observed TreatedCost1 incurred Cost2 incurred NOT Treated Loss incurred Null cost Based on the paper by D.S. Richardson, Skill and relative economic value of ECMWF ensemble prediction system, Q.J.R. Meteorol. Soc. (2000),126, pp Defines the relative value V of a forecast system using The maximum value of V is given by the Kuiper’s performance index = Hit Rate – False Alarm Rate If a perfect knowledge of the future weather will save the user an amount S (over the use of purely climatological information) then forecast system with relative value V will save the user 100V% of S Relative economic value The maximum value of V =1 for perfect forecasts and if V>0 then the user will obtain benefit from the forecasts. ME(Climate) = Mean expense by using climate as a forecast. ME(forecast) = Mean expense by using forecasts. ME(perfect) = Mean expense of perfect forecasts. Cost loss ratio = (cost of taking the action)/(potential loss protected by the action) Often the potential loss protected by the action is taken to equal to the total loss that would have occurred. Also known as the Peirce’s Skill Score. It is an equitable skill score, which means that constant forecasts and random forecasts all give a value of zero. It also has no dependence on sample climate. Good forecasts have values with scores closer to 1, but a range of values for the score can be between -1 to +1.

© Crown copyright Met Office Economic skill values for forecasts For marginal nights, - replacing the mean expense from climatology with the expense of salting every marginal night - using a cost to loss ratio = (Thornes 1999) as the benefits of winter maintenance have been estimated to about eight times the cost. Marginal Night analysis The Value Index is based on the one used by Thornes and Stephenson (2001). E(S) = Expense of salting every marginal night. E(A) = Expense of salting with actual forecasts. E(P) = Expense with perfect forecasts. Cost:loss ratio is the cost of taking action (i.e. salting the roads) as a fraction of the loss incurred (i.e. the roads are not salted and accidents and delay occur) taking into account the savings made by not salting. It is assumed 0<cost:loss < 1. ValueIndex = {E(S)-E(A)}/{E(S)-E(P)} ValueIndex(AlwaysProtect) = {[(c+d)-(c/(cost:loss))]/[b+d]} Verification of cost-loss models are evaluated as a consequence.

© Crown copyright Met Office Questions and answers