Translating verification experience from meteorology to space weather Suzy Bingham ESWW splinter session, Thurs. 20 th Nov.

Slides:



Advertisements
Similar presentations
MOGREPS-W First-guess Severe Weather Warnings for NSWWS
Advertisements

© Crown copyright Met Office WAFC CAT verification Objective verification of GRIB CAT forecasts Dr Philip G Gill, WAFC Science Meeting, Washington, 20.
Space Environment Center Service Start to Finish Joe Hirman SEC Lab Review July 2000.
Challenges and Opportunities Hamza Kabelwa REGIONAL FORECASTING SUPPORT CENTRE, DAR ES SALAAM Washington, USA, June 2013.
SWFDP – Eastern Africa (pilot phase) Milestones (Planning and Management): Technical-Planning Workshop (Nairobi, Oct 2010) –Agreement that the SWFDP-EA.
Richard (Rick) Jones SWFDP Training Workshop on Severe Weather Forecasting Bujumbura, Burundi, Nov 11-16, 2013.
SWFDP-SA: Evolution, challenges and successes Mark Majodina South African Weather Service 1 October 20141FCAST-PRE
Space Weather in CMA Xiaonong Shen Deputy Administrator China Meteorological Administration 17 May 2011 WMO Cg-XVI Side Event Global Preparedness for Space.
Splinter Session – ‘Space Weather Metrics, Verification & Validation.’ Thursday 20 th Nov., 16: :00 Splinter session - Space weather metrics, verification.
How to Measure the Progress and Value of NMME? Issues to Address: Need to define how to measure the value of NMME. Besides CPC, who are other users of.
PRESENTS: FORECASTING FOR OPERATIONS AND DESIGN February 16 th 2011 – Aberdeen.
MANOAH TEPA SOLOMON ISLANDS METEOROLOGICAL SERVICES SEVERE WEATHER FORECASTING AND DISASTER REDUCTION DEMONSTRATION PROJECT (SWFDDP) ASSESSMENT.
Page 1© Crown copyright 2006ESWWIII, Royal Library of Belgium, Brussels, Nov 15 th 2006 Forecasting uncertainty: the ensemble solution Mike Keil, Ken Mylne,
SIPR Dundee. © Crown copyright Scottish Flood Forecasting Service Pete Buchanan – Met Office Richard Maxey – SEPA SIPR, Dundee, 21 June 2011.
EUMETSAT04 04/2004 © Crown copyright Use of EARS in Global and Regional NWP Models at the Met Office Brett Candy, Steve English, Roger Saunders and Amy.
© Crown copyright Met Office Requirement for transitioning radio heliophysics research into operations David Jackson Radio Heliophysics Infrastructure.
Gridded OCF Probabilistic Forecasting For Australia For more information please contact © Commonwealth of Australia 2011 Shaun Cooper.
© Crown copyright Met Office Transitioning space weather models to operations at the UK Met Office Suzy Bingham, David Jackson, Catherine Burnett and Mark.
© Crown copyright Met Office Enhanced rainfall services Paul Davies.
National Space Weather Program Unified National Space Weather Capability Dr. Jack Hayes NOAA Assistant Administrator for Weather Services Director, National.
© Crown copyright Met Office Cost benefit studies for observing systems Stuart Goldstraw, Met Office, CBS-RA3-TECO-RECO, 13 th September 2014.
WMO / COST 718 Expert Meeting on Weather, Climate and Farmers November 2004 Geneva, Switzerland.
© Crown copyright 2012 Met Office Weather Observations Website (WOW) Aidan Green, 17 th October Introducing the.
Probabilistic forecasts of (severe) thunderstorms for the purpose of issuing a weather alarm Maurice Schmeits, Kees Kok, Daan Vogelezang and Rudolf van.
Verification has been undertaken for the 3 month Summer period (30/05/12 – 06/09/12) using forecasts and observations at all 205 UK civil and defence aerodromes.
Section 4: Forecasting the Weather
© Crown copyright 2011 Met Office WOW - Weather Observations Website Crowd-sourced weather obs for real OGC TC 79 Brussels, Chris Little & Aidan.
Physical science findings relevant to climate change adaptation Richard Jones, Met Office Science Fellow/Visiting Professor, School of Geography and Environment.
Presented by Roger Ndicu Principal Meteorologist Public Weather Service CAP Implementation Experiences in Kenya KENYA METEOROLOGICAL DEPARTMENT.
NOAA's NWS National Performance Measures FY 2010 – FY 2016 NOAA's NWS National Performance Measures FY 2010 – FY 2016 Government Performance Requirements.
NOAA’s National Weather Service National Digital Forecast Database: Status Update LeRoy Spayd Chief, Meteorological Services Division Unidata Policy Committee.
Pablo Santos WFO Miami, FL Mark DeMaria NOAA/NESDIS David Sharp WFO Melbourne, FL rd IHC St Petersburg, FL PS/DS “HURRICANE CONDITIONS EXPECTED.”
© Crown copyright Met Office PWS in support of disaster prevention and mitigation How to improve collaboration and coordination Sarah Davies UK Met Office.
Space weather forecasters perspective: UK David Jackson and Mark Gibbs SEREN Bz workshop, Abingdon, 9-10 July 2014.
© Crown copyright Met Office Space Weather Mr John Hirst, Permanent Representative of the UK to WMO 17 th May 2011.
© Crown copyright Met Office Probabilistic turbulence forecasts from ensemble models and verification Philip Gill and Piers Buchanan NCAR Aviation Turbulence.
© Crown copyright Met Office Met Office activities related to needs of humanitarian agencies Anca Brookshaw.
Flash Flood Forecasting as an Element of Multi-Hazard Warning Systems Wolfgang E. Grabs Chief, Water Resources Division WMO.
Civil Contingencies Secretariat/ Met Office Risk Preparedness and Operational Mitigation in the UK European Space Weather Week: Antwerp, 18 November 2013.
© Crown copyright 2007 Optimal distribution of polar-orbiting sounding missions John EyreMet Office, UK CGMS-40; Lugano, Switzerland;5-9 Nov 2012.
1 Machine Learning and Data Mining for Automatic Detection and Interpretation of Solar Events Jie Zhang (Presenting, Co-I, SCS*) Art Poland (PI, SCS*)
American Meteorological Society - January 15, 2009 National Weather Service River Flood Warning Verification Mary Mullusky, Ernest Wells, Timothy Helble,
© Crown copyright Met Office WAFC CAT verification Objective verification of GRIB CAT forecasts Dr Philip G Gill, WAFS Workshop on the use and visualisation.
© Crown copyright Met Office Provision of Web Based Space Weather Services Catherine Burnett, ESWW-11, Liege, st November 2014.
Click to edit Master subtitle style SAWS_PARL/CT /06/2012 CLIMATE CHANGE, EARLY WARNING, ADVISORY SERVICES AND RELATED INITIATIVES PORTFOLIO COMMITTEE.
1 Symposium on the 50 th Anniversary of Operational Numerical Weather Prediction Dr. Jack Hayes Director, Office of Science and Technology NOAA National.
Multi Hazard, Impact Based forecasting and warning services
1 National Centers for Environmental Prediction: Where America’s Climate and Weather Services Begin Louis W. Uccellini Director, NCEP January 28, 2004.
© Crown copyright Met Office 1 ISES Forecast Verification Workshop Saturday 11 th April 2015 Millennium Harvest House, th Street, Boulder, Colorado.
The CME geomagnetic forecast tool (CGFT) M. Dumbović 1, A. Devos 2, L. Rodriguez 2, B. Vršnak 1, E. Kraaikamp 2, B. Bourgoignie 2, J. Čalogović 1 1 Hvar.
Space Weather Services to Build Global Resilience Expert Meeting on Space Weather Services February 3, 2015 – UNCOPUOS STSC Assembly Goal: Foster greater.
ISES Director’s Report Terry Onsager, April 9, 2015 Accomplishments Actions from 2014 Annual Meeting Complementary Activities Challenges.
Space Weather - UK Activity Mark Gibbs, Head of Space Weather Met Office.
Impact of AMDAR/RS Modelling at the SAWS
Utilizing Scientific Advances in Operational Systems
Intensity-scale verification technique
Eugene Poolman RSMC Pretoria
Severe Weather Forecasting Demonstration Project (SWFDP) Bay of Bengal
Met Office GPC Adam Scaife Head of Monthly to Decadal Prediction Met Office © Crown copyright Met Office.
Global Air Navigation Plan (GANP) Aviation System Block Upgrade (ASBU)
Caio Coelho (Joint CBS/CCl IPET-OPSLS Co-chair) CPTEC/INPE, Brazil
Linking operational activities and research
Science of Rainstorms with applications to Flood Forecasting
Status of Existing Observing Networks
Project on Developing and Implementing an Impact-based Severe Weather Warning System South African Weather Service and National Disaster Management Centre.
Severe Weather Forecasting Demonstration Project (SWFDP)
Verification of SPE Probability Forecasts at SEPC
The WMO Global Basic Observing Network (GBON) Lars Peter Riishojgaard
Space Weather and HI Richard A. Harrison, RAL
Short Range Ensemble Prediction System Verification over Greece
Presentation transcript:

Translating verification experience from meteorology to space weather Suzy Bingham ESWW splinter session, Thurs. 20 th Nov.

Current space weather validation, verification & metrics Validation: plans to work with CCMC for TEC validation. Verification: plan to use experience in terrestrial methods; to verify models & forecaster warnings. Application metrics/KPIs: have extended terrestrial infrastructure to monitor space weather systems. Plan to verify for stakeholders. Met Office forecaster Webpages: Enlil & REFM.

Verification of forecasts Why verify: to understand quality/accuracy, evidence for stakeholders, forecaster feedback, to further improve, to compare models/methods. Space weather forecasters produce guidance: twice daily. These forecasts include probability forecasts for geomagnetic storms, X-ray flares, high energy protons & high energy electrons. Example probability forecasts

Verification of forecasts Metrics/performance indicators used in terrestrial weather forecasting verification for stakeholders (with examples): 1.Severe weather warning accuracy (score: impact level, area covered & validity time) 2.Forecast accuracy (e.g. daily minimum temp accuracy should be +/-2°C) 3.Public value (“How useful are forecasts these days”?) 4.Public reach (“Have you seen any weather warnings in the last few days?”) 5.Service quality (Timeliness scores for model delivery.) 6.Emergency responder value (How satisfied are you with the service?”) 7.Responder reach (Availability of Hazard Manager application to emergency responder community) 8.National capability (95% of lightning location data messages should be available within a certain time.) 9.Milestone achievement (Develop a national electronic weather data archive)

© Crown copyright Met Office Verification of warnings time Low Event threshold HIT FALSE ALARM MISS NON- EVENT Warning periodEnd time + late hit period Issue time EARLY HIT LATE HIT Event threshold EARLY LOW HIT LATE LOW HIT LOW HIT Introducing the warnings verification system

Gale warning verification St Judes Storm October a hit!

Performance: ROC plot Performance of Forecaster issued Heavy Rainfall Alerts in 2012: © Crown copyright Met Office

Application metrics: timely, reliable, robust Timeliness plot for Global Model. Server monitoring to verify system is robust/reliable.

Data latency SOHO: EIT 4 channels, LASCO – C2, C3 LASCO C2 & C3 – 12 min cad / ~1-2 hrs latency (but data gaps – sometimes up to 5 hrs). SDO: AIA 12 channels, HMI-magnetogram STEREO A&B – COR1, COR2, EUVI cadence: COR1 – 1hr, COR2 – 15 min, EUVI – 10 min latency – 1-3 hrs (but currently very big data gaps/no data). ENLIL: run every 2 hours Run completed / graphics produced around 4 hrs after model analysis time (ie T=0). WMO Coronograph image requirements, OSCAR webpage

Met Office Business Performance Measures (BPMs) The Met Office BPMs are: Forecast accuracy, Growth, Reach, Customer & Service Delivery, Efficiency & Sustainability Excellence. Verification underpins the Forecast Accuracy BPM which is set by government. This BPM is to improve: 1.Global NWP Index 2.UK NWP Index 3.Public Forecasts 4.Customer Forecasts Global NWP Index is compiled from: Mean sea-level pressure, 500 hPa height, 850 hPa wind, 250 hPa wind. Plot showing increase in Global NWP Index which shows an improvement in Global model accuracy.

Summary Met Office is currently applying some validation, verification & metrics to space weather models/service (e.g. verifying reliability of service). Met Office is planning to take part in CCMC TEC challenge with initial longitudinal section case studies over Europe. Met Office is planning to adapt terrestrial verification methods to space weather to allow forecasters to understand warning accuracy (e.g. flexible verification system). Met Office would like to provide simple metrics to stakeholders.

Useful links WMO space weather observation requirements, Observing Systems Capability Analysis & Review Tool: sat.info/oscar/applicationareas/view/25http:// sat.info/oscar/applicationareas/view/25 Forecast verification, Centre for Australian Weather & Climate Research: urses/msgcrs/crsindex.htmhttp:// urses/msgcrs/crsindex.htm NOAA verification metrics glossary:

Questions and answers

Verification of forecasts Metrics/performance indicators used in terrestrial weather forecasting verification for stakeholders: 1.Severe weather warning accuracy 2.Forecast accuracy 3.Public value 4.Public reach 5.Service quality 6.Emergency responder value 7.Responder reach 8.National capability 9.Milestone achievement 0-2Very PoorWarning was missed or gave very poor guidance to customer, perhaps being classed as a “False Alarm” 3-4Poor Guidance Although a warning was issued it gave poor guidance to the customer 5-7Good Guidance A warning was issued which gave generally good guidance to the customer 8-9Excellent Guidance The warning issued gave excellent guidance to the customer Scoring for quality of sever weather warning.

Performance: reliability Performance of Forecaster issued Heavy Rainfall Alerts in 2012: © Crown copyright Met Office

Met Office Business Performance Measures (BPMs) The Met Office BPMs are: Forecast accuracy, Growth, Reach, Customer & Service Delivery, Efficiency & Sustainability Excellence. Verification underpins the Forecast Accuracy BPM which is set by government. This BPM for FY13/14 was to improve: 1.Global NWP Index with a stretch of UK NWP Index with a stretch of Public Forecasts- 12/17 forecast targets met with a stretch of 17/17. 4.Customer Forecasts- 2/3 forecast targets met with a stretch of 3/3. Global NWP Index is compiled from: Mean sea-level pressure, 500 hPa height, 850 hPa wind, 250 hPa wind. Plot showing increase in Global NWP Index which shows an improvement in Global model accuracy.