Semi Automating Forecasts for Canadian Airports in the Great Lakes Area by George A. Isaac 1, With contributions from Monika Bailey, Faisal S. Boudala,

Slides:



Advertisements
Similar presentations
JMA Takayuki MATSUMURA (Forecast Department, JMA) C Asia Air Survey co., ltd New Forecast Technologies for Disaster Prevention and Mitigation 1.
Advertisements

Challenges of Nowcasting by George A. Isaac 1, Monika Bailey 1, Faisal Boudala 1, Stewart Cober 1, Robert Crawford 1, Ivan Heckman 1, Laura Huang 1, Paul.
Anthony Illingworth, + Robin Hogan, Ewan OConnor, U of Reading, UK and the CloudNET team (F, D, NL, S, Su). Reading: 19 Feb 08 – Meeting with Met office.
Predicting Weather. Meteorologist A person who studies the weather. They make weather maps from information gathered using various weather instruments.
Ch 14 – Instrument Meteorological Conditions (IMC)
Introduction to data assimilation in meteorology Pierre Brousseau, Ludovic Auger ATMO 08,Alghero, september 2008.
Report of the Q2 Short Range QPF Discussion Group Jon Ahlquist Curtis Marshall John McGinley - lead Dan Petersen D. J. Seo Jean Vieux.
A thermodynamic model for estimating sea and lake ice thickness with optical satellite data Student presentation for GGS656 Sanmei Li April 17, 2012.
1 00/XXXX © Crown copyright Use of radar data in modelling at the Met Office (UK) Bruce Macpherson Mesoscale Assimilation, NWP Met Office EWGLAM / COST-717.
Fog Forecasting at Roissy Airport (Paris) with 1D model Thierry Bergot and Joël Noilhan Météo-France 1) Methodology Cobel : 1D model of boundary layer.
UNSTABLE The UNderstanding Severe Thunderstorms and Alberta Boundary Layers Experiment Neil Taylor 1, Dave Sills 2, John Hanesiak 3, Jason Milbrandt 4.
WMO/TECO Functional Testing of Surface Weather Instruments and Systems - Rodica Nitu Meteorological Service of Canada.
1 G EOSS A nd M AHASRI E xperiment in T ropics (GaME-T) Taikan Oki and Shinjiro.
The Canadian Airport Nowcasting Project (CAN-Now): Vision for future and preliminary results George A. Isaac 1 and Faisal S. Boudala 1 Monika Bailey 1,
Department of Meteorology and Geophysics University of Vienna since 1851 since 1365 TOWARDS AN ANALYSIS ENSEMBLE FOR NWP-MODEL VERIFICATION Manfred Dorninger,
MOS Performance MOS significantly improves on the skill of model output. National Weather Service verification statistics have shown a narrowing gap between.
Ensemble Post-Processing and it’s Potential Benefits for the Operational Forecaster Michael Erickson and Brian A. Colle School of Marine and Atmospheric.
SAAWSO PROJECT SUMMARY AND ITS GOALS Ismail Gultepe EC, Cloud Physics and Severe Weather Research Section Toronto, Ontario M3H 5T4, Canada.
Chapter 13 – Weather Analysis and Forecasting. The National Weather Service The National Weather Service (NWS) is responsible for forecasts several times.
Printed Reports and Forecasts
Weather Forecasting - II. Review The forecasting of weather by high-speed computers is known as numerical weather prediction. Mathematical models that.
UNDERSTANDING TYPHOONS
Aviation Cloud Forecasts – A True Challenge for Forecasters v       Jeffrey S. Tongue NOAA/National Weather Service - Upton, NY Wheee !
COSMO General Meeting Zurich, 2005 Institute of Meteorology and Water Management Warsaw, Poland- 1 - Verification of the LM at IMGW Katarzyna Starosta,
High-resolution Mesoscale Modeling and Diagnosing of a Severe Fog Event 1 Meteorological Service of Canada, Burlington, Ontario, Canada 2 Meteorological.
Data Integration: Assessing the Value and Significance of New Observations and Products John Williams, NCAR Haig Iskenderian, MIT LL NASA Applied Sciences.
Nowcasting Airport Winter Weather – AVISA Tests During AIRS II Presentation by George A. Isaac Cloud Physics and Severe Weather Research Division Environment.
Laurence Wilson Associate Scientist Emeritus Environment Canada Monica Bailey, Marcel Vallee and Ivan Heckmann Verification of forecasts from the 2010.
1. Sensor Classification System Canadian Version: Siting Classification Rodica Nitu.
Moisture observation by a dense GPS receiver network and its assimilation to JMA Meso ‑ Scale Model Koichi Yoshimoto 1, Yoshihiro Ishikawa 1, Yoshinori.
REGIONAL DECISION SUPPORT SYSTEM T. Bazlova, N. Bocharnikov, V. Olenev, and A. Solonin Institute of Radar Meteorology St. – Petersburg, Russia 2010.
1. HAZARDS  Wind shear  Turbulence  Icing  Lightning  Hail 3.
June, 2003EUMETSAT GRAS SAF 2nd User Workshop. 2 The EPS/METOP Satellite.
June 19, 2007 GRIDDED MOS STARTS WITH POINT (STATION) MOS STARTS WITH POINT (STATION) MOS –Essentially the same MOS that is in text bulletins –Number and.
Existing Scientific Instruments (of astronomical interest) AWS Concordia AWS Davis and AW11 (summer) 12 m Tower: Wind, Temperature, RH sensors at standard.
Numerical Weather and Environmental Prediction and Nowcasting for the 2010 Vancouver Winter Olympics Jocelyn Mailhot, George Issac and Charles Lin Atmospheric.
Russian proposals to Scientific program of Hydrometeorological observatory in framework of meteorological and radiation measurements (prepared by A. Makshtas)
© TAFE MECAT 2008 Chapter 6(b) Where & how we take measurements.
© Crown copyright Met Office Plans for Met Office contribution to SMOS+STORM Evolution James Cotton & Pete Francis, Satellite Applications, Met Office,
NCAR Auto-Nowcaster Convective Weather Group NCAR/RAL.
Météo-France / CNRM – T. Bergot 1) Introduction 2) The methodology of the inter-comparison 3) Phase 1 : cases study Inter-comparison of numerical models.
SAAWSO HIGH IMPACT WEATHER EVENTS Ismail Gultepe EC, Cloud Physics and Severe Weather Research Section Toronto, Ontario M3H 5T4, Canada.
Latest results in verification over Poland Katarzyna Starosta, Joanna Linkowska Institute of Meteorology and Water Management, Warsaw 9th COSMO General.
END-USER FOCUSED VERIFICATION OF PRECIPITATION NOWCASTS DURING THE SOCHI 2014 WINTER OLYMPICS Larisa Nikitina 1, Suleiman Mostmandy 2, Pertti Nurmi (presenter)
1 Climate Test Bed Seminar Series 24 June 2009 Bias Correction & Forecast Skill of NCEP GFS Ensemble Week 1 & Week 2 Precipitation & Soil Moisture Forecasts.
T he Man-In-The-Loop (MITL) Nowcast Demonstration: Forecaster Input into a Thunderstorm Nowcasting System R. Roberts, T. Saxen, C. Mueller, E. Nelson,
Nowcasting Trends Past and Future By Jim Wilson NCAR 8 Feb 2011 Geneva Switzerland.
2006(-07)TAMDAR aircraft impact experiments for RUC humidity, temperature and wind forecasts Stan Benjamin, Bill Moninger, Tracy Lorraine Smith, Brian.
James Pinto Project Scientist II NCAR Research Applications Laboratory NCAR/RAL Perspective on Aviation-based Requirements for RUA.
Post processing on NWP output and nowcasting on the grid for feeding the forecast system in Canada Donald Talbot Chief of Meteorological System Section,
Northeast Winter C&V Program Roy Rasmussen NCAR Wes Wilson MIT/LL.
METEOROLOGICAL DATA COLLECTION NETWORK OF IMD. Meteorological instrument Observer Meteorological Observation (In situ & Remote sensing)
ASAP In-Flight Icing Research at NCAR J. Haggerty, F. McDonough, J. Black, S. Landolt, C. Wolff, and S. Mueller In collaboration with: P. Minnis and W.
Trials of a 1km Version of the Unified Model for Short Range Forecasting of Convective Events Humphrey Lean, Susan Ballard, Peter Clark, Mark Dixon, Zhihong.
 Federal Aviation Administration’s Notice of Proposed Rulemaking on certification of aircraft for operation in supercooled large drop (SLD) icing conditions.
WMO CIMO Survey National Summaries of Methods and Instruments for Solid Precipitation Measurement - Preliminary results - R Nitu Meteorological Service.
Science Chapter 6B Lesson 4. Climate Weather: – Usual – Year after year.
DOWNSCALING GLOBAL MEDIUM RANGE METEOROLOGICAL PREDICTIONS FOR FLOOD PREDICTION Nathalie Voisin, Andy W. Wood, Dennis P. Lettenmaier University of Washington,
Météo-France / CNRM – T. Bergot 1) Methodology 2) The assimilation procedures at local scale 3) Results for the winter season Improved Site-Specific.
11 Short-Range QPF for Flash Flood Prediction and Small Basin Forecasts Prediction Forecasts David Kitzmiller, Yu Zhang, Wanru Wu, Shaorong Wu, Feng Ding.
Encast Global forecasting.
Weather Forecasting.
Plans for Met Office contribution to SMOS+STORM Evolution
Weather and meteorology npor. Eva Slovák Kubalová
Tadashi Fujita (NPD JMA)
Road Weather Information Systems (RWIS)
Winter storm forecast at 1-12 h range
Assorted Observation Systems
Statistical vs. Physical Adaptation
WMO NWP Wokshop: Blending Breakout
Presentation transcript:

Semi Automating Forecasts for Canadian Airports in the Great Lakes Area by George A. Isaac 1, With contributions from Monika Bailey, Faisal S. Boudala, Stewart G. Cober, Robert W. Crawford, Bjarne Hansen, Ivan Heckman, Laura X. Huang, Alister Ling, and Janti Reid Cloud Physics and Severe Weather Research Section, and Environment Canada Great Lakes Operational Meteorology Workshop 2013 Webinar – May 14, 2013

Acknowledgements Funds from Transport Canada Search and Rescue New Initiatives Fund NAV CANADA Environment Canada Also operations and research colleagues at CMC/RPN, others at CMAC-East (e.g. Stephen Kerr, Gilles Simard) and CMAC-West (e.g. Tim Guezen, Bruno Larochelle) and others within our Section (e.g. Bill Burrows)

Canadian Airport Nowcasting (CAN-Now) To improve short term forecasts (0-6 hour) or Nowcasts of airport severe weather. Develop a forecast system which will include routinely gathered information (radar, satellite, surface based data, pilot reports), numerical weather prediction model outputs, and a limited suite of specialized sensors placed at the airport. Forecast/Nowcast products to be issued with 1-15 min resolution for most variables. Test this system, and its associated information delivery system, within an operational airport environment (e.g. Toronto and Vancouver International Airports ).

Isaac, G.A., Bailey, M., Boudala, F.S., Cober, S.G., Crawford, R.W., Donaldson, N., Gultepe, I., Hansen, B., Heckman, I., Huang, L.X., Ling, A., Mailhot, J., Milbrandt, J.A., Reid, J., and Fournier, M. (2012), The Canadian airport nowcasting system(CAN-Now). Accepted to Meteorological Applications.

Algorithm Development Visibility/Fog … RVR Ceiling Blowing Snow Turbulence Winds/Gusts/Shear Icing Precipitation Type Precipitation Intensity Lightning/Convective Storm Real Time Verification

Main equipment at Pearson at the old Test and Evaluation site near the existing Met compound

7 21 instrument bases with power and data feeds. 10m apart; rows 15m apart GTAA anemometer NAV Canada 78D anemometer # Meteorological Observation Building 1.Present Weather Sensor (Vaisala FD12P) 2.Spare 3.Camera # Power distribution box 4.Present Weather Sensor (Parsivel) 5.3D Ultrasonic Wind Sensor (removed) 6.Microwave Profiling Radiometer (Radiometrics) 7.Precipitation Occurrence Sensor (POSS) 8.Icing detector (Rosemount) 9.Precipitation gauge (Belfort) with Nipher Shield Ultrasonic snow depth 10.Hotplate (Yankee – removed) 11.Tipping Bucket rain gauge TB3 12.Precipitation Switch 13.Spinning arm, liquid/total water content probe -- proposed m Tower, 2D ultrasonic wind sensor 15.Ceilometer (Vaisala CT25K) 16.Vertically Pointing 3 cm Radar (McGill) 17.Hotplate Precipitation Meter (Yankee) 18.Temp, humidity, pressure, solar radiation 19.Precipitation gauge (Geonor) with Nipher Shield 20.Spare 21.10m Tower Spare (Proposed or removed equipment) Pearson Instrument Site

CAN-Now Situation Chart

Crosswinds: Dry RWY (precipitation rate ≤ 0.2 mm/h and visibility ≥ 1 SM): x-wind (knots) < 15:GREEN 15 ≤ x-wind (knots) < 20:YELLOW 20 ≤ x-wind (knots) < 25:ORANGE x-wind (knots) ≥ 25:RED(NOT PERMITTED) Wet RWY (precipitation rate > 0.2 mm/h or visibility < 1 SM): x-wind (knots) < 5:GREEN 5 ≤ x-wind (knots) < 10:YELLOW 10 ≤ x-wind (knots) < 15:ORANGE x-wind (knots) ≥ 15:RED(NOT PERMITTED) Visibility: vis (SM) ≥ 6:GREEN(VFR) 3 ≤ vis (SM) < 6:BLUE(MVFR) ½ ≤ vis (SM) < 3:YELLOW(IFR) ¼ ≤ vis (SM) < ½:ORANGE (BLO ALTERNATE) vis (SM) < ¼:RED (BLO LANDING) Thresholds as applied on Situation Chart

Ceiling: ceiling (ft) ≥ 2500:GREEN(VFR) 1000 ≤ ceiling (ft) < 2500:BLUE(MVFR) 400 ≤ ceiling (ft) < 1000:YELLOW(IFR) 150 ≤ ceiling (ft) < 400:ORANGE (BLO ALTERNATE) ceiling (ft) < 150:RED (BLO LANDING) Shear & Turbulence: momentum flux FQ (Pa) < 0.75:GREEN(LGT) 0.75 ≤ mom. flux FQ (Pa) < 1.5:YELLOW(MOD) mom flux FQ (Pa) ≥ 1.5:RED (SEV) eddy dissipation rate (m 2/3 /s) < 0.3:GREEN(LGT) 0.3 ≤ EDR (m 2/3 /s) < 0.5:YELLOW(MOD) EDR (m 2/3 /s) ≥ 0.5:RED (SEV) eddy dissipation rate (m 2/3 /s) < 0.3:GREEN(LGT) 0.3 ≤ EDR (m 2/3 /s) < 0.5:YELLOW(MOD) EDR (m 2/3 /s) ≥ 0.5:RED (SEV) If the windspeed (relative to surface wind direction) exceeds, any of the following: level[2] (~125m/410ft) - level[0] >= 25 kts level[4] (~325m/1060ft) - level[0] >= 40 kts:RED level[5] (~440m/1440ft) - level[0] >= 50 kts

Precipitation: rate (mm/h) > 7.5:RED (HEAVY) 2.5 < rate (mm/h) ≤ 7.5:ORANGE (MODERATE) 0.2 < rate (mm/h) ≤ 2.5:YELLOW(LIGHT) 0 < rate (mm/h) ≤ 0.2: GREEN(TRACE) rate (mm/h) = 0 : GREEN(NO PRECIP) TSTM & LTNG: Lightning Distance ≤ 6 SM RED (TS) Lightning Distance ≤ 10 SM ORANGE (VCTS) Lightning Distance ≤ 30 SM YELLOW(LTNG DIST) Lightning within area (> 30 SM) YELLOW Lightning forecast map received GREEN(NO LTNG FCST) ICING: TWC < 0.1 g/m3 or TT ≥ 0°C GREEN TWC ≥ 0.1 g/m3 where TT < 0°C YELLOW (POTENTIAL ICING)

CAT-level: RVR (ft) < 600 RED(NOT PERMITTED) 600 ≤ RVR (ft) < or- ceiling (ft) < 100: RED(CAT IIIa) 1200 ≤ RVR (ft) < or- 100 ≤ ceiling (ft) < 200: ORANGE (CAT II) 2600 ft ≤ RVR < 3 SM -or- 200 ≤ ceiling (ft) < 1000: YELLOW (CAT I) 3 ≤ RVR (SM) < 6 -or ≤ ceiling (ft) < 2500: BLUE (MVFR) RVR (SM) ≥ 6 -and- ceiling (ft) ≥ 2500: GREEN (VFR) RWY Condition: precipitation rate (mm/h) > 0.2: ORANGE (Possible WET rwy) precipitation rate (mm/h) ≤ 0.2: YELLOW (Possible DRY rwy) Wx Only AAR: Cell colour is based on meteorological conditions – same as CAT-level Meteorologically-limited theoretical maximum AAR determined from look-up table of documented AAR values based on runway configuration and meteorological conditions (CAT-level). Runway configuration determined solely from crosswind thresholds for maximum potential capacity.

Thanks to Bill Burrows

Web Site A Web site has been created at: 1.ontario.int.ec.gc.ca/cannow/cyyz/wx/inde x_e.php?airport=1 1.ontario.int.ec.gc.ca/cannow/cyyz/wx/inde x_e.php?airport=1 The site is accessible externally only with a user name and password. The site is currently active in a research mode to obtain feedback..

Conditions Change Rapidly

The mean absolute error for continuous variables for CYYZ. CLI refers to the error if a climate average were used as the predictor.

Mean absolute error wind direction at CYYZ calculated with all the data and then when wind speeds less than 5 knots are removed.

Theoretical Limit NWP Models Nowcasting From Golding (1998) Meteorol. Appl., 5, 1-16 The main idea behind Nowcasting is that extrapolation of observations, by simple or sophisticated means, shows better skill than numerical forecast models in the short term. For precipitation, Nowcasting techniques are usually better for 6 hours or more.

Adaptive Blending of Observations and Models (ABOM) Change predicted by model Forecast at lead time p Current Observation Change predicted by obs trend INTW INTW combines predictions from several NWP models by weighting them based on past performance (6 hours) and doing a bias correction using the most recent observation. (SMOW-V10 used GEM 1, 2.5 and 15 km) Nowcasting Techniques Which Combine Model(s) and Observations

Related Papers Isaac, G.A., P. Joe, J. Mailhot, M. Bailey, S. Bélair, F.S. Boudala, M. Brugman, E. Campos, R.L.Carpenter Jr., R.W.Crawford, S.G. Cober, B. Denis, C. Doyle, H.D. Reeves, I.Gultepe, T. Haiden, I. Heckman, L.X. Huang, J.A. Milbrandt, R. Mo, R.M. Rasmussen, T. Smith, R.E. Stewart, D. Wang and L.J. Wilson, 2012b: Science of Nowcasting Olympic Weather for Vancouver 2010 (SNOW-10): A World Weather Research Programme project. Pure and Applied Geophysics. (DOI: /s ). Bailey, M.E., G.A. Isaac, I. Gultepe, I. Heckman and J. Reid, 2012: Adaptive Blending of Model and Observations for Automated Short Range Forecasting: Examples from the Vancouver 2010 Olympic and Paralympic Winter Games. Pure and Applied Geophysics. DOI /s x. Huang, L.X., G. A. Isaac, and G. Sheng, 2012: Integrating NWP Forecasts and Observation Data to Improve Nowcasting Accuracy. Weather and Forecasting, 27, Huang, Laura X, George A. Isaac, and Grant Sheng, 2012: A New Integrated Weighted Model in SNOW-V10: Verification of Continuous Variables. Pure and Applied Geophysics. DOI /s Huang, Laura X, George A. Isaac, and Grant Sheng, 2012: A New Integrated Weighted Model in SNOW-V10: Verification of Categorical Variables. Pure and Applied Geophysics. DOI /s

NWP Model with Minimum MAE in CAN-Now for Winter Dec 1/09 – Mar 31/10 and Summer June 1/10 to Aug 31/10 Periods Based on First 6 Hours of Forecast

Winter period – Dec. 1, 2009 to Mar. 31, 2010 Summer period - June 1 to August 31, 2010

Variable LAMREGRUCINTW CYYZCYVRCYYZCYVRCYYZCYVRCYYZCYVR TEMP RH 66no WS no12.5 GUST no 53.5no1.5 Winter Summer Variable LAMREGRUCINTW CYYZCYVRCYYZCYVRCYYZCYVRCYYZCYVR TEMP no0.5 RH no11 WS no GUST no 5.5no2.2no0.54 Time (h) for Model to Beat Persistence Huang, L.X., G.A. Isaac and G. Sheng, 2012: Integrating NWP Forecasts and Observation Data to Improve Nowcasting Accuracy, Weather and Forecasting, 27,

Shows the mean absolute error (MAE) in temperature and RH at CYYZ for the winter of 2009/10 as a function of forecast lead time averaged over the whole season. Temperature and relative humidity ABOM REG and ABOM LAM are compared to the raw model output and persistence.

Categories Being Used in CAN-Now Analysis

Heidke Skill Score: Multi- Categories Ktotal 1 N(F 1 ) 2 N(F 2 ) 3 N(F 3 ).. K N(F k ) total N(O 1 )N(O 2 )N(O 3 )N(O k )N Observed category Forecast Category j i Using: Calculate:

The HSS and ACC scores for the relaxed set of criteria.

Summary Progress is being made to forecast aviation related variables using numerical model output and nowcast schemes. We already have a system which uses climatology (WIND III). RH predictions are poor, barely beating climatology. (Impacts visibility forecasts) Visibility forecasts are poor from statistical point of view. (also require snow and rain rates) Cloud base forecasts, although showing some skill, could be improved with better model resolution in boundary layer. Wind direction either poorly forecast or measured. There are many difficulties in measuring parameters, especially precipitation amount and type. Overall statistical scores do not show complete story. Need emphasis on high impact events. Selection of model point to best represent site is a critical process.

Summary (continued) Weather changes rapidly, especially in complex terrain, and it is necessary to get good measurements at time resolutions of at least min. CAN-Now and SNOW-V10 attempted to get measurements at 1 min resolution where possible. Because of the rapidly changing nature of the weather, weather forecasts also must be given at high time resolution. Verification of mesoscale forecasts, and nowcasts, must be done with appropriate data (time and space). Data collected on hourly basis are not sufficient. Nowcast schemes which blend NWP models and observations at a site, outperform individual NWP models and persistence after 1-2 hours.

Summary (continued) Currently using products to develop a First Guess TAF (FGT). The FGT system is being tested at the Aviation Weather Centres (CMAC-East and West) and is showing considerable promise, especially for VFR conditions. A recent IRP (last week) suggested many things that need addressing, including the verification of FGT and comparison with what forecasters are now producing. The algorithms definitely need some improvement (e.g. Low cloud is often predicted in Arctic under cold conditions when skies are clear, and there are issues with precipitation type)

Questions?