VERIFICATION OF NDFD GRIDDED FORECASTS USING ADAS John Horel 1, David Myrick 1, Bradley Colman 2, Mark Jackson 3 1 NOAA Cooperative Institute for Regional.

Slides:



Advertisements
Similar presentations
Chapter 13 – Weather Analysis and Forecasting
Advertisements

Observing, Analyzing, and Simulating the Boundary Layer in Northern Utah and Beyond John Horel, Erik Crosman, Xia Dong, Matt Lammers, Neil Lareau, Dan.
NOAA/NWS Change to WRF 13 June What’s Happening? WRF replaces the eta as the NAM –NAM is the North American Mesoscale “timeslot” or “Model Run”
ROMAN: Real-Time Observation Monitor and Analysis Network John Horel, Mike Splitt, Judy Pechmann, Brian Olsen NOAA Cooperative Institute for Regional Prediction.
Application of Numerical Model Verification and Ensemble Techniques to Improve Operational Weather Forecasting. Northeast Regional Operational Workshop.
MesoWest - Monitoring Weather Conditions around the West and the Nation
Rapid Update Cycle Model William Sachman and Steven Earle ESC452 - Spring 2006.
Daniel P. Tyndall and John D. Horel Department of Atmospheric Sciences, University of Utah Salt Lake City, Utah.
Brian Ancell, Cliff Mass, Gregory J. Hakim University of Washington
Transitioning unique NASA data and research technologies to the NWS 1 Evaluation of WRF Using High-Resolution Soil Initial Conditions from the NASA Land.
MOS Performance MOS significantly improves on the skill of model output. National Weather Service verification statistics have shown a narrowing gap between.
Analysis of Record Issues: Research Perspective John Horel NOAA Cooperative Institute for Regional Prediction Department of Meteorology University of Utah.
Robert LaPlante NOAA/NWS Cleveland, OH David Schwab Jia Wang NOAA/GLERL Ann Arbor, MI 22 March 2011.
Chapter 13 – Weather Analysis and Forecasting. The National Weather Service The National Weather Service (NWS) is responsible for forecasts several times.
1 NOAA’s National Climatic Data Center April 2005 Climate Observation Program Blended SST Analysis Changes and Implications for the Buoy Network 1.Plans.
Understanding the Weather Leading to Poor Winter Air Quality Erik Crosman 1, John Horel 1, Chris Foster 1, Lance Avey 2 1 University of Utah Department.
Real Time Mesoscale Analysis John Horel Department of Meteorology University of Utah RTMA Temperature 1500 UTC 14 March 2008.
Forecast Skill and Major Forecast Failures over the Northeastern Pacific and Western North America Lynn McMurdie and Cliff Mass University of Washington.
Gpegpe P Introduction A three-year NSF project is underway to investigate the processes leading to the formation, maintenance and destruction of.
Development of an object- oriented verification technique for QPF Michael Baldwin 1 Matthew Wandishin 2, S. Lakshmivarahan 3 1 Cooperative Institute for.
NOAA’s National Weather Service National Digital Forecast Database: Status Update LeRoy Spayd Chief, Meteorological Services Division Unidata Policy Committee.
Data assimilation and observing systems strategies Pierre Gauthier Data Assimilation and Satellite Meteorology Division Meteorological Service of Canada.
June 19, 2007 GRIDDED MOS STARTS WITH POINT (STATION) MOS STARTS WITH POINT (STATION) MOS –Essentially the same MOS that is in text bulletins –Number and.
VERIFICATION OF NDFD GRIDDED FORECASTS IN THE WESTERN UNITED STATES John Horel 1, David Myrick 1, Bradley Colman 2, Mark Jackson 3 1 NOAA Cooperative Institute.
Part III: ROMAN and MesoWest: resources for observing surface weather  MesoWest and ROMAN are software that require ongoing maintenance and development.
Dataset Development within the Surface Processes Group David I. Berry and Elizabeth C. Kent.
Analysis of Record Andy Devanas Jeff Medlin Charlie Paxton Pablo Santos Dave Sharp Irv Watson Pat Welsh Suggestions and Concerns from the Florida Science.
OUTLINE Current state of Ensemble MOS
Integration of Surface Weather Observations with MODIS Imagery for Fire Weather Applications Mike Splitt, Brian Olsen, John Horel, Judy Pechmann NOAA Cooperative.
CPPA Past/Ongoing Activities - Ocean-Atmosphere Interactions - Address systematic ocean-atmosphere model biases - Eastern Pacific Investigation of Climate.
An air quality information system for cities with complex terrain based on high resolution NWP Viel Ødegaard, r&d department.
Part II  Access to Surface Weather Conditions:  MesoWest & ROMAN  Surface Data Assimilation:  ADAS.
Transitioning unique NASA data and research technologies to the NWS 1 Evaluation of WRF Using High-Resolution Soil Initial Conditions from the NASA Land.
1 Results from Winter Storm Reconnaissance Program 2008 Yucheng SongIMSG/EMC/NCEP Zoltan TothEMC/NCEP/NWS Sharan MajumdarUniv. of Miami Mark ShirleyNCO/NCEP/NWS.
P1.7 The Real-Time Mesoscale Analysis (RTMA) An operational objective surface analysis for the continental United States at 5-km resolution developed by.
2006(-07)TAMDAR aircraft impact experiments for RUC humidity, temperature and wind forecasts Stan Benjamin, Bill Moninger, Tracy Lorraine Smith, Brian.
Robert LaPlante NOAA/NWS Cleveland, OH David Schwab Jia Wang NOAA/GLERL Ann Arbor, MI 15 March 2012.
A Numerical Study of Early Summer Regional Climate and Weather. Zhang, D.-L., W.-Z. Zheng, and Y.-K. Xue, 2003: A Numerical Study of Early Summer Regional.
Comparing GEM 15 km, GEM-LAM 2.5 km and RUC 13 km Model Simulations of Mesoscale Features over Southern Ontario 2010 Great Lakes Op Met Workshop Toronto,
APPLICATION OF NUMERICAL MODELS IN THE FORECAST PROCESS - FROM NATIONAL CENTERS TO THE LOCAL WFO David W. Reynolds National Weather Service WFO San Francisco.
VERIFICATION OF NDFD GRIDDED FORECASTS USING ADAS John Horel 1, David Myrick 1, Bradley Colman 2, Mark Jackson 3 1 NOAA Cooperative Institute for Regional.
1 National HIC/RH/HQ Meeting ● January 27, 2006 version: FOCUSFOCUS FOCUSFOCUS FOCUS FOCUSFOCUS FOCUSFOCUS FOCUSFOCUS FOCUSFOCUS FOCUSFOCUS FOCUSFOCUS.
NFUSE Conference Call 4/11/07 How Good Does a Forecast Really Need To Be? David Myrick Western Region Headquarters Scientific Services Division.
Evapotranspiration Estimates over Canada based on Observed, GR2 and NARR forcings Korolevich, V., Fernandes, R., Wang, S., Simic, A., Gong, F. Natural.
Post-processing air quality model predictions of fine particulate matter (PM2.5) at NCEP James Wilczak, Irina Djalalova, Dave Allured (ESRL) Jianping Huang,
Wind Gust Analysis in RTMA Yanqiu Zhu, Geoff DiMego, John Derber, Manuel Pondeca, Geoff Manikin, Russ Treadon, Dave Parrish, Jim Purser Environmental Modeling.
Verification of ensemble precipitation forecasts using the TIGGE dataset Laurence J. Wilson Environment Canada Anna Ghelli ECMWF GIFS-TIGGE Meeting, Feb.
ROMAN: Real-Time Observation Monitor and Analysis Network John Horel, Mike Splitt, Judy Pechmann, Brian Olsen NOAA Cooperative Institute for Regional Prediction.
INTEGRATING SATELLITE AND MONITORING DATA TO RETROSPECTIVELY ESTIMATE MONTHLY PM 2.5 CONCENTRATIONS IN THE EASTERN U.S. Christopher J. Paciorek 1 and Yang.
MOS and Evolving NWP Models Developer’s Dilemma: Frequent changes to NWP models… Make need for reliable statistical guidance more critical Helps forecasters.
Rapid Update Cycle-RUC. RUC A major issue is how to assimilate and use the rapidly increasing array of offtime or continuous observations (not a 00.
VERIFICATION OF A DOWNSCALING SEQUENCE APPLIED TO MEDIUM RANGE METEOROLOGICAL PREDICTIONS FOR GLOBAL FLOOD PREDICTION Nathalie Voisin, Andy W. Wood and.
CWB Midterm Review 2011 Forecast Applications Branch NOAA ESRL/GSD.
MoPED temperature, pressure, and relative humidity observations at sub- minute intervals are accessed and bundled at the University of Utah into 5 minute.
Global vs mesoscale ATOVS assimilation at the Met Office Global Large obs error (4 K) NESDIS 1B radiances NOAA-15 & 16 HIRS and AMSU thinned to 154 km.
1 Recent AMDAR (MDCRS/ACARS) Activities at GSD New AMDAR-RUC database that helps evaluate AMDAR data quality Optimization study that suggests data can.
Translating Advances in Numerical Weather Prediction into Official NWS Forecasts David P. Ruth Meteorological Development Laboratory Symposium on the 50.
Applied Meteorology Unit 1 Observation Denial and Performance of a Local Mesoscale Model Leela R. Watson William H. Bauman.
RUC Convective Probability Forecasts using Ensembles and Hourly Assimilation Steve Weygandt Stan Benjamin Forecast Systems Laboratory NOAA.
Gridded analyses of near-surface ozone concentration, wind, temperature, and moisture at hourly intervals at 1 km horizontal resolution derived using the.
Progress in development of HARMONIE 3D-Var and 4D-Var Contributions from Magnus Lindskog, Roger Randriamampianina, Ulf Andrae, Ole Vignes, Carlos Geijo,
Rapid Update Cycle-RUC
  Robert Gibson1, Douglas Drob2 and David Norris1 1BBN Technologies
Reinhold Steinacker Department of Meteorology and Geophysics
Objective Analyses & Data Representativeness
Nathalie Voisin, Andy W. Wood and Dennis P. Lettenmaier
Rapid Update Cycle-RUC Rapid Refresh-RR High Resolution Rapid Refresh-HRRR RTMA.
University of Washington Center for Science in the Earth System
New Developments in Aviation Forecast Guidance from the RUC
P2.5 Sensitivity of Surface Air Temperature Analyses to Background and Observation Errors Daniel Tyndall and John Horel Department.
Presentation transcript:

VERIFICATION OF NDFD GRIDDED FORECASTS USING ADAS John Horel 1, David Myrick 1, Bradley Colman 2, Mark Jackson 3 1 NOAA Cooperative Institute for Regional Prediction 2 National Weather Service, Seattle 3 National Weather Service, Salt Lake City Objective: Verify winter season NDFD gridded forecasts of temperature, dew point temperature, and wind speed over the western United States

Validation of NDFD Forecast Grids Developing effective gridded verification scheme is critical to identifying the capabilities and deficiencies of the IFPS forecast process (SOO White Paper 2003)  National efforts led by MDL to verify NDFD forecasts underway  Objective:  Evaluate and improve techniques required to verify NDFD grids  Method  Compare NDFD forecasts to analyses created at the Cooperative Institute for Regional Prediction (CIRP) at the University of Utah, using the Advanced Regional Prediction System Data Assimilation System (ADAS)  Period examined 00Z NDFD forecasts from 12 November 2003 – 29 February Verifying analyses from 17 November March  Many complementary validation strategies:  Forecasts available from NDFD for a particular grid box are intended to be representative of the conditions throughout that area (a 5 x 5 km 2 region)  Interpolate gridded forecasts to observing sites  Compare gridded forecasts to gridded analysis based upon observations  Verify gridded forecasts only where confidence in analysis is high

MesoWest and ROMAN  MesoWest: Cooperative sharing of current weather information around the nation  Real-time and retrospective access to weather information through state-of- the-art database edu/mesowest edu/mesowest  ROMAN: Real-Time Observation Monitor and Analysis Network  Provide real-time weather data around the nation to meteorologists and land managers for fire weather applications

2003 Fire Locations (Red); ROMAN stations (Grey) Fire locations provided by Remote Sensing Applications Center from MODIS imagery

Documentation  MesoWest: Horel et al. (2002) Bull. Amer. Meteor. Soc. February 2002  ROMAN:  Horel et al. (2004) Submitted to International Journal of Wildland Fire. Jan  Text:  Figures:  Horel et al. (2004) IIPS Conference  ADAS:  Myrick and Horel (2004). Submitted to Wea. Forecasting.  Lazarus et al. (2002) Wea. Forecasting  On-line help:

Are All Observations Equally Bad?  All measurements have errors (random and systematic)  Errors arise from many factors:  Siting (obstacles, surface characteristics)  Exposure to environmental conditions (e.g., temperature sensor heating/cooling by radiation, conduction or reflection)  Sampling strategies  Maintenance standards  Metadata errors (incorrect location, elevation) SNZ

Are All Observations Equally Good?  Why was the sensor installed?  Observing needs and sampling strategies vary (air quality, fire weather, road weather)  Station siting results from pragmatic tradeoffs: power, communication, obstacles, access  Use common sense  Wind sensor in the base of a mountain pass will likely blow from only two directions  Errors depend upon conditions (e.g., temperature spikes common with calm winds)  Use available metadata  Topography  Land use, soil, and vegetation type  Photos  Monitor quality control information  Basic consistency checks  Comparison to other stations UT9

ADAS: ARPS Data Assimilation System  ADAS is run in near-real time to create analyses of temperature, relative humidity, and wind over the western U. S. (Lazarus et al WAF)  Analyses on NWS GFE grid at 2.5, 5, and 10 km spacing in the West  Test runs made for lower 48 state NDFD grid at 5 km spacing  Typically > 2000 surface temperature and wind observations available via MesoWest for analysis (5500 for lower 48)  The 20km Rapid Update Cycle (RUC; Benjamin et al. 2002) is used for the background field  Background and terrain fields help to build spatial & temporal consistency in the surface fields  Efficiency of ADAS code improved significantly  Anisotropic weighting for terrain and coasts added (Myrick et al. 2004)  Current ADAS analyses are a compromise solution; suffer from many fundamental problems due to nature of optimum interpolation approach

ADAS Limitations  Analysis depends strongly upon the background field  Hour-to-hour consistency only through background field  Analysis sensitive to choice of background error decorrelation length scale  Wind field not adjusted to local terrain  Anisotropic weighting only partially implemented  Manual effort required to maintain station blacklist  Difficult to assess independently the quality of the analysis: analysis can be constrained to match observations, which typically leads to spurious analysis in data sparse regions

How “Good” are the Analysis Grids? Relative to MesoWest Observations in the West RUC-0ZRUC-12ZADAS-0ZADAS-12Z Bias MAE RMS Temperature ( o C): 17 Nov Mar. 2004

How “Good” are the Analysis Grids? Relative to MesoWest Observations in the West RUC-0ZRUC-12ZADAS-0ZADAS-12Z Bias MAE RMS Wind Speed (m/s): 17 Nov Mar. 2004

Arctic Outbreak: November 2003 NDFD 48 h forecastADAS Analysis

Upper Level Ridging and Surface Cold Pools: 13 January 2004 NDFD 48 h forecast ADAS Analysis

Validation of NDFD Forecasts at “Points”  NDFD forecasts are intended to be representative of 5x5 km 2 grid box  Compare NDFD forecasts at gridpoint adjacent (lower/left) to observations: inconsistent but avoids errors in complex terrain introduced by additional bilinear interpolation to observation location  Compare NDFD forecasts to ADAS and RUC verification grids at the same sample of gridpoints: no interpolation  All observation points have equal weight  Since they are distributed unequally, not all regions receive equal weight

Verification at ~2500 Obs. Locations in the West Verification of NDFD relative to Obs or ADAS similar RUC: too warm at 12Z: leads to large bias and RMS

Verification at ~2000 Obs. Locations Smaller RMS relative to ADAS since evaluating NDFD at same grid points NDFD winds too strong and RUC winds too strong as well

Where Do We Have Greater Confidence in the ADAS Analysis? White Regions- No observations close enough to adjust the RUC background Varies: diurnally, from day-to-day, between variables ADAS confidence regions defined where total weight >.25

Gridded Validation of NDFD Forecasts  RUC downscaled to NDFD grid using NDFD terrain  ADAS analysis performed on NDFD grid  Statistics based upon areas where sufficient observations to have “confidence” in the analysis denoted as “ADAS_C”

Average 00Z Temperature: DJF NDFD 48 h

48 h Forecast Temperature Bias (NDFD – Analysis) DJF NDFD-RUCNDFD-ADAS

48 h Forecast Temperature RMS Difference (NDFD – Analysis) 00z 18 Nov.-23 Dec RUCADAS

Average 00Z Dewpoint and Wind Speed DJF DewpointWind Speed

48 h Forecast RMS Difference (NDFD – Analysis) DJF Dewpoint Wind Speed

Bias and RMS for Temperature as a function of forecast length: DJF No difference when verification limited to areas where higher confidence in the ADAS analysis

Bias and RMS for Dewpoint Temperature as a function of forecast length: DJF Lower confidence in analysis of dewpoint temperature

Bias and RMS for Wind Speed as a function of forecast length: DJF NDFD has higher speed bias in regions with observations

Arctic Outbreak: November 2003 NDFD 48 h forecastADAS Analysis NDFD and ADAS DJF seasonal means removed

Surface Cold Pool Event: 13 January 2004 NDFD 48 h forecastADAS Analysis NDFD and ADAS DJF seasonal means removed

Solid-ADAS Dashed-ADAS_C

Solid-ADAS Dashed-ADAS_C

DJF Anomaly Pattern Correlations

Summary  At the present time, verification of NDFD forecasts is relatively insensitive to methodology. The errors of the NDFD forecasts are much larger than uncertainty in the verification data sets.  Differences between analyses (e.g., RUC vs. ADAS) and differences between analyses and observations are much smaller than differences between NDFD forecast grids and analyses or NDFD forecast grids and observations  Difference between ADAS temperature analysis on 5 km grid and station observations is order 1.5-2C  Difference between NDFD temperature forecast and ADAS temperature analysis is order 3-6C  Systematic NDFD forecast errors are evident that may be correctable at WFOs and through improved coordination between WFOs  Skill of NDFD forecast grids, when the seasonal average is removed to focus upon synoptic and mesoscale variation, depends strongly on the parameter and the synoptic situation:  Anomaly pattern correlations between NDFD and ADAS temperature grids over the western United States suggest forecasts are most skillful out to 72 h  Dew point temperature skill evident out to 48 h and wind speed out to 36 h  Little difference in NDFD skill when evaluated over areas where analysis confidence is higher  Some strongly forced synoptic situations are well forecast over the West as a whole  Persistence forecasts were hard to beat during cold pool events  Specific issues for NDFD Validation in Complex Terrain  Scales of physical processes  Analysis methodology  Validation techniques

Issues for NDFD Validation in Complex Terrain  Physical Process:  Horizontal spatial scales of severe weather phenomena in complex terrain often local and not sampled by NDFD 5 km grid  Vertical decoupling from ambient flow of surface wind during night is difficult to forecast. Which is better guidance: match locally light surface winds or focus upon synoptic-scale forcing?

Issues for NDFD Validation in Complex Terrain  Analysis Methodology  Analysis of record will require continuous assimilation of surface observations, as well as other data resources (radar, satellite, etc.)  Requires considerable effort to quality control observations (surface stations siting issues, radar terrain clutter problems, etc.)  Quality control of precipitation data is particularly difficult  NWP model used to drive assimilation must resolve terrain without smoothing at highest possible resolution (2.5 km)  NCEP proposing to provide analysis of record for such applications

Issues for NDFD Validation in Complex Terrain  Validation technique:  Upscaling of WFO grids to NDFD grid introduces sampling errors in complex terrain  Which fields are verified?  Max/min T vs. hourly temperature?  Max/min spikes  fitting of sinusoidal curve to Max/Min T to generate hourly T grids  instantaneous/time average temperature obs vs. max/min  Objectively identify regions where forecaster skill limited by sparse data

Ongoing and Future Work  Submit paper on ADAS evaluation of NDFD grids  Make available simplified ADAS code suitable for use at WFOs in GFE  Develop variational constraint that adjusts winds to local terrain  Improve anisotropic weighting  Collaborate with MDL and NCEP on applications of MesoWest observations and ADAS  Meeting on action plan for analysis of record: June 29-30