VERIFICATION OF NDFD GRIDDED FORECASTS IN THE WESTERN UNITED STATES John Horel 1, David Myrick 1, Bradley Colman 2, Mark Jackson 3 1 NOAA Cooperative Institute.

Slides:



Advertisements
Similar presentations
Chapter 13 – Weather Analysis and Forecasting
Advertisements

Report of the Q2 Short Range QPF Discussion Group Jon Ahlquist Curtis Marshall John McGinley - lead Dan Petersen D. J. Seo Jean Vieux.
Experiments with Monthly Satellite Ocean Color Fields in a NCEP Operational Ocean Forecast System PI: Eric Bayler, NESDIS/STAR Co-I: David Behringer, NWS/NCEP/EMC/GCWMB.
1 00/XXXX © Crown copyright Use of radar data in modelling at the Met Office (UK) Bruce Macpherson Mesoscale Assimilation, NWP Met Office EWGLAM / COST-717.
Observing, Analyzing, and Simulating the Boundary Layer in Northern Utah and Beyond John Horel, Erik Crosman, Xia Dong, Matt Lammers, Neil Lareau, Dan.
ROMAN: Real-Time Observation Monitor and Analysis Network John Horel, Mike Splitt, Judy Pechmann, Brian Olsen NOAA Cooperative Institute for Regional Prediction.
Hydrometeorological Prediction Center HPC Medium Range Grid Improvements Mike Schichtel, Chris Bailey, Keith Brill, and David Novak.
Daniel P. Tyndall and John D. Horel Department of Atmospheric Sciences, University of Utah Salt Lake City, Utah.
Statistics, data, and deterministic models NRCSE.
ROMAN: Real-Time Observation Monitor and Analysis Network John Horel, Mike Splitt, Judy Pechmann, Brian Olsen NOAA Cooperative Institute for Regional Prediction.
Brian Ancell, Cliff Mass, Gregory J. Hakim University of Washington
Transitioning unique NASA data and research technologies to the NWS 1 Evaluation of WRF Using High-Resolution Soil Initial Conditions from the NASA Land.
MOS Performance MOS significantly improves on the skill of model output. National Weather Service verification statistics have shown a narrowing gap between.
Consortium Meeting June 3, Thanks Mike! Hit Rates.
Analysis of Record Issues: Research Perspective John Horel NOAA Cooperative Institute for Regional Prediction Department of Meteorology University of Utah.
Robert LaPlante NOAA/NWS Cleveland, OH David Schwab Jia Wang NOAA/GLERL Ann Arbor, MI 22 March 2011.
Chapter 13 – Weather Analysis and Forecasting. The National Weather Service The National Weather Service (NWS) is responsible for forecasts several times.
1 NOAA’s National Climatic Data Center April 2005 Climate Observation Program Blended SST Analysis Changes and Implications for the Buoy Network 1.Plans.
CPC Unified Gauge – Satellite Merged Precipitation Analysis for Improved Monitoring and Assessments of Global Climate Pingping Xie, Soo-Hyun Yoo,
Verification Summit AMB verification: rapid feedback to guide model development decisions Patrick Hofmann, Bill Moninger, Steve Weygandt, Curtis Alexander,
Federal Department of Home Affairs FDHA Federal Office of Meteorology and Climatology MeteoSwiss High-resolution data assimilation in COSMO: Status and.
Development of an object- oriented verification technique for QPF Michael Baldwin 1 Matthew Wandishin 2, S. Lakshmivarahan 3 1 Cooperative Institute for.
Rapidly Updating Analysis (The RUA White Paper) Stephen Lord – NWS Affiliate Acknowledgements: Brad Colman NWS SSD Chiefs Stan Benjamin Geoff DiMego Ken.
NOAA’s National Weather Service IFP/NDFD Conceptual Overview Michael Tomlinson Office of Climate Water and Weather Services NWS Partners’ Workshop September.
NOAA’s National Weather Service National Digital Forecast Database: Status Update LeRoy Spayd Chief, Meteorological Services Division Unidata Policy Committee.
June 19, 2007 GRIDDED MOS STARTS WITH POINT (STATION) MOS STARTS WITH POINT (STATION) MOS –Essentially the same MOS that is in text bulletins –Number and.
By John Metz Warning Coordination Meteorologist WFO Corpus Christi.
Analysis of Record Andy Devanas Jeff Medlin Charlie Paxton Pablo Santos Dave Sharp Irv Watson Pat Welsh Suggestions and Concerns from the Florida Science.
VERIFICATION OF NDFD GRIDDED FORECASTS USING ADAS John Horel 1, David Myrick 1, Bradley Colman 2, Mark Jackson 3 1 NOAA Cooperative Institute for Regional.
The First Step: Translating Needs to Requirements Many applications require the current and past states of the atmosphere near the surface at high spatial.
OUTLINE Current state of Ensemble MOS
1 Agenda Topic: National Blend Presented By: Kathryn Gilbert (NWS/NCEP) Team Leads: Dave Myrick, David Ruth (NWS/OSTI/MDL), Dave Novak (NCEP/WPC), Jeff.
AMB Verification and Quality Control monitoring Efforts involving RAOB, Profiler, Mesonets, Aircraft Bill Moninger, Xue Wei, Susan Sahm, Brian Jamison.
Quality control of daily data on example of Central European series of air temperature, relative humidity and precipitation P. Štěpánek (1), P. Zahradníček.
Integration of Surface Weather Observations with MODIS Imagery for Fire Weather Applications Mike Splitt, Brian Olsen, John Horel, Judy Pechmann NOAA Cooperative.
Synthesizing Weather Information for Wildland Fire Decision Making in the Great Lakes Region John Horel Judy Pechmann Chris Galli Xia Dong University of.
National Weather Service Goes Digital With Internet Mapping Ken Waters National Weather Service, Honolulu HI Jack Settelmaier National Weather Service,
Part II  Access to Surface Weather Conditions:  MesoWest & ROMAN  Surface Data Assimilation:  ADAS.
P1.7 The Real-Time Mesoscale Analysis (RTMA) An operational objective surface analysis for the continental United States at 5-km resolution developed by.
2006(-07)TAMDAR aircraft impact experiments for RUC humidity, temperature and wind forecasts Stan Benjamin, Bill Moninger, Tracy Lorraine Smith, Brian.
Robert LaPlante NOAA/NWS Cleveland, OH David Schwab Jia Wang NOAA/GLERL Ann Arbor, MI 15 March 2012.
APPLICATION OF NUMERICAL MODELS IN THE FORECAST PROCESS - FROM NATIONAL CENTERS TO THE LOCAL WFO David W. Reynolds National Weather Service WFO San Francisco.
VERIFICATION OF NDFD GRIDDED FORECASTS USING ADAS John Horel 1, David Myrick 1, Bradley Colman 2, Mark Jackson 3 1 NOAA Cooperative Institute for Regional.
1 National HIC/RH/HQ Meeting ● January 27, 2006 version: FOCUSFOCUS FOCUSFOCUS FOCUS FOCUSFOCUS FOCUSFOCUS FOCUSFOCUS FOCUSFOCUS FOCUSFOCUS FOCUSFOCUS.
NFUSE Conference Call 4/11/07 How Good Does a Forecast Really Need To Be? David Myrick Western Region Headquarters Scientific Services Division.
1 Gridded Localized Aviation MOS Program (LAMP) Guidance for Aviation Forecasting Judy E. Ghirardelli and Bob Glahn National Weather Service Meteorological.
Evaluation of the Real-Time Ocean Forecast System in Florida Atlantic Coastal Waters June 3 to 8, 2007 Matthew D. Grossi Department of Marine & Environmental.
NOAA/NWS Digital Services 1 NWS Forecast Evolution and Delivery in a Digital Era Glenn Austin Office of Climate, Water, and Weather Services David Ruth.
Wind Gust Analysis in RTMA Yanqiu Zhu, Geoff DiMego, John Derber, Manuel Pondeca, Geoff Manikin, Russ Treadon, Dave Parrish, Jim Purser Environmental Modeling.
Verification of ensemble precipitation forecasts using the TIGGE dataset Laurence J. Wilson Environment Canada Anna Ghelli ECMWF GIFS-TIGGE Meeting, Feb.
ROMAN: Real-Time Observation Monitor and Analysis Network John Horel, Mike Splitt, Judy Pechmann, Brian Olsen NOAA Cooperative Institute for Regional Prediction.
MDL Requirements for RUA Judy Ghirardelli, David Myrick, and Bruce Veenhuis Contributions from: David Ruth and Matt Peroutka 1.
Digital Forecast Process Evolution David Ruth December 5, 2008.
A. FY12-13 GIMPAP Project Proposal Title Page version 04 August 2011 Title: Fusing Goes Observations and RUC/RR Model Output for Improved Cloud Remote.
VERIFICATION OF A DOWNSCALING SEQUENCE APPLIED TO MEDIUM RANGE METEOROLOGICAL PREDICTIONS FOR GLOBAL FLOOD PREDICTION Nathalie Voisin, Andy W. Wood and.
1 NWS Digital Services American Meteorological Society Annual Partners Meeting San Diego, CA January 13, 2005 LeRoy Spayd National Weather Service Office.
Global vs mesoscale ATOVS assimilation at the Met Office Global Large obs error (4 K) NESDIS 1B radiances NOAA-15 & 16 HIRS and AMSU thinned to 154 km.
1 Recent AMDAR (MDCRS/ACARS) Activities at GSD New AMDAR-RUC database that helps evaluate AMDAR data quality Optimization study that suggests data can.
Translating Advances in Numerical Weather Prediction into Official NWS Forecasts David P. Ruth Meteorological Development Laboratory Symposium on the 50.
1 Application of MET for the Verification of the NWP Cloud and Precipitation Products using A-Train Satellite Observations Paul A. Kucera, Courtney Weeks,
RUC Convective Probability Forecasts using Ensembles and Hourly Assimilation Steve Weygandt Stan Benjamin Forecast Systems Laboratory NOAA.
Application of Probability Density Function - Optimal Interpolation in Hourly Gauge-Satellite Merged Precipitation Analysis over China Yan Shen, Yang Pan,
Breakout Session 3: Analysis Strategies Charge(s): –Identify and evaluate the current capabilities to develop AORs –Recommendations on overcoming current.
Objective Analyses & Data Representativeness
Nathalie Voisin, Andy W. Wood and Dennis P. Lettenmaier
Challenge: High resolution models need high resolution observations
New Developments in Aviation Forecast Guidance from the RUC
P2.5 Sensitivity of Surface Air Temperature Analyses to Background and Observation Errors Daniel Tyndall and John Horel Department.
Presentation transcript:

VERIFICATION OF NDFD GRIDDED FORECASTS IN THE WESTERN UNITED STATES John Horel 1, David Myrick 1, Bradley Colman 2, Mark Jackson 3 1 NOAA Cooperative Institute for Regional Prediction 2 National Weather Service, Seattle 3 National Weather Service, Salt Lake City Objective: Verify month sample of NDFD gridded forecasts of temperature, dew point temperature, and wind speed over the western United States

IFPS and NDFD  NWS has undergone major change in procedures to generate and distribute forecasts  Interactive Forecast Preparation System (IFPS; Ruth 2002) used to create experimental high-resolution gridded forecasts of many weather elements  Forecast grids at resolutions of 1.25, 2.5, or 5 km produced at each NWS Warning and Forecast Office (WFO) and cover their respective County Warning Area (CWA)  CWA grids combined into National Digital Forecast Database (NDFD; Glahn and Ruth 2003) at 5-km resolution  NDFD elements include: temperature, dewpoint, wind speed, sky cover, maximum and minimum temperature, probability of precipitation, and weather  Available up to hourly temporal intervals with lead times up to 7 days  Products can be:  viewed graphically  downloaded by customers and partners  linked to formatting software to produce traditional NWS text products

Validation of NDFD Forecast Grids Developing effective gridded verification scheme is critical to identifying the capabilities and deficiencies of the IFPS forecast process (SOO White Paper 2003)  National efforts led by MDL to verify NDFD forecasts underway  Forecasts available from NDFD for a particular grid box are intended to be representative of the conditions throughout that area (a 5 x 5 km 2 region)  Many complementary validation strategies:  Interpolate gridded forecasts to observing sites  Compare gridded forecasts to gridded analysis based upon observations  Objective of this preliminary study:  Compare NDFD forecasts to analyses created at the Cooperative Institute for Regional Prediction (CIRP) at the University of Utah, using the Advanced Regional Prediction System Data Assimilation System (ADAS)  Period examined 12 November – 24 December 2003

ADAS: ARPS Data Assimilation System  ADAS is run in near-real time to create analyses of temperature, relative humidity, and wind over the western U. S. (Lazarus et al WAF)  Analyses on NWS GFE grid at 2.5, 5, and 10 km spacing  Typically > 2000 surface temperature and wind observations available via MesoWest for analysis  The 20km Rapid Update Cycle (RUC; Benjamin et al. 2002) is used for the background field  Background and terrain fields help to build spatial & temporal consistency in the surface fields  Current ADAS analyses are a compromise solution; suffer from many fundamental problems due to nature of optimum interpolation approach

MesoWest  MesoWest: Cooperative sharing of current weather information around the nation  Real-time and retrospective access to weather information through state-of-the-art database  edu/mesowest edu/mesowest  Horel et al. (2002) Bull. Amer. Meteor. Soc.

Arctic Outbreak: November 2003 NDFD 48 h forecastADAS Analysis

RMS difference RUC2-OBS: 2.7C (0z) 4.0C (12z) RMS difference ADAS-OBS: 1.7C (0z) 2.4C (12z) ADAS Analysis Average 00Z Temperature: 18 Nov.- 23 Dec H NDFD Forecast

48 h Forecast Bias (NDFD –ADAS) 00z 18 Nov.-23 Dec. 2003

Average RMS Differences between NDFD Forecasts and ADAS grids over the Western United States NDFD Forecasts Issued 00z. Period: 12 Nov.-24 Dec Valid at 0z

Arctic Outbreak: November 2003 NDFD 48 h forecastADAS Analysis NDFD and ADAS sample means removed

Temperature spatial anomaly pattern correlation as a function of NDFD forecast length during 12 Nov.-24 Dec Anomaly relative to sample average for NDFD and ADAS Nov.Dec. Comparison of daily temperature anomaly maps

Temperature spatial anomaly pattern correlation as a function of NDFD forecast length. Average 12 Nov.-24 Dec Anomaly relative to sample average for NDFD and ADAS

Summary  Assimilation of surface data is critical for generating and verifying gridded forecasts of surface parameters  MDL is using RUC for national NDFD validation and is exploring use of ADAS in the West  Differences between ADAS analysis and NDFD forecast grids result from combination of analysis and forecast errors  Difference between ADAS temperature analysis on 5 km grid and station observations is order C  Difference between NDFD temperature forecast and ADAS temperature analysis is order 3-5C. May reflect upper bound of forecast error since ADAS analysis contains biases  Anomaly pattern correlations between NDFD and ADAS temperature grids over the western United States suggest forecasts are most skillful out to 48 h  Major issue for NDFD validation: true state of atmosphere is unknown  Specific issues for NDFD Validation in Complex Terrain  Scales of physical processes  Analysis methodology  Validation techniques

Issues for NDFD Validation in Complex Terrain  Analysis Methodology  Analysis of record will require continuous assimilation of surface observations, as well as other data resources (radar, satellite, etc.)  Requires considerable effort to quality control observations (surface stations siting issues, radar terrain clutter problems, etc.)  Quality control of precipitation data is particularly difficult  NWP model used to drive assimilation must resolve terrain without smoothing at highest possible resolution (2.5 km)  NCEP proposing to provide analysis of record for such applications

Issues for NDFD Validation in Complex Terrain  Validation technique:  Upscaling of WFO grids to NDFD grid introduces sampling errors in complex terrain  Which fields are verified?  Max/min T vs. hourly temperature?  Max/min spikes  fitting of sinusoidal curve to Max/Min T to generate hourly T grids  instantaneous/time average temperature obs vs. max/min  Objectively identify regions where forecaster skill limited by sparse data

Related Presentations  Monday Poster Session. David Myrick. A Modification to the Bratseth Method of Successive Corrections for Complex Terrain  Mike Splitt. Geospatial Uncertainty Analysis and Gridded Forecast Verification. Room 3A 8:30 Tuesday

Average RMS Differences between NDFD Forecasts and ADAS grids over the Western United States NDFD Forecasts Issued 00z. Period: 12 Nov.-20Dec Valid at 0z and 12z

48 h Forecast RMS Difference (NDFD – ADAS) 00z 18 Nov.- 23 Dec. 2003