1 Application of MET for the Verification of the NWP Cloud and Precipitation Products using A-Train Satellite Observations Paul A. Kucera, Courtney Weeks,

Slides:



Advertisements
Similar presentations
Precipitation in IGWCO The objectives of IGWCO require time series of accurate gridded precipitation fields with fine spatial and temporal resolution for.
Advertisements

Model Evaluation Tools MET. What is MET Model Evaluation Tools ( MET )- a powerful and highly configurable verification package developed by DTC offering:
Introduction to data assimilation in meteorology Pierre Brousseau, Ludovic Auger ATMO 08,Alghero, september 2008.
Report of the Q2 Short Range QPF Discussion Group Jon Ahlquist Curtis Marshall John McGinley - lead Dan Petersen D. J. Seo Jean Vieux.
Quantification of Spatially Distributed Errors of Precipitation Rates and Types from the TRMM Precipitation Radar 2A25 (the latest successive V6 and V7)
Calibration of GOES-R ABI cloud products and TRMM/GPM observations to ground-based radar rainfall estimates for the MRMS system – Status and future plans.
Water vapor estimates using simultaneous S and Ka band radar measurements Scott Ellis, Jothiram Vivekanandan NCAR, Boulder CO, USA.
Validation of Satellite Precipitation Estimates for Weather and Hydrological Applications Beth Ebert BMRC, Melbourne, Australia 3 rd IPWG Workshop / 3.
Assessment of Tropical Rainfall Potential (TRaP) forecasts during the Australian tropical cyclone season Beth Ebert BMRC, Melbourne, Australia.
Monitoring the Quality of Operational and Semi-Operational Satellite Precipitation Estimates – The IPWG Validation / Intercomparison Study Beth Ebert Bureau.
Demonstration Testbed for the Evaluation of Experimental Models for Tropical Cyclone Forecasting in Support of the NOAA Hurricane Forecast Improvement.
Verification Methods for High Resolution Model Forecasts Barbara Brown NCAR, Boulder, Colorado Collaborators: Randy Bullock, John Halley.
NWP Verification with Shape- matching Algorithms: Hydrologic Applications and Extension to Ensembles Barbara Brown 1, Edward Tollerud 2, Tara Jensen 1,
THORPEX-Pacific Workshop Kauai, Hawaii Polar Meteorology Group, Byrd Polar Research Center, The Ohio State University, Columbus, Ohio David H. Bromwich.
Atmospheric structure from lidar and radar Jens Bösenberg 1.Motivation 2.Layer structure 3.Water vapour profiling 4.Turbulence structure 5.Cloud profiling.
Exploring the Use of Object- Oriented Verification at the Hydrometeorological Prediction Center Faye E. Barthold 1,2, Keith F. Brill 1, and David R. Novak.
Weather Research & Forecasting Model (WRF) Stacey Pensgen ESC 452 – Spring ’06.
The Centre for Australian Weather and Climate Research A partnership between CSIRO and the Bureau of Meteorology Object-based Spatial Verification for.
Ensemble Post-Processing and it’s Potential Benefits for the Operational Forecaster Michael Erickson and Brian A. Colle School of Marine and Atmospheric.
Chapter 13 – Weather Analysis and Forecasting. The National Weather Service The National Weather Service (NWS) is responsible for forecasts several times.
Mesoscale & Microscale Meteorological Division / ESSL / NCAR WRF (near) Real-Time High-Resolution Forecast Using Bluesky Wei Wang May 19, 2005 CISL User.
CARPE DIEM Centre for Water Resources Research NUID-UCD Contribution to Area-3 Dusseldorf meeting 26th to 28th May 2003.
A Radar Data Assimilation Experiment for COPS IOP 10 with the WRF 3DVAR System in a Rapid Update Cycle Configuration. Thomas Schwitalla Institute of Physics.
Objective Evaluation of Aviation Related Variables during 2010 Hazardous Weather Testbed (HWT) Spring Experiment Tara Jensen 1*, Steve Weiss 2, Jason J.
1 On the use of radar data to verify mesoscale model precipitation forecasts Martin Goeber and Sean Milton Model Diagnostics and Validation group Numerical.
June, 2003EUMETSAT GRAS SAF 2nd User Workshop. 2 The EPS/METOP Satellite.
“High resolution ensemble analysis: linking correlations and spread to physical processes ” S. Dey, R. Plant, N. Roberts and S. Migliorini Mesoscale group.
Towards an object-oriented assessment of high resolution precipitation forecasts Janice L. Bytheway CIRA Council and Fellows Meeting May 6, 2015.
VERIFICATION OF NDFD GRIDDED FORECASTS IN THE WESTERN UNITED STATES John Horel 1, David Myrick 1, Bradley Colman 2, Mark Jackson 3 1 NOAA Cooperative Institute.
IMPROVING VERY-SHORT-TERM STORM PREDICTIONS BY ASSIMILATING RADAR AND SATELLITE DATA INTO A MESOSCALE NWP MODEL Allen Zhao 1, John Cook 1, Qin Xu 2, and.
Verification Approaches for Ensemble Forecasts of Tropical Cyclones Eric Gilleland, Barbara Brown, and Paul Kucera Joint Numerical Testbed, NCAR, USA
Ebert-McBride Technique (Contiguous Rain Areas) Ebert and McBride (2000: Verification of precipitation in weather systems: determination of systematic.
Celeste Saulo and Juan Ruiz CIMA (CONICET/UBA) – DCAO (FCEN –UBA)
Potential Benefits of Multiple-Doppler Radar Data to Quantitative Precipitation Forecasting: Assimilation of Simulated Data Using WRF-3DVAR System Soichiro.
Modern Era Retrospective-analysis for Research and Applications: Introduction to NASA’s Modern Era Retrospective-analysis for Research and Applications:
Ligia Bernardet, S. Bao, C. Harrop, D. Stark, T. Brown, and L. Carson Technology Transfer in Tropical Cyclone Numerical Modeling – The Role of the DTC.
Evaluation of Passive Microwave Rainfall Estimates Using TRMM PR and Ground Measurements as References Xin Lin and Arthur Y. Hou NASA Goddard Space Flight.
Yuying Zhang, Jim Boyle, and Steve Klein Program for Climate Model Diagnosis and Intercomparison Lawrence Livermore National Laboratory Jay Mace University.
Thomas Ackerman Roger Marchand University of Washington.
Feature-based (object-based) Verification Nathan M. Hitchens National Severe Storms Laboratory.
Observed & Simulated Profiles of Cloud Occurrence by Atmospheric State A Comparison of Observed Profiles of Cloud Occurrence with Multiscale Modeling Framework.
HMT-DTC Project – 2009 Funded by USWRP Collaborators: NCAR – Tara Jensen, Tressa Fowler, John Halley-Gotway, Barb Brown, Randy Bullock ESRL – Ed Tollerud,
Do the NAM and GFS have displacement biases in their MCS forecasts? Charles Yost Russ Schumacher Department of Atmospheric Sciences Texas A&M University.
ASAP In-Flight Icing Research at NCAR J. Haggerty, F. McDonough, J. Black, S. Landolt, C. Wolff, and S. Mueller In collaboration with: P. Minnis and W.
Diagnostic Evaluation of Mesoscale Models Chris Davis, Barbara Brown, Randy Bullock and Daran Rife NCAR Boulder, Colorado, USA.
Layered Water Vapor Quick Guide by NASA / SPoRT and CIRA Why is the Layered Water Vapor Product important? Water vapor is essential for creating clouds,
Challenges and Strategies for Combined Active/Passive Precipitation Retrievals S. Joseph Munchak 1, W. S. Olson 1,2, M. Grecu 1,3 1: NASA Goddard Space.
Opportunities for Satellite Observations in Validation and Assessment Barbara Brown NCAR/RAL 19 November 2008.
Diagnostic verification and extremes: 1 st Breakout Discussed the need for toolkit to build beyond current capabilities (e.g., NCEP) Identified (and began.
WRF Verification Toolkit Workshop, Boulder, February 2007 Spatial verification of NWP model fields Beth Ebert BMRC, Australia.
1 GOES-R AWG Product Validation Tool Development Hydrology Application Team Bob Kuligowski (STAR)
The WRF Verification Toolkit Lacey Holland, Tressa Fowler, and Barbara Brown Research Applications Laboratory NCAR Lacey Holland, Tressa Fowler, and Barbara.
Center for Satellite Applications and Research (STAR) Review 09 – 11 March 2010 Combining GOES Observations with Other Data to Improve Severe Weather Forecasts.
Testing of Objective Analysis of Precipitation Structures (Snowbands) using the NCAR Developmental Testbed Center (DTC) Model Evaluation Tools (MET) Software.
11 Short-Range QPF for Flash Flood Prediction and Small Basin Forecasts Prediction Forecasts David Kitzmiller, Yu Zhang, Wanru Wu, Shaorong Wu, Feng Ding.
Application of the CRA Method Application of the CRA Method William A. Gallus, Jr. Iowa State University Beth Ebert Center for Australian Weather and Climate.
2. WRF model configuration and initial conditions  Three sets of initial and lateral boundary conditions for Katrina are used, including the output from.
Application of object-oriented verification techniques to ensemble precipitation forecasts William A. Gallus, Jr. Iowa State University June 5, 2009.
60 min Nowcasts 60 min Verification Cold Front Regime
A dual-polarization QPE method based on the NCAR Particle ID algorithm Description and preliminary results Michael J. Dixon1, J. W. Wilson1, T. M. Weckwerth1,
  Robert Gibson1, Douglas Drob2 and David Norris1 1BBN Technologies
Convective Scale Modelling Humphrey Lean et. al
Radar/Surface Quantitative Precipitation Estimation
Welcome to The Third Workshop of the
Validation Working Group
Verification of nowcasting products: Issues and methods
Validation of Satellite Precipitation Estimates using High-Resolution Surface Rainfall Observations in West Africa Paul A. Kucera and Andrew J. Newman.
New Developments in Aviation Forecast Guidance from the RUC
Assimilation of Global Positioning System Radio Occultation Observations Using an Ensemble Filter in Atmospheric Prediction Models Hui Liu, Jefferey Anderson,
Verification of Tropical Cyclone Forecasts
Presentation transcript:

1 Application of MET for the Verification of the NWP Cloud and Precipitation Products using A-Train Satellite Observations Paul A. Kucera, Courtney Weeks, Barbara Brown, and Randy Bullock National Center for Atmospheric Research/Research Applications Laboratory Boulder, Colorado USA IPWG October 2010

Outline Introduction MET Overview Motivation Example Comparisons Future Work 2

3 Introduction We recently started a project to demonstrate the application of the NCAR Model Evaluation Tools (MET) for forecast evaluation using A-train (currently focused on CloudSat) observations The goal of the project is to create a common toolkit for integrating satellite observations in a framework that permits meaningful comparisons with numerical model output –Evaluate forecast products and features to determine when and why forecasts are sometimes deficient Develop tools that can easily extend to precipitation validation of satellite products

4 MET Overview MET was originally developed to support the Developmental Testbed Center (DTC) at NCAR and is a community resource: –It is a stand alone software package that is associated with the Weather Research and Forecasting (WRF) system. –Available at:

5 MET Overview MET is designed to evaluate a variety of datasets (e.g., rain gauge, radar, satellite, NWP products). The following statistical tools are available: Grid-to-point (e.g., NWP or satellite precipitation products to rain gauge) Grid-to-grid validation (e.g., NWP precipitation to satellite or radar precipitation products) Advanced spatial validation techniques Compare precipitation “features” in gridded fields

6 Motivation Most evaluation studies rely on the use of standard measures to quantify the quality of forecasts or observed fields: Mean error Bias Mean absolute error Root mean squared error...

7 Motivation Traditional statistics are often not able to account for spatial or temporal errors: Displacement in time and/or space Location Intensity Orientation errors There has been recent studies to develop spatial evaluation techniques

8 Motivation There are several spatial methods that are currently being developed: –Neighborhood –Scale separation –Field deformation –Features-based We have been developing a feature-based method called MODE

Method for Object-based Diagnostic Evaluation (MODE) 9 24-h precip forecast Precip analysis MODE results indicate Slightly displaced (centroid distance) Too intense (median intensity) A little large (ratio of areas) In contrast: POD = 0.40 FAR = 0.56 CSI = 0.27

10 Example CloudSat/NWP Comparisons We performed our comparison using the RUC ( cloud top height and derived reflectivity products at a spatial resolution of 20 km over the continental US Performed a seasonal correlation study for different cloud types

Example: 06 September

Standard Statistics Cloud Height Distributions along the track 12

Standard Statistics Forecast mean and standard deviation with 95% confidence intervals –Different weighting methods UTC

Standard Statistics Correlation (pearson, kendell, spearman), multiplicative bias, mean error, mean squared error, root mean squared error, bias corrected mean squared error and mean absolute error with 95% confidence intervals (boot strap method) –Different weighting methods UTC

Object based Verification MODE is being modified to spatially and temporally search model fields in the vertical to find matched objects: –parallel to track, cross track, and temporally Next slides show examples of search routines being implemented in MODE 15

Object-based Comparison: Along Track 16

Object-based Comparison: Parallel to Track (Westward) 17

Object-based Comparison: Parallel to Track (Eastward) 18

Object-based Comparison: Cross Track 19

Object-based Comparison: -1 hour (0700 UTC) 20

Object-based Comparison: +1 hour (0900 UTC) 21

Spatial/Seasonal Comparison Spatial Correlation along track for different CloudSat cloud types and for different seasons Combine with other datasets (e.g., MODIS) to better understand spatial variability off track 22

Future Work Future updates to MET will include: –Finish implementing read and remap tools for A-Train products into MET –Add object-based verification in the vertical plane to MODE –Add methods for verifying through in time –Improve methods for verifying cloud and precipitation properties Implement METViewer Database and Display system 23

Future Work A-train comparisons –Reflectivity profiles, cloud top height, cloud base, precipitation, cloud type, cloud phase, etc. Currently developing a database of case studies for comparison –Tropical storms, multilayer clouds, complex terrain The tool is easily extended to other satellite datasets such HRPP, TRMM, GPM, etc. 24

Acknowledgments The study is supported by NASA Project: NNX09AN79G We used Quickbeam developed at CSU for some of the initial analysis: –Haynes, J.M., R.T. Marchand, Z. Luo, A. Bodas- Salcedo, and G.L. Stephens, 2007: A multi-purpose radar simulation package: QuickBeam. Bull. Amer. Meteor. Soc., 88,

Thank You 26