1 Program to Evaluate High Resolution Precipitation Products (PEHRPP): An Update Matt Sapiano P. Arkin, J. Janowiak, D. Vila, Univ. of Maryland/ESSIC,

Slides:



Advertisements
Similar presentations
Validation of Satellite Rainfall Estimates over the Mid-latitudes Chris Kidd University of Birmingham, UK.
Advertisements

Precipitation in IGWCO The objectives of IGWCO require time series of accurate gridded precipitation fields with fine spatial and temporal resolution for.
World Meteorological Organization Working together in weather, climate and water Snowfall Measurement Challenges WMO SPICE Solid Precipitation Intercomparison.
Report of the Q2 Short Range QPF Discussion Group Jon Ahlquist Curtis Marshall John McGinley - lead Dan Petersen D. J. Seo Jean Vieux.
Quantification of Spatially Distributed Errors of Precipitation Rates and Types from the TRMM Precipitation Radar 2A25 (the latest successive V6 and V7)
WMO Outcomes & Recommendations of the Workshop on Radar Data Exchange WMO; Name of Department (ND)
THE USE OF REMOTE SENSING DATA/INFORMATION AS PROXY OF WEATHER AND CLIMATE IN THE GREATER HORN OF AFRICA Gilbert O Ouma IGAD Climate Applications and Prediction.
Validation of Satellite Precipitation Estimates for Weather and Hydrological Applications Beth Ebert BMRC, Melbourne, Australia 3 rd IPWG Workshop / 3.
Characteristics of High-Resolution Satellite Precipitation Products in Spring and Summer over China Yan Shen 1, A.-Y. Xiong 1 Pingping Xie 2 1. National.
Monitoring the Quality of Operational and Semi-Operational Satellite Precipitation Estimates – The IPWG Validation / Intercomparison Study Beth Ebert Bureau.
Princeton University Global Evaluation of a MODIS based Evapotranspiration Product Eric Wood Hongbo Su Matthew McCabe.
Intercomparing and evaluating high- resolution precipitation products M. R. P. Sapiano*, P. A. Arkin*, S. Sorooshian +, K. Hsu + * ESSIC, University of.
Infusing Information from SNPP and GOES-R Observations for Improved Monitoring of Weather, Water and Climate Pingping Xie, Robert Joyce, Shaorong Wu and.
Phil Arkin, Earth System Science Interdisciplinary Center University of Maryland, College Park (Presenter) J. Janowiak, M. Sapiano, D. Vila, ESSIC/UMCP.
GLOBAL CLIMATE OBSERVING SYSTEM- REQUIREMENTS AND REALITIES OF PROVIDING OVERLAPPING RADIOSONDE FLIGHT SERIES DATA FOR LONG TERM CLIMATE CONTINUITY Carl.
The IPWG* Precipitation Validation Program The IPWG* Precipitation Validation Program Phillip A. Arkin and John Janowiak ESSIC/University of MarylandINTRODUCTION.
A Kalman Filter Approach to Blend Various Satellite Rainfall Estimates in CMORPH Robert Joyce NOAA/NCEP/CPC Wyle Information Systems Pingping Xie NOAA/NCEP/CPC.
1 GOES-R AWG Hydrology Algorithm Team: Rainfall Probability June 14, 2011 Presented By: Bob Kuligowski NOAA/NESDIS/STAR.
The Evaluation of a Passive Microwave-Based Satellite Rainfall Estimation Algorithm with an IR-Based Algorithm at Short time Scales Robert Joyce RS Information.
2 nd International IPWG Workshop Monterey, CA, October, 2004 The International Precipitation Working Group Arnold Gruber – NOAA/NESDIS, Camp Springs,
John Janowiak Climate Prediction Center/NCEP/NWS Jianyin Liang China Meteorological Agency Pingping Xie Climate Prediction Center/NCEP/NWS Robert Joyce.
CPC Unified Gauge – Satellite Merged Precipitation Analysis for Improved Monitoring and Assessments of Global Climate Pingping Xie, Soo-Hyun Yoo,
Multi-mission synergistic activities: A new era of integrated missions Christa Peters- Lidard Deputy Director, Hydrospheric and Biospheric Sciences, Goddard.
Earth Science Division National Aeronautics and Space Administration 18 January 2007 Paper 5A.4: Slide 1 American Meteorological Society 21 st Conference.
IPWG 2000: CGMS-28 initiated the establishment of a Working Group on Precipitation, with co-sponsorship from WMO and CGMS.
29 October 20093rd CEOS PC Workshop - Salt Lake City, Utah, USA 1 Update on IPWG for the CEOS PC Workshop International Precipitation Working Group
IPWG Validation current status and future directions Chris Kidd Beth Ebert John Janowiak The University of Birmingham, Birmingham, UK Bureau of Meteorology,
GHP and Extremes. GHP SCIENCE ISSUES 1995 How do water and energy processes operate over different land areas? Sub-Issues include: What is the relative.
1 GOES-R AWG Hydrology Algorithm Team: Rainfall Potential June 14, 2011 Presented By: Bob Kuligowski NOAA/NESDIS/STAR.
Data assimilation and observing systems strategies Pierre Gauthier Data Assimilation and Satellite Meteorology Division Meteorological Service of Canada.
Cooperative Institute of Climate Studies University of Maryland 2207 Computer & Spaces Sciences Bldg. College Park, MD Tel: (301) Fax:
Mission: Transition unique NASA and NOAA observations and research capabilities to the operational weather community to improve short-term weather forecasts.
Combining CMORPH with Gauge Analysis over
Slide: 1 Quality indicators in an operational precipitation product IPWG meeting 4 Beijing, October 2008 Presented by: Thomas Heinemann Meteorological.
3 rd INTERNATIONAL PRECIPITATION WORKING GROUP WORKSHOP October 2006, Melbourne, Australia EVALUATION OF HIGH RESOLUTION SATELLITE PRODUCTS UNDER.
Considerations for the blending of multiple precipitation datasets for hydrological applications Luigi Renzullo Research Scientist CSIRO Land & Water,
A New Inter-Comparison of Three Global Monthly SSM/I Precipitation Datasets Matt Sapiano, Phil Arkin and Tom Smith Earth Systems Science Interdisciplinary.
CPC Unified Precipitation Project Pingping Xie, Wei Shi, Mingyue Chen and Sid Katz NOAA’s Climate Prediction Center
A Global Kalman Filtered CMORPH using TRMM to Blend Satellite Rainfall Robert Joyce NOAA/NCEP/CPC Wyle Information Systems Pingping Xie NOAA/NCEP/CPC John.
4th IPWG Workshop Chinese Meteorological Agency, Beijing, China, October, 2008 MAINSTREAMING THE OPERATIONAL USE OF SATELLITE PRECIPITATION DATA.
13-17 October th International Precipitation Working Group (IPWG) Workshop – Beijing, China 1 4 th International Precipitation Working Group Workshop.
13-17 October th International Precipitation Working Group (IPWG) Workshop – Beijing, China 1 Chris Kidd The University of Birmingham Birmingham,
1 Inter-comparing high resolution satellite precipitation estimates at different scales Phil Arkin and Matt Sapiano Cooperative Institute for Climate Studies.
Science plan S2S sub-project on verification. Objectives Recommend verification metrics and datasets for assessing forecast quality of S2S forecasts Provide.
COMPARING HRPP PRODUCTS OVER LARGE SPACE AND TIME SCALES Wesley Berg Department of Atmospheric Science Colorado State University.
Operational Working Group Bob Joyce (chair), Ralph Ferraro, Joe Turk, Jim Purdom, George Huffman, Arnie Gruber, Vincenzo Levizzani, Deborah Smith, Jorge.
1 Application of MET for the Verification of the NWP Cloud and Precipitation Products using A-Train Satellite Observations Paul A. Kucera, Courtney Weeks,
NOAA, May 2014 Coordination Group for Meteorological Satellites - CGMS NOAA Activities toward Transitioning Mature R&D Missions to an Operational Status.
VALIDATION OF HIGH RESOLUTION SATELLITE-DERIVED RAINFALL ESTIMATES AND OPERATIONAL MESOSCALE MODELS FORECASTS OF PRECIPITATION OVER SOUTHERN EUROPE 1st.
Application of Probability Density Function - Optimal Interpolation in Hourly Gauge-Satellite Merged Precipitation Analysis over China Yan Shen, Yang Pan,
The WCRP strategic framework : Coordinated Observation and Prediction of the Earth System (COPES) Peter Lemke (AWI), David Carson (WMO) and members.
Validation of Satellite Rainfall Estimates over the Mid-latitudes Chris Kidd University of Birmingham, UK.
High Resolution Gauge – Satellite Merged Analyses of Precipitation: A 15-Year Record Pingping Xie, Soo-Hyun Yoo, Robert Joyce, Yelena Yarosh, Shaorong.
*CPC Morphing Technique
Multi-scale validation of high resolution precipitation products
Radar/Surface Quantitative Precipitation Estimation
Peter May and Beth Ebert CAWCR Bureau of Meteorology Australia
Welcome to The Third Workshop of the
Validation Working Group
Rain Gauge Data Merged with CMORPH* Yields: RMORPH
PEHRPP Applications Working Group
Validation of Satellite Precipitation Estimates using High-Resolution Surface Rainfall Observations in West Africa Paul A. Kucera and Andrew J. Newman.
Welcome to….. The First Workshop for the Evaluation of High Resolution Precipitation Products: An Activity of the International Precipitation Working.
Welcome from the The International Precipitation Working Group
Hydrologically Relevant Error Metrics for PEHRPP
NOAA Objective Sea Surface Salinity Analysis P. Xie, Y. Xue, and A
6th IPWG Workshop October 2012, Sao Jose dos Campos, Brazil
PEHRPP Error Metrics WG Summary
An Inter-comparison of 5 HRPPs with 3-Hourly Gauge Estimates
Presentation transcript:

1 Program to Evaluate High Resolution Precipitation Products (PEHRPP): An Update Matt Sapiano P. Arkin, J. Janowiak, D. Vila, Univ. of Maryland/ESSIC, College Park, MD Joe Turk, Naval Research Laboratory, Monterey, CA (Presenter) E. Ebert, Bureau of Meteorology, Melbourne, Australia

2 Outline Brief explanation of PEHRPP PEHRPP activities Workshop highlights Recommendations to IPWG

3 What is PEHRPP? A collaborative effort to understand the capabilities and characteristics of High Resolution Precipitation Products High Resolution = Daily/sub-daily; <1 degree Sponsored by IPWG with broad voluntary participation Capitalizing on existing research and operational activities/datasets Providing a link between the observational and application communities

4 PEHRPP Strategy PEHRPP is designed to exploit four kinds of validation opportunities: 1.Networks based on national or regional operational rain gauges or radar networks 2.High-quality time series from ongoing research programs GEWEX CEOP, TAO/TRITON buoy gauges Ethiopia, Sao Paolo 3.Field program data sets NAME, BALTEX 4.Coherent global scale variability as depicted by the various data sets - the big picture

5 Real-time radar/gauge comparisons (slide courtesy of C. Kidd, with additions) (See Ebert et al., BAMS, 2007)

6 Guangdong Validation Site: Jianyin Liang, CMA with Pingping Xie, NOAA April – June 2005 period of initial data (394 hourly real-time gauges) Guang-Dong Seasonal Mean Bias GaugeCMORPH 3B42RT 3B42MWCOMB

7 Sub-daily, high quality time-series Comparison against sub-daily TAO/TRITON buoys and US SGP sites –Correlations are generally high –Under-estimates over ocean –Over-estimates over US in Summer in absence of gauge correction –Models included and perform OK at daily, less well at sub-daily Sapiano and Arkin, J. Hydrometeorology, 2008 (in press) % Bias

8 North American Monsoon Experiment Precipitation Daily Evolution: NERN vs Satellite over NAME Domain (Nesbitt)

9 First PEHRPP Workshop Hosted by the IPWG 3-5 December 2007, WMO, Geneva 40 attendees from 12 countries Presentations and working group reports on applications, validation and error metrics

10 Presentations and discussions Talks: –Precipitation Products (5) –Regional Validation (10) –Applications (9) –Error Metrics (6) Advanced Blending Methodologies –Use of Kalman filter Updates on 8 Validation Sites –Northern Europe, Southern Europe, Japan, Brazil, Australia, Mozambique, Continental US, Western Africa Use of HRPP’s by Users of Hydrological and Mesoscale Forecast Models Improved and Relevant Error Metrics –Focused on user requirements Summary due to appear in December BAMS (Turk et al.)

11 Key recommendations Multiple recommendations were made, the key ones being: 1.Several high resolution precipitation products exhibit useful skill, but clear superiority for one is not yet evident: continuing activities are useful to this end 2.IPWG should establish a continuing effort to conduct, facilitate and coordinate validation and evaluation of such products 3.A concerted validation/intercomparison campaign, covering multiple climatic regimes and seasons, should be designed and conducted

12 Discussion PEHRPP has become a useful framework for validation activities on high resolution data –Not all elements have been addressed – Synthesis of results is still lacking – hence the need for a concerted campaign New leadership is required for activities to continue –Mandate needs to come from IPWG Currently working on joint proposal to WGNE to collaborate by including more model precip in PEHRPP –Ebert, Huffman, Kidd, Sapiano and others

13 Recommendation 1: Recommend an intercomparison project (similar to PIP,AIP) for the evaluation of HRPP. Products should aim for a standard of three-hourly, 0.25 degree resolution with global coverage, with validation done at the regional scale. Details of the inter-comparison (locations, temporal scale, etc.) will be charged to an intercomparison working group in association with the GPM working group to maximize the impact of such a comparison. The intercomparison should be completed in the next 24 to 36 months. Working Group Key Recommendations: VALIDATION Historical Background Precipitation Intercomparison Program (PIP), sponsored by NASA’s WetNet Project PIP-1: First assessment of SSMI precipitation algorithms on a global scale, Aug-Nov PIP-2: Examined SSMI precipitation algorithms on a case basis for multiple years, seasons, and meteorological events (Jul 1987-Feb 1993). PIP-3: Examined global scale precipitation algorithms over an entire year (1992). Algorithm Intercomparison Program (AIP), sponsored by the Global Precipitation Climatology Program AIP-1: Japan and surrounding region during Jun–Aug 1989, covering frontal and tropical convective rainfall. AIP-2: Western Europe during Feb–Apr 1991 with rainfall and snowfall over both land and sea regions. AIP-3: Tropical Pacific Ocean region (1°N–4°S, 153–158°E) during Nov 1992–Feb Kidd, 2001: Satellite Rainfall Climatology: A Review, Int. J. Climatol.,

14 Recommendation 2: Recommend that the outputs of the current and future validation efforts are better utilized: a working group should be formed under IPWG as a PEHRPP activity, and should report by the next IPWG meeting (October 2008). The co- chairs should be a product developer and validation site developer. Recommendation 3: Recommend the use of existing HRPP in hydrological impact studies, such as the EUMETSAT H-SAF and HydroMet testbeds in the US, to assess the usefulness of the HRPP products in hydrological models. Recommendation 4: Recommend that we include and/or encourage the development of high- latitude sites such as the BALTEX, LOFZY, high latitude maritime radar sites, and/or the Canadian sites. Recommendation 5: Recommend that countries or weather institutions with high quality ground validation dataset actively participate in IPWG sponsored validation activities. Working Group Key Recommendations: VALIDATION

15 Recommendation 1: Product developers should be encouraged to formulate and produce error estimates for the products, by: Engaging end users IPWG should investigate the forms of error required for applications Engaging other product developers Since full error estimates will take time to obtain, developers should be encouraged to make other information available such as the main source of data (i.e. SSM/I F-13 GPROF V6) and the latency of PMW data (time since last MW overpass) Working Group Key Recommendations: APPLICATIONS

16 Recommendation 2: PEHRPP/IPWG should make satellite organizations aware of the fact that PMW data are useful for a broad range of applications and that these applications would benefit from more data, faster data delivery and the maintenance of all existing data streams. Recommendation 3: Product developers should be encouraged to pursue other assimilation and/or downscaling methodologies which exploit all available information (satellites, NWP, gauges, lightning estimates), particularly those which are optimized for specific applications. Working Group Key Recommendations: APPLICATIONS

17 There is a general feeling that the current understanding of HRPP quality/certainty/errors suffers from a lack of adequate error metrics that are pertinent to users and well-understood Long-term Recommendations: Physically based error characterization of retrievals (key element of GPM) Consistent set of “basic” metrics Comprehensive quantitative error model that allows users to specify time and space scales, give the space-time… coefficients associated with a precip data set, and obtain estimated RMS error (diagnostic) or create synthetic precip fields (prognostic) Work towards an assimilation-like method for combinations Working Group Key Recommendations: ERROR METRICS

18 Working Group Key Recommendations: ERROR METRICS Short-term Recommendations: Develop a standing working group on error metrics Agree on a short list of error metrics – each needs confidence intervals -“traditional” metrics that give insight at the scales of interest -other metrics suggested by the long-term vision -fuzzy validation framework -WWRP/WGNE Joint Working Group on Verification list of metrics -diagnostics (PDFs, conditional statistics, …) -examine using transformed data in metrics Test practicality of these metrics for producers and utility for users -Inter-satellite errors (Joyce/NOAA subsetted gridded (30-min, 0.25°) precipitation data sets from ~15 satellites/sensors) -Characterizing errors by regime -Establishing some minimum set of space/time correlations that are needed