Precipitation Validation

Slides:



Advertisements
Similar presentations
Validation of Satellite Rainfall Estimates over the Mid-latitudes Chris Kidd University of Birmingham, UK.
Advertisements

Precipitation in IGWCO The objectives of IGWCO require time series of accurate gridded precipitation fields with fine spatial and temporal resolution for.
Rainfall estimation for food security in Africa, using the Meteosat Second Generation (MSG) satellite. Robin Chadwick.
Multiple Sensor Precipitation Estimation over Complex Terrain AGENDA I. Paperwork A. Committee member signatures B. Advisory conference requirements II.
1 The relation between rainfall and area-time Integrals At the transition from and arid to an equatorial Climate: application to rainfall estimation by.
Calibration of GOES-R ABI cloud products and TRMM/GPM observations to ground-based radar rainfall estimates for the MRMS system – Status and future plans.
1 COPS Workshop 2008 University of Hohenheim, Stuttgart; 27 to 29 February 2008 IMGI‘s contribution to the COPS 2007 field experiment Simon Hölzl & Alexander.
Validation of Satellite Precipitation Estimates for Weather and Hydrological Applications Beth Ebert BMRC, Melbourne, Australia 3 rd IPWG Workshop / 3.
INTERNATIONAL INSTITUTE FOR GEO-INFORMATION SCIENCE AND EARTH OBSERVATION Ben Maathuis Department of Water Resources ITC-Enschede The Netherlands From.
TRMM Tropical Rainfall Measurement (Mission). Why TRMM? n Tropical Rainfall Measuring Mission (TRMM) is a joint US-Japan study initiated in 1997 to study.
Monitoring the Quality of Operational and Semi-Operational Satellite Precipitation Estimates – The IPWG Validation / Intercomparison Study Beth Ebert Bureau.
How low can you go? Retrieval of light precipitation in mid-latitudes Chris Kidd School of Geography, Earth and Environmental Science The University of.
CARPE DIEM Centre for Water Resources Research NUID-UCD Contribution to Area-3 Dusseldorf meeting 26th to 28th May 2003.
1 GOES-R AWG Hydrology Algorithm Team: Rainfall Probability June 14, 2011 Presented By: Bob Kuligowski NOAA/NESDIS/STAR.
The Evaluation of a Passive Microwave-Based Satellite Rainfall Estimation Algorithm with an IR-Based Algorithm at Short time Scales Robert Joyce RS Information.
Simultaneous Presence of 30 and 60 days ISO modes in Indian Summer Monsoon Observed from TRMM Merged Rainfall Data M S Narayanan National Atmospheric Research.
LMD/IPSL 1 Ahmedabad Megha-Tropique Meeting October 2005 Combination of MSG and TRMM for precipitation estimation over Africa (AMMA project experience)
IPWG Validation current status and future directions Chris Kidd Beth Ebert John Janowiak The University of Birmingham, Birmingham, UK Bureau of Meteorology,
1 GOES-R AWG Hydrology Algorithm Team: Rainfall Potential June 14, 2011 Presented By: Bob Kuligowski NOAA/NESDIS/STAR.
Fine-scale comparisons of satellite estimates Chris Kidd School of Geography, Earth and Environmental Sciences University of Birmingham.
All about DATASETS Description and Algorithms Description and Algorithms Source Source Spatial and temporal Resolutions Spatial and temporal Resolutions.
Precipitation Estimation from Remotely Sensed Information using Artificial Neural Networks (PERSIANN) Kuolin Hsu, Yang Hong, Dan Braithwaite, Xiaogang.
Slide: 1 Quality indicators in an operational precipitation product IPWG meeting 4 Beijing, October 2008 Presented by: Thomas Heinemann Meteorological.
VALIDATION OF HIGH RESOLUTION PRECIPITATION PRODUCTS IN THE SOUTH OF BRAZIL WITH A DENSE GAUGE NETWORK AND WEATHER RADARS – FIRST RESULTS Cesar Beneti,
Modern Era Retrospective-analysis for Research and Applications: Introduction to NASA’s Modern Era Retrospective-analysis for Research and Applications:
Global Precipitation Measurement: Past, present, and future challenges Chris Kidd ESSIC,University of Maryland, and NASA/Goddard Space Flight Center, USA.
Satellite-derived Rainfall Estimates over the Western U.S.: Fact or Fiction? John Janowiak Bob Joyce Pingping Xie Phil Arkin Mingyue Chen Yelena Yarosh.
VALIDATION AND IMPROVEMENT OF THE GOES-R RAINFALL RATE ALGORITHM Background Robert J. Kuligowski, Center for Satellite Applications and Research, NOAA/NESDIS,
Evaluation of Passive Microwave Rainfall Estimates Using TRMM PR and Ground Measurements as References Xin Lin and Arthur Y. Hou NASA Goddard Space Flight.
Environmental Remote Sensing GEOG 2021 Lecture 8 Observing platforms & systems and revision.
1 Validation for CRR (PGE05) NWC SAF PAR Workshop October 2005 Madrid, Spain A. Rodríguez.
1 Inter-comparing high resolution satellite precipitation estimates at different scales Phil Arkin and Matt Sapiano Cooperative Institute for Climate Studies.
An Overview of Satellite Rainfall Estimation for Flash Flood Monitoring Timothy Love NOAA Climate Prediction Center with USAID- FEWS-NET, MFEWS, AFN Presented.
COMPARING HRPP PRODUCTS OVER LARGE SPACE AND TIME SCALES Wesley Berg Department of Atmospheric Science Colorado State University.
The Potential Role of the GPM in Activities at the Naval Research Laboratory Joe Turk and Jeff Hawkins Naval Research Laboratory Marine Meteorology Division.
Active and passive microwave remote sensing of precipitation at high latitudes R. Bennartz - M. Kulie - C. O’Dell (1) S. Pinori – A. Mugnai (2) (1) University.
1 GOES-R AWG Product Validation Tool Development Hydrology Application Team Bob Kuligowski (STAR)
EVALUATION OF A GLOBAL PREDICTION SYSTEM: THE MISSISSIPPI RIVER BASIN AS A TEST CASE Nathalie Voisin, Andy W. Wood and Dennis P. Lettenmaier Civil and.
Diurnal Cycle of Precipitation Based on CMORPH Vernon E. Kousky, John E. Janowiak and Robert Joyce Climate Prediction Center, NOAA.
The Diurnal Cycle of Cold Cloud and Precipitation over the NAME Region Phil Arkin, ESSIC University of Maryland.
Precipitation Validation Dr Chris Kidd University of Maryland/Earth System Science Interdisciplinary Center & NASA/Goddard Space Flight Center. IPWG-7.
Application of Probability Density Function - Optimal Interpolation in Hourly Gauge-Satellite Merged Precipitation Analysis over China Yan Shen, Yang Pan,
“CMORPH” is a method that creates spatially & temporally complete information using existing precipitation products that are derived from passive microwave.
Passive Microwave Remote Sensing
National Aeronautics and Space Administration Jet Propulsion Laboratory California Institute of Technology Pasadena, California
Validation of Satellite Rainfall Estimates over the Mid-latitudes Chris Kidd University of Birmingham, UK.
An Experimental Study of the Rainfall Variability in the Southern Delmarva Peninsula Part I: Climatology and Physical Variability Rigoberto Roche NASA.
Estimation of precipitation over the OLYMPEX domain during winter
Assessment of high-resolution simulations of precipitation and temperature characteristics over western Canada using WRF model Asong. Z.E
Multi-Site and Multi-Objective Evaluation of CMORPH and TRMM-3B42 High-Resolution Satellite-Rainfall Products October 11-15, 2010 Hamburg, Germany Emad.
*CPC Morphing Technique
Precipitation Classification and Analysis from AMSU
PEHRPP Geneva, 3-5 December 2007
Verifying Precipitation Events Using Composite Statistics
An Experimental Study of the Rainfall Variability in the Southern Delmarva Peninsula Part I: Climatology and Physical Variability Rigoberto Roche NASA.
Radar/Surface Quantitative Precipitation Estimation
Meng Lu and Edzer Pebesma
Soo-Hyun Yoo and Pingping Xie
EG2234 Earth Observation Weather Forecasting.
*CPC Morphing Technique
Validation Working Group
Validation of Satellite Precipitation Estimates using High-Resolution Surface Rainfall Observations in West Africa Paul A. Kucera and Andrew J. Newman.
The Global Satellite Mapping of Precipitation (GSMaP) project: Integration of microwave and infrared radiometers for a global precipitation map Tomoo.
Hydrologically Relevant Error Metrics for PEHRPP
Characteristics of the TMPA and Input Data Sets
NOAA Objective Sea Surface Salinity Analysis P. Xie, Y. Xue, and A
PEHRPP Error Metrics WG Summary
An Inter-comparison of 5 HRPPs with 3-Hourly Gauge Estimates
Comeaux and Worley, NSF/NCAR/SCD
Presentation transcript:

Precipitation Validation Hydrology Training Workshop University of Hamburg Chris Kidd …and many others…

Hydrology Training Workshop: University of Hamburg, 12-14 October 2010 Overview Precipitation characteristics Surface measurements: Gauges, Radar Validation: Case study: the European IPWG site Experiences – other analysis Results – statistical dependency Conclusions Hydrology Training Workshop: University of Hamburg, 12-14 October 2010

Why? – essentially to improve estimates 2008 floods 2009 floods

UK Midlands: 20 July 2007

Precipitation Characteristics The ‘modal’ instantaneous precipitation value is zero Rain intensities are skewed towards zero: at middle to high latitudes, heavily so! Spatial/temporal accumulations will ‘normalise’ the data 1 mm of rain ≡ 1 lm-2 or 1 Kg (or 1000 tkm-2) Occurrence Accumulation Hydrology Training Workshop: University of Hamburg, 12-14 October 2010

Surface measurement Clee Hill radars (C-band vs ATC) Micro rain radar 0.2 mm/tip ARG100 gauge 0.1mm/tip Young’s Gauge

Conventional measurements Gauge data (rain/snow) Simple measurements of accumulations Quantitative sampling (tipping bucket gauges etc) But, point measurements, under-catch errors, etc. Radar systems Backscatter from hydrometeors (rain/snow/hail) Spatial measurements Potential to discriminate between precipitation type But, range effects, anomalous propagation errors, Z-R relationships… Precipitation is highly variable both temporally and spatially: measurements need to be representative Hydrology Training Workshop: University of Hamburg, 12-14 October 2010

Conventional Observations Radar duplicates rain-gauge coverage 20,000 Rain gauges Rain gauge network: Confined mainly to land areas – large areas of ocean with no data All gauges if placed together would measure an area no larger than the centre circle of a football field. Precipitation is highly variable both temporally and spatially. Measurements need to be representative Hydrology Training Workshop: University of Hamburg, 12-14 October 2010

Variance explained by nearest station Jürgen Grieser Variance based upon monthly data: shorter periods = lower explained variance Hydrology Training Workshop: University of Hamburg, 12-14 October 2010

What is truth? Co-located 8 gauges / 4 MRRs Hydrology Training Workshop: University of Hamburg, 12-14 October 2010

Hydrology Training Workshop: University of Hamburg, 12-14 October 2010 1st gauge… Hydrology Training Workshop: University of Hamburg, 12-14 October 2010

Hydrology Training Workshop: University of Hamburg, 12-14 October 2010 2nd gauge… Hydrology Training Workshop: University of Hamburg, 12-14 October 2010

Hydrology Training Workshop: University of Hamburg, 12-14 October 2010 2 more gauges Hydrology Training Workshop: University of Hamburg, 12-14 October 2010

Hydrology Training Workshop: University of Hamburg, 12-14 October 2010 All gauges Hydrology Training Workshop: University of Hamburg, 12-14 October 2010

Hydrology Training Workshop: University of Hamburg, 12-14 October 2010 plus the MRR… Hydrology Training Workshop: University of Hamburg, 12-14 October 2010

Hydrology Training Workshop: University of Hamburg, 12-14 October 2010 Radar vs Gauge measurements Cumulative Rainfall Radar vs gauge reasonable – but not quite 1:1 10 June 2009 : 40mm in 30mins MRR 24.1GHz Gauge, TBR Tipping bucket gauges provide quantised measurements (0.1 or 0.2 mm/tip) MRR critical for light rainfall Hydrology Training Workshop: University of Hamburg, 12-14 October 2010

University of Helsinki C-band Clee-Hill ATC Radar and C-band Chilbolton C-Band University of Helsinki C-band Hydrology Training Workshop: University of Hamburg, 12-14 October 2010

National network Radars: Doppler, dual polarised 100/210km

Hydrology Training Workshop: University of Hamburg, 12-14 October 2010 Radar vs gauge data Radar (daily integrated) Gauge data Hydrology Training Workshop: University of Hamburg, 12-14 October 2010

Hydrology Training Workshop: University of Hamburg, 12-14 October 2010 Helsinki Testbed FMI Helsinki Cold season – surface issues & mixed-phase precipitation to surface Circles: 4 operational Doppler weather radars (FMI & EMHI), 1 Dual pol radar + 1 vertically pointing C-band radar for research (Vaisala & UH) 2 vertically pointing POSS-radars Dots: 80 gauges Big diamonds: FD12P optical scatterometers Triangles: ultrasonic snow depth Squares: weighing gauges Hydrology Training Workshop: University of Hamburg, 12-14 October 2010

Ground validation - IPWG synergies Criteria GV program IPWG Type of validation Priority on physical, also statistical Has focused on descriptive and statistical Source of validation data Arranged for and collected by principle investigators Doesn't request. IPWG participants free to contribute Source of observational data Specific satellite-based products participants provide products directly to validation groups Types of Validation data Gauge, radars and specialist instrumentation, diverse in specific locations Conventional gauge and/or radar networks, usually part of a national network Types of observational data single-sensor, instantaneous, full-resolution datasets Blended satellite sensor products, time/area averaged. GV=Ground Validation After Turk & Arkin, BAMS 2008 Both approaches are complementary Hydrology Training Workshop: University of Hamburg, 12-14 October 2010

Summary: surface measurements Representativeness of surface measurements: Over land generally good, but variable Over oceans: virtually none-existent Measurement issues: Physical collection – interferes with measurement (e.g. wind effects – frozen precip, etc) Radar – imprecise backscatter:rainfall relationship (also clutter, range effects, bright band, etc) Satellites offer consistent, regular measurements, global coverage, real-time delivery of data Hydrology Training Workshop: University of Hamburg, 12-14 October 2010

Satellite Data sets

Observation availability Spectral Region Availability Cycle (current) Res.* Visible Since start of satellite era Geostationary, 15/30 mins Polar orbiters, 6-hourly 250 m+ Infrared Shortly after start of satellite era ~ calibrated since 1979 1 km+ Passive Microwave Experimental 1972/1975 Uncalibrated since 1978 Calibrated since 1987 + Low Earth orbiter (TMI) 4 km+ Active Microwave (radar) 13.8 GHz since 1997 94 GHz since 2006 Low Earth Orbiter (PR) Polar orbiter (Cloudsat) 4 km 1.5 km * Resolutions vary greatly with scan angle, frequency, sensor etc) Hydrology Training Workshop: University of Hamburg, 12-14 October 2010

Satellite observational scales 1km 25km Observations made nominally at 1km/15 mins: Estimates possible at 1km/1min but inaccurate Precipitation products generally available at 0.25 degree daily, or 0.25 degree, 3 hourly Earth Resources Satellites Precipitation systems Vis/IR MW 3 hours LEO Vis IR 15 minutes Accuracy of satellite precipitation estimates improve with temporal/spatial averaging GEO rapid scan Ikonos Landsat Spot MODIS mm Hydrology Training Workshop: University of Hamburg, 12-14 October 2010

LEO vs GEO satellite observations (movie) The contrast between these two satellites can be seen here: Low-Earth Orbit sensors SSM/I and TRMM Geostationary sensors Meteosat / MSG

Observations to Products Data inputs Resolutions time/space Monthly/seasonal Climate resolution Instantaneous Full resolution Climatology Agriculture/crops Meteorology Hydrology O b s e r v a t i o n R e t r i v a l s P r o d u c t s Visible Infrared Passive MW Active MW Model outputs Hydrology Training Workshop: University of Hamburg, 12-14 October 2010

Global precipitation data sets Many different products at different spatial/temporal resolutions … and formats! Hydrology Training Workshop: University of Hamburg, 12-14 October 2010

Setting up a validation site

IPWG European validation Radar used as 'ground truth' Composite of radars over UK, France, Germany, Belgium and Netherlands Nominal 5 km resolution Equal-area polar-stereographic projection Data and product ingest Near real-time Statistical and graphical output (SGI/Irix; f77/netpbm) Hydrology Training Workshop: University of Hamburg, 12-14 October 2010

Hydrology Training Workshop: University of Hamburg, 12-14 October 2010 Processing setup Perceived requirements: Daily inter-comparison → 00Z-24Z (also -06, -09, -12Z) 0.25 degree resolution → 25 km resolution Real-time → near real-time dependent upon product Validation data → radar data (gauge being added later) Automatic → quasi-automatic (not ‘operational’) Many products → limited number of products Hydrology Training Workshop: University of Hamburg, 12-14 October 2010

Hydrology Training Workshop: University of Hamburg, 12-14 October 2010 Processing Schedule 01Z Global IR 02Z SSM/I data GPI FDA ECMWF 03Z European radar data PMIR 04Z 3B4x 05Z cics data Statistics at 20km 22Z EUMETSAT MPE Web pages Hydrology Training Workshop: University of Hamburg, 12-14 October 2010

Hydrology Training Workshop: University of Hamburg, 12-14 October 2010 Processing system Initial setup: Setting of dates Cleaning out old/decayed data Remapping of data: … to regional grid or 5 km PSG projection… Acquiring data: Searching existing data Listing missing data Creation of .netrc file ftp data sources Results generation: Statistical analysis Graphical output Web pages: Generate HTML files Copying to server Hydrology Training Workshop: University of Hamburg, 12-14 October 2010

Processing checks foreach day (d0-d0-31) dn=dn+1 set d0=today foreach product & day remap to PSG using LUTs & standardise format standardise filename foreach day (d0-d0-31) foreach product (p1-pn) if (product for day) !exist add to .netrc file foreach datasource (s0-sn) ftp datasource (4k) N Y foreach product & day Generate statistics Generate plots foreach product & day generate HTML files Hydrology Training Workshop: University of Hamburg, 12-14 October 2010

Processing checks Automated systems they are NOT! foreach day (d0-d0-31) dn=dn+1 set d0=today Set up list of past dates/days Usually okay: sometimes needs tweaking Prepares products into common format Usually okay… foreach product & day remap to PSG using LUTs & standardise format standardise filename foreach day (d0-d0-31) foreach product (p1-pn) if (product for day) !exist add to .netrc file foreach datasource (s0-sn) ftp datasource (4k) Y N Checks for a products results: Okay if no results, but not if bad data Generates outputs: Okay if there is rain… foreach product & day Generate statistics Generate plots FTP runs several times: 4K buffer limit on macros Generates raw HTML: Occasional issues with server foreach product & day generate HTML files Automated systems they are NOT!

Hydrology Training Workshop: University of Hamburg, 12-14 October 2010

Hydrology Training Workshop: University of Hamburg, 12-14 October 2010

“Standard” Layout Validation data Precipitation product Occurrence comparison Accumulation comparison Contingency tables PoD/FAR/ HSS Scatter-plot Descriptive Statistics Cumulative distribution Hydrology Training Workshop: University of Hamburg, 12-14 October 2010

PMIR results: Europe 2009-01-11 Hydrology Training Workshop: University of Hamburg, 12-14 October 2010

PMIR results: Australia 2008-12-25 Hydrology Training Workshop: University of Hamburg, 12-14 October 2010

Results: Snow problems Hydrology Training Workshop: University of Hamburg, 12-14 October 2010

Hydrology Training Workshop: University of Hamburg, 12-14 October 2010 Results: rain extent Hydrology Training Workshop: University of Hamburg, 12-14 October 2010

IPWG Inter-comparison regions Near real-time intercomparison of model & satellite estimates vs radar/gauge IPWG – International Precipitation Working Group (WMO/CGMS) Hydrology Training Workshop: University of Hamburg, 12-14 October 2010

Monthly and seasonal validation Monthly and seasonal diagnostic validation summaries CICS – Cooperative Institute for Climate Studies Hydrology Training Workshop: University of Hamburg, 12-14 October 2010

Validation resolution Month Validation resolution At full resolution the correlation of estimated rain is low; averaging over time and space improves the picture Fine-scale data is generated so users get to decide on averaging strategy 5-day day 3-hour VAR vs. HQ (mm/hr) Feb. 2002 30°N-S Huffman 2/10

Resolution vs Statistical Performance Performance can be improved just by smoothing the data! Hydrology Training Workshop: University of Hamburg, 12-14 October 2010

Validation through Hydrology Applications are resolution critical Bacchiglione (1200 km2) Posina (116 km2) PMIR: 4km/30min 3B42RT: 1deg/3hr Anagnostou & Hossain: Satellite error propagation in flood simulation: Study region is a cascade of mountainous basins in North Italy We used a flood event from 1996 Did synthetic simulations using SREM2D with parameters calibrated for two satellite products: Chris Kidd product 4km/1hr satellite product and 3B42 (V6) 25km/3hr product Presented error stats in rainfall and runoff for different basin scales and contrasted results between the two different resolution products Satellite resolution is critical for basin scales below 200 km2; these results are model and basin topography dependent; would we exhibit similar scale effects on error propagation for a less complex hydrologic model or less complex terrain? 0.5 km 1 km 2 km 4 km 8 km 16 km High:57.9 Low:1.6 Applications are resolution critical Hydrology Training Workshop: University of Hamburg, 12-14 October 2010

Instantaneous analysis AMSR precipitation product (v10) instantaneous radar (3x5 scans averaged to 15 mins). 5km resolution average to 50x50km Regions of interest: - NSea, Atlantic, France, Germany, UK. January 2005 - September 2009 Hydrology Training Workshop: University of Hamburg, 12-14 October 2010

Hydrology Training Workshop: University of Hamburg, 12-14 October 2010 Mean rainfall (mm/d) 2005-2009 Radar AMSR 0 0.1 0.3 0.5 1 2 4 8 Rainfall mm/d Hydrology Training Workshop: University of Hamburg, 12-14 October 2010

Hydrology Training Workshop: University of Hamburg, 12-14 October 2010 Mean rainrate (mm/h) Mean rain rate: mm/h Date: year/month Overall – current AMSR rain product underestimates rainfall Hydrology Training Workshop: University of Hamburg, 12-14 October 2010

Regional breakdown: ratios NSea = 0.8370 UK = 0.3424 Germany = 0.4271 France = 0.3956 Atlantic = 0.8033 Hydrology Training Workshop: University of Hamburg, 12-14 October 2010

Current High-resolution Studies Inter-comparison of 3-hourly 0.25 degree precipitation estimates over UK/NW Europe 5 mainstream PEHRPP algorithms: - CMORPH - CPCMMW - NRLGEO - PERSIANN - 3B42RT (v5) Surface reference data sets: 3-hourly gauge, interpolated gauge and NIMROD European radar Period of study March 2005-February 2009 (4 years, split into seasons, ~2880 samples/0.25° grid) Hydrology Training Workshop: University of Hamburg, 12-14 October 2010

3-hourly/0.25 degree data availability Radar Gauges PERSIANN NRLBLD HYDROE CPCMW CMORPH 3B42v5 Hydrology Training Workshop: University of Hamburg, 12-14 October 2010

Precipitation Totals (2005-2009 by season) Radar PERSIANN NRLBLD CPCMMW CMORPH 3B42v5 1500 500 250 200 150 100 75 50 25 10 DJF MAM JJA SON mm/season

3-hourly radar vs product (2007) Correlation Ratio 3B42RT CMORPH 3B42RT CMORPH CPCMMW NRLBLD CPCMMW NRLBLD PERSIANN ECMWF PERSIANN ECMWF 1.0 0.8 0.6 0.4 0.2 0.0 -0.2 -0.4 -0.6 -0.8 -1.0 10.0 4.0 2.0 1.5 1.2 1.1 0.91 0.81 0.67 0.5 0.25 0.0 Hydrology Training Workshop: University of Hamburg, 12-14 October 2010

3-hourly 0.25 degree summary Correlations are good generally, although with seasonal variations - CMORPH produces highest correlations JJA - ECMWF produces highest corrections DJF Quantification of precipitation poor: typically <50% of 'true' rainfall as identified by radar and gauge Temporal-matching of product and surface data sets (3B42 product) Obvious radar inadequacies, but spatially useful Issues with light precipitation – the 'missing bit' Hydrology Training Workshop: University of Hamburg, 12-14 October 2010

Hydrology Training Workshop: University of Hamburg, 12-14 October 2010 Factors to consider… Processing issues: Grid box vs grid point Instantaneous vs accumulation (i.e. ±1.5 hours or 00-03) Data resolutions (temporal & spatial) Data units (storage resolution vs retrieval resolution) Formats (I*2; I*4; R*4) (& units: mmh-1 ;mmd-1 ; kgd-1) Filename and date/time conventions (end, start, period) W-E (180°E/0°E & E-W) and N-S (or S-N) layout Statistical analysis is dependent upon the rainfall: intensity, extent and patterns temporal resolution spatial resolution All these are inter-related and pose a multi-dimensional problem that cannot currently be adequately resolved Hydrology Training Workshop: University of Hamburg, 12-14 October 2010

Statistics: blame it on the weather! Type of cloud/ rain Movement: Is the movement perpendicular or along the rain band?   Intensity What is the range of values within the rain area? Sensor field-of-view   Size/variability What is the size and variability of the rain area(s)?   Statistical success has as much to do with meteorology as the algorithms ability… Hydrology Training Workshop: University of Hamburg, 12-14 October 2010

Precipitation Validation Summary What are your key requirements to make the best of your own limited resources? What are the requirements of your user community? What are the requirements of the algorithm/product providers? What sources of data are available to you – both satellite product and surface data? Should you go beyond basic, daily regional comparisons (instantaneous-seasonal)? Hydrology Training Workshop: University of Hamburg, 12-14 October 2010

Hydrology Training Workshop: University of Hamburg, 12-14 October 2010 Contacts Chris Kidd: C.Kidd@bham.ac.uk University of Birmingham (from 01/01/11 try Chris.Kidd@nasa.gov) International Precipitation Working Group Main Web page: http://www.isac.cnr.it/~ipwg Hydrology Training Workshop: University of Hamburg, 12-14 October 2010