Presentation is loading. Please wait.

Presentation is loading. Please wait.

Precipitation Validation

Similar presentations


Presentation on theme: "Precipitation Validation"— Presentation transcript:

1 Precipitation Validation
Hydrology Training Workshop University of Hamburg Chris Kidd …and many others…

2 Hydrology Training Workshop: University of Hamburg, 12-14 October 2010
Overview Precipitation characteristics Surface measurements: Gauges, Radar Validation: Case study: the European IPWG site Experiences – other analysis Results – statistical dependency Conclusions Hydrology Training Workshop: University of Hamburg, October 2010

3 Why? – essentially to improve estimates
2008 floods 2009 floods

4 UK Midlands: 20 July 2007

5 Precipitation Characteristics
The ‘modal’ instantaneous precipitation value is zero Rain intensities are skewed towards zero: at middle to high latitudes, heavily so! Spatial/temporal accumulations will ‘normalise’ the data 1 mm of rain ≡ 1 lm-2 or 1 Kg (or 1000 tkm-2) Occurrence Accumulation Hydrology Training Workshop: University of Hamburg, October 2010

6 Surface measurement Clee Hill radars (C-band vs ATC) Micro rain radar
0.2 mm/tip ARG100 gauge 0.1mm/tip Young’s Gauge

7 Conventional measurements
Gauge data (rain/snow) Simple measurements of accumulations Quantitative sampling (tipping bucket gauges etc) But, point measurements, under-catch errors, etc. Radar systems Backscatter from hydrometeors (rain/snow/hail) Spatial measurements Potential to discriminate between precipitation type But, range effects, anomalous propagation errors, Z-R relationships… Precipitation is highly variable both temporally and spatially: measurements need to be representative Hydrology Training Workshop: University of Hamburg, October 2010

8 Conventional Observations
Radar duplicates rain-gauge coverage 20,000 Rain gauges Rain gauge network: Confined mainly to land areas – large areas of ocean with no data All gauges if placed together would measure an area no larger than the centre circle of a football field. Precipitation is highly variable both temporally and spatially. Measurements need to be representative Hydrology Training Workshop: University of Hamburg, October 2010

9 Variance explained by nearest station
Jürgen Grieser Variance based upon monthly data: shorter periods = lower explained variance Hydrology Training Workshop: University of Hamburg, October 2010

10 What is truth? Co-located 8 gauges / 4 MRRs
Hydrology Training Workshop: University of Hamburg, October 2010

11 Hydrology Training Workshop: University of Hamburg, 12-14 October 2010
1st gauge… Hydrology Training Workshop: University of Hamburg, October 2010

12 Hydrology Training Workshop: University of Hamburg, 12-14 October 2010
2nd gauge… Hydrology Training Workshop: University of Hamburg, October 2010

13 Hydrology Training Workshop: University of Hamburg, 12-14 October 2010
2 more gauges Hydrology Training Workshop: University of Hamburg, October 2010

14 Hydrology Training Workshop: University of Hamburg, 12-14 October 2010
All gauges Hydrology Training Workshop: University of Hamburg, October 2010

15 Hydrology Training Workshop: University of Hamburg, 12-14 October 2010
plus the MRR… Hydrology Training Workshop: University of Hamburg, October 2010

16 Hydrology Training Workshop: University of Hamburg, 12-14 October 2010
Radar vs Gauge measurements Cumulative Rainfall Radar vs gauge reasonable – but not quite 1:1 10 June 2009 : 40mm in 30mins MRR 24.1GHz Gauge, TBR Tipping bucket gauges provide quantised measurements (0.1 or 0.2 mm/tip) MRR critical for light rainfall Hydrology Training Workshop: University of Hamburg, October 2010

17 University of Helsinki C-band
Clee-Hill ATC Radar and C-band Chilbolton C-Band University of Helsinki C-band Hydrology Training Workshop: University of Hamburg, October 2010

18 National network Radars: Doppler, dual polarised 100/210km

19

20 Hydrology Training Workshop: University of Hamburg, 12-14 October 2010
Radar vs gauge data Radar (daily integrated) Gauge data Hydrology Training Workshop: University of Hamburg, October 2010

21 Hydrology Training Workshop: University of Hamburg, 12-14 October 2010
Helsinki Testbed FMI Helsinki Cold season – surface issues & mixed-phase precipitation to surface Circles: 4 operational Doppler weather radars (FMI & EMHI), 1 Dual pol radar + 1 vertically pointing C-band radar for research (Vaisala & UH) 2 vertically pointing POSS-radars Dots: 80 gauges Big diamonds: FD12P optical scatterometers Triangles: ultrasonic snow depth Squares: weighing gauges Hydrology Training Workshop: University of Hamburg, October 2010

22 Ground validation - IPWG synergies
Criteria GV program IPWG Type of validation Priority on physical, also statistical Has focused on descriptive and statistical Source of validation data Arranged for and collected by principle investigators Doesn't request. IPWG participants free to contribute Source of observational data Specific satellite-based products participants provide products directly to validation groups Types of Validation data Gauge, radars and specialist instrumentation, diverse in specific locations Conventional gauge and/or radar networks, usually part of a national network Types of observational data single-sensor, instantaneous, full-resolution datasets Blended satellite sensor products, time/area averaged. GV=Ground Validation After Turk & Arkin, BAMS 2008 Both approaches are complementary Hydrology Training Workshop: University of Hamburg, October 2010

23 Summary: surface measurements
Representativeness of surface measurements: Over land generally good, but variable Over oceans: virtually none-existent Measurement issues: Physical collection – interferes with measurement (e.g. wind effects – frozen precip, etc) Radar – imprecise backscatter:rainfall relationship (also clutter, range effects, bright band, etc) Satellites offer consistent, regular measurements, global coverage, real-time delivery of data Hydrology Training Workshop: University of Hamburg, October 2010

24 Satellite Data sets

25 Observation availability
Spectral Region Availability Cycle (current) Res.* Visible Since start of satellite era Geostationary, 15/30 mins Polar orbiters, 6-hourly 250 m+ Infrared Shortly after start of satellite era ~ calibrated since 1979 1 km+ Passive Microwave Experimental 1972/1975 Uncalibrated since 1978 Calibrated since 1987 + Low Earth orbiter (TMI) 4 km+ Active Microwave (radar) 13.8 GHz since 1997 94 GHz since 2006 Low Earth Orbiter (PR) Polar orbiter (Cloudsat) 4 km 1.5 km * Resolutions vary greatly with scan angle, frequency, sensor etc) Hydrology Training Workshop: University of Hamburg, October 2010

26 Satellite observational scales
1km 25km Observations made nominally at 1km/15 mins: Estimates possible at 1km/1min but inaccurate Precipitation products generally available at 0.25 degree daily, or 0.25 degree, 3 hourly Earth Resources Satellites Precipitation systems Vis/IR MW 3 hours LEO Vis IR 15 minutes Accuracy of satellite precipitation estimates improve with temporal/spatial averaging GEO rapid scan Ikonos Landsat Spot MODIS mm Hydrology Training Workshop: University of Hamburg, October 2010

27 LEO vs GEO satellite observations
(movie) The contrast between these two satellites can be seen here: Low-Earth Orbit sensors SSM/I and TRMM Geostationary sensors Meteosat / MSG

28 Observations to Products
Data inputs Resolutions time/space Monthly/seasonal Climate resolution Instantaneous Full resolution Climatology Agriculture/crops Meteorology Hydrology O b s e r v a t i o n R e t r i v a l s P r o d u c t s Visible Infrared Passive MW Active MW Model outputs Hydrology Training Workshop: University of Hamburg, October 2010

29 Global precipitation data sets
Many different products at different spatial/temporal resolutions … and formats! Hydrology Training Workshop: University of Hamburg, October 2010

30 Setting up a validation site

31 IPWG European validation
Radar used as 'ground truth' Composite of radars over UK, France, Germany, Belgium and Netherlands Nominal 5 km resolution Equal-area polar-stereographic projection Data and product ingest Near real-time Statistical and graphical output (SGI/Irix; f77/netpbm) Hydrology Training Workshop: University of Hamburg, October 2010

32 Hydrology Training Workshop: University of Hamburg, 12-14 October 2010
Processing setup Perceived requirements: Daily inter-comparison → 00Z-24Z (also -06, -09, -12Z) 0.25 degree resolution → 25 km resolution Real-time → near real-time dependent upon product Validation data → radar data (gauge being added later) Automatic → quasi-automatic (not ‘operational’) Many products → limited number of products Hydrology Training Workshop: University of Hamburg, October 2010

33 Hydrology Training Workshop: University of Hamburg, 12-14 October 2010
Processing Schedule 01Z Global IR 02Z SSM/I data GPI FDA ECMWF 03Z European radar data PMIR 04Z 3B4x 05Z cics data Statistics at 20km 22Z EUMETSAT MPE Web pages Hydrology Training Workshop: University of Hamburg, October 2010

34 Hydrology Training Workshop: University of Hamburg, 12-14 October 2010
Processing system Initial setup: Setting of dates Cleaning out old/decayed data Remapping of data: … to regional grid or 5 km PSG projection… Acquiring data: Searching existing data Listing missing data Creation of .netrc file ftp data sources Results generation: Statistical analysis Graphical output Web pages: Generate HTML files Copying to server Hydrology Training Workshop: University of Hamburg, October 2010

35 Processing checks foreach day (d0-d0-31) dn=dn+1 set d0=today
foreach product & day remap to PSG using LUTs & standardise format standardise filename foreach day (d0-d0-31) foreach product (p1-pn) if (product for day) !exist add to .netrc file foreach datasource (s0-sn) ftp datasource (4k) N Y foreach product & day Generate statistics Generate plots foreach product & day generate HTML files Hydrology Training Workshop: University of Hamburg, October 2010

36 Processing checks Automated systems they are NOT!
foreach day (d0-d0-31) dn=dn+1 set d0=today Set up list of past dates/days Usually okay: sometimes needs tweaking Prepares products into common format Usually okay… foreach product & day remap to PSG using LUTs & standardise format standardise filename foreach day (d0-d0-31) foreach product (p1-pn) if (product for day) !exist add to .netrc file foreach datasource (s0-sn) ftp datasource (4k) Y N Checks for a products results: Okay if no results, but not if bad data Generates outputs: Okay if there is rain… foreach product & day Generate statistics Generate plots FTP runs several times: 4K buffer limit on macros Generates raw HTML: Occasional issues with server foreach product & day generate HTML files Automated systems they are NOT!

37 Hydrology Training Workshop: University of Hamburg, 12-14 October 2010

38 Hydrology Training Workshop: University of Hamburg, 12-14 October 2010

39 “Standard” Layout Validation data Precipitation product
Occurrence comparison Accumulation comparison Contingency tables PoD/FAR/ HSS Scatter-plot Descriptive Statistics Cumulative distribution Hydrology Training Workshop: University of Hamburg, October 2010

40 PMIR results: Europe 2009-01-11
Hydrology Training Workshop: University of Hamburg, October 2010

41 PMIR results: Australia 2008-12-25
Hydrology Training Workshop: University of Hamburg, October 2010

42 Results: Snow problems
Hydrology Training Workshop: University of Hamburg, October 2010

43 Hydrology Training Workshop: University of Hamburg, 12-14 October 2010
Results: rain extent Hydrology Training Workshop: University of Hamburg, October 2010

44 IPWG Inter-comparison regions
Near real-time intercomparison of model & satellite estimates vs radar/gauge IPWG – International Precipitation Working Group (WMO/CGMS) Hydrology Training Workshop: University of Hamburg, October 2010

45 Monthly and seasonal validation
Monthly and seasonal diagnostic validation summaries CICS – Cooperative Institute for Climate Studies Hydrology Training Workshop: University of Hamburg, October 2010

46 Validation resolution
Month Validation resolution At full resolution the correlation of estimated rain is low; averaging over time and space improves the picture Fine-scale data is generated so users get to decide on averaging strategy 5-day day 3-hour VAR vs. HQ (mm/hr) Feb °N-S Huffman 2/10

47 Resolution vs Statistical Performance
Performance can be improved just by smoothing the data! Hydrology Training Workshop: University of Hamburg, October 2010

48 Validation through Hydrology Applications are resolution critical
Bacchiglione (1200 km2) Posina (116 km2) PMIR: 4km/30min B42RT: 1deg/3hr Anagnostou & Hossain: Satellite error propagation in flood simulation: Study region is a cascade of mountainous basins in North Italy We used a flood event from 1996 Did synthetic simulations using SREM2D with parameters calibrated for two satellite products: Chris Kidd product 4km/1hr satellite product and 3B42 (V6) 25km/3hr product Presented error stats in rainfall and runoff for different basin scales and contrasted results between the two different resolution products Satellite resolution is critical for basin scales below 200 km2; these results are model and basin topography dependent; would we exhibit similar scale effects on error propagation for a less complex hydrologic model or less complex terrain? 0.5 km 1 km 2 km 4 km 8 km 16 km High:57.9 Low:1.6 Applications are resolution critical Hydrology Training Workshop: University of Hamburg, October 2010

49 Instantaneous analysis
AMSR precipitation product (v10) instantaneous radar (3x5 scans averaged to 15 mins). 5km resolution average to 50x50km Regions of interest: - NSea, Atlantic, France, Germany, UK. January September 2009 Hydrology Training Workshop: University of Hamburg, October 2010

50 Hydrology Training Workshop: University of Hamburg, 12-14 October 2010
Mean rainfall (mm/d) Radar AMSR Rainfall mm/d Hydrology Training Workshop: University of Hamburg, October 2010

51 Hydrology Training Workshop: University of Hamburg, 12-14 October 2010
Mean rainrate (mm/h) Mean rain rate: mm/h Date: year/month Overall – current AMSR rain product underestimates rainfall Hydrology Training Workshop: University of Hamburg, October 2010

52 Regional breakdown: ratios
NSea = UK = Germany = France = Atlantic = Hydrology Training Workshop: University of Hamburg, October 2010

53 Current High-resolution Studies
Inter-comparison of 3-hourly 0.25 degree precipitation estimates over UK/NW Europe 5 mainstream PEHRPP algorithms: - CMORPH - CPCMMW - NRLGEO - PERSIANN - 3B42RT (v5) Surface reference data sets: 3-hourly gauge, interpolated gauge and NIMROD European radar Period of study March 2005-February 2009 (4 years, split into seasons, ~2880 samples/0.25° grid) Hydrology Training Workshop: University of Hamburg, October 2010

54 3-hourly/0.25 degree data availability
Radar Gauges PERSIANN NRLBLD HYDROE CPCMW CMORPH 3B42v5 Hydrology Training Workshop: University of Hamburg, October 2010

55 Precipitation Totals (2005-2009 by season)
Radar PERSIANN NRLBLD CPCMMW CMORPH B42v5 1500 500 250 200 150 100 75 50 25 10 DJF MAM JJA SON mm/season

56 3-hourly radar vs product (2007)
Correlation Ratio 3B42RT CMORPH 3B42RT CMORPH CPCMMW NRLBLD CPCMMW NRLBLD PERSIANN ECMWF PERSIANN ECMWF 1.0 0.8 0.6 0.4 0.2 0.0 -0.2 -0.4 -0.6 -0.8 -1.0 10.0 4.0 2.0 1.5 1.2 1.1 0.91 0.81 0.67 0.5 0.25 0.0 Hydrology Training Workshop: University of Hamburg, October 2010

57 3-hourly 0.25 degree summary
Correlations are good generally, although with seasonal variations - CMORPH produces highest correlations JJA - ECMWF produces highest corrections DJF Quantification of precipitation poor: typically <50% of 'true' rainfall as identified by radar and gauge Temporal-matching of product and surface data sets (3B42 product) Obvious radar inadequacies, but spatially useful Issues with light precipitation – the 'missing bit' Hydrology Training Workshop: University of Hamburg, October 2010

58 Hydrology Training Workshop: University of Hamburg, 12-14 October 2010
Factors to consider… Processing issues: Grid box vs grid point Instantaneous vs accumulation (i.e. ±1.5 hours or 00-03) Data resolutions (temporal & spatial) Data units (storage resolution vs retrieval resolution) Formats (I*2; I*4; R*4) (& units: mmh-1 ;mmd-1 ; kgd-1) Filename and date/time conventions (end, start, period) W-E (180°E/0°E & E-W) and N-S (or S-N) layout Statistical analysis is dependent upon the rainfall: intensity, extent and patterns temporal resolution spatial resolution All these are inter-related and pose a multi-dimensional problem that cannot currently be adequately resolved Hydrology Training Workshop: University of Hamburg, October 2010

59 Statistics: blame it on the weather!
Type of cloud/ rain Movement: Is the movement perpendicular or along the rain band? Intensity What is the range of values within the rain area? Sensor field-of-view Size/variability What is the size and variability of the rain area(s)? Statistical success has as much to do with meteorology as the algorithms ability… Hydrology Training Workshop: University of Hamburg, October 2010

60 Precipitation Validation Summary
What are your key requirements to make the best of your own limited resources? What are the requirements of your user community? What are the requirements of the algorithm/product providers? What sources of data are available to you – both satellite product and surface data? Should you go beyond basic, daily regional comparisons (instantaneous-seasonal)? Hydrology Training Workshop: University of Hamburg, October 2010

61 Hydrology Training Workshop: University of Hamburg, 12-14 October 2010
Contacts Chris Kidd: University of Birmingham (from 01/01/11 try International Precipitation Working Group Main Web page: Hydrology Training Workshop: University of Hamburg, October 2010


Download ppt "Precipitation Validation"

Similar presentations


Ads by Google