Download presentation
Presentation is loading. Please wait.
Published byClarence Gallagher Modified over 9 years ago
1
Calibration/Validation and Generating Sea- Surface Temperature Climate Data Records: An approach using ship-board radiometry Peter Minnett 1, Gary Corlett 2 & ISSI Team 1 Rosenstiel School of Marine and Atmospheric Science University of Miami 2 University of Leicester, Department of Physics and Astronomy
2
ISSI Team Scientists who have attended the ISSI International Teams in Space Science on the “Generation of Climate Data Records of Sea-Surface Temperature from current and future satellite radiometers” include: Dr Peter Minnett (Team Leader), University of Miami, USA Dr Gary Corlett (Co-leader), University of Leicester, UK Dr Sandra Castro, University of Colorado, USA Dr Craig Donlon, ESA-ESTEC, NL Dr Lei Guan, Ocean University of China, CN Dr Andrew Jessup, University of Washington, USA Dr Tim Nightingale, Rutherford Appleton Laboratory, UK Dr Anne O’Carroll, EUMETSAT, DE Dr Theo Theocharous, National Physical Laboratory, UK Dr Gary Wick, NOAA ESRL, USA Dr Werenfrid Wimmer, University of Southampton, UK Dr Chris Wilson, NASA Jet Propulsion Laboratory, USA
3
Introduction Ship-board radiometers are repeatedly calibrated using SI-traceable facilities. Requires and “unbroken chain” of comparisons from at-sea measurements to SI-standards in National Metrology Institutes. This is an important prerequisite for generating a Climate Data Record of Sea Surface Temperature. Repeated calibration and characterization of the ship-radiometers allows estimation of uncertainties in the SST validation system.
4
Calibrating satellite radiometers
5
Calibrating ship-board radiometers
6
Transfer SI-traceability to satellite retrievals CDAF = GHRSST Climate Data Assessment Framework
7
Example: Non-SI validation
8
Example: SI validation (1)
9
Example: SI validation (2)
10
Match-up statistics N2N3D2D3 Drifters Day251213-0.09 (0.34) -0.09 (0.25) Night317080-0.21 (0.38)-0.171(0.20)-0.17 (0.29)-0.16 (0.26) GTMBA Day10439-0.01 (0.41) -0.07 (0.24) Night13806-0.19 (0.46)-0.16 (0.18)-0.18 (0.28)-0.17 (0.24) Radiometers Day7195-0.01 (0.33) -0.02 (0.29) Night10989-0.04 (0.36)+0.02 (0.21)-0.02 (0.28)-0.01 (0.24) Argo Day2788-0.03 (0.40) -0.04 (0.26) Night2468-0.20 (0.39)-0.16 (0.17)-0.17 (0.28)-0.14 (0.22)
11
ADVANCES IN CAL/VAL AND FUTURE CHALLENGES
12
Cal/Val Challenges Pre-launch calibration is essential – Not for compromise in schedule Post-launch on-board and vicarious – Cross platform comparisons; SI traceability Improve quality and coverage of surface measurements – Next generation drifters Dealing with improvements in retrieval methods – Validating uncertainties; algorithm sensitivity Feature resolution – Validating gradients – radiometers ideal Information flow – At all levels; two way dialogue
13
Advances in SST retrievals Empirical regression to buoys Optimal estimation of SST & TCWV Regression to RT modelling Empirical screening thresholds Probabilistic / dynamic RT Fixed RT screening thresholds Retrieval Cloud detection Uncertainty model 1 st 2 nd 3 rd Empirical (SSES) Uncertainty model Requires methods to validate uncertainties
14
STATISTICAL METHODS
15
ESA SST CCI Analysis long-term product version 1.0 versus idrifters
16
VALIDATING UNCERTAINTIES
17
Components of the validation uncertainty budget Satellite (σ 1 ) – Varies pixel by pixel Reference (σ 2 ) – Generally unknown; can be provided at ‘pixel’ level for radiometers Estimate of O(0.1 K) for GTMBA moorings; O(0.2 K) for drifters; negligible for Argo Procedural: spatial – surface (σ 3 ) – Systematic for single match-up; pseudo-random for large dataset Estimated for AATSR, 1 km & 1 hour, to be 0.1 K (on average) – Can be reduced through pixel averaging (e.g. sample 11 by 11 instead of 1 by 1) – Includes uncertainty in geolocation (may be systematic even for large numbers) Procedural: spatial – depth (σ 4 ) – Systematic for single match-up with different depths; pseudo-random for large dataset at different depths (with diurnal & skin model) Procedural: temporal (σ 5 ) – Systematic for single match-up; may be reduced for large dataset (if match-up window small enough) – Can be reduced with diurnal & skin model
18
Example: For drifters Use single value uncertainty of 0.2 K for σ 2 – This is the average value for the entire dataset Use large number of match-ups, area averaging and diurnal/skin model to “randomise” σ 3 and σ 4 – Contribution is reduced (<< 0.1 K) Use diurnal & skin model to reduce σ 5 Reduced to << 0.1 K Take σ 1 from satellite product
19
Uncertainty validation Assuming σ 3, σ 4 and σ 5 do not contribute significantly then the uncertainty budget can be approximated as: Tends to drifter uncertainty (0.2 K) at low sat uncertainties Tends to sat uncertainty at high sat uncertainties
20
Example: Uncertainty validation
21
STABILITY
22
Satellite Poorly characterised reference leads to apparent unstable time series of discrepancies within quoted uncertainties Well characterised reference confirms stable time series of discrepancies within quoted uncertainties Why measurement uncertainties are essential
24
FUTURE MEASUREMENT REQUIREMENTS
25
Requirements Argo – Near surface profiles – High temporal sampling Drifters – Improved reporting of location and time Are you getting this data? – Pre-deployment calibration (with DBCP) 350 k$ requested (evaluation deployment) Radiometers – On-going development of next generation
26
Summary Pre- and post-launch calibration is essential – Supported by vicarious methods SI traceability maintained through ship-borne radiometer deployments – Mandatory for CDR generation Next generation retrievals require improved methods – Uncertainty validation Better interaction with users needed – Feedback on data quality
27
But… Life would be easier if data was provided appropriate to the measurement method – IR radiometers should provide SST skin; trust the physics All products should come with associated uncertainties – Consistently developed Funding for international projects is difficult – IR intercomparison; drifter calibration project Users – Please use uncertainty information provided with data and tell us what is wrong with it!
Similar presentations
© 2025 SlidePlayer.com Inc.
All rights reserved.