Download presentation

Presentation is loading. Please wait.

Published byLandon MacPherson Modified over 4 years ago

1
22 March 2011: GSICS GRWG & GDWG Meeting Daejeon, Korea Tim Hewison SEVIRI-IASI Inter-calibration Uncertainty Evaluation

2
22 March 2011: GSICS GRWG & GDWG Meeting Daejeon, Korea Introduction Following guidelines provided by QA4EO Based on Guide to the Expression of Uncertainty in Measurement (GUM) To be read in conjunction with the Algorithm Theoretical Basis Document Uncertainties provide Quality Indicators for the inter-calibration products Each process of ATBD is considered: Uncertainties evaluated for key variables due to Random and Systematic effects Combined to produce error budget (Type B evaluation of combined uncertainty) Used to make recommendations for ATBD adjustments To produce more consistent uncertainty estimates

3
22 March 2011: GSICS GRWG & GDWG Meeting Daejeon, Korea Method – Introduction ATBD uses weighted linear regression to compare collocated radiances from monitored and reference instruments weightings based on spatial variance of radiances + radiometric noise Regression propagates these variances to estimate uncertainty on corrected radiance But these are only 2 processes introducing uncertainty to final product Full dynamic error propagation of all processes could be prohibitive Analysis reviews uncertainties based on measurement model of processes for case studies, which are assumed to be typical Define IASI as inter-calibration reference = Truth => IASI errors should not contribute to uncertainty of products But some are included here, to illustrate their magnitude

4
22 March 2011: GSICS GRWG & GDWG Meeting Daejeon, Korea Method – Systematic Errors Collocation criteria designed to minimise systematic errors by ensuring samples systematically distributed But in reality small residual differences remain Sampling differences introduce radiance errors in each collocation These introduce systematic errors in end product, according to its sensitivity to each variable, which is estimated from statistics of cases studies Use actual sampling distribution or assume uniform distribution between threshold limits

5
22 March 2011: GSICS GRWG & GDWG Meeting Daejeon, Korea Method – Random Errors Similarly, different processes introduce random errors in each collocated radiance Their magnitude is estimated from typical range of each variable and the sensitivity of radiances to perturbations of each variable Regression process used to generate GSICS Correction coefficients Reduces impact of random errors on each collocation Repeat regression many times for randomly perturbed datasets to estimate uncertainty on corrected radiances A Monte Carlo-like approach

6
22 March 2011: GSICS GRWG & GDWG Meeting Daejeon, Korea Method – Combining Errors Method may be refined for dominate processes.

7
22 March 2011: GSICS GRWG & GDWG Meeting Daejeon, Korea General Methodology For each process: Estimate typical differences in sampling variables, x between monitored and reference instruments Estimate sensitivity of radiances to perturbations in x: L/x where L i is radiance of each collocation, i Uncertainty on L i due to process, j : Regression of collocated radiances => GSICS Correction, g(L) Perturb observed radiances L i by u(L i ) Recalculated regression gives modified function, g(L) Gives different corrected radiances,

8
22 March 2011: GSICS GRWG & GDWG Meeting Daejeon, Korea Methodology for Systematic Errors For processes introducing systematic errors: each collocated radiance is perturbed by Recalculated regression gives modified function, g(L) Evaluate for range of scene radiances Compare to unmodified function, g(L) To estimate uncertainty on corrected radiance, due to systematic errors introduced by process j:

9
22 March 2011: GSICS GRWG & GDWG Meeting Daejeon, Korea Temporal Mismatch Uniform distribution over ±t max =300 s with n~30000 collocations gives mean time differencet = 2t max /(3n) 2 s. But mean difference in sampling time of collocations, t =30s due to deficiencies in orbital selection Calculate sensitivity from mean rate of change of radiances from time series of observations Much worse using 09:30 overpasses! Time series of mean rate of change of radiances calculated from Meteosat-9 observations on 2009-09-20 over (30°W- 30°E)x(30°S-30°N) [1mW/m2/st/cm-1/hr ~ 1K/hr]

10
22 March 2011: GSICS GRWG & GDWG Meeting Daejeon, Korea Summary of Systematic Errors Perturbations and Sensitivities Table 1 summarises the magnitude of typical perturbations,x j, of processes introducing systematic errors in the collocated radiances and the sensitivity of the 8 infrared channels of SEVIRI to these perturbations, dL j /dx. Table 1 Summary of Systematic Errors Perturbations and Sensitivities

11
22 March 2011: GSICS GRWG & GDWG Meeting Daejeon, Korea Combining all Systematic Errors All uncertainties due to systematic processes, added in quadrature: Systematic mismatches in time and space dominate the total systematic uncertainty due to finite gradients But, IR3.9 dominated by spectral correction to compensate for IASIs incomplete coverage

12
22 March 2011: GSICS GRWG & GDWG Meeting Daejeon, Korea Methodology for Random Errors For processes introducing random errors: each collocated radiance is perturbed by where z i is a random number drawn from distribution consistent with characteristic difference x r Repeat regression n k times gives modified functions, g j,k(L) Each evaluation is used to calculated corrected radiances: Standard deviation of over the Monte Carlo ensemble is calculated to provide an estimate of the uncertainty on corrected radiances due to each random process, j :

13
22 March 2011: GSICS GRWG & GDWG Meeting Daejeon, Korea Temporal Variability Uniform distribution over ±t max =300 s equivalent to r.m.s. difference,t = t max /3 173 s. Temporal Variability of typical SEVIRI images evaluated as RMSD between radiances sampled over different time intervals Calculate sensitivity from RMSD between radiances sampled at 21:30 and 21:45 over target area R.M.S. differences in Meteosat-8 10.8 μm brightness temperatures with time intervals from Rapid Scanning Meteosat data (red diamonds) and with spatial separation in North-South direction (black pluses) and West-East direction (black stars)

14
22 March 2011: GSICS GRWG & GDWG Meeting Daejeon, Korea Summary of Random Errors Perturbations and Sensitivities Table 3 summarises the characteristic difference,x r j, of processes introducing random errors in the collocated radiances and the sensitivity of the 8 infrared channels of SEVIRI to these perturbations, dL j /dx. Table 3 Summary of Random Errors Perturbations and Sensitivities

15
22 March 2011: GSICS GRWG & GDWG Meeting Daejeon, Korea Combining all Random Errors All uncertainties due to random processes, added in quadrature: Random variability in space and time dominate the total random uncertainty for all channels * 300s matches 3km well Other terms negligible => Could relax geometric collocation threshold!

16
22 March 2011: GSICS GRWG & GDWG Meeting Daejeon, Korea Compare with Quoted Uncertainty This total random uncertainty is 1-4x larger than quoted values => ATBD does not include important random processes Time series of Standard Biases shows higher variability => Implies there are real instrument calibration changes

17
22 March 2011: GSICS GRWG & GDWG Meeting Daejeon, Korea Combining Systematic and Random Errors Total uncertainties due to random and systematic processes: Random components dominate total error in most conditions Uncertainties increase rapidly for low Tb (fewer collocations) Errors much lower in WV channels

18
22 March 2011: GSICS GRWG & GDWG Meeting Daejeon, Korea Recommendations Geometric collocation criteria could be relaxed by a factor of 10 Would give more collocations and reduce random error ATBD should be revised to account for correlations when estimating uncertainty on GSICS Correction Or inflate uncertainty from regression by a factor of ~2 Analysis assumes published SRFs are correctly interpreted Misinterpretation would dominate systematic errors Need clear guidance in the application of published SRFs Should repeat this analysis for all GSICS Products!

Similar presentations

OK

2010 CEOS Field Reflectance Intercomparisons Lessons Learned K. Thome 1, N. Fox 2 1 NASA/GSFC, 2 National Physical Laboratory.

2010 CEOS Field Reflectance Intercomparisons Lessons Learned K. Thome 1, N. Fox 2 1 NASA/GSFC, 2 National Physical Laboratory.

© 2018 SlidePlayer.com Inc.

All rights reserved.

To make this website work, we log user data and share it with processors. To use this website, you must agree to our Privacy Policy, including cookie policy.

Ads by Google