Presentation is loading. Please wait.

Presentation is loading. Please wait.

Masaya Takahashi (JMA) Dohyeong Kim (KMA)

Similar presentations


Presentation on theme: "Masaya Takahashi (JMA) Dohyeong Kim (KMA)"— Presentation transcript:

1 Masaya Takahashi (JMA) Dohyeong Kim (KMA)
Annual GSICS Calibration Report for {Agency} – action proposal for 2019 GRWG/GDWG Annual Meeting Masaya Takahashi (JMA) Dohyeong Kim (KMA)

2 Background (1/2) CGMS member agencies have been reporting their spacecraft status using the same format (text-format document). We could also make reports for CGMS on instrument performance based on GSICS inter-calibration approaches.

3 Background (2/2) In 2017, GSICS-EP asked GSICS member agencies to provide instrument performance report validated by GSICS inter-calibration methods. Benefits for both GSICS and satellite data user communities The draft template: prepared through 2018 Annual Meeting and gsics-dev Great thanks who contributed to the discussion and provided report examples! GDWG Chair reported the template (example) at GSICS-EP-19 in June 2018. EP’s point: information on scene dependent biases (if exists) and accuracies of reference instruments by Double Difference approaches would be useful as well as time series charts on monitored instruments’ biases. Reference Action Description Actionee Due date EP-18.A03 GDWG and GRWG to develop an approach for an Annual GSICS report on the State of the Observing System with respect to Instrument Performance and Inter-comparisons with GSICS Reference Instruments GDWG and GRWG EP-19 (Jun 2018)

4 Concepts of Annual Report – Template as of 2018-10-31
Target audience Satellite data users + satellite operators (incl. non calibration experts) Keeping report contents SIMPLE for GSICS members’ minimum efforts Satellite/Instrument events relevant to calibration: 1 page for agency Radiometric calibration performance for MONITORED instruments: 1-2 page / instrument Using GSICS inter-calibration approaches Detailed information should be referred to Calibration Landing Page Adopting existing formats as possible Color chart used on NOAA satellite status webpage Visualization tools (e.g. GSICS Plotting Tool) Geometric/spectral calibration and GSICS reference instruments’ performance To be considered in future (2nd step) even though GSICS-EP asked to show reference instruments’ performance using double difference approach for GEO instruments

5 Discussion Proposed action Discussion points for report template
GSICS member agencies operating geostationary satellites to report their instrument’s radiometric calibration performance and relevant events at Agency Report of 2019 GRWG/GDWG Annual Meeting. Discussion points for report template # of slides 1 slide for GEO instruments’ status and calibration-relevant events? 1-2 slides for each instrument? Stats to be reported Table: calibration bias/uncertainty for the last year (or snapshot for a certain month of the last year)? Figure: time series of bias/unc. from operation start to show calibration trends? Reference instruments’ validation (by double difference) To be left for future reports by agencies who are responsible for them?

6 {Year} Annual GSICS Calibration Report
Report Template - Proposal 05 April 2019 {Year} Annual GSICS Calibration Report {Name} {Agency}

7 EO instruments (operation start date)
Report Template - Proposal {Year} Satellite/Instrument Summary Please add hyperlinks over instrument names which navigate to agency’s Calibration Landing Page Satellite (status) Location Launch date EO instruments (operation start date) {Satlelite#1} {Status} {SSP} YYYY-MM-DD {Instrument#1} (YYYY-MM-DD) {Satellite#2} {Instrument#2} Major calibration relevant events in {Year} YYYY-MM-DD: XXXXXXXX YYYY-MM-DD: YYYYYYYY Legends for satellite/instrument status Satellite Status Op = Operational RSS = Rapid Scanning Service P = Pre-operational B = Back-up, secondary L = Limited availability Instrument Status Operational (or capable of) Operational with limitations (or Standby) Operational with Degraded Performance Not Operational Functional, Turned Off

8 Report Template for VIS/NIR - Proposal
Summary Stats of {Satellite}/{Instrument} VIS/NIR Calibration Performance in {Year} (All uncertainties are k=1) Reference: {Satellite}/{Instrument} Channel Name (m) CH XX (XXX) Units {Method#1} Mean Bias (unc.) Annual Drift (unc.) {Method#2} Stats derived from {Satellite}/{Instrument} GSICS Re-Analysis Correction ( {Link to ATBD} ) {Description on inter-calibration methods#1} {Description on inter-calibration methods#2} {Notes on validation results (if any) } Time series of gains derived for {Satellite}/{Instrument} CH XXX w.r.t. {Reference} Monitored Instrument Gain [W/m2/sr/um]

9 Report Template for IR - Proposal
Summary Stats of {Satellite}/{Instrument} IR Calibration Performance in {Year} (All uncertainties are k=1) Channel Name (m) CH XX (XXX) Std. Rad. as Tb (K) {Reference#1} Mean Bias (K) Stdv. of Bias (K) {Reference#2} Stats derived from {Satellite}/{Instrument} GSICS Re-Analysis Correction ( {Link to ATBD} ) Standard Radiance: typical scene defined by GSICS for easy inter-comparison of sensors’ inter-calibration biases {Notes on validation results (if any) } Time series of {Satellite}/{Instrument} Tb biases w.r.t. {Reference} at std. radiance

10 The following slides are report examples for GEO imagers – thanks for your cooperation
EUMETSAT (Tim Hewison and Sebatien Wagner) NOAA (Fred Wu and Fangfang Yu) JMA (Masaya Takahashi, Yusuke Yogo) KMA (Dohyeong Kim and Minju Gu) ROSHYDROMET (Alexey Rublev and Alexander Uspensky)

11 2017 Annual GSICS Calibration Report for EUMETSAT
05 April 2019 Example#1 2017 Annual GSICS Calibration Report for EUMETSAT Tim Hewison, Sebastien Wagner EUMETSAT

12 Satellite/Instrument Summary - GEO
Satellite (status) Location Launch date EO instruments Meteosat-7 B 57 E MVIRI Meteosat-8 Op 41.5 E SEVIRI GERB Meteosat-9 RSS 9.5 E Meteosat-10 0.0 E Meteosat-11 3.4 W Hyperlinks on instrument names navigate to the Calibration Landing Page Used in CGMS Working Paper Satellite Status Op = Operational P = Pre-operational B = Back-up, secondary L = Limited availability Instrument Status Operational (or capable of) Operational with limitations (or Standby) Operational with Degraded Performance Not Operational Functional, Turned Off Major calibration relevant events in 2017 :00 Meteosat-8 took over from Meteosat-7 as prime spacecraft for IODC Service. Meteosat-7 was disseminated in parallel until Meteosat-7 re-orbited : Correction to geo-referencing offset in the SEVIRI Level 1.5 image Used on NOAA/OSPO GOES/POES status webpages

13 Calibration Performance: Meteosat/SEVIRI Visible Bands
Summary Statistics of Meteosat/SEVIRI VIS/NIR Calibration Performance (All Uncertainties are k=1) Channel Name VIS0.6 VIS0.8 NIR1.6 HRV Units Meteosat-8 Mean Bias (DCC-SSCC) +13.5 ± 5.6 % Annual Drift (DCC) -0.08 ± 0.13 %/yr Annual Drift (SSCC) -0.56 ± 0.01 -0.53 ± 0.03 -0.03 ± 0.03 -0.51 ± 0.04 Annual Drift (LCS) -0.54 ± 0.01 -0.52 ± 0.01 -0.02 ± 0.02 -0.53 ± 0.02 Meteosat-10 +10.4 ± 1.4 -0.54 ± 0.03 -0.75 ± 0.20 -0.63 ± 0.14 -0.02 ± 0.13 -0.76 ± 0.15 -0.88 ± 0.03 -0.77 ± 0.03 -0.23 ± 0.10 -0.99 ± 0.08 Mean Bias: Mean and SD of difference of DCC from operational (SSCC) calibration for all 2017 pentad results Annual Drift: Mean gain drift and uncertainty calculated from DCC and operational (SSCC) over each satellite life DCC: Demonstration GSICS Deep Convective Cloud inter-calibration, with respect to Aqua/MODIS SSCC: SEVIRI Solar Channel Calibration System = vicarious calibration used operationally for SEVIRI LCS: Lunar Calibration System = based on the GIRO.

14 Calibration Performance: Meteosat/SEVIRI Visible Bands
Mean Bias: Mean and SD of difference of DCC from operational (SSCC) calibration for all 2017 pentad results Annual Drift: Mean gain drift and uncertainty calculated from DCC and operational (SSCC) over each satellite life DCC: Demonstration GSICS Deep Convective Cloud inter-calibration, with respect to Aqua/MODIS SSCC: SEVIRI Solar Channel Calibration System = vicarious calibration used operationally for SEVIRI Time Series of gains derived for Meteosat-8 (MSG1) and -10 (MSG3) VIS0.6 channel SSCC (red), DCC (green).

15 Calibration Performance: Meteosat-8/SEVIRI IR Bands
Summary Statistics of Meteosat-8/SEVIRI IR Calibration Performance in 2017 (All uncertainties are k=1) Channel Name IR3.9 IR6.2 IR7.3 IR8.7 IR9.7 IR10.8 IR12.0 IR13.4 Standard Radiance as Tb (K) 284 236 255 261 286 285 267 Mean Bias (K) +0.57 -0.16 +0.38 +0.01 -0.08 +0.04 +0.35 Standard Deviation of Bias (K) 0.03 0.05 0.08 0.07 0.04 0.40 Mean Drift Rate of Bias (K/yr) -0.07 -0.10 -0.22 -0.04 -0.14 -0.05 The statistics are derived from Meteosat-8/SEVIRI Operational GSICS Re-Analysis Correction vs. Metop-A/IASI Biases defined for Standard Radiance: typical scene for easy inter-comparison of sensors’ inter-calibration biases Decontaminations introduce calibration jumps – most obvious in the IR13.4 channel due to ice contamination Time series of Meteosat-8/SEVIRI Tb biases w.r.t. Metop-A/IASI at standard radiance

16 Calibration Performance: Meteosat-9/SEVIRI IR Bands
Summary Statistics of Meteosat-9/SEVIRI IR Calibration Performance in 2017 (All uncertainties are k=1) Channel Name IR3.9 IR6.2 IR7.3 IR8.7 IR9.7 IR10.8 IR12.0 IR13.4 Standard Radiance as Tb (K) 284 236 255 261 286 285 267 Mean Bias (K) +0.36 -0.13 +0.09 +0.01 -0.03 +0.03 -0.01 -1.41 Standard Deviation of Bias (K) 0.08 0.12 0.15 0.11 0.18 0.13 0.16 Mean Drift Rate of Bias (K/yr) -0.23 -0.34 -0.39 -0.08 -0.44 -0.21 -0.46 The statistics are derived from Meteosat-9/SEVIRI Operational GSICS Re-Analysis Correction vs. Metop-A/IASI Biases defined for Standard Radiance: typical scene for easy inter-comparison of sensors’ inter-calibration biases Meteosat-9 operated in Rapid Scan Service during most of this period, which increases the bias uncertainties Time series of Meteosat-9/SEVIRI Tb biases w.r.t. Metop-A/IASI at standard radiance

17 2017 Annual GSICS Calibration Report for NOAA
05 April 2019 Example#2 2017 Annual GSICS Calibration Report for NOAA Fred Wu, Fangfang Yu NOAA

18 Calibration Performance: GOES16/ABI Infrared Bands
Summary Statistics of GOES-16/ABI IR Calibration Performance in December 2017 (All uncertainties are k=1) Channel Name (Central Wavelength in m) BAND07 (3.9) BAND08 (6.2) BAND09 (6.9) BAND10 (7.3) BAND11 (8.6) BAND12 (9.6) BAND13 (10.4) BAND14 (11.2) BAND15 (12.4) BAND16 (13.3) Std. Scene Tb (K) 286.0 234.5 244.0 254.5 284.0 259.5 283.5 269.5 Metop-B/ IASI Bias at Std. Scene(K) -0.167 -0.196 -0.218 -0.170 -0.204 -0.227 -0.210 -0.141 -0.153 -0.294 Stdv. of Bias (K) 0.120 0.082 0.093 0.108 0.147 0.110 0.160 0.165 0.169 S-NPP/ CrIS - -0.259 -0.202 -0.160 -0.176 -0.282 Stdv of Bias (K) 0.045 0.052 0.047 0.073 0.094 The uncertainty and statistics are calculated following the GSICS standard GEO-LEO IR inter-calibration algorithm GOES-16 ABI IR calibration is very stable with mean Tb bias to CrIS/IASI less than 0.3K. No significant scene dependent Tb bias to the reference instruments for all the IR channels GOES-16 ABI post-launch test started in Jan and became operational on 18 December L1B data are available to the public since after the provisional maturity on 1 June 2017. Stable reference and monitored instruments can quickly detect and identify calibration events (e.g. Metop-B/IASI and GOES-16 ABI Ground updates) and validate the algorithm (e.g. ABI cal. algorithm update in October 2017) Time series of GOES-16 ABI daily mean Tb bias to SNPP/CrIS and Metop-B/IASI for ABI B12 Scene dependent Tb bias to SNPP/CrIS for ABI B16 Scene dependent Tb bias to Metop-B/IASI for ABI B16 Metop-B/IASI cal. algorithm update G16 ABI cal. algorithm update

19 2017 Annual GSICS Calibration Report for JMA
05 April 2019 Example#3 2017 Annual GSICS Calibration Report for JMA Masaya Takahashi, Yusuke Yogo Japan Meteorological Agency

20 Satellite/Instrument Summary - GEO
Satellite (status) Location Launch date EO instruments Himawari-8 (Op) 140.7 E AHI Himawari-9 (B) Hyperlinks on instrument names navigate to the Calibration Landing Page Major calibration relevant events in 2017 : Himawari-9 was put into in-orbit standby as backup Himawari-8 : Disclosure of sensitivity trends (calibration gain) of Himawari-8/AHI VNIR bands : Updating ground processing system incl. reduction of banding and stripe noise of Himawari-8/AHI VNIR bands Satellite Status Op = Operational P = Pre-operational B = Back-up, secondary L = Limited availability Instrument Status Operational (or capable of) Operational with limitations (or Standby) Operational with Degraded Performance Not Operational Functional, Turned Off

21 Calibration Performance: Himawari-8/AHI Visible/Near-Infrared Bands
Summary Statistics of Himawari-8/AHI VNIR Calibration Performance (All Uncertainties are k=1) Channel Name (Central Wavelength in m) BAND01 (0.47) BAND02 (0.51) BAND03 (0.64) BAND04 (0.86) BAND05 (1.6) BAND06 (2.3) Units Ray-matching w/ S-NPP/VIIRS Mean Bias -2.2 ± 0.8 -3.1 ± 0.8 +2.9 ± 0.8 +0.8 ± 0.7 +6.3 ± 0.8 -4.9 ± 0.9 % Annual Drift -0.53 ± 0.15 -0.52 ± 0.12 -0.79 ± 0.09 -0.58 ± 0.06 -0.06 ± 0.13 -0.18 ± 0.11 %/yr Vicarious Cal. using Aqua/MODIS + RTM -1.0 ± 1.6 -1.7 ± 1.8 +1.9 ± 1.8 +2.8 ± 1.6 +4.3 ± 1.0 -3.5 ± 0.9 -0.72 ± 0.17 -0.92 ± 0.18 -1.26 ± 0.17 -1.20 ± 0.17 -0.39 ± 0.12 -0.24 ± 0.19 Mean Bias: monthly average and standard deviation of the daily results in January 2017 Annual Drift: calculated using Mean Bias from July 2015 to December 2017 Ray-matching: Spectral Band Adjustment Factors on NASA Langley website compensates Spectral diff. (ATBD) Vicarious calibration uses optical parameters retrieved from Aqua/MODIS C6 L1B (Reference) Trend of the ratio of observation to reference computed using Ray-matching / Vicarious calibration approaches BAND01 (0.47 μm) BAND03 (0.64 μm) BAND05 (1.6 μm)

22 Calibration Performance: Himawari-8/AHI Infrared Bands
Summary Statistics of Himawari-8/AHI IR Calibration Performance in 2017 (All uncertainties are k=1) Channel Name (Central Wavelength in m) BAND07 (3.9) BAND08 (6.2) BAND09 (6.9) BAND10 (7.3) BAND11 (8.6) BAND12 (9.6) BAND13 (10.4) BAND14 (11.2) BAND15 (12.4) BAND16 (13.3) Std. Radiance as Tb (K) 286.0 234.6 243.9 254.6 283.8 259.5 286.2 286.1 269.7 Metop-A/ IASI Mean Bias (K) -0.11 -0.173 -0.212 -0.129 -0.05 -0.216 0.036 0.045 -0.04 0.078 Stdv. of Bias (K) 0.008 0.012 0.009 0.014 0.017 0.018 0.019 0.015 S-NPP/ CrIS -0.07 -0.16 -0.24 -0.15 N/A -0.23 -0.02 -0.01 0.03 Stdv/ of Bias (K) 0.039 0.011 0.026 0.013 0.010 0.005 The statistics are derived from Himawari-8/AHI GSICS Re-Analysis Correction (ATBD) Standard Radiance: typical scene defined by GSICS for easy inter-comparison of sensors’ inter-calibration biases Time series of Himawari-8/AHI Tb biases w.r.t. Metop-A/IASI at std. radiance CrIS – IASI-A in 2017 [K] (AHI-8 – IASI-A) – (AHI-8 – CrIS) TB [K]

23 2017 Annual GSICS Calibration Report for KMA
05 April 2019 Example#4 2017 Annual GSICS Calibration Report for KMA Dohyeong Kim, Minju Gu Korea Meteorological Administration

24 Calibration Performance: COMS/MI Infrared Bands
Summary Statistics of COMS/MI IR Calibration Performance in 2017 (All uncertainties are k=1) MetOp-A/IASI MetOp-B/IASI Channel Name IR3.8 IR6.8 IR10.8 IR12.0 Std Rad as Tb (K) 286 238 285 Mean Bias (K) 0.16 -0.02 0.12 0.004 0.15 -0.06 0.11 Stdv of Bias (K) 0.03 0.01 0.05 0.02 0.04 Mean Drift Rate of Bias (K/yr) -0.14 - -0.12 -0.01 -0.15 -0.04 Snpp/CrIS Aqua/AIRS Channel Name IR3.8 IR6.8 IR10.8 IR12.0 Std Rad as Tb (K) 286 238 285 Mean Bias (K) - -0.23 -0.03 -0.02 -0.19 -0.30 0.02 Stdv of Bias (K) 0.01 0.14 0.08 0.07 0.09 Mean Drift Rate of Bias (K/yr) +0.03 +0.19 -0.005 -0.007 The statistics are derived from COMS/MI Operational GSICS Re-Analysis Correction vs. Metop-A/IASI, Metop-B/IASI, Aqua/AIRS, Snpp/CrIS Biases defined for Standard Radiance: typical scene for easy inter-comparison of sensors’ inter-calibration biases Operation of MI with shifted WV SRF of 3.5cm-1 started in 5 December 2017.

25 2017 Annual GSICS Calibration Report for ROSHHYDROMET
05 April 2019 Example#5 2017 Annual GSICS Calibration Report for ROSHHYDROMET Alexey Rublev, Alexander Uspensky ROSHYDROMET

26 Satellite/Instrument Summary - GEO
Satellite/Instrument Summary - GEO Satellite (status) Location Launch date EO instruments Elektro-L No.2 (Op) 76E MSU-GS Satellite Status Op = Operational P = Pre-operational B = Back-up, secondary L = Limited availability Instrument Status Operational (or capable of) Operational with limitations (or Standby) Operational with Degraded Performance Not Operational Functional, Turned Off The MSU-GS instrument is functional with limitations (12 mm channel is absent). Absolute calibration is ongoing.

27 Calibration Performance of MSU-GS: Visible/Near-Infrared Channels
Inter-calibration of MSU-GS shortwaves channels versus the VIIRS/Suomi NPP Comparison above deep convective clouds over the Indian Ocean ± 20° North/South and East/West of Elektro-L No. 2 sub-satellite point (76E) Calculated SBAF between MSU-GS and VIIRS ~1.0 for each pair of channels Month k1 Std error* k2 Std error k3 Number of cases April 0.959 0.021 1.091 0.038 1.096 0.031 384 May 0.995 0.009 1.082 0.010 1.126 0.012 383 June 0.980 0.095 1.049 0.091 1.076 0.076 227 July 0.978 0.036 1.052 0.040 1.088 0.035 651 August 0.974 0.118 1.066 0.120 1.086 116 September 0.987 0.037 1.116 0.030 554 Weighted mean 0.989 0.008 1.079 1.118 2315 *Std error – standard regression error


Download ppt "Masaya Takahashi (JMA) Dohyeong Kim (KMA)"

Similar presentations


Ads by Google