29 May 2009GHRSST User's Symp - SQUAM1 The SST Quality Monitor (SQUAM) 1 st GHRSST Int’l User’s Symposium 28-29 May 2009, Santa Rosa, CA Alexander “Sasha”

Slides:



Advertisements
Similar presentations
JPSS and GOES-R SST Sasha Ignatov
Advertisements

15 May 2009ACSPO v1.10 GAC1 ACSPO upgrade to v1.10 Effective Date: 04 March 2009 Sasha Ignatov, XingMing Liang, Yury Kihai, Boris Petrenko, John Stroup.
March 6, 2013iQuam v21 In situ Quality Monitor (iQuam) version 2 Near-real time online Quality Control, Monitoring, and Data Serving tool for SST Cal/Val.
Pathfinder –> MODIS -> VIIRS Evolution of a CDR Robert Evans, Peter Minnett, Guillermo Podesta Kay Kilpatrick (retired), Sue Walsh, Vicki Halliwell, Liz.
GHRSST XI Science Team Meeting, ST-VAL, June 2010, Lima, Peru Recent developments to the SST Quality Monitor (SQUAM) and SST validation with In situ.
1 High resolution SST products for 2001 Satellite SST products and coverage In situ observations, coverage Quality control procedures Satellite error statistics.
Satellite SST Radiance Assimilation and SST Data Impacts James Cummings Naval Research Laboratory Monterey, CA Sea Surface Temperature Science.
Characterizing and comparison of uncertainty in the AVHRR Pathfinder SST field, Versions 5 & 6 Robert Evans Guilllermo Podesta’ RSMAS Nov 8, 2010 with.
8 November 2010SST Science Team Meeting1 Towards Community Consensus SSTs and Clear-Sky Radiances from AVHRR SST Science Team Meeting 8-10 November 2010,
1 High Resolution Daily Sea Surface Temperature Analysis Errors Richard W. Reynolds (NOAA, CICS) Dudley B. Chelton (Oregon State University)
1 A High Resolution Daily SST Analysis Richard W. Reynolds (NOAA, CICS) Dudley B. Chelton (Oregon State University) Thomas M. Smith (NOAA, STAR)
Center for Satellite Applications and Research (STAR) Review 09 – 11 March 2010 NOAA Operational Geostationary Sea Surface Temperature Products from NOAA.
1 Improved Sea Surface Temperature (SST) Analyses for Climate NOAA’s National Climatic Data Center Asheville, NC Thomas M. Smith Richard W. Reynolds Kenneth.
1 NOAA’s National Climatic Data Center April 2005 Climate Observation Program Blended SST Analysis Changes and Implications for the Buoy Network 1.Plans.
Improved NCEP SST Analysis
Recent activities on utilization of microwave imager data in the JMA NWP system - Preparation for AMSR2 data assimilation - Masahiro Kazumori Japan Meteorological.
4 June 2009GHRSST-X STM - SQUAM1 The SST Quality Monitor (SQUAM) 10 th GHRSST Science Team Meeting 1-5 June 2009, Santa Rosa, CA Alexander “Sasha” Ignatov*,
Center for Satellite Applications and Research (STAR) Review 09 – 11 March 2010 Image: MODIS Land Group, NASA GSFC March 2000 POES-GOES Blended SST Analysis.
Center for Satellite Applications and Research (STAR) Review 09 – 11 March 2010 CLOUD MASK AND QUALITY CONTROL FOR SST WITHIN THE ADVANCED CLEAR SKY PROCESSOR.
MODIS Sea-Surface Temperatures for GHRSST-PP Robert H. Evans & Peter J. Minnett Otis Brown, Erica Key, Goshka Szczodrak, Kay Kilpatrick, Warner Baringer,
Number of match-ups Mean Anomaly Fig. 3. Time-series of night MUT SST anomaly statistics compared to daily OISST SST. SST from different platforms mostly.
Quantifying the effect of ambient cloud on clear-sky ocean brightness temperatures and SSTs Korak Saha 1,2, Alexander Ignatov 1, and XingMing Liang 1,2.
GOES-R AWG 2 nd Validation Workshop 9-10 January 2014, College Park, MD GOES-R and JPSS SST Monitoring System Sasha Ignatov, Prasanjit Dash, Xingming Liang,
Real time Arctic SST retrieval from satellite IR data issues and solutions(?) Pierre Le Borgne, Gérard Legendre, Anne Marsouin, Sonia Péré, Hervé Roquet.
Assimilating Retrievals of Sea Surface Temperature from VIIRS and AMSR2 Bruce Brasnett and Dorina Surcel Colan CMDE November 21, 2014 Brasnett, B. and.
Page 1© Crown copyright HadISST2: progress and plans Nick Rayner, 14 th March 2007.
DMI-OI analysis in the Arctic DMI-OI processing scheme or Arctic Arctic bias correction method Arctic L4 Reanalysis Biases (AATSR – Pathfinder) Validation.
VIPER Quality Assessment Overview Presenter: Sathyadev Ramachandran, SPS Inc.
Center for Satellite Applications and Research (STAR) Review 09 – 11 March 2010 In Situ SST for Satellite Cal/Val and Quality Control Alexander Ignatov.
GHRSST XIV STM – AUS TAG. Woods Hole 18 June 2013, Cape Cod, MA 1 GHRSST 2013 Annual Meeting – AUS TAG breakout session 18 June, 2013, Woods Hole, MA SQUAM.
1 GOES-R AWG Product Validation Tool Development Sea Surface Temperature (SST) Team Sasha Ignatov (STAR)
1 SST Near-Real Time Online Validation Tools Sasha Ignatov (STAR) AWG Annual Meeting June 2011, Ft Collins, CO.
Monitoring/Validation of HR SSTs in SQUAM GHRSST-XV, 2-6 June 2014, Cape Town, South Africa 1 The 15 th GHRSST 2014 meeting, ST-VAL Breakout session 2–6.
Which VIIRS product to use: ACSPO vs. NAVO GHRSST-XV, 2-6 June 2014, Cape Town, South Africa 1 Prasanjit Dash 1,2, Alex Ignatov 1, Yuri Kihai 1,3, John.
1 RTM/NWP-BASED SST ALGORITHMS FOR VIIRS USING MODIS AS A PROXY B. Petrenko 1,2, A. Ignatov 1, Y. Kihai 1,3, J. Stroup 1,4, X. Liang 1,5 1 NOAA/NESDIS/STAR,
Eileen Maturi 1, Jo Murray 2, Andy Harris 3, Paul Fieguth 4, John Sapper 1 1 NOAA/NESDIS, U.S.A., 2 Rutherford Appleton Laboratory, U.K., 3 University.
GHRSST XI Meeting, IC-TAG Breakout Session, 22 June 2010, Lima, Peru Cross-monitoring of L4 SST fields in the SST Quality Monitor (SQUAM)
Retrieval Algorithms The derivations for each satellite consist of two steps: 1) cloud detection using a Bayesian Probabilistic Cloud Mask; and 2) application.
Infrared and Microwave Remote Sensing of Sea Surface Temperature Gary A. Wick NOAA Environmental Technology Laboratory January 14, 2004.
AVHRR Radiance Bias Correction Andy Harris, Jonathan Mittaz NOAA Cooperative Institute for Climate Studies University of Maryland Some concepts and some.
GHRSST XIV Science Team Meeting. Woods Hole June 2013, Cape Cod, MA 1 GHRSST 2013 Annual Meeting 17–21 June, 2013, Woods Hole, MA Preliminary analyses.
Center for Satellite Applications and Research (STAR) Review 09 – 11 March 2010 Monitoring of IR Clear-sky Radiances over Oceans for SST (MICROS) Alexander.
24 January th AMS Symposium on Future Operational Environmental Satellite Systems 22 – 26 January 2012, New Orleans, LA NPP VIIRS SST Algorithm.
2 November 2011JPSS SST at STAR 1 2 nd NASA SST Science Team Meeting 2 – 4 November 2011, Miami Beach, FL Joint Polar Satellite System (JPSS) SST Algorithm.
Preliminary results from the new AVHRR Pathfinder Atmospheres Extended (PATMOS-x) Data Set Andrew Heidinger a, Michael Pavolonis b and Mitch Goldberg a.
Quality Flags: Two separate sets of QFs are provided in IDPS Product: (1) SST QFs; and (2) VIIRS Cloud Mask. Analyses were aimed at identifying a suitable.
A comparison of AMSR-E and AATSR SST time-series A preliminary investigation into the effects of using cloud-cleared SST data as opposed to all-sky SST.
Center for Satellite Applications and Research (STAR) Review 09 – 11 March 2010 Image: MODIS Land Group, NASA GSFC March 2000 Advanced Clear-Sky Processor.
International GHRSST User Symposium Santa Rosa, California, USA 28-29th May 2009 MODIS Sea-Surface Temperatures Peter J Minnett & Robert H. Evans With.
2 Jun 09 UNCLASSIFIED 10th GHRSST Science Team Meeting Santa Rosa, CA 1 – 5 June Presented by Bruce McKenzie Charlie N. Barron, A.B. Kara, C. Rowley.
1 March 2011iQuam GHRSST DV-WG, HL-TAG and ST-VAL Meeting 28 February – 2 March 2011, Boulder, CO In situ Quality Monitor (iQuam) Near-real time.
Characterizing and comparison of uncertainty in the AVHRR Pathfinder Versions 5 & 6 SST field to various reference fields Robert Evans Guilllermo Podesta’
© Crown copyright Met Office Report of the GHRSST Inter-Comparison TAG (IC-TAG) Matt Martin GHRSST XI meeting, Lima, Peru, June 2010.
Uncertainty estimation from first principles: The future of SSES? Gary Corlett (University of Leicester) Chris Merchant (University of Edinburgh)
Edward Armstrong Jorge Vazquez Mike Chin Mike Davis James Pogue JPL PO.DAAC California Institute of Technology 28 May 2009 GHRSST Symposium, Santa Rosa,
Prototyping SST Retrievals from GOES-R ABI with MSG SEVIRI Data Nikolay V. Shabanov 1,2, (301)
GHRSST-9 Perros-Guirec, France 9-13 June Intercomparisons Among Global Daily SST Analyses NOAA’s National Climatic Data Center Asheville, NC, USA.
Validation of MTSAT-1R SST for the TWP+ Experiment Leon Majewski 1, Jon Mittaz 2, George Kruger 1,3, Helen Beggs 3, Andy Harris 2, Sandra Castro 4 1 Observations.
1 Objective Determination of Feature Resolution in an SST Analysis Dudley B. Chelton (Oregon State University) Richard W. Reynolds (NOAA, CICS) Dimitris.
GSICS Telecon July 2012 AVHRR, MODIS, VIIRS Radiance Monitoring in MICROS and GSICS help to SST Sasha Ignatov.
Monitoring of SST Radiances
Joint GRWG and GDWG Meeting February 2010, Toulouse, France
SST – GSICS Connections
NOAA Report on Ocean Parameters - SST Presented to CGMS-43 Working Group 2 session, agenda item 9 Author: Sasha Ignatov.
Using Double Differences in MICROS for Cross-Sensor Consistency Checks
Validation of Satellite-derived Lake Surface Temperatures
Towards Understanding and Resolving Cross-Platform Biases in MICROS
The SST CCI: Scientific Approaches
Radiometric Consistency between AVHRR, MODIS, and VIIRS in SST Bands
NOAA Objective Sea Surface Salinity Analysis P. Xie, Y. Xue, and A
Presentation transcript:

29 May 2009GHRSST User's Symp - SQUAM1 The SST Quality Monitor (SQUAM) 1 st GHRSST Int’l User’s Symposium May 2009, Santa Rosa, CA Alexander “Sasha” Ignatov*, Prasanjit Dash*, John Sapper**, and Yury Kihai* NOAA/NESDIS *Center for Satellite Applications & Research (STAR) **Office of Satellite Data Processing & Distribution (OSDPD)

29 May 2009GHRSST User's Symp - SQUAM2 NESDIS Operational AVHRR SST Products and SQUAM Objectives  Heritage Main Unit Task (MUT) present (McClain et al., 1985; Walton et al., 1998).  New Advanced Clear-Sky Processor for Oceans (ACSPO) -May 2008 – present SST Quality Monitor (SQUAM) Evaluate MUT and ACSPO SST products in near-real time for self-, cross-platform and cross-product consistency Identify product anomalies & help diagnose their causes (e.g., sensor malfunction, cloud mask, or SST algorithm)

29 May 2009GHRSST User's Symp - SQUAM3 Native pixel resolution 8 km 6.6×10 4 SST pixels 8.3% ocean covered at 1day×0.3 o resolution Native pixel resolution 4 km 2.1 × 10 6 SST pixels (MUT × 30) 32.7% ocean covered at 1day × 0.3 o resolution AVHRR SST MetOp-A GAC, 3 January 2008 (Daytime) Heritage MUT SST product ACSPO SST product ACSPO is superior to MUT in coverage. But how to evaluate SST performance? SST imagery is often inspected visually for quality and artifacts. However, large-scale SST background dominates and it’s not easy to discern “signal” from “noise”.

29 May 2009GHRSST User's Symp - SQUAM4 Heritage MUT SST product Mapping deviations from a global reference field (e.g., Reynolds daily 0.25°) reveals artifacts in the product (e.g., cold stripes at AVHRR swath edges). Removing large-scale SST background (daily Reynolds) emphasizes ‘noise’ ACSPO SST product

29 May 2009GHRSST User's Symp - SQUAM5 Quantitatively, satellite SSTs are validated against in situ SSTs However, in situ SSTs have limitations  They are sparse and geographically biased (cover retrieval domain not fully and non-uniformly)  Have non-uniform and suboptimal quality (often comparable to or worse than satellite SSTs)  Not available in near real time in sufficient numbers to cover the full geographical domain and retrieval space

29 May 2009GHRSST User's Symp - SQUAM6 Try using global reference fields for quantitative evaluation of satellite SST..  Satellite & reference SSTs are subject to near-Gaussian errors T SAT = T TRUE + ε SAT ; ε SAT = N(μ sat,σ sat 2 ) T REF = T TRUE + ε REF ; ε REF = N(μ ref,σ ref 2 ) where μ’s and σ’s are global mean and standard deviations of ε‘s  The residual is distributed near-normally ΔT = T SAT - T REF = ε SAT - ε REF ; ε ΔT = N(μ ΔT,σ ΔT 2 ) where μ ΔT = μ sat - μ ref ; σ ΔT 2 = σ sat 2 + σ ref 2 (if ε SAT and ε REF are independent)  If T REF = T in situ, then it is customary validation. If (μ ref, σ ref ) are comparable to (μ in situ, σ in situ ), and if ε SAT and ε REF are not too strongly correlated, then T REF can be used to monitor T SAT  All current T REF (Reynolds, OSTIA, RTG, ODYSSEA) do not resolve diurnal cycle

29 May 2009GHRSST User's Symp - SQUAM7 Global Histograms of T SAT - T REF ( Nighttime MUT)

29 May 2009GHRSST User's Symp - SQUAM8 Histogram of SST residual Reference SST: In situ 30 days of data: ~6,500 match-ups with in situ SST Median = K; Robust Standard Deviation = 0.27 K

29 May 2009GHRSST User's Symp - SQUAM9 8 days of data: ~483,500 match-ups with OSTIA SST Median = 0.00 K; Robust Standard Deviation = 0.30 K Histogram of SST residual Reference SST: OSTIA

29 May 2009GHRSST User's Symp - SQUAM10 8 days of data: ~483,700 match-ups with daily Reynolds SST Median = K; Robust Standard Deviation = 0.42 K Histogram of SST residual Reference SST: Daily Reynolds

29 May 2009GHRSST User's Symp - SQUAM11  Global histograms of T SAT - T REF are close to Gaussian shape, against all other T REF including T in situ  Gaussian distribution is characterized by location and scale. Here, robust VAL statistics are used: Median and Robust Standard Deviation (RSD)  For some T REF ‘s (OSTIA), VAL statistics are closer to T in situ than for other T REF ‘s (Reynolds). This preliminary observation is further verified on time series. ** For more histograms (ACSPO vs. MUT, day and night, other platforms and reference SSTs), go to SQUAM page Preliminary observations from global histograms analyses

29 May 2009GHRSST User's Symp - SQUAM12 Nighttime Time Series Global Median Biases of (T SAT - T REF )

29 May 2009GHRSST User's Symp - SQUAM13 Global Median Biases T SAT – T in situ Each data point = 1 month match-up with in situ Median Bias from 0 to K except for N-16 (sensor problems) MetOp-A and N-17 fly close orbits but show a cross-platform bias of ~0.10 K

29 May 2009GHRSST User's Symp - SQUAM14 Each data point: 1 week match-up with OSTIA SST Patterns reproducible yet crispier (finer temporal resolution) NOAA-18 more consistent with NOAA-17 than in VAL OSTIA artifacts observed in early period ( ) Global Median Biases T SAT – T OSTIA

29 May 2009GHRSST User's Symp - SQUAM15 Each data point: 1 week match-up with Reynolds SST Patterns reproducible but noisier than wrt OSTIA N-18 is more consistent with N-17 than in Val against in situ Artifacts in Reynolds SST different from those in OSTIA Global Median Biases T SAT – T Reynolds

29 May 2009GHRSST User's Symp - SQUAM16  Number of match-ups: Two orders of magnitude larger against T REF than against T in situ  Major trends & anomalies in T SAT : Captured well in time series against all T REF but crisper than against T in situ  Noise: Some T REF (Reynolds) are “noisier” than others (OSTIA)  Artifacts: Different artifacts are seen in different T REF ** For more analyses (ACSPO vs. MUT, and other reference SSTs), go to SQUAM page Observations from time series of global median biases

29 May 2009GHRSST User's Symp - SQUAM17 Time Series Global RSDs of (T SAT - T REF )

29 May 2009GHRSST User's Symp - SQUAM18 Robust Standard Deviations T SAT – T in situ On average, Val against in situ SST shows RSD~0.3 K N-16: Anomalous noise in recent years (sensor problems) N-18 (flies ~2am) shows slightly better RSD that MetOp-A and N-17 (fly ~10 pm)

29 May 2009GHRSST User's Symp - SQUAM19 On average, RSD~ K N-16: the same anomalies as against in situ SST but crispier N-18: slightly better RSD than for MetOp-A and N-17, consistent with in situ Val Robust Standard Deviations T SAT – T OSTIA

29 May 2009GHRSST User's Symp - SQUAM20 Before Jan 2006: RSD~0.5 K After Jan 2006: RSD~0.4 K Change likely due to switch in Jan 2006 from Pathfinder to NAVOCEANO SEATEMP as input to Reynolds SST Robust Standard Deviations T SAT – T Reynolds

29 May 2009GHRSST User's Symp - SQUAM21  Global RSDs are slightly larger for T REF compared to T in situ  For some T REF (Reynolds) RSD is larger than for others (OSTIA)  Nevertheless, all RSDs accurately track performance of T SAT  Different artifacts are seen in different T REF ** For more analyses (ACSPO vs. MUT, and other reference SSTs), go to SQUAM page Observations from time series of global RSD’s

29 May 2009GHRSST User's Symp - SQUAM22  Validation against global reference fields is currently employed in SQUAM to monitor two NESDIS operational AVHRR SST products, in near-real time  It helps quickly uncover SST product anomalies and diagnose their root causes (SST algorithm, cloud mask, or sensor performance), and leads to corrections Summary and Future Work  Work is underway to reconcile AVHRR & reference SSTs -Improve AVHRR sensor calibration -Adjust T REF for diurnal cycle (e.g., Kennedy et al., 2007) -Improve SST product (cloud screening, SST algorithms) -Provide feedback to T REF producers Objective is to have a single “benchmark” SST in NPOESS era  Add NOAA-19 and eventually MetOp-B, -C and VIIRS to SQUAM  We are open to integration with GHRSST and collaboration (to test other satellite & reference SSTs, diurnal correction,..)

29 May 2009GHRSST User's Symp - SQUAM23  SQUAM page Real time maps, histograms, time series (including double differences), dependencies  CALVAL page Cal/Val of MUT and ACSPO data against in situ SST (currently, password protected but will be open in 2-3 months)  MICROS page (Monitoring of IR Clear-sky Radiances over Oceans for SST) Validation of SST Radiances against RTM calculations with Reynolds SST and NCEP GFS inputhttp:// NESDIS NRT SST analyses on the web