Presentation is loading. Please wait.

Presentation is loading. Please wait.

Slide 1 Model Assimilation of Satellite Observations and Retrievals Andrew Collard

Similar presentations


Presentation on theme: "Slide 1 Model Assimilation of Satellite Observations and Retrievals Andrew Collard"— Presentation transcript:

1 Slide 1 Model Assimilation of Satellite Observations and Retrievals Andrew Collard IMSG@NOAA/NCEP/EMC

2 Slide 2 Talk Outline Introduction The Global Model Cycle Data Assimilation Evaluating Impact Data Impact Results GOES-R and JPSS Summary

3 Slide 3 Introduction This talk will summarize how we make use of satellite data in operational forecast models We will discuss how satellite observations and derived products are used to produce our best estimate of the current state of the atmosphere This can be considered a data fusion process, where the information from current and historical observations are combined in an optimal manner.

4 Slide 4 The global model cycle NOAA Satellite Proving Ground Slide 4

5 Slide 5

6 Slide 6

7 Slide 7 Data Assimilation NOAA Satellite Proving Ground Slide 7

8 Slide 8 Data Assimilation Data assimilation for NWP updates the atmospheric state which is used as a starting point for the forecast run. This state is the analysis. It is a retrieval of the entire atmospheric state using multiple data sources simultaneously. It is also, therefore, an efficient data fusion process. The a priori or first-guess data used in the data assimilation process is a short range forecast from a previous cycle. -The first guess, and therefore the analysis, contain information from all previous observations, propagated forward using the atmospheric model. NOAA Satellite Proving Ground Slide 8

9 Slide 9 Atmospheric Analysis Problem J = J b + J o + J c J = (x-x b ) T B x -1 (x-x b ) + (y-K(x)) T (E+F) -1 (y-K(x)) + J C J = Fit to background + Fit to observations + constraints x= Analysis x b = Background (usually a short-range forecast from the previous cycle) B x = Background error covariance K= Forward model (nonlinear) O= Observations E+F= R = Instrument error + Representativeness error J C = Constraint term

10 Slide 10 Atmospheric Analysis Problem J = J b + J o + J c J = (x-x b ) T B x -1 (x-x b ) + (y-K(x)) T (E+F) -1 (y-K(x)) + J C J = Fit to background + Fit to observations + constraints x= Analysis x b = Background (usually a short-range forecast from the previous cycle) B x = Background error covariance K= Forward model (nonlinear) O= Observations E+F= R = Instrument error + Representativeness error J C = Constraint term

11 Slide 11 The ensembles are run to give the data assimilation system, flow-dependent background errors

12 Slide 12 Atmospheric Analysis Problem J = J b + J o + J c J = (x-x b ) T B x -1 (x-x b ) + (y-K(x)) T (E+F) -1 (y-K(x)) + J C J = Fit to background + Fit to observations + constraints x= Analysis x b = Background (usually a short-range forecast from the previous cycle) B x = Background error covariance K= Forward model (nonlinear) O= Observations E+F= R = Instrument error + Representativeness error J C = Constraint term

13 Slide 13 Radiance Data Assimilated at NCEP GOES-15 Sounders Channels 1-15 Meteosat-10 SEVIRI Channels 5-6 AMSU-A NOAA-15 Channels 1-10, 12-13, 15 NOAA-18 Channels 1-8, 10-13, 15 NOAA-19 Channels 1-7, 9-13, 15 METOP-A Channels 1-6, 8-13, 15 METOP-B Channels 1-13, 15 AQUA Channels 6, 8-13 MHS NOAA-18 Channels 1-5 NOAA-19 Channels 1-5 METOP-AChannels 1-5 METOP-BChannels 1-5 NOAA Satellite Proving Ground Slide 13 SSMIS DMSP-17 Channels 1-7, 24 DMSP-18 Channel 1-7, 24 HIRS METOP-AChannels 2-15 AIRS AQUA 120 Channels IASI METOP-A 165 Channels METOP-B 165 Channels CrIS SNPP 84 Channels

14 Slide 14 Derived Products Assimilated at NCEP Atmospheric Motion Vectors -MODIS Aqua and Terra -GOES-E, GOES-W -MSG -MTSAT -VIIRS (pre-operational) Ozone -NOAA-19 SBUV Profiles -OMI Total Column GPSRO NOAA Satellite Proving Ground Slide 14

15 Slide 15 Global Satellite Observing System

16 Slide 16 Data assimilation requires Accurate and fast observation operators -For radiances we use CRTM Quality control -To handle situations where the observation cannot be used Bias-correction Data quality monitoring NOAA Satellite Proving Ground Slide 16

17 Slide 17 Application of NWP Bias Correction for SSMIS F18 Ascending NodeDescending Node Latitude Unbias & Bias Corrected O-B O-B Before Bias Correction Global Dsc Asc O-B After Bias Correction Global Dsc Asc O-B Before Bias Correction O-B After Bias Correction Using Met Office SSMIS Bias Correction Predictors T (K) 17

18 Slide 18 Quality Monitoring of Satellite Data AIRS Channel 453 26 March 2007 Increase in SD Fits to Guess

19 Slide 19 Data assimilation is statistical NOAA Satellite Proving Ground Slide 19 A significant positive impact at day 8 still has a number of cases with negative impact.

20 Slide 20 Data assimilation is statistical (2) NOAA Satellite Proving Ground Slide 20 Histogram of the individual impact of each NOAA-19 AMSU-A Ch 8 field-of-view

21 Slide 21 Verification NOAA Satellite Proving Ground Slide 21

22 Slide 22 Verification Obviously the ultimate purpose of assimilating any kind of observation into NWP models is to improve the model forecast. Therefore robust methods for determining the impact of the observations are required. Operational centers typically have a wide range of tools available for this. The initial aim in any data assimilation system is to use the observations in such a way as to improve the accuracy in the analysis. Sometimes these improvements are not directly translated into increases in forecast skill due to the characteristics of the forecast model itself. In general this presentation is considering aircraft observations of temperature, humidity and wind vector together. But first, we need to define what we can use as the “truth” when verifying our models… Slide 22

23 Slide 23 What is truth? Observations? Normally this means radiosondes (but also surface observations and aircraft are used). This means the statistics are biased towards densely populated regions in the northern temperate latitudes. We should also (but usually don’t) take account of the errors in the observations themselves Slide 23 Potentially we could also use satellite observations (e.g., radiances) for verification as this would give more global sampling. However, comparisons in radiance space would be less intuitive.

24 Slide 24 What is truth? Analyses? The analysis should the best estimate of the atmospheric state through the combination of the information from the observations in the current and (through the forecast model) previous model cycles. Also, given that the analysis does not suffer from the spatial sampling issues of conventional observations, the analysis seems to be the ideal “truth”. However, for forecast ranges of approximately three days or less it is found that forecast skill (and even the sign of that skill) is highly dependent on the verifying analysis (control, test, independent) used. Also, for certain types of changes, if additional structure is added to the analysis fields this can be penalized in the usual forecast skill measures as it is easier to obtain a good fit to smooth rather than complex fields. Slide 24

25 Slide 25 What is truth? In practice, both observations and analyses are used in verifying forecast skill The 2014 NOAA Aircraft Workshop, ARINC Slide 25

26 Slide 26 Data Denial Data denial or Observation System Experiments (OSEs) are simply a way of investigating the impact of an observation or change by running full forecast experiments with and without the element to be tested. OSEs are expensive to run, particularly at full operational resolution, and they need to be run for many forecast cycles (60 days is a typical number for global forecast systems) before statistically significant results are obtained. Individual case studies are generally not trusted as a way of demonstrating forecast impact because of the dominance of statistical fluctuations. Forecast impact scores are generally presented with error bars indicating statistical significance. Scores are normally given in terms of differences between forecasts and “truth” in terms of RMS error or anomaly correlation coefficients (see next slide) The 2014 NOAA Aircraft Workshop, ARINC Slide 26

27 Slide 27 Anomaly correlation coefficients The 2014 NOAA Aircraft Workshop, ARINC Slide 27 Forecast skill is often given in terms of anomaly correlation coefficients (ACC scores). This is a measure of the correlation between the forecast (f) and the analysis (a) while removing the climatological average field (c) to mitigate the seasonal signal: Scores of 100% are perfect forecasts, 60% is considered the lower limit for useful skill.

28 Slide 28 xbxb xgxg t= -6 hrs egeg xtxt xfxf xaxa t= 24 hrs t=0 efef 6 hr assimilation window Observations move the forecast from the background trajectory to the trajectory starting from the new analysis In this context, “OBSERVATION IMPACT” is the effect of observations on the difference in forecast error norms C(e f -e g ) Forecast Sensitivity to Observations (FSO) The 2014 NOAA Aircraft Workshop, ARINC 28 “Truth” “Background” “Analysis” 30-hr fcst 24-hr fcst Langland and Baker (Tellus, 2004), Gelaro et al (2007), Morneau et al. (2006)

29 Slide 29 xbxb xgxg t= -6 hrs egeg xtxt xfxf xaxa t= 24 hrs t=0 efef 6 hr assimilation window Observations move the forecast from the background trajectory to the trajectory starting from the new analysis In this context, “OBSERVATION IMPACT” is the effect of observations on the difference in forecast error norms C(e f -e g ) Forecast Sensitivity to Observations (FSO) The 2014 NOAA Aircraft Workshop, ARINC 29 “Truth” “Background” “Analysis” 30-hr fcst 24-hr fcst ( e f - e g )=Mkδy δy = observation innovations (y-H(x b )) K=Kalman gain (transforms observation innovations to analysis increments) M=Forecast model (transforms analysis to forecast) Langland and Baker (Tellus, 2004), Gelaro et al (2007), Morneau et al. (2006)

30 Slide 30 Forecast Sensitivity to Observations (FSO) We want to get the sensitivity of the forecast to the observation increments so we apply the tangent linear model to C(e f -e g )=CMKδy: δe f-g = ½[MKδy] T C(e f -e g ) = ½ δy T M T K T C(e f -e g ) The 2014 NOAA Aircraft Workshop, ARINC Slide 30

31 Slide 31 Forecast Sensitivity to Observations (FSO) We want to get the sensitivity of the forecast to the observation increments so we apply the tangent linear model to C(e f -e g )=CMKδy: δe f-g = ½[MKδy] T C(e f -e g ) = ½ δy T M T K T C(e f -e g ) The 2014 NOAA Aircraft Workshop, ARINC Slide 31 Adjoint of data assimilation scheme Adjoint of linearized forecast model* Usually these both require approximations to be made (including linearity) and so this method is limited to forecast ranges of less than ~48 hours. *Already required if running 4DVar

32 Slide 32 Forecast Sensitivity to Observations (FSO) The 2014 NOAA Aircraft Workshop, ARINC Slide 32

33 Slide 33 Advantages and Disadvantages of FSO Advantages Can infer the impact of observations to whatever level of detail is required (e.g. ob by ob, channel by channel) without having to re-run the full system repeatedly. Useful for determining relative impact of observations and for quality control of bad observations. Allows the impact of observations on the forecast to be monitored on a daily basis. Disadvantages Limited to short-range forecasts So there is sensitivity to the accuracy of the verifying analysis Impact is always in the context of the total observing system as used Forecast impacts of an observation type may change as other observations are added/removed. The 2014 NOAA Aircraft Workshop, ARINC Slide 33

34 Slide 34 Data denial experiments At NCEP we are still developing an FSO capability based on the EnKF methodology. So we will present an overview of observation impact based on the data denial method. NOAA Satellite Proving Ground Slide 34

35 Slide 35 Data Denial Experiments (Jim Jung and Lars-Peter Riisshojgaard) NOAA Satellite Proving Ground Slide 35

36 Slide 36 Background NCEP Operational GDAS/GFS May 2011 version T574L64 operational resolution Two Seasons -Aug-Sept 2010 -Dec 2010-Jan 2011 Cycled experiments 7 Day forecast at 00Z Control late analysis (GDAS) used for verification Not NCEP operations computer 36

37 Slide 37 500 hPa Anomaly Correlations 15 Aug – 30 Sep 2010 37 No Satellite / No Conventional Data Northern HemisphereSouthern Hemisphere

38 Slide 38 500 hPa Anomaly Correlations 15 Aug – 30 Sep 2010 38 No AMSU-A / No MHS Northern HemisphereSouthern Hemisphere

39 Slide 39 500 hPa Anomaly Correlations 15 Aug – 30 Sep 2010 39 No GPS-RO / No AMV Northern HemisphereSouthern Hemisphere

40 Slide 40 500 hPa Anomaly Correlations 15 Aug – 30 Sep 2010 40 No Rawinsondes / No Aircraft Northern HemisphereSouthern Hemisphere

41 Slide 41 500 hPa Anomaly Correlations 15 Aug – 30 Sep 2010 41 No Hyperspectral Infrared Northern HemisphereSouthern Hemisphere

42 Slide 42 500 hPa, Day 5, Instrument Average AC scores 42

43 Slide 43 Data Addition Experiments (Jim Jung and Mitch Goldberg) NOAA Satellite Proving Ground Slide 43

44 Slide 44 Background 44 20 September 2015 Pre 2015 version of the GDAS/GFS  Hybrid (80 ensembles, T254)  T670 Semi-Lagrangian Winter and summer seasons 2014 Baseline is conventional data and GPS-RO Add single instruments  ATMS (SNPP), CrIS (SNPP)  AMSUA, MHS (Metop-b), IASI (Metop-a)  SSMIS (F18), AIRS (Aqua) Verified against the late control analysis with all operational data

45 Slide 45 250 hPa Northern Hemisphere AC scores for 20140801 – 20140831 00Z 45 20 September 2015

46 Slide 46 250 hPa Southern Hemisphere AC scores for 20140801 – 20140831 00Z 46 20 September 2015

47 Slide 47 500 hPa Northern Hemisphere AC scores for 20140801 – 20140831 00Z 47 20 September 2015

48 Slide 48 500 hPa Southern Hemisphere AC scores for 20140801 – 20140831 00Z 48 20 September 2015

49 Slide 49 1000 hPa Northern Hemisphere AC scores for 20140801 – 20140831 00Z 49 20 September 2015

50 Slide 50 1000 hPa Southern Hemisphere AC scores for 20140801 – 20140831 00Z 50 20 September 2015

51 Slide 51 200 hPa Tropical Vector Wind RMSE for 20140801 – 20140831 00Z 51 20 September 2015

52 Slide 52 850 hPa Tropical Vector Wind RMSE for 20140801 – 20140831 00Z 52 20 September 2015

53 Slide 53 JPSS and GOES-R NOAA Satellite Proving Ground Slide 53

54 Slide 54 So how will JPSS/GOES-R fit in? ATMS and CrIS will be assimilated closely following our experience with the NPP instruments -We will continue to move towards greater use of cloudy radiances -We will make more aggressive use of CrIS channels – both in terms of observation error and the number used. VIIRS will be used primarily through the winds and SST analysis. OMPS ozone GOES-R will provide an initial impact through the AMVs There are significant challenges still with using the full GOES-R radiance data stream, so averaged products will be useful. NOAA Satellite Proving Ground Slide 54

55 Slide 55 Summary NOAA Satellite Proving Ground Slide 55

56 Slide 56 Summary Data assimilation is an efficient form of data fusion. The global observing system combined with the data assimilation system is very robust. -Individual data types have an incremental impact on forecast skill but the combined effect is very large. Data addition experiments can give greater insight into the impact of each individual data type We can expect rapid implementation (dependent on model upgrade implementation cycles) of CrIS, ATMS, VIIRS AMVs and GOES-R AMVs based on current experience with existing data. NOAA Satellite Proving Ground Slide 56

57 Slide 57 NOAA Satellite Proving Ground Slide 57


Download ppt "Slide 1 Model Assimilation of Satellite Observations and Retrievals Andrew Collard"

Similar presentations


Ads by Google