Review of “Covariance Localization” in Ensemble Filters Tom Hamill NOAA Earth System Research Lab, Boulder, CO NOAA Earth System Research.

Slides:



Advertisements
Similar presentations
FTP Biostatistics II Model parameter estimations: Confronting models with measurements.
Advertisements

Effects of model error on ensemble forecast using the EnKF Hiroshi Koyama 1 and Masahiro Watanabe 2 1 : Center for Climate System Research, University.
1 What constrains spread growth in forecasts initialized from ensemble Kalman filters? Tom Hamill (& Jeff Whitaker) NOAA Earth System Research Lab Boulder,
1 アンサンブルカルマンフィルターによ る大気海洋結合モデルへのデータ同化 On-line estimation of observation error covariance for ensemble-based filters Genta Ueno The Institute of Statistical.
Empirical Localization of Observation Impact in Ensemble Filters Jeff Anderson IMAGe/DAReS Thanks to Lili Lei, Tim Hoar, Kevin Raeder, Nancy Collins, Glen.
Balance in the EnKF Jeffrey D. Kepert Centre for Australian Weather and Climate Research A partnership between the Australian Bureau of Meteorology and.
1/20 Accelerating minimizations in ensemble variational assimilation G. Desroziers, L. Berre Météo-France/CNRS (CNRM/GAME)
REGRESSION What is Regression? What is the Regression Equation? What is the Least-Squares Solution? How is Regression Based on Correlation? What are the.
SYSTEMS Identification
Simulation Modeling and Analysis Session 12 Comparing Alternative System Designs.
WWRP/THORPEX WORKSHOP on 4D-VAR and ENSEMBLE KALMAN FILTER INTER-COMPARISONS BUENOS AIRES - ARGENTINA, NOVEMBER 2008 Topics for Discussion in Session.
To understand the differing localization strength, consider two grid points, observation at grid point 1. K for grid point 2: (d 12 = distance between.
Advanced data assimilation methods- EKF and EnKF Hong Li and Eugenia Kalnay University of Maryland July 2006.
Accounting for correlated AMSU-A observation errors in an ensemble Kalman filter 18 August 2014 The World Weather Open Science Conference, Montreal, Canada.
Current Status of the Development of the Local Ensemble Transform Kalman Filter at UMD Istvan Szunyogh representing the UMD “Chaos-Weather” Group Ensemble.
Two Methods of Localization  Model Covariance Matrix Localization (B Localization)  Accomplished by taking a Schur product between the model covariance.
Background In deriving basic understanding of atmospheric phenomena, the analysis often revolves around discovering and exploiting relationships between.
REGRESSION Predict future scores on Y based on measured scores on X Predictions are based on a correlation from a sample where both X and Y were measured.
A comparison of hybrid ensemble transform Kalman filter(ETKF)-3DVAR and ensemble square root filter (EnSRF) analysis schemes Xuguang Wang NOAA/ESRL/PSD,
Objectives of Multiple Regression
Ensemble-Based Atmospheric Data Assimilation: A Tutorial Thomas M. Hamill NOAA-CIRES Climate Diagnostics Center Boulder, Colorado USA 80301
Geo479/579: Geostatistics Ch13. Block Kriging. Block Estimate  Requirements An estimate of the average value of a variable within a prescribed local.
The impact of localization and observation averaging for convective-scale data assimilation in a simple stochastic model Michael Würsch, George C. Craig.
Ensemble-variational sea ice data assimilation Anna Shlyaeva, Mark Buehner, Alain Caya, Data Assimilation and Satellite Meteorology Research Jean-Francois.
Ensemble Data Assimilation and Uncertainty Quantification Jeffrey Anderson, Alicia Karspeck, Tim Hoar, Nancy Collins, Kevin Raeder, Steve Yeager National.
EnKF Overview and Theory
Observing Strategy and Observation Targeting for Tropical Cyclones Using Ensemble-Based Sensitivity Analysis and Data Assimilation Chen, Deng-Shun 3 Dec,
Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 Part 4 Curve Fitting.
Computational Issues: An EnKF Perspective Jeff Whitaker NOAA Earth System Research Lab ENIAC 1948“Roadrunner” 2008.
1 ESTIMATING THE STATE OF LARGE SPATIOTEMPORALLY CHAOTIC SYSTEMS: WEATHER FORECASTING, ETC. Edward Ott University of Maryland Main Reference: E. OTT, B.
CSDA Conference, Limassol, 2005 University of Medicine and Pharmacy “Gr. T. Popa” Iasi Department of Mathematics and Informatics Gabriel Dimitriu University.
Federal Department of Home Affairs FDHA Federal Office of Meteorology and Climatology MeteoSwiss High-resolution data assimilation in COSMO: Status and.
3DVAR Retrieval of 3D Moisture Field from Slant- path Water Vapor Observations of a High-resolution Hypothetical GPS Network Haixia Liu and Ming Xue Center.
MPO 674 Lecture 20 3/26/15. 3d-Var vs 4d-Var.
Dusanka Zupanski And Scott Denning Colorado State University Fort Collins, CO CMDL Workshop on Modeling and Data Analysis of Atmospheric CO.
Data Assimilation Using Modulated Ensembles Craig H. Bishop, Daniel Hodyss Naval Research Laboratory Monterey, CA, USA September 14, 2009 Data Assimilation.
Applications of optimal control and EnKF to Flow Simulation and Modeling Florida State University, February, 2005, Tallahassee, Florida The Maximum.
MODEL ERROR ESTIMATION EMPLOYING DATA ASSIMILATION METHODOLOGIES Dusanka Zupanski Cooperative Institute for Research in the Atmosphere Colorado State University.
The “ ” Paige in Kalman Filtering K. E. Schubert.
Sensitivity Analysis of Mesoscale Forecasts from Large Ensembles of Randomly and Non-Randomly Perturbed Model Runs William Martin November 10, 2005.
Research Vignette: The TransCom3 Time-Dependent Global CO 2 Flux Inversion … and More David F. Baker NCAR 12 July 2007 David F. Baker NCAR 12 July 2007.
11 Background Error Daryl T. Kleist* National Monsoon Mission Scoping Workshop IITM, Pune, India April 2011.
Data assimilation and forecasting the weather (!) Eugenia Kalnay and many friends University of Maryland.
Ensemble data assimilation in an operational context: the experience at the Italian Weather Service Massimo Bonavita and Lucio Torrisi CNMCA-UGM, Rome.
11/23/2015Slide 1 Using a combination of tables and plots from SPSS plus spreadsheets from Excel, we will show the linkage between correlation and linear.
Sabrina Rainwater David National Research Council Postdoc at NRL with Craig Bishop and Dan Hodyss Naval Research Laboratory Multi-scale Covariance Localization.
Deutscher Wetterdienst Vertical localization issues in LETKF Breogan Gomez, Andreas Rhodin, Hendrik Reich.
A kernel-density based ensemble filter applicable (?) to high-dimensional systems Tom Hamill NOAA Earth System Research Lab Physical Sciences Division.
Copyright © Cengage Learning. All rights reserved. 13 Linear Correlation and Regression Analysis.
Local Predictability of the Performance of an Ensemble Forecast System Liz Satterfield and Istvan Szunyogh Texas A&M University, College Station, TX Third.
Use ensemble error covariance information in GRAPES 3DVAR Jiandong GONG, Ruichun WANG NWP center of CMA Oct 27, 2015.
Targetting and Assimilation: a dynamically consistent approach Anna Trevisan (*), Alberto Carrassi (°) and Francesco Uboldi (§) (*) ISAC-CNR Bologna,
A Random Subgrouping Scheme for Ensemble Kalman Filters Yun Liu Dept. of Atmospheric and Oceanic Science, University of Maryland Atmospheric and oceanic.
The application of ensemble Kalman filter in adaptive observation and information content estimation studies Junjie Liu and Eugenia Kalnay July 13th, 2007.
École Doctorale des Sciences de l'Environnement d’Île-de-France Année Universitaire Modélisation Numérique de l’Écoulement Atmosphérique et Assimilation.
École Doctorale des Sciences de l'Environnement d’ Î le-de-France Année Modélisation Numérique de l’Écoulement Atmosphérique et Assimilation.
The Ensemble Kalman filter
Hybrid Data Assimilation
Jeffrey Anderson, NCAR Data Assimilation Research Section
Methods for dealing with spurious covariances arising from small samples in ensemble data assimilation Jeff Whitaker NOAA Earth.
CH 5: Multivariate Methods
Information content in ensemble data assimilation
EnKF Overview and Theory
background error covariance matrices Rescaled EnKF Optimization
Observation Informed Generalized Hybrid Error Covariance Models
Ensemble variance loss in transport models:
Vertical localization issues in LETKF
FSOI adapted for used with 4D-EnVar
Sarah Dance DARC/University of Reading
Presentation transcript:

Review of “Covariance Localization” in Ensemble Filters Tom Hamill NOAA Earth System Research Lab, Boulder, CO NOAA Earth System Research Laboratory

Canonical ensemble Kalman filter update equations H = (possibly nonlinear) operator from model to observation space x = state vector ( i for ith member ) (1)An ensemble of parallel data assimilation cycles is conducted, here assimilating perturbed observations. (2)Background-error covariances are estimated using the ensemble. Notes:

Example of covariance localization Background-error correlations estimated from 25 members of a 200-member ensemble exhibit a large amount of structure that does not appear to have any physical meaning. Without correction, an observation at the dotted location would produce increments across the globe. Proposed solution is element-wise multiplication of the ensemble estimates (a) with a smooth correlation function (c) to produce (d), which now resembles the large-ensemble estimate (b). This has been dubbed “covariance localization.” from Hamill, Chapter 6 of “Predictability of Weather and Climate” obs location

How covariance localization is commonly implemented in EnKF  s is typically a compactly supported function similar in shape to a Gaussian, 1.0 at observation location. See Gaspari and Cohn, 1999, QJRMS, 125, , eq. 4.10

Why localization ? Relative sampling error’s dependence on ensemble size, correlation. True background-error covariance of model with two grid points: Suppose there is inaccurate estimate of background-error covariance matrix and obs at grid point 1 with error covariance How do errors change with correlation and ensemble size? Relative error is large when correlation and/or ensemble size is small. Ref: Hamill et al, 2001, MWR, 129, pp

Spreads and errors as function of localization length scale Ref: Houtekamer and Mitchell, 2001, MWR, 129, pp (1) Small ensembles minimize error with short length scale, large ensemble with longer length scale. That is, less localization needed for large ensemble. (2) The longer the localization length scale, the less spread in the posterior ensemble.

Localization adds effective degrees of freedom to background-error covariance model Experiments with toy 2-layer primitive equation model With few members, eigenvalue spectrum of background error covariance is very steep, and with n- member ensemble, only n-1 degrees of freedom; with more members, less steep spectrum. Covariance localization in ensemble increases the effective degrees of freedom, makes eigenvalue spectrum flatter; more ways the ensemble can adjust to the data. At the end of the data assimilation, though, you still have posterior ensemble with only n-1 degrees of freedom. Ref: Hamill et al, 2001, MWR, 129, pp

Increasing localization scale increases CPU run time Ensemble filters commonly serially process the observations signs are grid points. Red dot: observation location. Blue disc: area spanned by localization function. Dashed line: box enclosing all possible grid points that must be updated given this localization scale.

Covariance localization introduces imbalances Scaled imbalance (imbalance / increment magnitude) for 3 gravest vertical modes, pair of 32- member ensembles. Imbalance measured by comparing digitally filtered state to unfiltered. Larger scaled imbalance at shorter correlation length scales. The imbalance may limit the growth of spread, making background-error covariance model estimated from ensemble less accurate. Ref: Mitchell and Houtekamer, 2002, MWR, 130, pp See also J. Kepert’s talk and A. Lorenc, 2003, QJRMS, 129,

Synthesis “Localization scale is a tuning parameter that trades off computational effort, sampling error, and imbalance error.” (Ref: Zhou et al., 2008, MWR, 136, )

Some alternatives to standard covariance localization

Localization by inversely scaling observation-error covariances x1x1 x2x2 x3x3 obs at x 2, H=I, variance R ss something akin to covariance localization can be obtained by inversely scaling the observation-error variance rather than modifying the background-error covariances. Pioneered in LETKF. Ref: Hunt et al. 2007, Physica D, 230,

“Hierarchical” ensemble filter Suppose m groups of n -member ensembles. Then m estimates of background-error covariance relationships between obs location, model grid point (*). Kalman filter update: effectively a linear regression analysis, compute increment in state variable x, given increments for observation variable, y o. Hence m sample values of  (i.e., K (*) ) are available. Assume correct  is a random draw from same distribution the m samples drawn from. “Regression confidence factor”  chosen to minimize An “on-the-fly” moderation of covariances; replace  with . Ref: Jeff Anderson, 2007, Physica D, 230,

Anderson’s hierarchical ensemble filter (continued) Hovmoller of regression confidence factor for Lorenz ‘96 model assimilations for observations at red line, 16-group ensemble, 14 members.  min  0 when spread of  is large relative to mean.  min  1 when spread is small relative to mean. Is the variability with time a feature or a bug? Regression confidence factor.

Example of where regression confidence factor may induce unrealistic increments White lines: background sea-level pressure. Colors: background precipitable water. Observation 1 hPa less than background at dot. Red contours: precipitable water analysis increment due to SLP observation, dashed is negative. Uses Gaspari-Cohn localization here.

Example of where regression confidence factor may induce unrealistic increments Imagine gain calculation along black line, and how it may vary if several parallel ensemble filters were run.

Gain calculations along cross-section K (Gain) Distance, left to right 0 Here, the spread in gains is small relative to the magnitude of gain; hence   1

Gain calculations along cross-section K (Gain) Distance, left to right 0 Here, the spread in gains is large relative to the magnitude of gain; hence   0

Gain calculations along cross-section K (Gain) Distance, left to right 0 Here, the spread in gains is large relative to the magnitude of gain; hence   0  Distance, left to right 0 1 Regression Confidence Factor

Gain calculations along cross-section  K (Gain) Distance, left to right 0 The product of  & K now produces a larger region in the middle with nearly zero gain, and hence near-zero increment. This may be, in some circumstances, non-physical.

Other localization ideas Multiscale filter: replaces at each update time, the sample covariance with a multiscale tree, composed of nodes distributed over a relatively small number of discrete scales –Ref: Zhou et al. 2008, MWR, 136,

Other localization ideas Spectral localization: Filter covariances after transformation to spectral space, or combine spectral and grid-point localization. –Ref: Buehner and Charron, 2007, QJRMS, 133,

“Successive Covariance Localization” Fuqing Zhang, Penn State Adjust ensemble to regular network of observations using a broad localization length scale. –Where observations are sparse, use of short localization length scale will result in imbalances, “holes” in analysis-error spread, increased analysis errors. Adjust modified ensemble further to high-density observations using a short localization length scale. –Where dense observations, use of long localization length scale will smooth out mesoscale detail, trust ensemble covariance model too much, and strip too much variance from ensemble. Effective compromise that works across both data-dense and data-sparse regions?

Change model variables? Localization in (z, ,  ) (streamfunction, velocity potential) instead of (z, u, v)? –Ref: J. Kepert, 2006, tinyurl.com/4j539t –Figure to right: errors for normal localization, localization in (z, ,  ) space, and localization zeroing out (z-  ) & (  -  )

Ensemble Correlations Raised to a Power (ECO-RAP) Craig Bishop will present this idea in more detail in the next talk.

Conclusions Some form of covariance localization is a practical necessity for EnKF NWP applications. Because of drawbacks (e.g., imbalances introduced), active search for improved localization algorithms. Many interesting alternatives, but community not yet united behind one obviously superior approach. Alternatives tend to have their own set of strengths and weaknesses.