Quality of model and Error Analysis in Variational Data Assimilation François-Xavier LE DIMET Victor SHUTYAEV Université Joseph Fourier+INRIA Projet IDOPT,

Slides:



Advertisements
Similar presentations
Geometric Integration of Differential Equations 2. Adaptivity, scaling and PDEs Chris Budd.
Advertisements

Introduction to Data Assimilation NCEO Data-assimilation training days 5-7 July 2010 Peter Jan van Leeuwen Data Assimilation Research Center (DARC) University.
The Inverse Regional Ocean Modeling System:
Setup of 4D VAR inverse modelling system for atmospheric CH 4 using the TM5 adjoint model Peter Bergamaschi Climate Change Unit Institute for Environment.
Use of Kalman filters in time and frequency analysis John Davis 1st May 2011.
Aspects of Conditional Simulation and estimation of hydraulic conductivity in coastal aquifers" Luit Jan Slooten.
Visual Recognition Tutorial
280 SYSTEM IDENTIFICATION The System Identification Problem is to estimate a model of a system based on input-output data. Basic Configuration continuous.
N.E. Leonard – ASAP Progress Meeting – February 17-18, 2005 Slide 1/22 ASAP Progress Report Adaptive Sampling and Cooperative Control Naomi Ehrich Leonard.
Motion Analysis (contd.) Slides are from RPI Registration Class.
I DENTIFICATION OF main flow structures for highly CHANNELED FLOW IN FRACTURED MEDIA by solving the inverse problem R. Le Goc (1)(2), J.-R. de Dreuzy (1)
A Concept of Environmental Forecasting and Variational Organization of Modeling Technology Vladimir Penenko Institute of Computational Mathematics and.
Advanced data assimilation methods with evolving forecast error covariance Four-dimensional variational analysis (4D-Var) Shu-Chih Yang (with EK)
Retrieval Theory Mar 23, 2008 Vijay Natraj. The Inverse Modeling Problem Optimize values of an ensemble of variables (state vector x ) using observations:
Advanced data assimilation methods- EKF and EnKF Hong Li and Eugenia Kalnay University of Maryland July 2006.
Basic Mathematics for Portfolio Management. Statistics Variables x, y, z Constants a, b Observations {x n, y n |n=1,…N} Mean.
1 Optimal solution error covariances in nonlinear problems of variational data assimilation Victor Shutyaev Institute of Numerical Mathematics, Russian.
Interval-based Inverse Problems with Uncertainties Francesco Fedele 1,2 and Rafi L. Muhanna 1 1 School of Civil and Environmental Engineering 2 School.
AN ITERATIVE METHOD FOR MODEL PARAMETER IDENTIFICATION 4. DIFFERENTIAL EQUATION MODELS E.Dimitrova, Chr. Boyadjiev E.Dimitrova, Chr. Boyadjiev BULGARIAN.
Computational Stochastic Optimization: Bridging communities October 25, 2012 Warren Powell CASTLE Laboratory Princeton University
06 - Boundary Models Overview Edge Tracking Active Contours Conclusion.
Computing a posteriori covariance in variational DA I.Gejadze, F.-X. Le Dimet, V.Shutyaev.
CSDA Conference, Limassol, 2005 University of Medicine and Pharmacy “Gr. T. Popa” Iasi Department of Mathematics and Informatics Gabriel Dimitriu University.
Federal Department of Home Affairs FDHA Federal Office of Meteorology and Climatology MeteoSwiss High-resolution data assimilation in COSMO: Status and.
A study of relations between activity centers of the climatic system and high-risk regions Vladimir Penenko & Elena Tsvetova.
Linear Systems Iterative Solutions CSE 541 Roger Crawfis.
Mean Field Variational Bayesian Data Assimilation EGU 2012, Vienna Michail Vrettas 1, Dan Cornford 1, Manfred Opper 2 1 NCRG, Computer Science, Aston University,
Dusanka Zupanski And Scott Denning Colorado State University Fort Collins, CO CMDL Workshop on Modeling and Data Analysis of Atmospheric CO.
Indo-European network on Advanced Instability Methods Sensitivity and Adjoint operators: a simple matlab example (the discrete adjoint approach)‏
Applications of optimal control and EnKF to Flow Simulation and Modeling Florida State University, February, 2005, Tallahassee, Florida The Maximum.
Center for Radiative Shock Hydrodynamics Fall 2011 Review Assessment of predictive capability Derek Bingham 1.
Variational Data Assimilation - Adjoint Sensitivity Analysis Yan Ding, Ph.D. National Center for Computational Hydroscience and Engineering The University.
MODEL ERROR ESTIMATION EMPLOYING DATA ASSIMILATION METHODOLOGIES Dusanka Zupanski Cooperative Institute for Research in the Atmosphere Colorado State University.
The “ ” Paige in Kalman Filtering K. E. Schubert.
Sensitivity Analysis of Mesoscale Forecasts from Large Ensembles of Randomly and Non-Randomly Perturbed Model Runs William Martin November 10, 2005.
Colorado Center for Astrodynamics Research The University of Colorado 1 STATISTICAL ORBIT DETERMINATION ASEN 5070 LECTURE 11 9/16,18/09.
Research Vignette: The TransCom3 Time-Dependent Global CO 2 Flux Inversion … and More David F. Baker NCAR 12 July 2007 David F. Baker NCAR 12 July 2007.
Satellite-based inversion of NOx emissions using the adjoint of CMAQ Amir Hakami, John H. Seinfeld (Caltech) Qinbin Li (JPL) Daewon W. Byun, Violeta Coarfa,
Data assimilation and forecasting the weather (!) Eugenia Kalnay and many friends University of Maryland.
Silesian University of Technology in Gliwice Inverse approach for identification of the shrinkage gap thermal resistance in continuous casting of metals.
Errors, Uncertainties in Data Assimilation François-Xavier LE DIMET Université Joseph Fourier+INRIA Projet IDOPT, Grenoble, France.
CHAPTER 17 O PTIMAL D ESIGN FOR E XPERIMENTAL I NPUTS Organization of chapter in ISSO –Background Motivation Finite sample and asymptotic (continuous)
Variational data assimilation: examination of results obtained by different combinations of numerical algorithms and splitting procedures Zahari Zlatev.
Lecture 2: Statistical learning primer for biologists
Controlled CO 2 | Diversified Fuels | Fuel-efficient Vehicles | Clean Refining | Extended Reserves © IFP IEA Collaborative Project on EOR - 30th Annual.
NASA, JPL,January  CLIME INRIA Paris  Météo France  Institut de Mathématiques, Université de Toulouse  LEGI, Grenoble  MOISE INRIA Grenoble.
Hydrologic Data Assimilation with a Representer-Based Variational Algorithm Dennis McLaughlin, Parsons Lab., Civil & Environmental Engineering, MIT Dara.
Lecture II-3: Interpolation and Variational Methods Lecture Outline: The Interpolation Problem, Estimation Options Regression Methods –Linear –Nonlinear.
Stochastic Hydrology Random Field Simulation Professor Ke-Sheng Cheng Department of Bioenvironmental Systems Engineering National Taiwan University.
École Doctorale des Sciences de l'Environnement d’ Î le-de-France Année Modélisation Numérique de l’Écoulement Atmosphérique et Assimilation.
Using Neumann Series to Solve Inverse Problems in Imaging Christopher Kumar Anand.
Amir Yavariabdi Introduction to the Calculus of Variations and Optical Flow.
Probabilistic Robotics Bayes Filter Implementations Gaussian filters.
Computational Fluid Dynamics Lecture II Numerical Methods and Criteria for CFD Dr. Ugur GUVEN Professor of Aerospace Engineering.
ECMWF/EUMETSAT NWP-SAF Satellite data assimilation Training Course Mar 2016.
Hybrid Data Assimilation
Deep Feedforward Networks
Probability Theory and Parameter Estimation I
Sahar Sargheini, Alberto Paganini, Ralf Hiptmair, Christian Hafner
PSG College of Technology
A New Scheme for Chaotic-Attractor-Theory Oriented Data Assimilation
Ensemble variance loss in transport models:
Stochastic Hydrology Random Field Simulation
Filtering and State Estimation: Basic Concepts
FSOI adapted for used with 4D-EnVar
Where did we stop? The Bayes decision rule guarantees an optimal classification… … But it requires the knowledge of P(ci|x) (or p(x|ci) and P(ci)) We.
Adaptive Perturbation Theory: QM and Field Theory
Independent Factor Analysis
Sarah Dance DARC/University of Reading
Presentation transcript:

Quality of model and Error Analysis in Variational Data Assimilation François-Xavier LE DIMET Victor SHUTYAEV Université Joseph Fourier+INRIA Projet IDOPT, Grenoble, France Russian Academy of Sciences Institute of Numerical Mathematiques

Prediction: What information is necessary ? Model Model - law of conservation mass, energy - Laws of behaviour - Parametrization of physical processes Observations in situ and/or remote Observations in situ and/or remote Statistics Statistics Images Images

Forecast.. Produced by the integration of the model from an initial condition Produced by the integration of the model from an initial condition Problem : how to link together heterogeneous sources of information Problem : how to link together heterogeneous sources of information Heterogeneity in : Heterogeneity in : Nature Nature Quality Quality Density Density

Basic Problem U and V control variables, V being and error on the model U and V control variables, V being and error on the model J cost function J cost function U* and V* minimizes J U* and V* minimizes J

Optimality System P is the adjoint variable. P is the adjoint variable. Gradients are couputed by solving the adjoint model then an optimization method is performed. Gradients are couputed by solving the adjoint model then an optimization method is performed.

Errors On the model On the model Physical approximation (e.g. parametrization of subgrid processes) Physical approximation (e.g. parametrization of subgrid processes) Numerical discretization Numerical discretization Numerical algorithms ( stopping criterions for iterative methods Numerical algorithms ( stopping criterions for iterative methods On the observations On the observations Physical measurement Physical measurement Sampling Sampling Some « pseudo-observations », from remote sensing, are obtained by solving an inverse problem. Some « pseudo-observations », from remote sensing, are obtained by solving an inverse problem.

Sensitivity of the initial condition with respect to errors on the models and on the observations. The prediction is highly dependant on the initial condition. The prediction is highly dependant on the initial condition. Models have errors Models have errors Observations have errors. Observations have errors. What is the sensitivity of the initial condition to these errors ? What is the sensitivity of the initial condition to these errors ?

Optimality System : including errors on the model and on the observation

Second order adjoint

Models and Data Is it necessary to improve a model if data are not changed ? Is it necessary to improve a model if data are not changed ? For a given model what is the « best » set of data? For a given model what is the « best » set of data? What is the adequation between models and data? What is the adequation between models and data?

A simple numerical experiment. Burger’s equation with homegeneous B.C.’s Burger’s equation with homegeneous B.C.’s Exact solution is known Exact solution is known Observations are without error Observations are without error Numerical solution with different discretization Numerical solution with different discretization The assimilation is performed between T=0 and T=1 The assimilation is performed between T=0 and T=1 Then the flow is predicted at t=2. Then the flow is predicted at t=2.

Partial Conclusion The error in the model is introduced through the discretization The error in the model is introduced through the discretization The observations remain the same whatever be the discretization The observations remain the same whatever be the discretization It shows that the forecast can be downgraded if the model is upgraded. It shows that the forecast can be downgraded if the model is upgraded. Only the quality of the O.S. makes sense. Only the quality of the O.S. makes sense.

Remark 1 How to improve the link between data and models? How to improve the link between data and models? C is the operator mapping the space of the state variable into the space of observations C is the operator mapping the space of the state variable into the space of observations We considered the liear case. We considered the liear case.

Remark 2 : ensemble prediction To estimate the impact of uncertainies on the prediction several prediction are performed with perturbed initial conditions To estimate the impact of uncertainies on the prediction several prediction are performed with perturbed initial conditions But the initial condition is an artefact : there is no natural error on it. The error comes from the data throughthe data assimilation process But the initial condition is an artefact : there is no natural error on it. The error comes from the data throughthe data assimilation process If the error on the data are gaussian : what about the initial condition? If the error on the data are gaussian : what about the initial condition?

Because D.A. is a non linear process then the initial condition is no longer gaussian

Control of the error

Choice of the base

Remark. The model has several sources of errors The model has several sources of errors Discretization errors may depends on the second derivative : we can identify this error in a base of the first eigenvalues of the Laplacian Discretization errors may depends on the second derivative : we can identify this error in a base of the first eigenvalues of the Laplacian The systematic error may depends be estimated using the eigenvalues of the correlation matrix The systematic error may depends be estimated using the eigenvalues of the correlation matrix

Numerical experiment With Burger’s equation With Burger’s equation Laplacian and covariance matrix have considered separately then jointly Laplacian and covariance matrix have considered separately then jointly The number of vectors considered in the correctin term varies The number of vectors considered in the correctin term varies

With the eigenvectors of the Laplacian

Model error estimation controlled system model cost function optimality conditions adjoint system(to calculate the gradient)

Reduction of the size of the controlled problem Change the space bases Suppose is a base of the phase space and is time-dependent base function on [0, T], so that then the controlled variables are changed to with controlled space size

Optimality conditions for the estimation of model errors after size reduction If P is the solution of adjoint system, we search for optimal values of to minimize J :

Problem : how to choose the spatial base ? Consider the fastest error propagation direction Amplification factor Choose as leading eigenvectors of Calculus of - Lanczos Algorithm

Numerical experiments with another base Choice of “correct” model : - fine discretization: domain with 41 times 41 grid points To get the simulated observation - simulation results of ‘ correct’ model Choice of “incorrect” model : - coarse discretization: domain with 21 times 21 grid points

The difference of potential field between two models after 8 hours’ integration

Experiments without size reduction (1083*48) : the discrepancy of models at the end of integration before optimization after optimization

Experiments with size reduction (380*48) : the discrepancy of models at the end of integration before optimization after optimization

Experiments with size reduction (380*8) : the discrepancy of models at the end of integration before optimization after optimization

Conclusion For Data assimilation, Controlling the model error is a significant improvement. For Data assimilation, Controlling the model error is a significant improvement. In term of software development it’s cheap. In term of software development it’s cheap. In term of computational cost it could be expensive. In term of computational cost it could be expensive. It is a powerful tool for the analysis and identification of errors It is a powerful tool for the analysis and identification of errors