Presentation is loading. Please wait.

Presentation is loading. Please wait.

Yalchin Efendiev Texas A&M University

Similar presentations


Presentation on theme: "Yalchin Efendiev Texas A&M University"— Presentation transcript:

1 Yalchin Efendiev Texas A&M University
WEP 2010 Multiscale modeling and data assimilation. Mathematical and algorithmic aspects. Yalchin Efendiev Texas A&M University

2 Multiscale models What? Techniques allow moving from one scale to another. Why? (1) Many fine-scale models are prohibitively expensive for simulations (2) quantities of interests or observables are coarse-scale variables… What we have done so far: A unified theory of multiscale modeling for spatial fields (images, engineering applications) by defining appropriate distance functions. What we will do: Brief review. Multiscale theory for problems with time scales. Presence of uncertainties. A general concept.

3 Comparing fine and coarse?
V x y Similarity: d-d*~ ||x-y||.

4 Coarsening

5 Multiscale model in porous media app.
Parallel computation is important in these applications

6 Coarsening across ensemble
Assume there are multiple images and we would like to coarsen the whole ensemble. Examples: porous media, face detection,… Clustering of realizations.

7 Coarsening of ODEs. Time scales.
y x

8 ODEs averaging

9 ODEs

10 Particle density particles General strategy d

11 Data assimilation techniques
Data come at different scales with associated precisions. Assume we have data from different sources and denote them by D1,…,DN. Examples: permeability measurements near wells, production data, pressure-transient data, tracer data, seismic data, geological data,….

12 Data assimilation Consider forward problem A(y(x),k(x))=0 where k(x) are input parameters, y(x) is the solution, and A is a nonlinear solution operator Given noisy observations, possibly at different scales, (e.g., y(x0), y1(x)…), our goal is to estimate k(x). Important issues: sparsity of “useful” data (non-uniqueness); data scales; noisy data; computational time (expensive forward problems)

13 Example Example 1. Images. Input parameter is image. Observed data can be scanned images, some pixel values,… Relation between input and observed is through nonlinear equations. Example 2. Porous media. Input parameters are permeability k(x), rel. perms,… Observed – production data,… Relation is through nonlinear PDEs.

14 Data assimilation/inverse problem
Data assimilation vs. inverse problems. Inverse problem: find k given observations. Objective functions Example: F=Ak. Non-uniqueness. Penalization allows determining uniquely the solution Disadvantages: point estimates; relation of uncertainties in data and output is missing; cant incorporate uncertainties in penalization terms easily Bayesian inversion incorporates measurement errors and probabilistic prior information and sets up a posterior distribution.

15 Inverse problem with Bayesian approaches

16 Example

17 Example

18 Gaussian spatial fields
Example: assume d is a random field described by two-point covariance function C(x,y)=E(d(x)d(y)). y x

19 A coarsening for a random field

20 Data Integration application
Prior PDF for Reservoir Model (m=k) Likelihood Function Flow Simulation Posterior PDF for Reservoir Model

21 Production data for realizations

22 Multiscale data assimilation
P(k|D)=P(D|k)P(k) P(k1,…,kM|D1,…,DN)=P(D1,…,DN|k1,…,kM) P(k1,…,kM) If there is not sufficient data to estimate k1, k1 is sampled from the prior distribution

23 Prior modeling Identifying and representing features such as facies
+ Identifying and representing features such as facies Identifying textures and representing them using variogram based permeability fields

24 Priors. Feature based, texture based.

25 Sampling. Previous discussions are how to setup a posterior distribution. Once it is set, our goal is to get valid samples from it. Many approaches exist for sampling. One of general tools for sampling from complicated probability distributions with unknown normalizing constant is Markov chain Monte Carlo. The main idea of MCMC is to construct a Markov chain with steady state distribution given by posterior.


Download ppt "Yalchin Efendiev Texas A&M University"

Similar presentations


Ads by Google