Yalchin Efendiev Texas A&M University

Slides:



Advertisements
Similar presentations
Introduction to Monte Carlo Markov chain (MCMC) methods
Advertisements

1 -Classification: Internal Uncertainty in petroleum reservoirs.
Mobile Robot Localization and Mapping using the Kalman Filter
Probabilistic models Jouni Tuomisto THL. Outline Deterministic models with probabilistic parameters Hierarchical Bayesian models Bayesian belief nets.
Bayesian inference of normal distribution
ECE 8443 – Pattern Recognition LECTURE 05: MAXIMUM LIKELIHOOD ESTIMATION Objectives: Discrete Features Maximum Likelihood Resources: D.H.S: Chapter 3 (Part.
Bayesian Estimation in MARK
Markov-Chain Monte Carlo
On-Line Probabilistic Classification with Particle Filters Pedro Højen-Sørensen, Nando de Freitas, and Torgen Fog, Proceedings of the IEEE International.
Chapter 4: Linear Models for Classification
Bayesian statistics – MCMC techniques
HMM-BASED PATTERN DETECTION. Outline  Markov Process  Hidden Markov Models Elements Basic Problems Evaluation Optimization Training Implementation 2-D.
Spatially clustered processes are very pervasive in nature Can we do more to insure that our estimates are physically realistic? How can we incorporate.
Using ranking and DCE data to value health states on the QALY scale using conventional and Bayesian methods Theresa Cain.
Lecture II-2: Probability Review
Robin McDougall, Ed Waller and Scott Nokleby Faculties of Engineering & Applied Science and Energy Systems & Nuclear Science 1.
Classification: Internal Status: Draft Using the EnKF for combined state and parameter estimation Geir Evensen.
Image Analysis and Markov Random Fields (MRFs) Quanren Xiong.
Overview G. Jogesh Babu. Probability theory Probability is all about flip of a coin Conditional probability & Bayes theorem (Bayesian analysis) Expectation,
Bayesian and Geostatistical Approaches to Inverse Problems Peter K. Kitanidis Civil and Environmental Engineering Stanford University.
Travel-time Tomography For High Contrast Media based on Sparse Data Yenting Lin Signal and Image Processing Institute Department of Electrical Engineering.
Model Inference and Averaging
Monte Carlo Simulation CWR 6536 Stochastic Subsurface Hydrology.
Hierarchical Bayesian Modeling (HBM) in EEG and MEG source analysis Carsten Wolters Institut für Biomagnetismus und Biosignalanalyse, Westfälische Wilhelms-Universität.
Kalman Filter (Thu) Joon Shik Kim Computational Models of Intelligence.
Stochastic Monte Carlo methods for non-linear statistical inverse problems Benjamin R. Herman Department of Electrical Engineering City College of New.
Computer Science, Software Engineering & Robotics Workshop, FGCU, April 27-28, 2012 Fault Prediction with Particle Filters by David Hatfield mentors: Dr.
ECE 8443 – Pattern Recognition LECTURE 07: MAXIMUM LIKELIHOOD AND BAYESIAN ESTIMATION Objectives: Class-Conditional Density The Multivariate Case General.
Mixture Models, Monte Carlo, Bayesian Updating and Dynamic Models Mike West Computing Science and Statistics, Vol. 24, pp , 1993.
Graphical Models for Machine Learning and Computer Vision.
ECE 8443 – Pattern Recognition ECE 8527 – Introduction to Machine Learning and Pattern Recognition LECTURE 07: BAYESIAN ESTIMATION (Cont.) Objectives:
Latent Class Regression Model Graphical Diagnostics Using an MCMC Estimation Procedure Elizabeth S. Garrett Scott L. Zeger Johns Hopkins University
The famous “sprinkler” example (J. Pearl, Probabilistic Reasoning in Intelligent Systems, 1988)
Application of the MCMC Method for the Calibration of DSMC Parameters James S. Strand and David B. Goldstein The University of Texas at Austin Sponsored.
Probabilistic models Jouni Tuomisto THL. Outline Deterministic models with probabilistic parameters Hierarchical Bayesian models Bayesian belief nets.
Markov Chain Monte Carlo for LDA C. Andrieu, N. D. Freitas, and A. Doucet, An Introduction to MCMC for Machine Learning, R. M. Neal, Probabilistic.
Reducing MCMC Computational Cost With a Two Layered Bayesian Approach
Adaptive Spatial Resampling as a McMC Method for Uncertainty Quantification in Seismic Reservoir Modeling Cheolkyun Jeong*, Tapan Mukerji, and Gregoire.
Lab for Remote Sensing Hydrology and Spatial Modeling Dept of Bioenvironmental Systems Engineering National Taiwan University 1/45 GEOSTATISTICS INTRODUCTION.
1 Chapter 8: Model Inference and Averaging Presented by Hui Fang.
Introduction to Sampling Methods Qi Zhao Oct.27,2004.
RECITATION 2 APRIL 28 Spline and Kernel method Gaussian Processes Mixture Modeling for Density Estimation.
Ch 1. Introduction Pattern Recognition and Machine Learning, C. M. Bishop, Updated by J.-H. Eom (2 nd round revision) Summarized by K.-I.
Monte Carlo Sampling to Inverse Problems Wojciech Dębski Inst. Geophys. Polish Acad. Sci. 1 st Orfeus workshop: Waveform inversion.
Canadian Bioinformatics Workshops
Hierarchical Models. Conceptual: What are we talking about? – What makes a statistical model hierarchical? – How does that fit into population analysis?
10 October, 2007 University of Glasgow 1 EM Algorithm with Markov Chain Monte Carlo Method for Bayesian Image Analysis Kazuyuki Tanaka Graduate School.
Geostatistical History Matching Methodology using Block-DSS for Multi-Scale Consistent Models PHD PROGRAM IN PETROLUM ENGINEERING CATARINA MARQUES
Generalization Performance of Exchange Monte Carlo Method for Normal Mixture Models Kenji Nagata, Sumio Watanabe Tokyo Institute of Technology.
Markov Chain Monte Carlo in R
Biointelligence Laboratory, Seoul National University
Reducing Photometric Redshift Uncertainties Through Galaxy Clustering
Model Inference and Averaging
Ch3: Model Building through Regression
Classification of unlabeled data:
Department of Civil and Environmental Engineering
Probabilistic Robotics
Professor S K Dubey,VSM Amity School of Business
Predictive distributions
Hidden Markov Models Part 2: Algorithms
Multidimensional Integration Part I
Filtering and State Estimation: Basic Concepts
Graduate School of Information Sciences, Tohoku University
Stanford Center for Reservoir Forecasting
LECTURE 09: BAYESIAN LEARNING
LECTURE 07: BAYESIAN ESTIMATION
Energy Resources Engineering Department Stanford University, CA, USA
Opinionated Lessons #39 MCMC and Gibbs Sampling in Statistics
Probabilistic Surrogate Models
Classical regression review
Presentation transcript:

Yalchin Efendiev Texas A&M University WEP 2010 Multiscale modeling and data assimilation. Mathematical and algorithmic aspects. Yalchin Efendiev Texas A&M University

Multiscale models What? Techniques allow moving from one scale to another. Why? (1) Many fine-scale models are prohibitively expensive for simulations (2) quantities of interests or observables are coarse-scale variables… What we have done so far: A unified theory of multiscale modeling for spatial fields (images, engineering applications) by defining appropriate distance functions. What we will do: Brief review. Multiscale theory for problems with time scales. Presence of uncertainties. A general concept.

Comparing fine and coarse? V x y Similarity: d-d*~ ||x-y||.

Coarsening

Multiscale model in porous media app. Parallel computation is important in these applications

Coarsening across ensemble Assume there are multiple images and we would like to coarsen the whole ensemble. Examples: porous media, face detection,… Clustering of realizations.

Coarsening of ODEs. Time scales. y x

ODEs averaging

ODEs

Particle density particles General strategy d

Data assimilation techniques Data come at different scales with associated precisions. Assume we have data from different sources and denote them by D1,…,DN. Examples: permeability measurements near wells, production data, pressure-transient data, tracer data, seismic data, geological data,….

Data assimilation Consider forward problem A(y(x),k(x))=0 where k(x) are input parameters, y(x) is the solution, and A is a nonlinear solution operator Given noisy observations, possibly at different scales, (e.g., y(x0), y1(x)…), our goal is to estimate k(x). Important issues: sparsity of “useful” data (non-uniqueness); data scales; noisy data; computational time (expensive forward problems)

Example Example 1. Images. Input parameter is image. Observed data can be scanned images, some pixel values,… Relation between input and observed is through nonlinear equations. Example 2. Porous media. Input parameters are permeability k(x), rel. perms,… Observed – production data,… Relation is through nonlinear PDEs.

Data assimilation/inverse problem Data assimilation vs. inverse problems. Inverse problem: find k given observations. Objective functions Example: F=Ak. Non-uniqueness. Penalization allows determining uniquely the solution Disadvantages: point estimates; relation of uncertainties in data and output is missing; cant incorporate uncertainties in penalization terms easily Bayesian inversion incorporates measurement errors and probabilistic prior information and sets up a posterior distribution.

Inverse problem with Bayesian approaches

Example

Example

Gaussian spatial fields Example: assume d is a random field described by two-point covariance function C(x,y)=E(d(x)d(y)). y x

A coarsening for a random field

Data Integration application Prior PDF for Reservoir Model (m=k) Likelihood Function Flow Simulation Posterior PDF for Reservoir Model

Production data for realizations

Multiscale data assimilation P(k|D)=P(D|k)P(k) P(k1,…,kM|D1,…,DN)=P(D1,…,DN|k1,…,kM) P(k1,…,kM) If there is not sufficient data to estimate k1, k1 is sampled from the prior distribution

Prior modeling Identifying and representing features such as facies + Identifying and representing features such as facies Identifying textures and representing them using variogram based permeability fields

Priors. Feature based, texture based.

Sampling. Previous discussions are how to setup a posterior distribution. Once it is set, our goal is to get valid samples from it. Many approaches exist for sampling. One of general tools for sampling from complicated probability distributions with unknown normalizing constant is Markov chain Monte Carlo. The main idea of MCMC is to construct a Markov chain with steady state distribution given by posterior.