Image Analysis and Markov Random Fields (MRFs) Quanren Xiong.

Slides:



Advertisements
Similar presentations
Gibbs Sampling Methods for Stick-Breaking priors Hemant Ishwaran and Lancelot F. James 2001 Presented by Yuting Qi ECE Dept., Duke Univ. 03/03/06.
Advertisements

Introduction to Markov Random Fields and Graph Cuts Simon Prince
Gibbs Sampling Qianji Zheng Oct. 5th, 2010.
EE462 MLCV Lecture Introduction of Graphical Models Markov Random Fields Segmentation Tae-Kyun Kim 1.
Bayesian Methods with Monte Carlo Markov Chains III
Introduction to Sampling based inference and MCMC Ata Kaban School of Computer Science The University of Birmingham.
CHAPTER 16 MARKOV CHAIN MONTE CARLO
Lecture 3: Markov processes, master equation
1 Bayesian Methods with Monte Carlo Markov Chains II Henry Horng-Shing Lu Institute of Statistics National Chiao Tung University
Graduate School of Information Sciences, Tohoku University
Markov random field Institute of Electronics, NCTU
BAYESIAN INFERENCE Sampling techniques
Chapter 8-3 Markov Random Fields 1. Topics 1. Introduction 1. Undirected Graphical Models 2. Terminology 2. Conditional Independence 3. Factorization.
CS774. Markov Random Field : Theory and Application Lecture 16 Kyomin Jung KAIST Nov
Econometrics & Business The University of Sydney Michael Smith Econometrics & Business Statistics, U. of Sydney Ludwig Fahrmeir Department.
J. Mike McHugh,Janusz Konrad, Venkatesh Saligrama and Pierre-Marc Jodoin Signal Processing Letters, IEEE Professor: Jar-Ferr Yang Presenter: Ming-Hua Tang.
. PGM: Tirgul 8 Markov Chains. Stochastic Sampling  In previous class, we examined methods that use independent samples to estimate P(X = x |e ) Problem:
Computational statistics 2009 Random walk. Computational statistics 2009 Random walk with absorbing barrier.
1 On the Statistical Analysis of Dirty Pictures Julian Besag.
Machine Learning CUNY Graduate Center Lecture 7b: Sampling.
Fall 2006 – Fundamentals of Business Statistics 1 Chapter 6 Introduction to Sampling Distributions.
1 Markov random field: A brief introduction Tzu-Cheng Jen Institute of Electronics, NCTU
Announcements Readings for today:
Computer vision: models, learning and inference Chapter 10 Graphical Models.
Rician Noise Removal in Diffusion Tensor MRI
Data Selection In Ad-Hoc Wireless Sensor Networks Olawoye Oyeyele 11/24/2003.
Binary Variables (1) Coin flipping: heads=1, tails=0 Bernoulli Distribution.
Introduction to Monte Carlo Methods D.J.C. Mackay.
6. Markov Chain. State Space The state space is the set of values a random variable X can take. E.g.: integer 1 to 6 in a dice experiment, or the locations.
Bayes Factor Based on Han and Carlin (2001, JASA).
Simulation Output Analysis
Machine Learning Lecture 23: Statistical Estimation with Sampling Iain Murray’s MLSS lecture on videolectures.net:
A Markov Random Field Model for Term Dependencies Donald Metzler W. Bruce Croft Present by Chia-Hao Lee.
Module 1: Statistical Issues in Micro simulation Paul Sousa.
Finding Scientific topics August , Topic Modeling 1.A document as a probabilistic mixture of topics. 2.A topic as a probability distribution.
Random Numbers and Simulation  Generating truly random numbers is not possible Programs have been developed to generate pseudo-random numbers Programs.
Computer Science, Software Engineering & Robotics Workshop, FGCU, April 27-28, 2012 Fault Prediction with Particle Filters by David Hatfield mentors: Dr.
Simulation of the matrix Bingham-von Mises- Fisher distribution, with applications to multivariate and relational data Discussion led by Chunping Wang.
Mixture Models, Monte Carlo, Bayesian Updating and Dynamic Models Mike West Computing Science and Statistics, Vol. 24, pp , 1993.
Markov Random Fields Probabilistic Models for Images
Instructor: Eyal Amir Grad TAs: Wen Pu, Yonatan Bisk Undergrad TAs: Sam Johnson, Nikhil Johri CS 440 / ECE 448 Introduction to Artificial Intelligence.
Clustering and Testing in High- Dimensional Data M. Radavičius, G. Jakimauskas, J. Sušinskas (Institute of Mathematics and Informatics, Vilnius, Lithuania)
CS774. Markov Random Field : Theory and Application Lecture 02
Mixture of Gaussians This is a probability distribution for random variables or N-D vectors such as… –intensity of an object in a gray scale image –color.
Improved Cross Entropy Method For Estimation Presented by: Alex & Yanna.
Probabilistic Models for Discovering E-Communities Ding Zhou, Eren Manavoglu, Jia Li, C. Lee Giles, Hongyuan Zha The Pennsylvania State University WWW.
Image Analysis, Random Fields and Dynamic MCMC By Marc Sobel.
1 Markov random field: A brief introduction (2) Tzu-Cheng Jen Institute of Electronics, NCTU
Approximate Inference: Decomposition Methods with Applications to Computer Vision Kyomin Jung ( KAIST ) Joint work with Pushmeet Kohli (Microsoft Research)
Ch. 14: Markov Chain Monte Carlo Methods based on Stephen Marsland, Machine Learning: An Algorithmic Perspective. CRC 2009.; C, Andrieu, N, de Freitas,
Probabilistic Graphical Models seminar 15/16 ( ) Haim Kaplan Tel Aviv University.
The generalization of Bayes for continuous densities is that we have some density f(y|  ) where y and  are vectors of data and parameters with  being.
Seminar on random walks on graphs Lecture No. 2 Mille Gandelsman,
7. Metropolis Algorithm. Markov Chain and Monte Carlo Markov chain theory describes a particularly simple type of stochastic processes. Given a transition.
Probabilistic models Jouni Tuomisto THL. Outline Deterministic models with probabilistic parameters Hierarchical Bayesian models Bayesian belief nets.
Markov Chain Monte Carlo for LDA C. Andrieu, N. D. Freitas, and A. Doucet, An Introduction to MCMC for Machine Learning, R. M. Neal, Probabilistic.
Maximum Entropy Model, Bayesian Networks, HMM, Markov Random Fields, (Hidden/Segmental) Conditional Random Fields.
Chapter 3 Discrete Random Variables and Probability Distributions  Random Variables.2 - Probability Distributions for Discrete Random Variables.3.
6.4 Random Fields on Graphs 6.5 Random Fields Models In “Adaptive Cooperative Systems” Summarized by Ho-Sik Seok.
1 Chapter 8: Model Inference and Averaging Presented by Hui Fang.
STA347 - week 91 Random Vectors and Matrices A random vector is a vector whose elements are random variables. The collective behavior of a p x 1 random.
CS Statistical Machine learning Lecture 25 Yuan (Alan) Qi Purdue CS Nov
Markov Random Fields in Vision
Ch 6. Markov Random Fields 6.1 ~ 6.3 Adaptive Cooperative Systems, Martin Beckerman, Summarized by H.-W. Lim Biointelligence Laboratory, Seoul National.
CS498-EA Reasoning in AI Lecture #19 Professor: Eyal Amir Fall Semester 2011.
Markov Chain Monte Carlo in R
Biointelligence Laboratory, Seoul National University
Jun Liu Department of Statistics Stanford University
Parametric Methods Berlin Chen, 2005 References:
Outline Texture modeling - continued Markov Random Field models
Presentation transcript:

Image Analysis and Markov Random Fields (MRFs) Quanren Xiong

Statistical models Some image structures are not deterministic, are best characterized by their statistical properties. For example, textures can be represented by their first and second statistics. Images are often distorted by statistical noise. To restore the true image, images are often treated as realizations of a random process.

Uses of Markov Random Fields MRFs are a kind of statistical model. They can be used to model spatial constrains. –smoothness of image regions –spatial regularity of textures in a small region –depth continuity in stereo construction

What are MRFs Neighbors and cliques Let S be a set of locations, here for simplicity, assume S a grid. S={ (i, j) | i, j are integers }. Neighbors of sS are defined as: ∂((i, j)) = { (k, l) | 0<(k - i) 2 + (l - j) 2 <constant } A subset C of S is a clique if any two different elements of C are neighbors. The set of all cliques of S is denoted by Ω.

Examples of neighborhood 4-neighborhood cliques :

Examples of neighborhood 8-neighborhood cliques:

Random fields The random vector on S is called a random field and assumed to have density p(x). Images as Random fields: If vector X represents intensity values of an image, then its component X s is the intensity value at location s=(i, j). … … S=X= 640x480

Markov Random Fields If p(x) of a random field fulfills the so called Markov condition with respect to a neighborhood system, it is called a Markov Random Field. I.E, the value X s at location S is only depend on its neighbors.

MRFs versus Markov Chains MRFs replace temporal dependency of Markov chains with spatial dependency.

Markov Random fields p(x) can also be factorize over cliques due to its Markov properties. i.e. Ψ C is a function of of X determined by clique C.

Markov Random Fields MRFs are equivalent to Gibbs Fields and p(x) has the following form: H(x) is called energy function. The summation in the denominator is over all possible configurations on S. In our case are over all possible images. For 256 grey values and 640x480 grid, it will have x480 terms. Z is impractical to evaluate. so p(x) is only known up to a constant.

Local Characteristics of MRFs For every, we have, S\I means complement of I If I is a small set, since X only changes over I, Z I can be evaluated in reasonable time. So p(y I |x S\I ) is known.

Using MRFs in Image Analysis In image analysis, p(x) is often the posterior probability of Bayesian inference. That is, p(x) = p(x|y 0 ). For example: y 0 may be the observed image with noise, and we want to compute the estimate x 0 * of the true image x 0 based on p(x) = p(x|y 0 ).

Using MRFs in Image Analysis X0X0 Sampling MRF Model (either learned or known)

Difficulties in computing X 0 * A way to compute the estimate X 0 * is to let, But p(x|y 0 ) is only known up to a constant Z, How to do above integration?

Monte Carlo integration One solution is to construct a Markov chain having p(x) as its limiting distribution. If the Markov chain starting at state X 0, and going through states X 1, X 2, X 3,……, X t,……, then E(X) p(x) can be approximated by m is a sufficiently long burn-in time. X m+1, X m+2, can be considered as samples drawn from p(x).

Gibbs Sampler Because X is a high dimension vector. (For a 640x480 image its dimension is 640x480). It is not practical to update all components of X t to X t+1 in one iteration. One version of Metropolis-Hastings algorithm, called Gibbs Sampler, builds the Markov chain and updates only a single component of X t in one iteration.

Gibbs Sampler Algorithm Let the vector X has k components, X=(X 0,X 1,X 2,……,X k ). and presently it is in state X t = (x 0,x 1,x 2,……,x k ). An index that is equally likely to be any of 1,……,k is chosen. say index i. A random variable w with density P{ w=x} = P{ X i =x | X j = x j, j ≠ i } is generated. If w=x, the updated X t is X t+1 = (x 0,x 1,x 2,…x i-1,x,x i+1,…,x k ).

2 aspects of using MRFs 1.Find an appropriate model class, the general form of H(x). 2.Identify suitable parameters in H(x) from observed samples of X. This is the most difficult part in applying MRFs.

An Example Suppose we want to restore a binary (+1/-1) image with pepper-and- salt noise added. The Ising model is chosen.

Open Issues / Discussion Code Development –What should our MRF library look like? Challenges: Build MRF model from image samples and then generate new images using Gibbs sampler –Need to a way to determine the parameters in H(x) based on image samples.

Reference 1.Ross Kindermann and J. Laurie Snell, Markov Random Fields and Their Applications, Gerhard Winkler, Image Analysis, Random Fields and Markov Chain Monte Carlo methods. Springer, W. R. Gilks, Markov Chain Monte Carlo in Practice, Chapman & Hall/CRC, Sheldon M. Ross, Introduction to Probability Models, Academic Press, 2003.