Presentation is loading. Please wait.

Presentation is loading. Please wait.

Image Analysis and Markov Random Fields (MRFs) Quanren Xiong.

Similar presentations


Presentation on theme: "Image Analysis and Markov Random Fields (MRFs) Quanren Xiong."— Presentation transcript:

1 Image Analysis and Markov Random Fields (MRFs) Quanren Xiong

2 Statistical models Some image structures are not deterministic, are best characterized by their statistical properties. For example, textures can be represented by their first and second statistics. Images are often distorted by statistical noise. To restore the true image, images are often treated as realizations of a random process.

3 Uses of Markov Random Fields MRFs are a kind of statistical model. They can be used to model spatial constrains. –smoothness of image regions –spatial regularity of textures in a small region –depth continuity in stereo construction

4 What are MRFs Neighbors and cliques Let S be a set of locations, here for simplicity, assume S a grid. S={ (i, j) | i, j are integers }. Neighbors of sS are defined as: ∂((i, j)) = { (k, l) | 0<(k - i) 2 + (l - j) 2 <constant } A subset C of S is a clique if any two different elements of C are neighbors. The set of all cliques of S is denoted by Ω.

5 Examples of neighborhood 4-neighborhood cliques :

6 Examples of neighborhood 8-neighborhood cliques:

7 Random fields The random vector on S is called a random field and assumed to have density p(x). Images as Random fields: If vector X represents intensity values of an image, then its component X s is the intensity value at location s=(i, j). … … S=X= 640x480

8 Markov Random Fields If p(x) of a random field fulfills the so called Markov condition with respect to a neighborhood system, it is called a Markov Random Field. I.E, the value X s at location S is only depend on its neighbors.

9 MRFs versus Markov Chains MRFs replace temporal dependency of Markov chains with spatial dependency.

10 Markov Random fields p(x) can also be factorize over cliques due to its Markov properties. i.e. Ψ C is a function of of X determined by clique C.

11 Markov Random Fields MRFs are equivalent to Gibbs Fields and p(x) has the following form: H(x) is called energy function. The summation in the denominator is over all possible configurations on S. In our case are over all possible images. For 256 grey values and 640x480 grid, it will have 256 640x480 terms. Z is impractical to evaluate. so p(x) is only known up to a constant.

12 Local Characteristics of MRFs For every, we have, S\I means complement of I If I is a small set, since X only changes over I, Z I can be evaluated in reasonable time. So p(y I |x S\I ) is known.

13 Using MRFs in Image Analysis In image analysis, p(x) is often the posterior probability of Bayesian inference. That is, p(x) = p(x|y 0 ). For example: y 0 may be the observed image with noise, and we want to compute the estimate x 0 * of the true image x 0 based on p(x) = p(x|y 0 ).

14 Using MRFs in Image Analysis X0X0 Sampling MRF Model (either learned or known)

15 Difficulties in computing X 0 * A way to compute the estimate X 0 * is to let, But p(x|y 0 ) is only known up to a constant Z, How to do above integration?

16 Monte Carlo integration One solution is to construct a Markov chain having p(x) as its limiting distribution. If the Markov chain starting at state X 0, and going through states X 1, X 2, X 3,……, X t,……, then E(X) p(x) can be approximated by m is a sufficiently long burn-in time. X m+1, X m+2,...... can be considered as samples drawn from p(x).

17 Gibbs Sampler Because X is a high dimension vector. (For a 640x480 image its dimension is 640x480). It is not practical to update all components of X t to X t+1 in one iteration. One version of Metropolis-Hastings algorithm, called Gibbs Sampler, builds the Markov chain and updates only a single component of X t in one iteration.

18 Gibbs Sampler Algorithm Let the vector X has k components, X=(X 0,X 1,X 2,……,X k ). and presently it is in state X t = (x 0,x 1,x 2,……,x k ). An index that is equally likely to be any of 1,……,k is chosen. say index i. A random variable w with density P{ w=x} = P{ X i =x | X j = x j, j ≠ i } is generated. If w=x, the updated X t is X t+1 = (x 0,x 1,x 2,…x i-1,x,x i+1,…,x k ).

19 2 aspects of using MRFs 1.Find an appropriate model class, the general form of H(x). 2.Identify suitable parameters in H(x) from observed samples of X. This is the most difficult part in applying MRFs.

20 An Example Suppose we want to restore a binary (+1/-1) image with pepper-and- salt noise added. The Ising model is chosen.

21 Open Issues / Discussion Code Development –What should our MRF library look like? Challenges: Build MRF model from image samples and then generate new images using Gibbs sampler –Need to a way to determine the parameters in H(x) based on image samples.

22 Reference 1.Ross Kindermann and J. Laurie Snell, Markov Random Fields and Their Applications, http://www.ams.org/online_bks/conm1/, 1980. http://www.ams.org/online_bks/conm1/ 2.Gerhard Winkler, Image Analysis, Random Fields and Markov Chain Monte Carlo methods. Springer, 2003 3.W. R. Gilks, Markov Chain Monte Carlo in Practice, Chapman & Hall/CRC, 1998. 4.Sheldon M. Ross, Introduction to Probability Models, Academic Press, 2003.


Download ppt "Image Analysis and Markov Random Fields (MRFs) Quanren Xiong."

Similar presentations


Ads by Google