Presentation is loading. Please wait.

Presentation is loading. Please wait.

Image Analysis, Random Fields and Dynamic MCMC By Marc Sobel.

Similar presentations


Presentation on theme: "Image Analysis, Random Fields and Dynamic MCMC By Marc Sobel."— Presentation transcript:

1 Image Analysis, Random Fields and Dynamic MCMC By Marc Sobel

2 A field!!!!!!!!

3 Random Fields  Random Fields RF consist in collections of points P={p} and neighborhoods {N p } of points. (Neighborhoods N p do not contain p) The field imposes ‘label’ values f={f[p]} points. We use the notation f[S] for the label values imposed on a set S Random Fields have one central property which is closely related to the markov property:

4 Reasoning: Hammersely-Clifford Theorem  Under certain assumptions, assuming the points can be enumerated by p 1,…,p N we have that: (the distribution can be generated from these conditionals)

5 Gibbs Random Field  Gibbs random fields are characterized by  Where cε  are cliques. Cliques are contained in neighborhoods:  ⊂   For example, if cliques c are all pairs, we could put : 

6 Gibbs=Markov!!!!!!  Under Gibbs, conditioning is on neighborhoods:  But, the term,  Cancels in numerator  and denominator  giving the result

7 Examples of Random Fields  Automodels: all cliques have one or two members.  Autobinomial models: How to biuld a k color map: labels are 0,1,…,k. Neighborhoods are of size M.  Autologistic model (i.e., Model which imposes energy 1 when contiguous elements are different and -1 otherwise). 

8 A Metropolis Hastings update for autologistic field models  1) Propose a flip at a randomly selected point p.  2) The move probability is:

9 The 1-d autologistic  The 1-d autologistic is:  The effect of the prior is to smooth out the results.

10 The 2-d autologistic or Ising Model  In a 2-d setting we update using:

11 Example: The Ising Model: Each rectangle below is a field configuration f: black=1 and white=-1. Color results from multiple label values

12 Extensions: Segmentation  Start with a map Y (over a 2d grid).  Assume we would like to distinguish which points in the map are importand and which are background.  Devise an ising field model prior which captures the importand points of the maps and downweights the others. E.g.,

13 Extensions (concluded)  So, minimizing the potential contains a ‘magnetic field’ (based on the first term) and an ‘external field’ based on the second term.  Other extensions are to Line processes, image reconstruction, texture representation.

14 Random Field Priors: The Ising model or autologistic model: (Metropolis Hastings updates): Temperature=5; at time t=10000. Note the presence of more ‘islands’.

15 Random Field Priors: The Ising model. (Metropolis Hastings updates):Temperature=.005; Note the presence of fewer islands.

16 Generalized Ising models: Mean Field Equation  The energy is:  What is the ‘impact of this prior’? Use mean field equations to get the closest possible prior (in KLD) which makes the field points mutually independent.

17 Generalized Field: Note the Swiss Cheese aspect. Temperature=10.

18 Mean Field Equation  The mean field equation minimizes:  For distributions Q which make points mutually independent. For the generalized field model, the mean field equation is:

19 Mean Field Approximation to the General Ising Field at temperature T=10. We simulate from the mean field prior. We retain the swiss cheese but lose the islands.

20 Gaussian Process  Autonormal models: If labels are real numbers, (i.e., we are trying to biuld a picture with many different grey levels):

21 Gaussian Processes For Gaussian processes, the covariance, Cov(f[p],f[p’])= Σ β p’,p’’ cov(f[p’],f[p’’]) + σ 2 ; This gives the Yule-Walker equation: COV=B*COV+I; or COV -1 =(I-B)/σ 2 ; So the likelihood is given by,

22 Gaussian Processes  The likelihood is gaussian with mean μ and inverse covariance matrix I-B;  Example: assume a likelihood, centered at i+j. Assume a gaussian process prior.

23 Posterior Distribution for the Gaussian Model

24 Gaussian field at time t=20,000 with conditional prior variance=.01. Mesh is over a realization of μ. Note how smooth the mesh is:

25 Maximum Aposteriori Estimates

26 MAP Estimator with prior variance =.5

27 Maximum Aposteriori Estimate with prior variance =.01

28 Smoothness Priors  Suppose we observe data with prior,

29 Smoothness priors  The smoothness prior π 1 has the effect of imposing a small ‘derivative’ on the field.  The smoothness prior π 2 has the effect of imposing a small curvature on the field.

30 Smoothness Priors  Smoothness priors have the same kind of impact as choosing a function which minimizes the ‘loss’,  Assume the likelihood

31 Data = -5 below 50 and Data=5 above 50. Conditional prior variance is.5

32 Data = -5 below 50 and Data=5 above 50. Conditional prior variance is.005


Download ppt "Image Analysis, Random Fields and Dynamic MCMC By Marc Sobel."

Similar presentations


Ads by Google