1 Markov random field: A brief introduction Tzu-Cheng Jen Institute of Electronics, NCTU 2007-03-28.

Slides:



Advertisements
Similar presentations
Bayesian Belief Propagation
Advertisements

Linear Time Methods for Propagating Beliefs Min Convolution, Distance Transforms and Box Sums Daniel Huttenlocher Computer Science Department December,
Various Regularization Methods in Computer Vision Min-Gyu Park Computer Vision Lab. School of Information and Communications GIST.
Introduction to Markov Random Fields and Graph Cuts Simon Prince
1 12. Principles of Parameter Estimation The purpose of this lecture is to illustrate the usefulness of the various concepts introduced and studied in.
EE462 MLCV Lecture Introduction of Graphical Models Markov Random Fields Segmentation Tae-Kyun Kim 1.
Markov Networks.
Markov Random Fields (MRF)
Smoothing 3D Meshes using Markov Random Fields
Contextual Classification by Melanie Ganz Lecture 6, Medical Image Analysis 2008.
Markov random field Institute of Electronics, NCTU
Graphical models, belief propagation, and Markov random fields 1.
Chapter 8-3 Markov Random Fields 1. Topics 1. Introduction 1. Undirected Graphical Models 2. Terminology 2. Conditional Independence 3. Factorization.
ECE 472/572 - Digital Image Processing Lecture 8 - Image Restoration – Linear, Position-Invariant Degradations 10/10/11.
Yung-Lin Huang, Yi-Nung Liu, and Shao-Yi Chien Media IC and System Lab Graduate Institute of Networking and Multimedia National Taiwan University Signal.
Learning to Detect A Salient Object Reporter: 鄭綱 (3/2)
J. Mike McHugh,Janusz Konrad, Venkatesh Saligrama and Pierre-Marc Jodoin Signal Processing Letters, IEEE Professor: Jar-Ferr Yang Presenter: Ming-Hua Tang.
The Markov property Discrete time: A time symmetric version: A more general version: Let A be a set of indices >k, B a set of indices
ON THE IMPROVEMENT OF IMAGE REGISTRATION FOR HIGH ACCURACY SUPER-RESOLUTION Michalis Vrigkas, Christophoros Nikou, Lisimachos P. Kondi University of Ioannina.
Learning Low-Level Vision William T. Freeman Egon C. Pasztor Owen T. Carmichael.
Announcements Readings for today:
1 Bayesian Restoration Using a New Nonstationary Edge-Preserving Image Prior Giannis K. Chantas, Nikolaos P. Galatsanos, and Aristidis C. Likas IEEE Transactions.
Pattern Recognition. Introduction. Definitions.. Recognition process. Recognition process relates input signal to the stored concepts about the object.
Computer vision: models, learning and inference
(1) A probability model respecting those covariance observations: Gaussian Maximum entropy probability distribution for a given covariance observation.
Rician Noise Removal in Diffusion Tensor MRI
Advanced Image Processing Image Relaxation – Restoration and Feature Extraction 02/02/10.
Image Analysis and Markov Random Fields (MRFs) Quanren Xiong.
1 Patch Complexity, Finite Pixel Correlations and Optimal Denoising Anat Levin, Boaz Nadler, Fredo Durand and Bill Freeman Weizmann Institute, MIT CSAIL.
Presenter : Kuang-Jui Hsu Date : 2011/5/23(Tues.).
What we didn’t have time for CS664 Lecture 26 Thursday 12/02/04 Some slides c/o Dan Huttenlocher, Stefano Soatto, Sebastian Thrun.
CHAPTER 4 S TOCHASTIC A PPROXIMATION FOR R OOT F INDING IN N ONLINEAR M ODELS Organization of chapter in ISSO –Introduction and potpourri of examples Sample.
1 Physical Fluctuomatics 5th and 6th Probabilistic information processing by Gaussian graphical model Kazuyuki Tanaka Graduate School of Information Sciences,
A Markov Random Field Model for Term Dependencies Donald Metzler W. Bruce Croft Present by Chia-Hao Lee.
Markov Random Fields Probabilistic Models for Images
Xu Huaping, Wang Wei, Liu Xianghua Beihang University, China.
1 Markov Random Fields with Efficient Approximations Yuri Boykov, Olga Veksler, Ramin Zabih Computer Science Department CORNELL UNIVERSITY.
Fields of Experts: A Framework for Learning Image Priors (Mon) Young Ki Baik, Computer Vision Lab.
CS774. Markov Random Field : Theory and Application Lecture 02
CHAPTER 5 SIGNAL SPACE ANALYSIS
Image Analysis, Random Fields and Dynamic MCMC By Marc Sobel.
1 Markov random field: A brief introduction (2) Tzu-Cheng Jen Institute of Electronics, NCTU
Presenter : Kuang-Jui Hsu Date : 2011/3/24(Thur.).
Lecture 2: Statistical learning primer for biologists
A Dynamic Conditional Random Field Model for Object Segmentation in Image Sequences Duke University Machine Learning Group Presented by Qiuhua Liu March.
Motion Estimation using Markov Random Fields Hrvoje Bogunović Image Processing Group Faculty of Electrical Engineering and Computing University of Zagreb.
Lecture 9 State Space Gradient Descent Gibbs Sampler with Simulated Annealing.
Christopher M. Bishop, Pattern Recognition and Machine Learning 1.
6.4 Random Fields on Graphs 6.5 Random Fields Models In “Adaptive Cooperative Systems” Summarized by Ho-Sik Seok.
1 Multi Scale Markov Random Field Image Segmentation Taha hamedani.
Bayesian Belief Propagation for Image Understanding David Rosenberg.
Markov Random Fields in Vision
Jianchao Yang, John Wright, Thomas Huang, Yi Ma CVPR 2008 Image Super-Resolution as Sparse Representation of Raw Image Patches.
Ch 6. Markov Random Fields 6.1 ~ 6.3 Adaptive Cooperative Systems, Martin Beckerman, Summarized by H.-W. Lim Biointelligence Laboratory, Seoul National.
Outline Time series prediction Find k-nearest neighbors Lag selection Weighted LS-SVM.
6.8 Maximizer of the Posterior Marginals 6.9 Iterated Conditional Modes of the Posterior Distribution Jang, HaYoung.
Biointelligence Laboratory, Seoul National University
12. Principles of Parameter Estimation
Ch3: Model Building through Regression
Markov Random Fields with Efficient Approximations
Binarization of Low Quality Text Using a Markov Random Field Model
Markov Random Fields for Edge Classification
Shashi Shekhar Weili Wu Sanjay Chawla Ranga Raju Vatsavai
Markov Random Fields Presented by: Vladan Radosavljevic.
Adaptive Cooperative Systems Chapter 6 Markov Random Fields
32nd Annual International Conference of the IEEE Engineering in Medicine and Biology Society Denoising of LSFCM images with compensation for the Photoblinking/Photobleaching.
Markov Networks.
12. Principles of Parameter Estimation
Outline Texture modeling - continued Markov Random Field models
Stochastic Methods.
Presentation transcript:

1 Markov random field: A brief introduction Tzu-Cheng Jen Institute of Electronics, NCTU

2 Outline Neighborhood system and cliques Markov random field Optimization-based vision problem Solver for the optimization problem

3 Neighborhood system and cliques

4 Prior knowledge In order to explain the concept of the MRF, we first introduce following definition: 1. i: Site (Pixel) 2. N i : The neighboring point of i 3. S: Set of sites (Image) 4. f i : The value at site i (Intensity) f1f1 f2f2 f3f3 f4f4 fifi f6f6 f7f7 f8f8 f9f9 A 3x3 imagined image

5 Neighborhood system The sites in S are related to one another via a neighborhood system. Its definition for S is defined as: where N i is the set of sites neighboring i. The neighboring relationship has the following properties: (1) A site is not neighboring to itself (2) The neighboring relationship is mutual f1f1 f2f2 f3f3 f4f4 fifi f6f6 f7f7 f8f8 f9f9

6 Neighborhood system: Example First order neighborhood system Second order neighborhood system Nth order neighborhood system

7 Neighborhood system: Example The neighboring sites of the site i are m, n, and f. The neighboring sites of the site j are r and x

8 Clique A clique C is defined as a subset of sites in S. Following are some examples

9 Clique: Example Take first order neighborhood system and second order neighborhood for example: Neighborhood systemClique types

10 Markov random field

11 Markov random field (MRF) View the 2D image f as the collection of the random variables (Random field) A random field is said to be Markov random field if it satisfies following properties Image configuration f f1f1 f2f2 f3f3 f4f4 fifi f6f6 f7f7 f8f8 f9f9

12 Gibbs random field (GRF) and Gibbs distribution A random field is said to be a Gibbs random field if and only if its configuration f obeys Gibbs distribution, that is: Image configuration f f1f1 f2f2 f3f3 f4f4 fifi f6f6 f7f7 f8f8 f9f9 U(f): Energy function; T: Temperature V i (f): Clique potential Design U for different applications

13 Markov-Gibbs equivalence Hammersley-Clifford theorem: A random field F is an MRF if and only if F is a GRF Proof(<=): Let P(f) be a Gibbs distribution on S with the neighborhood system N. f1f1 f2f2 f3f3 f4f4 fifi f6f6 f7f7 f8f8 f9f9 A 3x3 imagined image

14 Markov-Gibbs equivalence Divide C into two set A and B with A consisting of cliques containing i and B cliques not containing i: A 3x3 imagined image f1f1 f2f2 f3f3 f4f4 fifi f6f6 f7f7 f8f8 f9f9

15 Optimization-based vision problem

16 Denoising Noisy signal ddenoised signal f

17 MAP formulation for denoising problem The problem of the signal denoising could be modeled as the MAP estimation problem, that is, (Prior model) (Observation model)

18 MAP formulation for denoising problem Assume the observation is the true signal plus the independent Gaussian noise, that is Under above circumstance, the observation model could be expressed as U(d|f): Likelihood energy

19 MAP formulation for denoising problem Assume the unknown data f is MRF, the prior model is: Based on above information, the posteriori probability becomes

20 MAP formulation for denoising problem The MAP estimator for the problem is: ?

21 MAP formulation for denoising problem Define the smoothness prior: Substitute above information into the MAP estimator, we could get: Observation model (Similarity measure) Prior model (Reconstruction constrain)

22 Super-resolution Super-Resolution (SR): A method to reconstruct high- resolution images/videos from low-resolution images/videos

23 Super-resolution Illustration for super-resolution d (1) d (2) d (3) d (4) f (1) Use the low-resolution frames to reconstruct the high resolution frame

24 MAP formulation for super-resolution problem The problem of the super-resolution could be modeled as the MAP estimation problem, that is, (Prior model)(Observation model)

25 MAP formulation for super-resolution problem The conditional PDF can be modeled as the Gaussian distribution if the noise source is Gaussian noise We also assume the prior model is joint Gaussian distribution

26 MAP formulation for super-resolution problem Substitute above relation into the MAP estimator, we can get following expression: (Prior model)(Observation model)

27 Solver for the optimization problem

28 The solver of the optimization problem In this section, we will introduce different approaches for solving the optimization problem: 1. Brute-force search (Global extreme) 2. Gradient descent search (Local extreme, Usually) 3. Genetic algorithm (Global extreme) 4. Simulated annealing algorithm (Global extreme)

29 Gradient descent algorithm (1)

30 Gradient descent algorithm (2)

31 Simulation: SR by gradient descent algorithm Use 6 low resolution frames (a)~(f) to reconstruct the high resolution frame (g)

32 Simulation: SR by gradient descent algorithm

33 The problem of the gradient descent algorithm Gradient descent algorithm may be trapped into the local extreme instead of the global extreme

34 Genetic algorithm (GA) The GA includes following steps:

35 Simulated annealing (SA) The SA includes following steps: