Markov Random Fields Probabilistic Models for Images

Slides:



Advertisements
Similar presentations
Basic Steps 1.Compute the x and y image derivatives 2.Classify each derivative as being caused by either shading or a reflectance change 3.Set derivatives.
Advertisements

Mean-Field Theory and Its Applications In Computer Vision1 1.
Bayesian Belief Propagation
Linear Time Methods for Propagating Beliefs Min Convolution, Distance Transforms and Box Sums Daniel Huttenlocher Computer Science Department December,
Deep Learning Bing-Chen Tsai 1/21.
CS590M 2008 Fall: Paper Presentation
Introduction to Markov Random Fields and Graph Cuts Simon Prince
Exact Inference in Bayes Nets
Bayesian Estimation in MARK
Undirected Probabilistic Graphical Models (Markov Nets) (Slides from Sam Roweis)
Learning to estimate human pose with data driven belief propagation Gang Hua, Ming-Hsuan Yang, Ying Wu CVPR 05.
Introduction to Belief Propagation and its Generalizations. Max Welling Donald Bren School of Information and Computer and Science University of California.
EE462 MLCV Lecture Introduction of Graphical Models Markov Random Fields Segmentation Tae-Kyun Kim 1.
Probabilistic Inference Lecture 1
Markov Networks.
Markov Random Fields (MRF)
Markov random field Institute of Electronics, NCTU
Graphical models, belief propagation, and Markov random fields 1.
Chapter 8-3 Markov Random Fields 1. Topics 1. Introduction 1. Undirected Graphical Models 2. Terminology 2. Conditional Independence 3. Factorization.
Learning to Detect A Salient Object Reporter: 鄭綱 (3/2)
Descriptive statistics Experiment  Data  Sample Statistics Experiment  Data  Sample Statistics Sample mean Sample mean Sample variance Sample variance.
1 Markov random field: A brief introduction Tzu-Cheng Jen Institute of Electronics, NCTU
Learning Low-Level Vision William T. Freeman Egon C. Pasztor Owen T. Carmichael.
Announcements Readings for today:
Computer vision: models, learning and inference
Computer vision: models, learning and inference Chapter 10 Graphical Models.
Image Analysis and Markov Random Fields (MRFs) Quanren Xiong.
6. Experimental Analysis Visible Boltzmann machine with higher-order potentials: Conditional random field (CRF): Exponential random graph model (ERGM):
Presenter : Kuang-Jui Hsu Date : 2011/5/23(Tues.).
1 CS 391L: Machine Learning: Bayesian Learning: Beyond Naïve Bayes Raymond J. Mooney University of Texas at Austin.
Learning Lateral Connections between Hidden Units Geoffrey Hinton University of Toronto in collaboration with Kejie Bao University of Toronto.
1 Markov Random Fields with Efficient Approximations Yuri Boykov, Olga Veksler, Ramin Zabih Computer Science Department CORNELL UNIVERSITY.
Fields of Experts: A Framework for Learning Image Priors (Mon) Young Ki Baik, Computer Vision Lab.
CS774. Markov Random Field : Theory and Application Lecture 02
Learning With Bayesian Networks Markus Kalisch ETH Zürich.
Image Analysis, Random Fields and Dynamic MCMC By Marc Sobel.
1 Markov random field: A brief introduction (2) Tzu-Cheng Jen Institute of Electronics, NCTU
Approximate Inference: Decomposition Methods with Applications to Computer Vision Kyomin Jung ( KAIST ) Joint work with Pushmeet Kohli (Microsoft Research)
Probabilistic Graphical Models seminar 15/16 ( ) Haim Kaplan Tel Aviv University.
Lecture 2: Statistical learning primer for biologists
A Dynamic Conditional Random Field Model for Object Segmentation in Image Sequences Duke University Machine Learning Group Presented by Qiuhua Liu March.
Maximum Entropy Model, Bayesian Networks, HMM, Markov Random Fields, (Hidden/Segmental) Conditional Random Fields.
Markov Random Fields & Conditional Random Fields
6.4 Random Fields on Graphs 6.5 Random Fields Models In “Adaptive Cooperative Systems” Summarized by Ho-Sik Seok.
1 Chapter 8: Model Inference and Averaging Presented by Hui Fang.
CS Statistical Machine learning Lecture 25 Yuan (Alan) Qi Purdue CS Nov
Markov Networks: Theory and Applications Ying Wu Electrical Engineering and Computer Science Northwestern University Evanston, IL 60208
CSC321: Introduction to Neural Networks and Machine Learning Lecture 17: Boltzmann Machines as Probabilistic Models Geoffrey Hinton.
Bayesian Belief Propagation for Image Understanding David Rosenberg.
Markov Random Fields in Vision
Ch 6. Markov Random Fields 6.1 ~ 6.3 Adaptive Cooperative Systems, Martin Beckerman, Summarized by H.-W. Lim Biointelligence Laboratory, Seoul National.
10 October, 2007 University of Glasgow 1 EM Algorithm with Markov Chain Monte Carlo Method for Bayesian Image Analysis Kazuyuki Tanaka Graduate School.
Biointelligence Laboratory, Seoul National University
Learning Deep Generative Models by Ruslan Salakhutdinov
Sublinear Computational Time Modeling in Statistical Machine Learning Theory for Markov Random Fields Kazuyuki Tanaka GSIS, Tohoku University, Sendai,
Markov Random Fields with Efficient Approximations
Jun Liu Department of Statistics Stanford University
Prof. Adriana Kovashka University of Pittsburgh April 4, 2017
Markov Networks.
Binarization of Low Quality Text Using a Markov Random Field Model
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 18
Markov Random Fields for Edge Classification
Estimating Networks With Jumps
Lecture 5 Unsupervised Learning in fully Observed Directed and Undirected Graphical Models.
Shashi Shekhar Weili Wu Sanjay Chawla Ranga Raju Vatsavai
Markov Random Fields Presented by: Vladan Radosavljevic.
Graduate School of Information Sciences, Tohoku University
Expectation-Maximization & Belief Propagation
Markov Networks.
Outline Texture modeling - continued Markov Random Field models
Presentation transcript:

Markov Random Fields Probabilistic Models for Images Applications in Image Segmentation and Texture Modeling Ying Nian Wu UCLA Department of Statistics IPAM July 22, 2013

Outline Basic concepts, properties, examples Markov chain Monte Carlo sampling Modeling textures and objects Application in image segmentation

Markov Chains Pr(future|present, past) = Pr(future|present) future past | present Markov property: conditional independence limited dependence Makes modeling and learning possible

Markov Chains (higher order) Temporal: a natural ordering Spatial: 2D image, no natural ordering

Markov Random Fields Markov Property all the other pixels Nearest neighborhood, first order neighborhood From Slides by S. Seitz - University of Washington

Markov Random Fields Second order neighborhood

Markov Random Fields Can be generalized to any undirected graphs (nodes, edges) Neighborhood system: each node is connected to its neighbors neighbors are reciprocal Markov property: each node only depends on its neighbors Note: the black lines on the left graph are illustrating the 2D grid for the image pixels they are not edges in the graph as the blue lines on the right

Markov Random Fields What is

Hammersley-Clifford Theorem normalizing constant, partition function potential functions of cliques Cliques for this neighborhood From Slides by S. Seitz - University of Washington

Hammersley-Clifford Theorem Gibbs distribution a clique: a set of pixels, each member is the neighbor of any other member Cliques for this neighborhood From Slides by S. Seitz - University of Washington

Hammersley-Clifford Theorem Gibbs distribution a clique: a set of pixels, each member is the neighbor of any other member Cliques for this neighborhood ……etc, note: the black lines are for illustrating 2D grids, they are not edges in the graph

Ising model Cliques for this neighborhood From Slides by S. Seitz - University of Washington

Ising model pair potential Challenge: auto logistic regression

Gaussian MRF model Challenge: auto regression continuous pair potential Challenge: auto regression

Sampling from MRF Models Markov Chain Monte Carlo (MCMC) Gibbs sampler (Geman & Geman 84) Metropolis algorithm (Metropolis et al. 53) Swedeson & Wang (87) Hybrid (Hamiltonian) Monte Carlo

Gibbs Sampler Repeat: Randomly pick a pixel Simple one-dimension distribution Repeat: Randomly pick a pixel Sample given the current values of

Gibbs sampler for Ising model Challenge: sample from Ising model

Metropolis Algorithm Repeat: energy function Repeat: Proposal: Perturb I to J by sample from K(I, J) = K(J, I) If change I to J otherwise change I to J with prob

Metropolis for Ising model Ising model: proposal --- randomly pick a pixel and flip it Challenge: sample from Ising model

Modeling Images by MRF Ising model Hidden variables, layers, RBM Exponential family model, log-linear model maximum entropy model unknown parameters features (may also need to be learned) reference distribution

Modeling Images by MRF Given How to estimate Maximum likelihood Pseudo-likelihood (Besag 1973) Contrastive divergence (Hinton)

Maximum likelihood Given Challenge: prove it

Stochastic Gradient Given Generate Analysis by synthesis

Texture Modeling

Modeling image pixel labels as MRF (Ising) MRF for Image Segmentation Modeling image pixel labels as MRF (Ising) Bayesian posterior 1 real image label image Slides by R. Huang – Rutgers University

Model joint probability region labels image pixels model param. image-label compatibility Function enforcing Data Constraint label-label compatibility Function enforcing Smoothness constraint label image local Observations neighboring label nodes Slides by R. Huang – Rutgers University

MRF for Image Segmentation Slides by R. Huang – Rutgers University

Inference in MRFs Classical Gibbs sampling, simulated annealing Iterated conditional modes State of the Art Graph cuts Belief propagation Linear Programming Tree-reweighted message passing Slides by R. Huang – Rutgers University

Summary MRF, Gibbs distribution Gibbs sampler, Metropolis algorithm Exponential family model