A Trainable Graph Combination Scheme for Belief Propagation Kai Ju Liu New York University.

Slides:



Advertisements
Similar presentations
Mean-Field Theory and Its Applications In Computer Vision1 1.
Advertisements

Bayesian Belief Propagation
Discrete Optimization Lecture 4 – Part 3 M. Pawan Kumar Slides available online
Exact Inference. Inference Basic task for inference: – Compute a posterior distribution for some query variables given some observed evidence – Sum out.
Constrained Approximate Maximum Entropy Learning (CAMEL) Varun Ganapathi, David Vickrey, John Duchi, Daphne Koller Stanford University TexPoint fonts used.
Introduction to Markov Random Fields and Graph Cuts Simon Prince
Learning to Combine Bottom-Up and Top-Down Segmentation Anat Levin and Yair Weiss School of CS&Eng, The Hebrew University of Jerusalem, Israel.
Exact Inference in Bayes Nets
ICCV 2007 tutorial Part III Message-passing algorithms for energy minimization Vladimir Kolmogorov University College London.
Loopy Belief Propagation a summary. What is inference? Given: –Observabled variables Y –Hidden variables X –Some model of P(X,Y) We want to make some.
Convergent Message-Passing Algorithms for Inference over General Graphs with Convex Free Energies Tamir Hazan, Amnon Shashua School of Computer Science.
Introduction to Belief Propagation and its Generalizations. Max Welling Donald Bren School of Information and Computer and Science University of California.
EE462 MLCV Lecture Introduction of Graphical Models Markov Random Fields Segmentation Tae-Kyun Kim 1.
GrabCut Interactive Image (and Stereo) Segmentation Joon Jae Lee Keimyung University Welcome. I will present Grabcut – an Interactive tool for foreground.
Belief Propagation by Jakob Metzler. Outline Motivation Pearl’s BP Algorithm Turbo Codes Generalized Belief Propagation Free Energies.
Markov Nets Dhruv Batra, Recitation 10/30/2008.
Markov Random Fields (MRF)
Belief Propagation on Markov Random Fields Aggeliki Tsoli.
CS774. Markov Random Field : Theory and Application Lecture 04 Kyomin Jung KAIST Sep
1 Bayesian Image Modeling by Generalized Sparse Markov Random Fields and Loopy Belief Propagation Kazuyuki Tanaka GSIS, Tohoku University, Sendai, Japan.
Graphical models, belief propagation, and Markov random fields 1.
1 Graphical Models in Data Assimilation Problems Alexander Ihler UC Irvine Collaborators: Sergey Kirshner Andrew Robertson Padhraic Smyth.
1 Image Completion using Global Optimization Presented by Tingfan Wu.
Global Approximate Inference Eran Segal Weizmann Institute.
Stereo & Iterative Graph-Cuts Alex Rav-Acha Vision Course Hebrew University.
Conditional Random Fields
Genome evolution: a sequence-centric approach Lecture 6: Belief propagation.
Abstract Extracting a matte by previous approaches require the input image to be pre-segmented into three regions (trimap). This pre-segmentation based.
Understanding Belief Propagation and its Applications Dan Yuan June 2004.
Belief Propagation Kai Ju Liu March 9, Statistical Problems Medicine Finance Internet Computer vision.
Stereo Computation using Iterative Graph-Cuts
© 2006 by Davi GeigerComputer Vision April 2006 L1.1 Binocular Stereo Left Image Right Image.
Computer Vision Group University of California Berkeley 1 Cue Integration in Figure/Ground Labeling Xiaofeng Ren, Charless Fowlkes and Jitendra Malik.
Measuring Uncertainty in Graph Cut Solutions Pushmeet Kohli Philip H.S. Torr Department of Computing Oxford Brookes University.
Computer vision: models, learning and inference
MRFs and Segmentation with Graph Cuts Computer Vision CS 543 / ECE 549 University of Illinois Derek Hoiem 02/24/10.
Planar Cycle Covering Graphs for inference in MRFS The Typhon Algorithm A New Variational Approach to Ground State Computation in Binary Planar Markov.
Lecture 26: Single-Image Super-Resolution CAP 5415.
Markov Random Fields Probabilistic Models for Images
Algorithms for MAP estimation in Markov Random Fields Vladimir Kolmogorov University College London.
Continuous Variables Write message update equation as an expectation: Proposal distribution W t (x t ) for each node Samples define a random discretization.
Tokyo Institute of Technology, Japan Yu Nishiyama and Sumio Watanabe Theoretical Analysis of Accuracy of Gaussian Belief Propagation.
14 October, 2010LRI Seminar 2010 (Univ. Paris-Sud)1 Statistical performance analysis by loopy belief propagation in probabilistic image processing Kazuyuki.
1 Markov Random Fields with Efficient Approximations Yuri Boykov, Olga Veksler, Ramin Zabih Computer Science Department CORNELL UNIVERSITY.
Readings: K&F: 11.3, 11.5 Yedidia et al. paper from the class website
Approximate Inference: Decomposition Methods with Applications to Computer Vision Kyomin Jung ( KAIST ) Joint work with Pushmeet Kohli (Microsoft Research)
1 Mean Field and Variational Methods finishing off Graphical Models – Carlos Guestrin Carnegie Mellon University November 5 th, 2008 Readings: K&F:
Probabilistic Graphical Models seminar 15/16 ( ) Haim Kaplan Tel Aviv University.
Belief Propagation and its Generalizations Shane Oldenburger.
Exact Inference in Bayes Nets. Notation U: set of nodes in a graph X i : random variable associated with node i π i : parents of node i Joint probability:
Markov Random Fields & Conditional Random Fields
Contextual models for object detection using boosted random fields by Antonio Torralba, Kevin P. Murphy and William T. Freeman.
Efficient Belief Propagation for Image Restoration Qi Zhao Mar.22,2006.
Tightening LP Relaxations for MAP using Message-Passing David Sontag Joint work with Talya Meltzer, Amir Globerson, Tommi Jaakkola, and Yair Weiss.
Markov Random Fields in Vision
10 October, 2007 University of Glasgow 1 EM Algorithm with Markov Chain Monte Carlo Method for Bayesian Image Analysis Kazuyuki Tanaka Graduate School.
Introduction of BP & TRW-S
Summary of “Efficient Deep Learning for Stereo Matching”
Extending Expectation Propagation for Graphical Models
Sublinear Computational Time Modeling in Statistical Machine Learning Theory for Markov Random Fields Kazuyuki Tanaka GSIS, Tohoku University, Sendai,
Markov Random Fields with Efficient Approximations
Learning to Combine Bottom-Up and Top-Down Segmentation
CSCI 5822 Probabilistic Models of Human and Machine Learning
Generalized Belief Propagation
Markov Random Fields Presented by: Vladan Radosavljevic.
Graduate School of Information Sciences, Tohoku University
Expectation-Maximization & Belief Propagation
Extending Expectation Propagation for Graphical Models
Junction Trees 3 Undirected Graphical Models
Readings: K&F: 11.3, 11.5 Yedidia et al. paper from the class website
Presentation transcript:

A Trainable Graph Combination Scheme for Belief Propagation Kai Ju Liu New York University

Images

Pairwise Markov Random Field Basic structure: vertices, edges

Pairwise Markov Random Field Basic structure: vertices, edges Vertex i has set of possible states X i and observed value y i Compatibility between states and observed values, Compatibility between neighboring vertices i and j,

Pairwise MRF: Probabilities Joint probability: Marginal probability: –Advantage: allows average over ambiguous states –Disadvantage: complexity exponential in number of vertices

Belief Propagation

Beliefs replace probabilities: Messages propagate information:

Belief Propagation Example

BP: Questions When can we calculate beliefs exactly? When do beliefs equal probabilities? When is belief propagation efficient? Answer: Singly-Connected Graphs (SCG’s) Graphs without loops Messages terminate at leaf vertices Beliefs equal probabilities Complexity in previous example reduced from 13S 5 to 24S 2

BP on Loopy Graphs Messages do not terminate Energy approximation schemes [Freeman et al.] –Standard belief propagation –Generalized belief propagation Standard belief propagation –Approximates Gibbs free energy of system by Bethe free energy –Iterates, requiring convergence criteria 12 43

BP on Loopy Graphs Tree-based reparameterization [Wainwright] –Reparameterizes distributions on singly-connected graphs –Convergence improved compared to standard belief propagation –Permits calculation of bounds on approximation errors

BP-TwoGraphs Eliminates iteration Utilizes advantages of SCG’s

BP-TwoGraphs Calculate beliefs on each set of SCG’s: – Select set of beliefs with minimum entropy – Consider loopy graph with n vertices Select two sets of SCG’s that approximate the graph –

BP-TwoGraphs on Images Rectangular grid of pixel vertices H i : horizontal graphs G i : vertical graphs horizontal graph vertical graphoriginal graph

Image Segmentation add noise segment

Image Segmentation Results

Image Segmentation Revisited add noise ground truth max-flow ground truth

Image Segmentation: Horizontal Graph Analysis

Image Segmentation: Vertical Graph Analysis

BP-TwoLines Rectangular grid of pixel vertices H i : horizontal lines G i : vertical lines horizontal line vertical lineoriginal graph

Image Segmentation Results II

Image Segmentation Results III

Natural Image Segmentation

Boundary-Based Image Segmentation: Window Vertices Square 2-by-2 window of pixels Each pixel has two states –foreground –background

Boundary-Based Image Segmentation: Overlap

Boundary-Based Image Segmentation: Graph

Real Image Segmentation: Training

Real Image Segmentation: Results

Real Image Segmentation: Gorilla Results

Conclusion BP-TwoGraphs –Accurate and efficient –Extensive use of beliefs –Trainable parameters Future work –Multiple states –Stereo –Image fusion