Understanding Belief Propagation and its Applications Dan Yuan June 2004.

Slides:



Advertisements
Similar presentations
Mean-Field Theory and Its Applications In Computer Vision1 1.
Advertisements

Bayesian Belief Propagation
Linear Time Methods for Propagating Beliefs Min Convolution, Distance Transforms and Box Sums Daniel Huttenlocher Computer Science Department December,
Exact Inference. Inference Basic task for inference: – Compute a posterior distribution for some query variables given some observed evidence – Sum out.
Graphical Models BRML Chapter 4 1. the zoo of graphical models Markov networks Belief networks Chain graphs (Belief and Markov ) Factor graphs =>they.
Introduction to Markov Random Fields and Graph Cuts Simon Prince
Exact Inference in Bayes Nets
Loopy Belief Propagation a summary. What is inference? Given: –Observabled variables Y –Hidden variables X –Some model of P(X,Y) We want to make some.
Undirected Probabilistic Graphical Models (Markov Nets) (Slides from Sam Roweis)
Introduction to Belief Propagation and its Generalizations. Max Welling Donald Bren School of Information and Computer and Science University of California.
EE462 MLCV Lecture Introduction of Graphical Models Markov Random Fields Segmentation Tae-Kyun Kim 1.
Belief Propagation by Jakob Metzler. Outline Motivation Pearl’s BP Algorithm Turbo Codes Generalized Belief Propagation Free Energies.
Overview of Inference Algorithms for Bayesian Networks Wei Sun, PhD Assistant Research Professor SEOR Dept. & C4I Center George Mason University, 2009.
Markov Networks.
Belief Propagation on Markov Random Fields Aggeliki Tsoli.
CS774. Markov Random Field : Theory and Application Lecture 04 Kyomin Jung KAIST Sep
1 Bayesian Image Modeling by Generalized Sparse Markov Random Fields and Loopy Belief Propagation Kazuyuki Tanaka GSIS, Tohoku University, Sendai, Japan.
Chapter 8-3 Markov Random Fields 1. Topics 1. Introduction 1. Undirected Graphical Models 2. Terminology 2. Conditional Independence 3. Factorization.
GS 540 week 6. HMM basics Given a sequence, and state parameters: – Each possible path through the states has a certain probability of emitting the sequence.
A Graphical Model For Simultaneous Partitioning And Labeling Philip Cowans & Martin Szummer AISTATS, Jan 2005 Cambridge.
1 Graphical Models in Data Assimilation Problems Alexander Ihler UC Irvine Collaborators: Sergey Kirshner Andrew Robertson Padhraic Smyth.
Learning Low-Level Vision William T. Freeman Egon C. Pasztor Owen T. Carmichael.
Conditional Random Fields
Belief Propagation, Junction Trees, and Factor Graphs
Graphical Models Lei Tang. Review of Graphical Models Directed Graph (DAG, Bayesian Network, Belief Network) Typically used to represent causal relationship.
Genome evolution: a sequence-centric approach Lecture 6: Belief propagation.
Belief Propagation Kai Ju Liu March 9, Statistical Problems Medicine Finance Internet Computer vision.
Computer vision: models, learning and inference Chapter 10 Graphical Models.
A Trainable Graph Combination Scheme for Belief Propagation Kai Ju Liu New York University.
24 November, 2011National Tsin Hua University, Taiwan1 Mathematical Structures of Belief Propagation Algorithms in Probabilistic Information Processing.
Computer vision: models, learning and inference
Reconstructing Relief Surfaces George Vogiatzis, Philip Torr, Steven Seitz and Roberto Cipolla BMVC 2004.
12/07/2008UAI 2008 Cumulative Distribution Networks and the Derivative-Sum-Product Algorithm Jim C. Huang and Brendan J. Frey Probabilistic and Statistical.
MRFs and Segmentation with Graph Cuts Computer Vision CS 543 / ECE 549 University of Illinois Derek Hoiem 02/24/10.
Markov Random Fields Probabilistic Models for Images
28 February, 2003University of Glasgow1 Cluster Variation Method and Probabilistic Image Processing -- Loopy Belief Propagation -- Kazuyuki Tanaka Graduate.
Continuous Variables Write message update equation as an expectation: Proposal distribution W t (x t ) for each node Samples define a random discretization.
Learning to Perceive Transparency from the Statistics of Natural Scenes Anat Levin School of Computer Science and Engineering The Hebrew University of.
CS774. Markov Random Field : Theory and Application Lecture 02
Daphne Koller Message Passing Belief Propagation Algorithm Probabilistic Graphical Models Inference.
Approximate Inference: Decomposition Methods with Applications to Computer Vision Kyomin Jung ( KAIST ) Joint work with Pushmeet Kohli (Microsoft Research)
1 Mean Field and Variational Methods finishing off Graphical Models – Carlos Guestrin Carnegie Mellon University November 5 th, 2008 Readings: K&F:
Probabilistic Graphical Models seminar 15/16 ( ) Haim Kaplan Tel Aviv University.
Lecture 2: Statistical learning primer for biologists
Introduction to Belief Propagation
Exact Inference in Bayes Nets. Notation U: set of nodes in a graph X i : random variable associated with node i π i : parents of node i Joint probability:
Christopher M. Bishop, Pattern Recognition and Machine Learning 1.
Contextual models for object detection using boosted random fields by Antonio Torralba, Kevin P. Murphy and William T. Freeman.
Pattern Recognition and Machine Learning
Efficient Belief Propagation for Image Restoration Qi Zhao Mar.22,2006.
Markov Networks: Theory and Applications Ying Wu Electrical Engineering and Computer Science Northwestern University Evanston, IL 60208
Daphne Koller Overview Maximum a posteriori (MAP) Probabilistic Graphical Models Inference.
Bayesian Belief Propagation for Image Understanding David Rosenberg.
30 November, 2005 CIMCA2005, Vienna 1 Statistical Learning Procedure in Loopy Belief Propagation for Probabilistic Image Processing Kazuyuki Tanaka Graduate.
ICPR2004 (24 July, 2004, Cambridge) 1 Probabilistic image processing based on the Q-Ising model by means of the mean- field method and loopy belief propagation.
10 October, 2007 University of Glasgow 1 EM Algorithm with Markov Chain Monte Carlo Method for Bayesian Image Analysis Kazuyuki Tanaka Graduate School.
Today.
StatSense In-Network Probabilistic Inference over Sensor Networks
Graduate School of Information Sciences, Tohoku University, Japan
Markov Networks.
CSCI 5822 Probabilistic Models of Human and Machine Learning
Integration and Graphical Models
Graduate School of Information Sciences, Tohoku University
Inferring Edges by Using Belief Propagation
Expectation-Maximization & Belief Propagation
Probabilistic image processing and Bayesian network
GANG: Detecting Fraudulent Users in OSNs
Junction Trees 3 Undirected Graphical Models
Markov Networks.
Mean Field and Variational Methods Loopy Belief Propagation
Presentation transcript:

Understanding Belief Propagation and its Applications Dan Yuan June 2004

Outline Motivation Rationale Applications

Probabilistic Inference Directed Graph—Bayesian Network Undirected Graph– Markov Random Field NP-hard Problem: Computing the a posteriori beliefs of RVs in both of these graphs

Solutions Approximate Inference MCMC Sampling Belief Propagation

Parameterization and conditioning in Undirected Graph The Joint Probability where Z is a normalizing constant There is a cost named compatibility on each link between two neighboring nodes. We assume only the pair-wise compatibility between two nodes. P can be thought of as factoring into five multiplicative potential functions : A BC EAEA EBEB ECEC

Parameterization and conditioning in Undirected Graph with a Loop Formulation: A B C EAEA EBEB ECEC Why do we care about loopy graphs?

Probability Propagation The max-product update where denotes a normalizing constant and means all nodes neighboring except.

Probability Propagation (Cont’d) 1. The algorithm converges to a unique fixed belief regardless of initial conditions in a finite number of iterations. 2. At convergence, the belief for any value of a node i is the maximum of the posterior, conditioned on that node having the value: 3. Define the max-product assignment, by (assuming a unique maximizing value exists). Then is the MAP assignment.

Relation to Junction Tree Algorithm Transformation from a general graph to a junction tree, and BP on the junction tree is equivalent to that on the original graph. Transformation is too complicated when the original graph is very loopy.

Applications of BP in Computer Vision Unwrapping phase images[Frey, NIPS] Stereo matching [Sun,ECCV ] Shape and reflectance inference from photograph [Weiss, ICCV] Image detail extrapolating [Freeman, IJCV]

Experiments Noise Removal Image segmentation Enhancement m ii (x i ) yjyj yiyi …… xjxj xixi … … … … …… …

Results—noise removal Pepper and saltWhite gaussian

Results—Image Segmentation Enhancement

Thanks Questions?