Belief Propagation, Junction Trees, and Factor Graphs

Slides:



Advertisements
Similar presentations
Exact Inference. Inference Basic task for inference: – Compute a posterior distribution for some query variables given some observed evidence – Sum out.
Advertisements

Variational Methods for Graphical Models Micheal I. Jordan Zoubin Ghahramani Tommi S. Jaakkola Lawrence K. Saul Presented by: Afsaneh Shirazi.
CS498-EA Reasoning in AI Lecture #15 Instructor: Eyal Amir Fall Semester 2011.
Lauritzen-Spiegelhalter Algorithm
Statistical Methods in AI/ML Bucket elimination Vibhav Gogate.
Exact Inference in Bayes Nets
Junction Trees And Belief Propagation. Junction Trees: Motivation What if we want to compute all marginals, not just one? Doing variable elimination for.
Loopy Belief Propagation a summary. What is inference? Given: –Observabled variables Y –Hidden variables X –Some model of P(X,Y) We want to make some.
Clique Trees Amr Ahmed October 23, Outline Clique Trees Representation Factorization Inference Relation with VE.
Pearl’s Belief Propagation Algorithm Exact answers from tree-structured Bayesian networks Heavily based on slides by: Tomas Singliar,
Belief Propagation by Jakob Metzler. Outline Motivation Pearl’s BP Algorithm Turbo Codes Generalized Belief Propagation Free Energies.
Overview of Inference Algorithms for Bayesian Networks Wei Sun, PhD Assistant Research Professor SEOR Dept. & C4I Center George Mason University, 2009.
Markov Networks.
Hidden Markov Models M. Vijay Venkatesh. Outline Introduction Graphical Model Parameterization Inference Summary.
CS774. Markov Random Field : Theory and Application Lecture 04 Kyomin Jung KAIST Sep
Junction Trees: Motivation Standard algorithms (e.g., variable elimination) are inefficient if the undirected graph underlying the Bayes Net contains cycles.
Junction tree Algorithm :Probabilistic Graphical Models Recitation: 10/04/07 Ramesh Nallapati.
From Variable Elimination to Junction Trees
Machine Learning CUNY Graduate Center Lecture 6: Junction Tree Algorithm.
GS 540 week 6. HMM basics Given a sequence, and state parameters: – Each possible path through the states has a certain probability of emitting the sequence.
Global Approximate Inference Eran Segal Weizmann Institute.
Bayesian Networks Clique tree algorithm Presented by Sergey Vichik.
Graphical Models Lei Tang. Review of Graphical Models Directed Graph (DAG, Bayesian Network, Belief Network) Typically used to represent causal relationship.
Belief Propagation Kai Ju Liu March 9, Statistical Problems Medicine Finance Internet Computer vision.
Exact Inference: Clique Trees
Computer vision: models, learning and inference
CSC2535 Spring 2013 Lecture 2a: Inference in factor graphs Geoffrey Hinton.
Rate-based Data Propagation in Sensor Networks Gurdip Singh and Sandeep Pujar Computing and Information Sciences Sanjoy Das Electrical and Computer Engineering.
Undirected Models: Markov Networks David Page, Fall 2009 CS 731: Advanced Methods in Artificial Intelligence, with Biomedical Applications.
Belief Propagation. What is Belief Propagation (BP)? BP is a specific instance of a general class of methods that exist for approximate inference in Bayes.
Discrete Optimization Lecture 4 – Part 2 M. Pawan Kumar Slides available online
1 Variable Elimination Graphical Models – Carlos Guestrin Carnegie Mellon University October 11 th, 2006 Readings: K&F: 8.1, 8.2, 8.3,
Inference Complexity As Learning Bias Daniel Lowd Dept. of Computer and Information Science University of Oregon Joint work with Pedro Domingos.
Solving Bayesian Decision Problems: Variable Elimination and Strong Junction Tree Methods Presented By: Jingsong Wang Scott Langevin May 8, 2009.
Readings: K&F: 11.3, 11.5 Yedidia et al. paper from the class website
Daphne Koller Message Passing Belief Propagation Algorithm Probabilistic Graphical Models Inference.
An Introduction to Variational Methods for Graphical Models
Intro to Junction Tree propagation and adaptations for a Distributed Environment Thor Whalen Metron, Inc.
1 Mean Field and Variational Methods finishing off Graphical Models – Carlos Guestrin Carnegie Mellon University November 5 th, 2008 Readings: K&F:
Belief Propagation and its Generalizations Shane Oldenburger.
Exact Inference in Bayes Nets. Notation U: set of nodes in a graph X i : random variable associated with node i π i : parents of node i Joint probability:
Christopher M. Bishop, Pattern Recognition and Machine Learning 1.
Pattern Recognition and Machine Learning
Machine Learning – Lecture 18
Perceptual and Sensory Augmented Computing Machine Learning, Summer’09 Machine Learning – Lecture 13 Exact Inference & Belief Propagation Bastian.
Today Graphical Models Representing conditional dependence graphically
Belief propagation with junction trees Presented by Mark Silberstein and Yaniv Hamo.
1 Variable Elimination Graphical Models – Carlos Guestrin Carnegie Mellon University October 15 th, 2008 Readings: K&F: 8.1, 8.2, 8.3,
Perfect recall: Every decision node observes all earlier decision nodes and their parents (along a “temporal” order) Sum-max-sum rule (dynamical programming):
Slide 1 Directed Graphical Probabilistic Models: inference William W. Cohen Machine Learning Feb 2008.
Instructor: Eyal Amir Grad TAs: Wen Pu, Yonatan Bisk
Inference in Bayesian Networks
Today.
Exact Inference Continued
Markov Networks.
CSCI 5822 Probabilistic Models of Human and Machine Learning
Exact Inference ..
Markov Random Fields Presented by: Vladan Radosavljevic.
Exact Inference Continued
Variable Elimination 2 Clique Trees
Lecture 3: Exact Inference in GMs
Clique Tree Algorithm: Computation
Junction Trees 3 Undirected Graphical Models
Readings: K&F: 11.3, 11.5 Yedidia et al. paper from the class website
Markov Networks.
Mean Field and Variational Methods Loopy Belief Propagation
Generalized Belief Propagation
Presentation transcript:

Belief Propagation, Junction Trees, and Factor Graphs Daniel Lowd CIS 410/510pm

Inference in a chain graph B D C

Inference in a chain graph B D C

Inference in a chain graph B D C We already computed these messages when evaluating P(A) and P(D)!

What about a tree? A B C D E

What about a tree? A B C D E Therefore: General Pattern:

Overview of Belief Propagation We must receive incoming messages from all other neighbors in order to send an outgoing message to that neighbor: Marginal probability of a variable is the produce of all messages addressed to that variable: In a tree, we can compute all marginals by: Choosing a root Sending messages from the leaves to the root Sending messages from the root to the leaves

Belief Propagation in a Tree mBA(A) mEA(A) B E mCB(B) mDB(B) mFE(E) C D F 1. Upward Pass

Belief Propagation in a Tree mAB(B) mAE(E) B E mBD(D) mBC(C) mEF(F) C D F 2. Downward Pass

Junction Tree Algorithm How do we handle loops? Create a junction tree or clique tree in which each node is labeled with a set of variables Running intersection property: If variable X appears in cliques S and T, it must appear on every node along the path between them. Each potential function must be assigned to exactly one clique in the tree You can use variable elimination to generate the tree!

Example A A B D C E 2. Triangulate, form clique graph ABE ECD BEC ΦABx ΦAE ΦBC ΦCDx ΦED 3. Form a tree and assign potentials B E C D Start with a Markov network 4. Run belief propagation! One twist: Only sum out variables not in target clique.

Junction Tree Savings Avoids redundancy in repeated variable elimination Need to build junction tree only once ever Need to repeat belief propagation only when new evidence is received

Loopy Belief Propagation Inference is efficient if graph is tree Inference cost is exponential in treewidth (size of largest clique in graph – 1) What if treewidth is too high? Solution: Do belief prop. on original graph May not converge, or converge to bad approx. In practice, often fast and good approximation

Markov networks: Different factorizations, same graph B C A B C A B C

Factor graphs: Different factorizations, different graphs B C A B C A B C

Factor Graphs A factor graph is a bipartite graph with a node for each random variable and each factor. There is an edge between a factor and each variable that participates in that factor. A B C D A B C D A B C D

Factor Graphs A factor graph is a bipartite graph with a node for each random variable and each factor. There is an edge between a factor and each variable that participates in that factor. B E A J M A B E J M A B E J M

Belief Propagation in Factor Graphs D A B C h f g i E Original rule for trees: Break message passing into two steps: 1. Messages from variables to factors Multiply incoming messages from all other neighboring factors. 2. Messages from factors to variables. Multiply other incoming messages and sum out other variables.

Loopy Belief Propagation Incorporate evidence: For each evidence variable, select a potential that includes that variable and change the potential values to zero for everything that contradicts the evidence. Initialize: Set all messages to one Run: Pass messages from variables to factors, then factors to variables Stop: When messages change by very little, loopy belief propagation has converged (may not always happen!)