Belief Propagation in a Continuous World Andrew Frank 11/02/2009 Joint work with Alex Ihler and Padhraic Smyth TexPoint fonts used in EMF. Read the TexPoint.

Slides:



Advertisements
Similar presentations
Mean-Field Theory and Its Applications In Computer Vision1 1.
Advertisements

Bayesian Belief Propagation
Topic models Source: Topic models, David Blei, MLSS 09.
Linear Time Methods for Propagating Beliefs Min Convolution, Distance Transforms and Box Sums Daniel Huttenlocher Computer Science Department December,
Graphical Models BRML Chapter 4 1. the zoo of graphical models Markov networks Belief networks Chain graphs (Belief and Markov ) Factor graphs =>they.
Randomized Sensing in Adversarial Environments Andreas Krause Joint work with Daniel Golovin and Alex Roper International Joint Conference on Artificial.
Exact Inference in Bayes Nets
Dynamic Bayesian Networks (DBNs)
Undirected Probabilistic Graphical Models (Markov Nets) (Slides from Sam Roweis)
An Introduction to Variational Methods for Graphical Models.
EE462 MLCV Lecture Introduction of Graphical Models Markov Random Fields Segmentation Tae-Kyun Kim 1.
Exact Inference (Last Class) variable elimination  polytrees (directed graph with at most one undirected path between any two vertices; subset of DAGs)
Variational Inference and Variational Message Passing
On Sketching Quadratic Forms Robert Krauthgamer, Weizmann Institute of Science Joint with: Alex Andoni, Jiecao Chen, Bo Qin, David Woodruff and Qin Zhang.
1 Graphical Models in Data Assimilation Problems Alexander Ihler UC Irvine Collaborators: Sergey Kirshner Andrew Robertson Padhraic Smyth.
Convergent and Correct Message Passing Algorithms Nicholas Ruozzi and Sekhar Tatikonda Yale University TexPoint fonts used in EMF. Read the TexPoint manual.
Extending Expectation Propagation for Graphical Models Yuan (Alan) Qi Joint work with Tom Minka.
Global Approximate Inference Eran Segal Weizmann Institute.
Geographic Gossip: Efficient Aggregations for Sensor Networks Author: Alex Dimakis, Anand Sarwate, Martin Wainwright University: UC Berkeley Venue: IPSN.
Belief Propagation, Junction Trees, and Factor Graphs
Graphical Models Lei Tang. Review of Graphical Models Directed Graph (DAG, Bayesian Network, Belief Network) Typically used to represent causal relationship.
Belief Propagation Kai Ju Liu March 9, Statistical Problems Medicine Finance Internet Computer vision.
Maximum Likelihood (ML), Expectation Maximization (EM)
Sequential Monte Carlo and Particle Filtering Frank Wood Gatsby, November 2007 TexPoint fonts used in EMF. Read the TexPoint manual before you delete this.
Computer vision: models, learning and inference Chapter 10 Graphical Models.
Particle Filters++ TexPoint fonts used in EMF.
Computer vision: models, learning and inference
6. Experimental Analysis Visible Boltzmann machine with higher-order potentials: Conditional random field (CRF): Exponential random graph model (ERGM):
Ahsanul Haque *, Swarup Chandra *, Latifur Khan * and Charu Aggarwal + * Department of Computer Science, University of Texas at Dallas + IBM T. J. Watson.
Fluid Limits for Gossip Processes Vahideh Manshadi and Ramesh Johari DARPA ITMANET Meeting March 5-6, 2009 TexPoint fonts used in EMF. Read the TexPoint.
1 Learning CRFs with Hierarchical Features: An Application to Go Scott Sanner Thore Graepel Ralf Herbrich Tom Minka TexPoint fonts used in EMF. Read the.
Ahsanul Haque *, Swarup Chandra *, Latifur Khan * and Michael Baron + * Department of Computer Science, University of Texas at Dallas + Department of Mathematical.
Ran El-Yaniv and Dmitry Pechyony Technion – Israel Institute of Technology, Haifa, Israel Transductive Rademacher Complexity and its Applications.
Continuous Variables Write message update equation as an expectation: Proposal distribution W t (x t ) for each node Samples define a random discretization.
Bayesian inference for Plackett-Luce ranking models
Daphne Koller Message Passing Belief Propagation Algorithm Probabilistic Graphical Models Inference.
CS Statistical Machine learning Lecture 24
Learning to Detect Events with Markov-Modulated Poisson Processes Ihler, Hutchins and Smyth (2007)
Approximate Inference: Decomposition Methods with Applications to Computer Vision Kyomin Jung ( KAIST ) Joint work with Pushmeet Kohli (Microsoft Research)
1 Mean Field and Variational Methods finishing off Graphical Models – Carlos Guestrin Carnegie Mellon University November 5 th, 2008 Readings: K&F:
Exact Inference in Bayes Nets. Notation U: set of nodes in a graph X i : random variable associated with node i π i : parents of node i Joint probability:
04/21/2005 CS673 1 Being Bayesian About Network Structure A Bayesian Approach to Structure Discovery in Bayesian Networks Nir Friedman and Daphne Koller.
Daphne Koller Overview Conditional Probability Queries Probabilistic Graphical Models Inference.
Markov Networks: Theory and Applications Ying Wu Electrical Engineering and Computer Science Northwestern University Evanston, IL 60208
Distributed cooperation and coordination using the Max-Sum algorithm
Bayesian Belief Propagation for Image Understanding David Rosenberg.
Perfect recall: Every decision node observes all earlier decision nodes and their parents (along a “temporal” order) Sum-max-sum rule (dynamical programming):
Slide 1 Directed Graphical Probabilistic Models: inference William W. Cohen Machine Learning Feb 2008.
A Collapsed Variational Bayesian Inference Algorithm for Latent Dirichlet Allocation Yee W. Teh, David Newman and Max Welling Published on NIPS 2006 Discussion.
Bayesian Conditional Random Fields using Power EP Tom Minka Joint work with Yuan Qi and Martin Szummer.
Many-Pairs Mutual Information for Adding Structure to Belief Propagation Approximations Arthur Choi and Adnan Darwiche University of California, Los Angeles.
Expectation Propagation for Graphical Models Yuan (Alan) Qi Joint work with Tom Minka.
CS498-EA Reasoning in AI Lecture #23 Instructor: Eyal Amir Fall Semester 2011.
Extending Expectation Propagation for Graphical Models
Today.
StatSense In-Network Probabilistic Inference over Sensor Networks
Bucket Renormalization for Approximate Inference
Markov Networks.
Arthur Choi and Adnan Darwiche UCLA
Graduate School of Information Sciences, Tohoku University
Bucket Renormalization for Approximate Inference
Markov Random Fields Presented by: Vladan Radosavljevic.
≠ Particle-based Variational Inference for Continuous Systems
Graduate School of Information Sciences, Tohoku University
Arthur Choi and Adnan Darwiche UCLA
Extending Expectation Propagation for Graphical Models
Readings: K&F: 5.1, 5.2, 5.3, 5.4, 5.5, 5.6, 5.7 Markov networks, Factor graphs, and an unified view Start approximate inference If we are lucky… Graphical.
Approximate Inference by Sampling
Markov Networks.
Mean Field and Variational Methods Loopy Belief Propagation
Presentation transcript:

Belief Propagation in a Continuous World Andrew Frank 11/02/2009 Joint work with Alex Ihler and Padhraic Smyth TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: A

Graphical Models Nodes represent random variables. Edges represent dependencies. C B AC B A C B A

CE DB A Markov Random Fields E DB CA D ACE B B  E | C, DA  C | B

Factoring Probability Distributions Independence relations  factorization D C BA p(A,B,C,D) = f(A) f(B) f(C) f(D) f(A,B) f(B,C) f(B,D)

Toy Example: A Day in Court W A EV A, E, W є {“Innocent”, “Guilty”} V є {“Not guilty verdict”, “Guilty verdict”} I G G I I G

Inference Most probable explanation: Marginalization:

Iterative Message Updates x

Belief Propagation W A EV m AE (E) m WE (E) m EV (V)

Loopy BP C A BD C A BD Does this work? Does it make any sense?

A Variational Perspective Reformulate the problem: True distribution, P “Tractable” distributions Best tractable approximation, Q Find Q to minimize the divergence.

Desired traits: – Simple enough to enable easy computation – Complex enough to represent P Choose an Approximating Family e.g. Fully factored: Structured:

Choose a Divergence Measure Kullback-Liebler divergence: Alpha divergence: Common choices:

Behavior of α-Divergence Source: T. Minka. Divergence measures and message passing. Technical Report MSR-TR , Microsoft. Research, 2005.

Resulting Algorithms Assuming a fully-factored form of Q, we get…* Mean field,α = 0 Belief propagation,α = 1 Tree-reweighted BP,α ≥ 1 * By minimizing “local divergence”: Q(X 1, X 2, …, X n ) = f(X 1 ) f(X 2 ) … f(X n )

Local vs. Global Minimization Source: T. Minka. Divergence measures and message passing. Technical Report MSR-TR , Microsoft. Research, 2005.

Applications

Sensor Localization A B C

Protein Side Chain Placement RTDCYGN +

Common traits? ? Continuous state space:

Easy Solution: Discretize! 10 bins Domain size: d = bins Domain size: d = 400 Each message: O(d 2 )

Particle BP We’d like to pass “continuous messages”… C A BD B m AB (B) ……… Instead, pass discrete messages over sets of particles: { b (i) } ~ W B (B) m AB ({b (i) }) b (1) b (2) b (N)...

PBP: Computing the Messages Re-write as an expectation: Finite-sample approximation:

Choosing“Good” Proposals C A BD Proposal should “match” the integrand. Sample from the belief:

Iteratively Refine Particle Sets (2) f(x s, x t ) (1)Draw a set of particles, {x s (i) } ~ W s (x s ). (2)Discrete inference over the particle discretization. (3)Adjust W s (x s ) (1) (3) XsXs XtXt (1) (3)

Benefits of PBP No distributional assumptions. Easy accuracy/speed trade-off. Relies on an “embedded” discrete algorithm. Belief propagation, mean field, tree-reweighted BP…

Exploring PBP: A Simple Example xsxs ||x s – x t ||

Continuous Ising Model Marginals Approximate Exact Mean Field PBP α = 0 PBP α = 1 TRW PBP α = 1.5 * Run with 100 particles per node

A Localization Scenario

Exact Marginal

PBP Marginal

Tree-reweighted PBP Marginal

Estimating the Partition Function Mean field provides a lower bound. Tree-reweighted BP provides an upper bound. p(A,B,C,D) = f(A) f(B) f(C) f(D) f(A,B) f(B,C) f(B,D) Z = f(A) f(B) f(C) f(D) f(A,B) f(B,C) f(B,D)

Partition Function Bounds

Conclusions BP and related algorithms are useful! Particle BP let’s you handle continuous RVs. Extensions to BP can work with PBP, too. Thank You!