Daphne Koller Variable Elimination Graph-Based Perspective Probabilistic Graphical Models Inference.

Slides:



Advertisements
Similar presentations
1 Undirected Graphical Models Graphical Models – Carlos Guestrin Carnegie Mellon University October 29 th, 2008 Readings: K&F: 4.1, 4.2, 4.3, 4.4,
Advertisements

Variational Methods for Graphical Models Micheal I. Jordan Zoubin Ghahramani Tommi S. Jaakkola Lawrence K. Saul Presented by: Afsaneh Shirazi.
CS498-EA Reasoning in AI Lecture #15 Instructor: Eyal Amir Fall Semester 2011.
Graphical Models BRML Chapter 4 1. the zoo of graphical models Markov networks Belief networks Chain graphs (Belief and Markov ) Factor graphs =>they.
Bayesian Networks, Winter Yoav Haimovitch & Ariel Raviv 1.
Lauritzen-Spiegelhalter Algorithm
Bayes Networks Markov Networks Noah Berlow. Bayesian -> Markov (Section 4.5.1) Given B, How can we turn into Markov Network? The general idea: – Convert.
Statistical Methods in AI/ML Bucket elimination Vibhav Gogate.
1 NP-Complete Problems. 2 We discuss some hard problems:  how hard? (computational complexity)  what makes them hard?  any solutions? Definitions 
An Introduction to Variational Methods for Graphical Models.
Algorithm Strategies Nelson Padua-Perez Chau-Wen Tseng Department of Computer Science University of Maryland, College Park.
Computability and Complexity 23-1 Computability and Complexity Andrei Bulatov Search and Optimization.
1 Directional consistency Chapter 4 ICS-179 Spring 2010 ICS Graphical models.
1 Spanning Trees Lecture 20 CS2110 – Spring
16:36MCS - WG20041 On the Maximum Cardinality Search Lower Bound for Treewidth Hans Bodlaender Utrecht University Arie Koster ZIB Berlin.
Recent Development on Elimination Ordering Group 1.
1 Internet Networking Spring 2006 Tutorial 6 Network Cost of Minimum Spanning Tree.
Global Approximate Inference Eran Segal Weizmann Institute.
Belief Propagation, Junction Trees, and Factor Graphs
1 Internet Networking Spring 2004 Tutorial 6 Network Cost of Minimum Spanning Tree.
Coloring Algorithms and Networks. Coloring2 Graph coloring Vertex coloring: –Function f: V  C, such that for all {v,w}  E: f(v)  f(w) Chromatic number.
Exact Inference: Clique Trees
SSS 06 Graphical SLAM and Sparse Linear Algebra Frank Dellaert.
Nattee Niparnan. Easy & Hard Problem What is “difficulty” of problem? Difficult for computer scientist to derive algorithm for the problem? Difficult.
1 Variable Elimination Graphical Models – Carlos Guestrin Carnegie Mellon University October 11 th, 2006 Readings: K&F: 8.1, 8.2, 8.3,
Knowledge Repn. & Reasoning Lec #11: Partitioning & Treewidth UIUC CS 498: Section EA Professor: Eyal Amir Fall Semester 2004.
CS774. Markov Random Field : Theory and Application Lecture 02
Data Structures & Algorithms Graphs
Daphne Koller Message Passing Belief Propagation Algorithm Probabilistic Graphical Models Inference.
1 Directional consistency Chapter 4 ICS-275 Spring 2009 ICS Constraint Networks.
NP-Complete Problems. Running Time v.s. Input Size Concern with problems whose complexity may be described by exponential functions. Tractable problems.
An Introduction to Variational Methods for Graphical Models
Wednesday, January 29, 2003CSCE Spring 2003 B.Y. Choueiry Directional Consistency Chapter 4.
Computing Branchwidth via Efficient Triangulations and Blocks Authors: F.V. Fomin, F. Mazoit, I. Todinca Presented by: Elif Kolotoglu, ISE, Texas A&M University.
Exact Inference in Bayes Nets. Notation U: set of nodes in a graph X i : random variable associated with node i π i : parents of node i Joint probability:
Christopher M. Bishop, Pattern Recognition and Machine Learning 1.
Daphne Koller Markov Networks General Gibbs Distribution Probabilistic Graphical Models Representation.
Daphne Koller Overview Conditional Probability Queries Probabilistic Graphical Models Inference.
Today Graphical Models Representing conditional dependence graphically
ISAM2: Incremental Smoothing and Mapping Using the Bayes Tree Michael Kaess, Hordur Johannsson, Richard Roberts, Viorela Ila, John Leonard, and Frank Dellaert.
Chapter 20: Graphs. Objectives In this chapter, you will: – Learn about graphs – Become familiar with the basic terminology of graph theory – Discover.
Daphne Koller Overview Maximum a posteriori (MAP) Probabilistic Graphical Models Inference.
The minimum cost flow problem. Solving the minimum cost flow problem.
Markov Random Fields in Vision
1 Variable Elimination Graphical Models – Carlos Guestrin Carnegie Mellon University October 15 th, 2008 Readings: K&F: 8.1, 8.2, 8.3,
An Algorithm to Learn the Structure of a Bayesian Network Çiğdem Gündüz Olcay Taner Yıldız Ethem Alpaydın Computer Engineering Taner Bilgiç Industrial.
Daphne Koller Variable Elimination Variable Elimination Algorithm Probabilistic Graphical Models Inference.
2016/7/2Appendices A and B1 Introduction to Distributed Algorithm Appendix A: Pseudocode Conventions Appendix B: Graphs and Networks Teacher: Chun-Yuan.
An introduction to chordal graphs and clique trees
Optimization problems such as
Inference in Bayesian Networks
The minimum cost flow problem
Exact Inference Continued
Maximum Flows of Minimum Cost
Independence in Markov Networks
Complexity Analysis Variable Elimination Inference Probabilistic
Clique Tree & Independence
Exact Inference ..
Independence in Markov Networks
General Gibbs Distribution
Markov Random Fields Presented by: Vladan Radosavljevic.
Exact Inference Continued
Variable Elimination 2 Clique Trees
Readings: K&F: 5.1, 5.2, 5.3, 5.4, 5.5, 5.6, 5.7 Markov networks, Factor graphs, and an unified view Start approximate inference If we are lucky… Graphical.
Lecture 3: Exact Inference in GMs
Junction Trees 3 Undirected Graphical Models
Independence in Markov Networks
Variable Elimination Graphical Models – Carlos Guestrin
Lecture 28 Approximation of Set Cover
Presentation transcript:

Daphne Koller Variable Elimination Graph-Based Perspective Probabilistic Graphical Models Inference

Daphne Koller Initial Graph C D I SG L J H D C D I SG L J H D

Daphne Koller Elimination as Graph Operation Eliminate: C C D I SG L J H D Induced Markov network for the current set of factors

Daphne Koller Elimination as Graph Operation Eliminate: D C D I SG L J H D Induced Markov network for the current set of factors

Daphne Koller Elimination as Graph Operation Eliminate: I C D I SG L J H Induced Markov network for the current set of factors

Daphne Koller Elimination as Graph Operation Eliminate: H C D I SG L J H Induced Markov network for the current set of factors

Daphne Koller Elimination as Graph Operation Eliminate: G C D I SG L J H Induced Markov network for the current set of factors

Daphne Koller Elimination as Graph Operation Eliminate: L,S C D I SG L J H Induced Markov network for the current set of factors

Daphne Koller Elimination as Graph Operation Eliminate: L,S C D I SG L J H Induced Markov network for the current set of factors

Daphne Koller Induced Graph The induced graph I ,  over factors  and ordering  : – Undirected graph – X i and X j are connected if they appeared in the same factor in a run of the VE algorithm using  as the ordering C D I SG L J H D

Daphne Koller Cliques in the Induced Graph Theorem: Every factor produced during VE is a clique in the induced graph C D I SG L J H D

Daphne Koller Cliques in the Induced Graph Theorem: Every (maximal) clique in the induced graph is a factor produced during VE C D I SG L J H D

Daphne Koller Cliques in the Induced Graph Theorem: Every (maximal) clique in the induced graph is a factor produced during VE C D I SG L J H D

Daphne Koller Induced Width The width of an induced graph is the number of nodes in the largest clique in the graph minus 1 Minimal induced width of a graph K is min  (width(I K,  )) Provides a lower bound on best performance of VE to a model factorizing over K

Daphne Koller Finding Elimination Orderings Theorem: For a graph H, determining whether there exists an elimination ordering for H with induced width  K is NP-complete Note: This NP-hardness result is distinct from the NP-hardness result of inference – Even given the optimal ordering, inference may still be exponential

Daphne Koller Finding Elimination Orderings Greedy search using heuristic cost function – At each point, eliminate node with smallest cost Possible cost functions: – min-neighbors: # neighbors in current graph – min-weight: weight (# values) of factor formed – min-fill: number of new fill edges – weighted min-fill: total weight of new fill edges (edge weight = product of weights of the 2 nodes)

Daphne Koller Finding Elimination Orderings Theorem: The induced graph is triangulated – No loops of length > 3 without a “bridge” Can find elimination ordering by finding a low-width triangulation of original graph H  BD C A

Daphne Koller Robot Localization & Mapping Square Root SAM, F. Dellaert and M. Kaess, IJRR, 2006

Daphne Koller Robot Localization & Mapping x z x z 2 2 x z... t t x robot pose sensor observation L1L1 L2L2 L3L3 x z 3 3 x z 4 4

Daphne Koller Robot Localization & Mapping Square Root SAM, F. Dellaert and M. Kaess, IJRR, 2006

Daphne Koller Eliminate Poses then Landmarks Square Root SAM, F. Dellaert and M. Kaess, IJRR, 2006

Daphne Koller Eliminate Landmarks then Poses Square Root SAM, F. Dellaert and M. Kaess, IJRR, 2006

Daphne Koller Min-Fill Elimination Square Root SAM, F. Dellaert and M. Kaess, IJRR, 2006

Daphne Koller Summary Variable elimination can be viewed in terms of transformations on undirected graph – Eliminating Z connects its current neighbors Sizes of cliques in resulting induced graph directly correspond to algorithm’s complexity Keeping induced graph simple provides useful heuristics for selecting elimination ordering

Daphne Koller END END END