Bayesian Networks Bucket Elimination Algorithm 主講人:虞台文 大同大學資工所 智慧型多媒體研究室.

Slides:



Advertisements
Similar presentations
Naïve Bayes. Bayesian Reasoning Bayesian reasoning provides a probabilistic approach to inference. It is based on the assumption that the quantities of.
Advertisements

Variational Methods for Graphical Models Micheal I. Jordan Zoubin Ghahramani Tommi S. Jaakkola Lawrence K. Saul Presented by: Afsaneh Shirazi.
. Exact Inference in Bayesian Networks Lecture 9.
Bayesian Networks CSE 473. © Daniel S. Weld 2 Last Time Basic notions Atomic events Probabilities Joint distribution Inference by enumeration Independence.
“Using Weighted MAX-SAT Engines to Solve MPE” -- by James D. Park Shuo (Olivia) Yang.
Bucket Elimination: A Unifying Framework for Probabilistic Inference By: Rina Dechter Presented by: Gal Lavee.
Logic.
Bucket Elimination: A unifying framework for Probabilistic inference Rina Dechter presented by Anton Bezuglov, Hrishikesh Goradia CSCE 582 Fall02 Instructor:
CSCE 582 Computation of the Most Probable Explanation in Bayesian Networks using Bucket Elimination -Hareesh Lingareddy University of South Carolina.
MPE, MAP AND APPROXIMATIONS Lecture 10: Statistical Methods in AI/ML Vibhav Gogate The University of Texas at Dallas Readings: AD Chapter 10.
Hidden Markov Model 主講人:虞台文 大同大學資工所 智慧型多媒體研究室. Contents Introduction – Markov Chain – Hidden Markov Model (HMM) Formal Definition of HMM & Problems Estimate.
A. Darwiche Inference in Bayesian Networks. A. Darwiche Query Types Pr: –Evidence: Pr(e) –Posterior marginals: Pr(x|e) for every X MPE: Most probable.
Decision Tree Learning 主講人:虞台文 大同大學資工所 智慧型多媒體研究室.
Optimization Problems 虞台文 大同大學資工所 智慧型多媒體研究室. Content Introduction Definitions Local and Global Optima Convex Sets and Functions Convex Programming Problems.
Probabilistic networks Inference and Other Problems Hans L. Bodlaender Utrecht University.
主講人:虞台文 大同大學資工所 智慧型多媒體研究室
1 Exact Inference Algorithms Bucket-elimination and more COMPSCI 179, Spring 2010 Set 8: Rina Dechter (Reading: chapter 14, Russell and Norvig.
A Differential Approach to Inference in Bayesian Networks - Adnan Darwiche Jiangbo Dang and Yimin Huang CSCE582 Bayesian Networks and Decision Graph.
1 Chapter 12 Probabilistic Reasoning and Bayesian Belief Networks.
1 Exact Inference Algorithms for Probabilistic Reasoning; COMPSCI 276 Fall 2007.
. Bayesian Networks Lecture 9 Edited from Nir Friedman’s slides by Dan Geiger from Nir Friedman’s slides.
. Inference I Introduction, Hardness, and Variable Elimination Slides by Nir Friedman.
Tutorial #9 by Ma’ayan Fishelson
A Differential Approach to Inference in Bayesian Networks - Adnan Darwiche Jiangbo Dang and Yimin Huang CSCE582 Bayesian Networks and Decision Graphs.
1 Bayesian Networks Chapter ; 14.4 CS 63 Adapted from slides by Tim Finin and Marie desJardins. Some material borrowed from Lise Getoor.
CS Bayesian Learning1 Bayesian Learning. CS Bayesian Learning2 States, causes, hypotheses. Observations, effect, data. We need to reconcile.
AND/OR Search for Mixed Networks #CSP Robert Mateescu ICS280 Spring Current Topics in Graphical Models Professor Rina Dechter.
Inference in Gaussian and Hybrid Bayesian Networks ICS 275B.
机器学习 陈昱 北京大学计算机科学技术研究所 信息安全工程研究中心. Concept Learning Reference : Ch2 in Mitchell’s book 1. Concepts: Inductive learning hypothesis General-to-specific.
第十讲 概率图模型导论 Chapter 10 Introduction to Probabilistic Graphical Models
Automated Planning and Decision Making Prof. Ronen Brafman Automated Planning and Decision Making 2007 Bayesian networks Variable Elimination Based on.
Introduction to Bayesian Networks
1 Variable Elimination Graphical Models – Carlos Guestrin Carnegie Mellon University October 11 th, 2006 Readings: K&F: 8.1, 8.2, 8.3,
Daphne Koller Variable Elimination Graph-Based Perspective Probabilistic Graphical Models Inference.
Inference Complexity As Learning Bias Daniel Lowd Dept. of Computer and Information Science University of Oregon Joint work with Pedro Domingos.
The Simplex Algorithm 虞台文 大同大學資工所 智慧型多媒體研究室. Content Basic Feasible Solutions The Geometry of Linear Programs Moving From Bfs to Bfs Organization of a.
Lecture 1: A Formal Model of Computation 虞台文 大同大學資工所 智慧型多媒體研究室.
Probabilistic Networks Chapter 14 of Dechter’s CP textbook Speaker: Daniel Geschwender April 1, 2013 April 1&3, 2013DanielG--Probabilistic Networks1.
Two Approximate Algorithms for Belief Updating Mini-Clustering - MC Robert Mateescu, Rina Dechter, Kalev Kask. "Tree Approximation for Belief Updating",
1 Chapter 12 Probabilistic Reasoning and Bayesian Belief Networks.
Reinforcement Learning Eligibility Traces 主講人:虞台文 大同大學資工所 智慧型多媒體研究室.
Reinforcement Learning 主講人:虞台文 大同大學資工所 智慧型多媒體研究室.
The Simplex Algorithm 虞台文 大同大學資工所 智慧型多媒體研究室. Content Basic Feasible Solutions The Geometry of Linear Programs Moving From Bfs to Bfs Organization of a.
Inference Algorithms for Bayes Networks
Lecture 3: Count Programs, While Programs and Recursively Defined Functions 虞台文 大同大學資工所 智慧型多媒體研究室.
1 CMSC 671 Fall 2001 Class #20 – Thursday, November 8.
Lecture 5: Finite Automata 虞台文 大同大學資工所 智慧型多媒體研究室.
1 Tutorial #9 by Ma’ayan Fishelson. 2 Bucket Elimination Algorithm An algorithm for performing inference in a Bayesian network. Similar algorithms can.
Hopfield Neural Networks for Optimization 虞台文 大同大學資工所 智慧型多媒體研究室.
Lecture 2: Limiting Models of Instruction Obeying Machine 虞台文 大同大學資工所 智慧型多媒體研究室.
1 Structure Learning (The Good), The Bad, The Ugly Inference Graphical Models – Carlos Guestrin Carnegie Mellon University October 13 th, 2008 Readings:
Daphne Koller Overview Maximum a posteriori (MAP) Probabilistic Graphical Models Inference.
Linear Programming 虞台文.
1 Variable Elimination Graphical Models – Carlos Guestrin Carnegie Mellon University October 15 th, 2008 Readings: K&F: 8.1, 8.2, 8.3,
EM Algorithm 主講人:虞台文 大同大學資工所 智慧型多媒體研究室. Contents Introduction Example  Missing Data Example  Mixed Attributes Example  Mixture Main Body Mixture Model.
Lecture 6: Context-Free Languages
Inference in Bayesian Networks
Exact Inference Continued
Directional Resolution: The Davis-Putnam Procedure, Revisited
Functional Treewidth:
Readings: K&F: 15.1, 15.2, 15.3, 15.4, 15.5 K&F: 7 (overview of inference) K&F: 8.1, 8.2 (Variable Elimination) Structure Learning in BNs 3: (the good,
Class #19 – Tuesday, November 3
主講人:虞台文 大同大學資工所 智慧型多媒體研究室
Hopfield Neural Networks for Optimization
Lecture 5: Turning Machine
Variable Elimination Graphical Models – Carlos Guestrin
presented by Anton Bezuglov, Hrishikesh Goradia CSCE 582 Fall02
Simulated Annealing & Boltzmann Machines
Lecture 6: Computational Complexity
Presentation transcript:

Bayesian Networks Bucket Elimination Algorithm 主講人:虞台文 大同大學資工所 智慧型多媒體研究室

Content Basic Concept Belief Updating Most Probable Explanation (MPE) Maximum A Posteriori (MAP)

Bayesian Networks Bucket Elimination Algorithm Basic Concept 大同大學資工所 智慧型多媒體研究室

Satisfiability Given a statement of clauses (in disjunction normal form), the satisfiability problem is to determine whether there exists a truth assignment to make the statement true. Examples: A=True, B=True, C=False, D=False Satisfiable Satisfiable?

Resolution can be true if and only if can be true.   unsatisfiable

Direct Resolution Example: Given a set of clauses and an order d=ABCD Bucket A Bucket B Bucket C Bucket D Set initial buckets as follows:

Direct Resolution Bucket A Bucket B Bucket C Bucket D Because no empty clause (  ) is resulted, the statement is satisfiable. How to get a truth assignment?

Direct Resolution Bucket A Bucket B Bucket C Bucket D

Direct Resolution

Queries on Bayesian Networks Belief updating Finding the most probable explanation (mpe) – Given evidence, finding a maximum probability assignment to the rest of variables. Maximizing a posteriori hypothesis (map) – Given evidence, finding an assignment to a subset of hypothesis variables that maximize their probability. Maximizing the expected utility of the problem (meu) – Given evidence and utility function, finding a subset of decision variables that maximize the expected utility.

Bucket Elimination The algorithm will be used as a framework for various probabilistic inferences on Bayesian Networks.

Preliminary – Elimination Functions Given a function h defined over subset of variables S, where X  S, Eliminate parameter X from h Defined over U = S – {X}.

Preliminary – Elimination Functions Given a function h defined over subset of variables S, where X  S,

Preliminary – Elimination Functions Given function h 1,…, h n defined over subset of variables S 1,…, S n, respectively, Defined over

Preliminary – Elimination Functions Given function h 1,…, h n defined over subset of variables S 1,…, S n, respectively,

Bayesian Networks Bucket Elimination Algorithm Belief Updating 大同大學資工所 智慧型多媒體研究室

Goal Normalization Factor

Basic Concept of Variable Elimination Example: A B D C F G

Basic Concept of Variable Elimination Example:

Basic Concept of Variable Elimination G ( f ) D ( a, b ) F ( b, c ) B ( a, c ) C ( a )

Basic Concept of Variable Elimination Bucket G Bucket D Bucket F Bucket B Bucket C Bucket A

Basic Concept of Variable Elimination Bucket G Bucket D Bucket F Bucket B Bucket C Bucket A

Basic Concept of Variable Elimination f G (f ) +0.1  0.7

Basic Concept of Variable Elimination f G (f ) +0.1  0.7 ab D (a, b)

Basic Concept of Variable Elimination f G (f ) +0.1  0.7 ab D (a, b)  0.7  0.1  0.7  0.1  0.7  0.1  0.7  0.1 bc F (b, c)

Basic Concept of Variable Elimination f G (f ) +0.1  0.7 ab D (a, b) bc F (b, c) ac B (a, c)   0.400=   0.340=   0.400=   0.340=0.5020

Basic Concept of Variable Elimination f G (f ) +0.1  0.7 ab D (a, b) bc F (b, c) ac B (a, c) a C (a )   =   =

Basic Concept of Variable Elimination f G (f ) +0.1  0.7 ab D (a, b) bc F (b, c) ac B (a, c) a C (a ) aP(a, g=1)  =  = aP(a | g=1) / = / =

Bucket Elimination Algorithm

Complexity The BuckElim Algorithm can be applied to any ordering. The arity of the function recorded in a bucket – the numbers of variables appearing in the processed bucked, excluding the bucket’s variable. Time and Space complexity is exponentially grow with a function of arity r. The arity is dependent on the ordering. How many possible orderings for BN’s variables?

Determination of the Arity Bucket G Bucket B Bucket C Bucket D Bucket F Bucket A A B D C F G Consider the ordering AFDCBG. G B C D F A ,3,2,1

Determination of the Arity Given the ordering, e.g., AFDCBG G B C D F A G B C D F A A B D C F G Initial Graph Width of node Induced Graph Width of node d w(d): width of initial graph for ordering d. w*(d): width of induced graph for ordering d. w(d): width of initial graph for ordering d. w*(d): width of induced graph for ordering d. The width of a graph is the maximum width of its nodes. w(d) = 4w*(d) = 4

Definition of Tree-Width Goal: Finding an ordering with smallest induced width. NP -Hard Greedy heuristic and Approximation methods Are available.

Summary The complexity of BuckElim algorithm is dominated by the time and space needed to process a bucket. It is time and space is exponential in number of bucket variables. Induced width bounds the arity of bucket functions.

Exercises A B D C F G Use BuckElim to evaluate P(a|b=1) with the following two ordering: 1. d 1 =ACBFDG 2. d 2 =AFDCBG Give the details and make some conclusion. How to improve the algorithm?

Bayesian Networks Bucket Elimination Algorithm Most Probable Explanation (MPE) 大同大學資工所 智慧型多媒體研究室

MPE Goal: evidence

MPE Goal:

Notations xixi

MPE Let

MPE Some terms involve x n, some terms not. XnXn X n is conditioned by its parents. X n conditions its children.

MPE XnXn Not conditioned by x n Conditioned by x n Itself x n appears in these CPT’s

MPE Eliminate variable x n at Bucket n. Process the next bucket recursively.

Example A B D C F G

A B D C F G Bucket G Bucket D Bucket F Bucket B Bucket C Bucket A Consider ordering ACBFDG

Bucket Elimination Algorithm

Exercise Consider ordering ACBFDG

Bayesian Networks Bucket Elimination Algorithm Maximum A Posteriori (MAP) 大同大學資工所 智慧型多媒體研究室

MAP Given a belief network, a subset of hypothesized variables A=(A 1, …, A k ), and evidence E=e, the goal is to determine

Example A B D C F G Hypothesis (Decision) Variables g = 1

MAP Ordering Some of them may be observed

MAP

Bucket Elimination for belief updating Bucket Elimination for MPE

Bucket Elimination Algorithm

Example A B D C F G g = 1 Consider ordering CBAFDG Bucket G Bucket D Bucket F Bucket A Bucket B Bucket C

Exercise A B D C F G g = 1 Consider ordering CBAFDG Bucket G Bucket D Bucket F Bucket A Bucket B Bucket C Give the detail