Presentation is loading. Please wait.

Presentation is loading. Please wait.

CSCE 582 Computation of the Most Probable Explanation in Bayesian Networks using Bucket Elimination -Hareesh Lingareddy University of South Carolina.

Similar presentations


Presentation on theme: "CSCE 582 Computation of the Most Probable Explanation in Bayesian Networks using Bucket Elimination -Hareesh Lingareddy University of South Carolina."— Presentation transcript:

1 CSCE 582 Computation of the Most Probable Explanation in Bayesian Networks using Bucket Elimination -Hareesh Lingareddy University of South Carolina

2 Bucket Elimination -Algorithmic framework that generalizes dynamic programming to accommodate algorithms for many complex problem solving and reasoning activities. -Uses “buckets” to mimic the algebraic manipulations involved in each of these problems resulting in an easily expressible algorithmic formulation

3 Bucket Elimination Algorithm -Partition functions on the graph into “buckets” in backwards relative to the given node order -In the bucket of variable X we put all functions that mention X but do not mention any variable having a higher index -Process buckets backwards relative to the node order -The computed function after elimination is placed in the bucket of the ‘highest’ variable in its scope

4 Algorithms using Bucket Elimination -Belief Assessment -Most Probable Estimation(MPE) -Maximum A Posteriori Hypothesis(MAP) -Maximum Expected Utility(MEU)

5 Belief Assessment Definition - Given a set of evidence compute the posterior probability of all the variables – The belief assessment task of X k = x k is to find In the Visit to Asia example, the belief assessment problem answers questions like – What is the probability that a person has tuberculosis, given that he/she has dyspnoea and has visited Asia recently ? where k – normalizing constant

6 Belief Assessment Overview In reverse Node Ordering: – Create bucket function by multiplying all functions (given as tables) containing the current node – Perform variable elimination using Summation over the current node – Place the new created function table into the appropriate bucket

7 Most Probable Explanation (MPE) Definition – Given evidence find the maximum probability assignment to the remaining variables – The MPE task is to find an assignment x o = (x o 1, …, x o n ) such that

8 Differences from Belief Assessment – Replace Sums With Max – Keep track of maximizing value at each stage – “Forward Step” to determine what is the maximizing assignment tuple

9 Elimination Algorithm for Most Probable Explanation Bucket  : Bucket  : Bucket  : Bucket  : Bucket  : Bucket : Bucket  : Bucket  : P(  |  ) P(  |  )*P(  ) P(  | , ) P(  | ,  ),  =“no” MPE= MAX { , , , ,, , ,  } (P(  |  )* P(  |  )* P(  | , )* P(  | ,  )* P(  )*P( |  )*P(  |  )*P(  )) P( |  ) P(  |  )*P(  ) H()H() H()H() H(,)H(,) H  ( ,,  ) H ( , ,  ) H()H() H(,)H(,) MPE probability Finding MPE = max , , , ,, , ,  P( , , , ,, , ,  ) H n (u)=max xn ( П xn  Fn C(x n |x pa ))

10 Elimination Algorithm for Most Probable Explanation Bucket  : Bucket  : Bucket  : Bucket  : Bucket  : Bucket : Bucket  : Bucket  : P(  |  ) P(  |  )*P(  ) P(  | , ) P(  | ,  ),  =“no” P( |  ) P(  |  )*P(  ) H()H() H()H() H(,)H(,) H  ( ,,  ) H ( , ,  ) H()H() H(,)H(,) Forward part  ’ = arg max  H  (  )* H  (  )  ’ = arg max  H  (  ’,  )  ’ = arg max  P(  ’|  )*P(  )* H (  ’,  ’,  ) ’ = arg max P( |  ’)*H  (  ’,,  ’)  ’ = arg max  P(  |  ’, ’)*H  ( ,  ’)*H  (  )  ’ = “no”  ’ = arg max  P(  |  ’)  ’ = arg max  P(  ’|  )*P(  ) Return: (  ’,  ’,  ’, ’,  ’,  ’,  ’,  ’)

11 MPE Overview In reverse node Ordering – Create bucket function by multiplying all functions (given as tables) containing the current node – Perform variable elimination using the Maximization operation over the current node (recording the maximizing state function) – Place the new created function table into the appropriate bucket In forward node ordering – Calculate the maximum probability using maximizing state functions

12 Maximum Aposteriori Hypothesis (MAP) Definition – Given evidence find an assignment to a subset of “hypothesis” variables that maximizes their probability – Given a set of hypothesis variables A = {A 1, …, A k },,the MAP task is to find an assignment a o = (a o 1, …, a o k ) such that

13


Download ppt "CSCE 582 Computation of the Most Probable Explanation in Bayesian Networks using Bucket Elimination -Hareesh Lingareddy University of South Carolina."

Similar presentations


Ads by Google