Download presentation

Presentation is loading. Please wait.

Published byMaya Brundrett Modified over 3 years ago

1
Bayesian Networks Bucket Elimination Algorithm 主講人：虞台文 大同大學資工所 智慧型多媒體研究室

2
Content Basic Concept Belief Updating Most Probable Explanation (MPE) Maximum A Posteriori (MAP)

3
Bayesian Networks Bucket Elimination Algorithm Basic Concept 大同大學資工所 智慧型多媒體研究室

4
Satisfiability Given a statement of clauses (in disjunction normal form), the satisfiability problem is to determine whether there exists a truth assignment to make the statement true. Examples: A=True, B=True, C=False, D=False Satisfiable Satisfiable?

5
Resolution can be true if and only if can be true. unsatisfiable

6
Direct Resolution Example: Given a set of clauses and an order d=ABCD Bucket A Bucket B Bucket C Bucket D Set initial buckets as follows:

7
Direct Resolution Bucket A Bucket B Bucket C Bucket D Because no empty clause ( ) is resulted, the statement is satisfiable. How to get a truth assignment?

8
Direct Resolution Bucket A Bucket B Bucket C Bucket D

9
Direct Resolution

10
Queries on Bayesian Networks Belief updating Finding the most probable explanation (mpe) – Given evidence, finding a maximum probability assignment to the rest of variables. Maximizing a posteriori hypothesis (map) – Given evidence, finding an assignment to a subset of hypothesis variables that maximize their probability. Maximizing the expected utility of the problem (meu) – Given evidence and utility function, finding a subset of decision variables that maximize the expected utility.

11
Bucket Elimination The algorithm will be used as a framework for various probabilistic inferences on Bayesian Networks.

12
Preliminary – Elimination Functions Given a function h defined over subset of variables S, where X S, Eliminate parameter X from h Defined over U = S – {X}.

13
Preliminary – Elimination Functions Given a function h defined over subset of variables S, where X S,

14
Preliminary – Elimination Functions Given function h 1,…, h n defined over subset of variables S 1,…, S n, respectively, Defined over

15
Preliminary – Elimination Functions Given function h 1,…, h n defined over subset of variables S 1,…, S n, respectively,

16
Bayesian Networks Bucket Elimination Algorithm Belief Updating 大同大學資工所 智慧型多媒體研究室

17
Goal Normalization Factor

18
Basic Concept of Variable Elimination Example: A B D C F G

19
Basic Concept of Variable Elimination Example:

20
Basic Concept of Variable Elimination G ( f ) D ( a, b ) F ( b, c ) B ( a, c ) C ( a )

21
Basic Concept of Variable Elimination Bucket G Bucket D Bucket F Bucket B Bucket C Bucket A

22
Basic Concept of Variable Elimination Bucket G Bucket D Bucket F Bucket B Bucket C Bucket A

23
Basic Concept of Variable Elimination f G (f ) +0.1 0.7

24
Basic Concept of Variable Elimination f G (f ) +0.1 0.7 ab D (a, b) 001 011 101 111

25
Basic Concept of Variable Elimination f G (f ) +0.1 0.7 ab D (a, b) 001 011 101 111 0.7 0.1 0.7 0.1 0.7 0.1 0.7 0.1 bc F (b, c) 000.701 010.610 100.400 110.340

26
Basic Concept of Variable Elimination f G (f ) +0.1 0.7 ab D (a, b) 001 011 101 111 bc F (b, c) 000.701 010.610 100.400 110.340 ac B (a, c) 00 0.9 0.701+0.1 0.400=0.6709 01 0.9 0.610+0.1 0.340=0.5830 10 0.6 0.701+0.4 0.400=0.5806 11 0.6 0.610+0.4 0.340=0.5020

27
Basic Concept of Variable Elimination f G (f ) +0.1 0.7 ab D (a, b) 001 011 101 111 bc F (b, c) 000.701 010.610 100.400 110.340 ac B (a, c) 000.6709 010.5830 100.5806 110.5020 a C (a ) 1 0.67 0.5806+0.33 0.5020=0.554662 0 0.75 0.6709+0.25 0.5830=0.648925

28
Basic Concept of Variable Elimination f G (f ) +0.1 0.7 ab D (a, b) 001 011 101 111 bc F (b, c) 000.701 010.610 100.400 110.340 ac B (a, c) 000.6709 010.5830 100.5806 110.5020 a C (a ) 10.554662 00.648925 aP(a, g=1) 1 0.3 0.554662=0.1663986 0 0.7 0.648925=0.4542475 aP(a | g=1) 10.1663986/0.6206461=0.26811 00.4542475/0.6206461=0.73189

29
Bucket Elimination Algorithm

30
Complexity The BuckElim Algorithm can be applied to any ordering. The arity of the function recorded in a bucket – the numbers of variables appearing in the processed bucked, excluding the bucket’s variable. Time and Space complexity is exponentially grow with a function of arity r. The arity is dependent on the ordering. How many possible orderings for BN’s variables?

31
Determination of the Arity Bucket G Bucket B Bucket C Bucket D Bucket F Bucket A A B D C F G Consider the ordering AFDCBG. G B C D F A 1 4 1 0 0 0,3,2,1

32
Determination of the Arity Given the ordering, e.g., AFDCBG. 1 4 0 3 2 1 G B C D F A G B C D F A 1 4 1 0 0 0 A B D C F G Initial Graph Width of node Induced Graph Width of node d w(d): width of initial graph for ordering d. w*(d): width of induced graph for ordering d. w(d): width of initial graph for ordering d. w*(d): width of induced graph for ordering d. The width of a graph is the maximum width of its nodes. w(d) = 4w*(d) = 4

33
Definition of Tree-Width Goal: Finding an ordering with smallest induced width. NP -Hard Greedy heuristic and Approximation methods Are available.

34
Summary The complexity of BuckElim algorithm is dominated by the time and space needed to process a bucket. It is time and space is exponential in number of bucket variables. Induced width bounds the arity of bucket functions.

35
Exercises A B D C F G Use BuckElim to evaluate P(a|b=1) with the following two ordering: 1. d 1 =ACBFDG 2. d 2 =AFDCBG Give the details and make some conclusion. How to improve the algorithm?

36
Bayesian Networks Bucket Elimination Algorithm Most Probable Explanation (MPE) 大同大學資工所 智慧型多媒體研究室

37
MPE Goal: evidence

38
MPE Goal:

39
Notations xixi

40
MPE Let

41
MPE Some terms involve x n, some terms not. XnXn X n is conditioned by its parents. X n conditions its children.

42
MPE XnXn Not conditioned by x n Conditioned by x n Itself x n appears in these CPT’s

43
MPE Eliminate variable x n at Bucket n. Process the next bucket recursively.

44
Example A B D C F G

45
A B D C F G Bucket G Bucket D Bucket F Bucket B Bucket C Bucket A Consider ordering ACBFDG

46
Bucket Elimination Algorithm

47
Exercise Consider ordering ACBFDG

48
Bayesian Networks Bucket Elimination Algorithm Maximum A Posteriori (MAP) 大同大學資工所 智慧型多媒體研究室

49
MAP Given a belief network, a subset of hypothesized variables A=(A 1, …, A k ), and evidence E=e, the goal is to determine

50
Example A B D C F G Hypothesis (Decision) Variables g = 1

51
MAP Ordering Some of them may be observed

52
MAP

54
Bucket Elimination for belief updating Bucket Elimination for MPE

55
Bucket Elimination Algorithm

56
Example A B D C F G g = 1 Consider ordering CBAFDG Bucket G Bucket D Bucket F Bucket A Bucket B Bucket C

57
Exercise A B D C F G g = 1 Consider ordering CBAFDG Bucket G Bucket D Bucket F Bucket A Bucket B Bucket C Give the detail

Similar presentations

OK

1 Structure Learning (The Good), The Bad, The Ugly Inference Graphical Models – 10708 Carlos Guestrin Carnegie Mellon University October 13 th, 2008 Readings:

1 Structure Learning (The Good), The Bad, The Ugly Inference Graphical Models – 10708 Carlos Guestrin Carnegie Mellon University October 13 th, 2008 Readings:

© 2018 SlidePlayer.com Inc.

All rights reserved.

Ads by Google

Ppt on fdi in hindi Ppt on fire safety week Ppt on 555 timer chip Ppt on asymptotic notation of algorithms in c Ppt on global warming and greenhouse effect Ppt on adolf hitler Ppt on main bodies of uno rules Ppt on combination of resistances training Ppt on bmc remedy service Ppt on blood stain pattern analysis expiration pattern