Presentation is loading. Please wait.

Presentation is loading. Please wait.

Junction Tree Algorithm Brookes Vision Reading Group.

Similar presentations


Presentation on theme: "Junction Tree Algorithm Brookes Vision Reading Group."— Presentation transcript:

1 Junction Tree Algorithm Brookes Vision Reading Group

2 Outline Graphical Models – What are Graphical Models ? – Conditional Independence – Inference Junction Tree Algorithm – Moralizing a graph – Junction Tree Property – Creating a junction tree – Inference using junction tree algorithm

3 Don’t we all know … P(A) = 1, if and only if A is certain P(A or B) = P(A)+P(B) if and only if A and B are mutually exclusive P(A,B) = P(A|B)P(B) = P(B|A)P(A) Conditional Independence –A is conditionally independent of C given B –P(A|B,C) = P(A|B)

4 Outline Graphical Models – What are Graphical Models ? – Conditional Independence – Inference Junction Tree Algorithm – Moralizing a graph – Junction Tree Property – Creating a junction tree – Inference using junction tree algorithm

5 Graphical Models Compact graphical representation of joint probability. A B P(A,B) = P(A)P(B|A) A ‘causes’ B

6 Graphical Models Compact graphical representation of joint probability. A B P(A,B) = P(B)P(A|B) B ‘causes’ A

7 Graphical Models Compact graphical representation of joint probability. A B P(A,B)

8 A Simple Example P(A,B,C) = P(A)P(B,C | A) = P(A) P(B|A) P(C|B,A) C is conditionally independent of A given B = P(A) P(B|A) P(C|B) Graphical Representation ???

9 Bayesian Network Directed Graphical Model P(U) =  P(V i | Pa(V i )) A B C P(A,B,C) = P(A) P(B | A) P(C | B)

10 Markov Random Fields Undirected Graphical Model ABC

11 Markov Random Fields Undirected Graphical Model ABBC B P(U) =  P(Clique) /  P(Separator) Clique Separator P(A,B,C) = P(A,B) P(B,C) / P(B)

12 Outline Graphical Models – What are Graphical Models ? – Conditional Independence – Inference Junction Tree Algorithm – Moralizing a graph – Junction Tree Property – Creating a junction tree – Inference using junction tree algorithm

13 Bayesian Networks A is conditionally independent of B given C Bayes ball cannot reach A from B

14 Markov Random Fields A, B, C - (set of) nodes C is conditionally independent of A given B All paths from A to C go through B

15 Markov Random Fields

16 A node is conditionally independent of all others given its neighbours.

17 Outline Graphical Models – What are Graphical Models ? – Conditional Independence – Inference Junction Tree Algorithm – Moralizing a graph – Junction Tree Property – Creating a junction tree – Inference using junction tree algorithm

18 MAP Estimation c*,s*,r*,w* = argmax P(C=c,S=s,R=r,W=w)

19 Computing Marginals P(W=w) =  c,s,r P(C=c,S=s,R=r,W=w)

20 Outline Graphical Models – What are Graphical Models ? – Conditional Independence – Inference Junction Tree Algorithm – Moralizing a graph – Junction Tree Property – Creating a junction tree – Inference using junction tree algorithm

21 Aim To perform exact inference efficiently Transform the graph into an appropriate data structure Ensure joint probability remains the same Ensure exact marginals can be computed

22 Junction Tree Algorithm Converts Bayes Net into an undirected tree – Joint probability remains unchanged – Exact marginals can be computed Why ??? –Uniform treatment of Bayes Net and MRF –Efficient inference is possible for undirected trees

23 Junction Tree Algorithm Converts Bayes Net into an undirected tree – Joint probability remains unchanged – Exact marginals can be computed Why ??? –Uniform treatment of Bayes Net and MRF –Efficient inference is possible for undirected trees

24 Let us recap.. Shall we P(U) =  P(V i | Pa(V i )) =  a(V i, Pa(V i )) Potential Lets convert this to an undirected graphical model A BC D

25 Let us recap.. Shall we A BC D Wait a second …something is wrong here. The cliques of this graph are inconsistent with the original one. Node D just lost a parent.

26 Solution A BC D Ensure that a node and its parents are part of the same clique Marry the parents for a happy family Now you can make the graph undirected

27 Solution A BC D A few conditional independences are lost. But we have added extra edges, haven’t we ???

28 Outline Graphical Models – What are Graphical Models ? – Conditional Independence – Inference Junction Tree Algorithm – Moralizing a graph – Junction Tree Property – Creating a junction tree – Inference using junction tree algorithm

29 Moralizing a graph Marry all unconnected parents Drop the edge directions Ensure joint probability remains the same.

30 Moralizing a graph

31 C SR W CSR SRW SR Clique Potentials a(C i ) Separator Potentials a(S i ) Initialize a(C i ) = 1 a(S i ) = 1

32 Moralizing a graph C SR W CSR SRW SR Choose one node V i Find one clique C i containing V i and Pa(V i ) Multiply a(V i,Pa(V i )) to a(C i ) Repeat for all V i

33 Moralizing a graph C SR W CSR SRW SR Choose one node V i Find one clique C i containing V i and Pa(V i ) Multiply a(V i,Pa(V i )) to a(C i ) Repeat for all V i

34 Moralizing a graph C SR W CSR SRW SR Choose one node V i Find one clique C i containing V i and Pa(V i ) Multiply a(V i,Pa(V i )) to a(C i ) Repeat for all V i

35 Moralizing a graph C SR W CSR SRW SR Choose one node V i Find one clique C i containing V i and Pa(V i ) Multiply a(V i,Pa(V i )) to a(C i ) Repeat for all V i

36 Moralizing a graph P(U) =  a(C i ) /  a(S i ) Now we can form a tree with all the cliques we chose. That was easy. We’re ready to marginalize. OR ARE WE ???

37 A few more examples … AB DC AB CD

38 AB DC AB CD

39 ABBD CDAC ABBCD Inconsistency in C Clearly we’re missing something here

40 Outline Graphical Models – What are Graphical Models ? – Conditional Independence – Inference Junction Tree Algorithm – Moralizing a graph – Junction Tree Property – Creating a junction tree – Inference using junction tree algorithm

41 Junction Tree Property In a junction tree, all cliques in the unique path between cliques C i and C j must contain C i  C j So what we want is a junction tree, right ??? Q. Do all graphs have a junction tree ??? A. NO

42 Decomposable Graphs Decomposition (A,B,C) ACB V = A  B  C All paths between A and B go through C C is a complete subset of V Undirected graph G = (V,E)

43 Decomposable Graphs A, B and/or C can be empty A, B are non-empty in a proper decomposition

44 Decomposable Graphs G is decomposable if and only if G is complete OR It possesses a proper decomposition (A,B,C) such that – G A  C is decomposable – G B  C is decomposable

45 Decomposable Graphs AB DC AB CD Not Decomposable Decomposable

46 Decomposable Graphs Not Decomposable Decomposable A BC ED A BC ED

47 An Important Theorem Theorem: A graph G has a junction tree if and only if it is decomposable. Proof on white board.

48 OK. So how do I convert my graph into a decomposable one.

49 Time for more definitions Chord of a cycle – An edge between two non-successive nodes Chordless cycle – A cycle with no chords Triangulated graph – A graph with no chordless cycles

50 Another Important Theorem Theorem: A graph G is decomposable if and only if it is triangulated. Proof on white board. Alright. So add edges to triangulate the graph.

51 Triangulating a Graph AB CD ABCBCD BC

52 Triangulating a Graph

53

54 Some Notes on Triangulation Can we ensure the joint probability remains unchanged ?? Of course. Adding edges preserves cliques found after moralization. Use the previous algorithm for initializing potentials. Aren’t more conditional independences lost ??? Yes. :-(

55 Some Notes on Triangulation Is Triangulation unique?? No. Okay then. Lets find the best triangulation. Sadly, that’s NP hard. Hang on. We still have a graph. We were promised a tree. Alright. Lets form a tree then.

56 Outline Graphical Models – What are Graphical Models ? – Conditional Independence – Inference Junction Tree Algorithm – Moralizing a graph – Junction Tree Property – Creating a junction tree – Inference using junction tree algorithm

57 Creating a Junction Tree A BD CE ABD CDE BCD ABD BCD CDE Not a junction tree Junction tree Clearly, we’re still missing something here.

58 Yet Another Theorem Theorem: A junction tree is an MST where the weights are the cardinality of the separators. Proof on white board. Alright. So lets form an MST.

59 A BD CE Forming an MST ABD BCDCDE 2 2 1

60 A BD CE Forming an MST ABD BCDCDE 2 2 1

61 A BD CE Forming an MST ABD BCDCDE 2 2 1

62 A BD CE Forming an MST ABD BCDCDE 2 2

63 A Quick Recap AS BLT E XD Asia Network

64 A Quick Recap AS BLT E XD 1. Marry unconnected parents

65 A Quick Recap AS BLT E XD 2. Drop directionality of edges.

66 A Quick Recap AS BLT E XD 3. Triangulate the graph.

67 A Quick Recap 4. Find the MST clique tree. Voila.. The junction tree. SBL BLE DBEXE TLEAT Whew. Done !! But where are these marginals we were talking about ?

68 Outline Graphical Models – What are Graphical Models ? – Conditional Independence – Inference Junction Tree Algorithm – Moralizing a graph – Junction Tree Property – Creating a junction tree – Inference using junction tree algorithm

69 Inference using JTA Modify potentials Ensure joint probability is consistent Ensure consistency between neighbouring cliques Ensure clique potentials = clique marginals Ensure separator potentials = separator marginals

70 Inference using JTA VW S 1. a*(S) =  V\S a(V) 2. a*(W) = a(W) a*(S) / a(S) 3. a**(S) =  W\S a*(W) 4. a*(V) = a(V) a**(S) / a*(S)  V\S a*(V) = a**(S) =  W\S a*(W) Consistency

71 Inference using JTA VW S 1. a*(S) =  V\S a(V) 2. a*(W) = a(W) a*(S) / a(S) 3. a**(S) =  W\S a*(W) 4. a*(V) = a(V) a**(S) / a*(S) a*(V) a*(W) / a**(S) = a(V) a(W) / a(S) Joint probability remains same

72 One Last Theorem Theorem: After JTA, Potentials = Marginals Proof on white board. (Then we can all go home)

73 Happy Marginalizing


Download ppt "Junction Tree Algorithm Brookes Vision Reading Group."

Similar presentations


Ads by Google