Presentation is loading. Please wait.

Presentation is loading. Please wait.

Junction tree Algorithm 10-708:Probabilistic Graphical Models Recitation: 10/04/07 Ramesh Nallapati.

Similar presentations


Presentation on theme: "Junction tree Algorithm 10-708:Probabilistic Graphical Models Recitation: 10/04/07 Ramesh Nallapati."— Presentation transcript:

1 Junction tree Algorithm 10-708:Probabilistic Graphical Models Recitation: 10/04/07 Ramesh Nallapati

2 Cluster Graphs A cluster graph K for a set of factors F is an undirected graph with the following properties: Each node i is associated with a subset C i ½ X Family preserving property: each factor  is such that scope[  ] µ C i Each edge between C i and C j is associated with a sepset S ij = C i Å C j Execution of variable elimination defines a cluster- graph Each factor used in elimination becomes a cluster-node An edge is drawn between two clusters if a message is passed between them in elimination Example: Next slide 

3 Variable Elimination to Junction Trees: Original graph DifficultyIntelligence Coherence Grade SAT Happy Letter Job

4 Variable Elimination to Junction Trees: Moralized graph DifficultyIntelligence Coherence Grade SAT Happy Letter Job

5 Variable Elimination to Junction Trees: Triangulated graph DifficultyIntelligence Coherence Grade SAT Happy Letter Job

6 Variable Elimination to Junction Trees: Elimination ordering: C, D, I, H, G, S, L DifficultyIntelligence Coherence Grade SAT Happy Letter Job C,D D

7 Elimination ordering: C, D, I, H, G, S, L DifficultyIntelligence Coherence Grade SAT Happy Letter Job C,D D,I,G D G,I Variable Elimination to Junction Trees:

8 Elimination ordering: C, D, I, H, G, S, L DifficultyIntelligence Coherence Grade SAT Happy Letter Job C,D D,I,G D G,I,S G,I G,S

9 Variable Elimination to Junction Trees: Elimination ordering: C, D, I, H, G, S, L DifficultyIntelligence Coherence Grade SAT Happy Letter Job C,D D,I,G D G,I,S G,I H,G,J G,S G,J

10 Variable Elimination to Junction Trees: Elimination ordering: C, D, I, H, G, S, L DifficultyIntelligence Coherence Grade SAT Happy Letter Job C,D D,I,G D G,I,S G,I H,G,J G,J G,J,S,L G,S J,S,L

11 Variable Elimination to Junction Trees: Elimination ordering: C, D, I, H, G, S, L DifficultyIntelligence Coherence Grade SAT Happy Letter Job C,D D,I,G D G,I,S G,I H,G,J G,J G,J,S,L G,S J,S,L L,J

12 Variable Elimination to Junction Trees: Elimination ordering: C, D, I, H, G, S, L DifficultyIntelligence Coherence Grade SAT Happy Letter Job C,D D,I,G D G,I,S G,I H,G,J G,J G,J,S,L G,S J,S,L L,J

13 Properties of Junction Tree Cluster-graph G induced by variable elimination is necessarily a tree Reason: each intermediate factor is used atmost once G satisfies Running Intersection Property (RIP) (X 2 C i & X in C j ) ) X 2 C K where C k is in the path of C i and C j If C i and C j are neighboring clusters, and C i passes message m ij to C j, then scope[m ij ] = S i,j Let F be set of factors over X. A cluster tree over F that satisfies RIP is called a junction tree One can obtain a minimal junction tree by eliminating the sub-cliques No redundancies C,D D,I,G D G,I,S G,I H,G,J G,J G,J,S,L G,S J,S,L L,J

14 Junction Trees to Variable elimination: Now we will assume a junction tree and show how to do variable elimination DifficultyIntelligence Coherence Grade SAT Happy Letter Job 1: C,D 2: G,I,D 3: G,S,I 4: G,J,S,L 5: H,G,J D G,I G,S G,J

15 Junction Trees to Variable Elimination: Initialize potentials first: DifficultyIntelligence Coherence Grade SAT Happy Letter Job 1: C,D 2: G,I,D 3: G,S,I 4:G,J,S,L 5:H,G,J D G,I G,S G,J  0 1 (C,D) = P(C)P(D|C)  0 2 (G,I,D) = P(G|D,I)  0 3 (G,S,I) = P(I)P(S|I)  0 4 (G,J,S,L) = P(L|G)P(J|S,L)  0 5 (H,G,J) = P(H|G,J)

16 Junction Trees to Variable Elimination: Pass messages: (C 4 is the root) 1: C,D 2: G,I,D 3: G,S,I 4:G,J,S,L 5:H,G,J D G,I G,S G,J  0 1 (C,D) = P(C)P(D|C)  0 2 (G,I,D) = P(G|D,I)  0 3 (G,S,I) = P(I)P(S|I)  0 4 (G,J,S,L) = P(L|G)P(J|S,L)  0 5 (H,G,J) = P(H|G,J)  1 ! 2 (D) =  C  0 1 (C,D)  2 ! 3 (G,I) =  D  0 2 (G,I,D)  1 ! 2 (D)  3 ! 4 (G,S) =  I  0 3 (G,S,I)  2 ! 3 (G,I)  5 ! 4 (G,J) =  H  0 5 (H,G,J)  4 (G,J,S,L) =  3 ! 4 (G,S)  5 ! 4 (G,J)  0 4 (G,J,S,L)

17 Junction Tree calibration Aim is to compute marginals of each node using least computation Similar to the 2-pass sum-product algorithm C i transmits a message to its neighbor C j after it receives messages from all other neighbors Called “Shafer-Shenoy” clique tree algorithm 1: C,D2: G,I,D 3: G,S,I 4:G,J,S,L5:H,G,J

18 Message passing with division Consider calibrated potential at node C i whose neighbor is C j Consider message from C i to C j Hence, one can write: CiCi CjCj

19 Message passing with division Belief-update or Lauritzen-Speigelhalter algorithm Each cluster C i maintains its fully updated current beliefs  i Each sepset s ij maintains  ij, the previous message passed between C i -C j regardless of direction Any new message passed along C i -C j is divided by  ij

20 Belief Update message passing Example 1: A,B 2: B,C3: C,D BC  12 =  1 ! 2 (B)  23 =  3 ! 2 (C)  2 ! 1 (B) This is what we expect to send in the regular message passing! Actual message

21 Belief Update message passing Another Example 1: A,B 2: B,C3: C,D BC  2 ! 3 (C) =  0 23  3 ! 2 (C) =  1 23 This is exactly the message C 2 would have received from C 3 if C 2 didn’t send an uninformed message: Order of messages doesn’t matter!

22 Belief Update message passing Junction tree invariance Recall: Junction Tree measure: A message from C i to C j changes only  j and  ij : Thus the measure remains unchanged for updated potentials too!

23 Junction trees from Chordal graphs Recall: A junction tree can be obtained by the induced graph from variable elimination Alternative approach: using chordal graphs Recall: Any chordal graph has a clique tree Can obtain chordal graphs through triangulation Finding a minimum triangulation, where largest clique has minimum size is NP-hard

24 Junction trees from Chordal graphs Maximum spanning tree algorithm Original Graph DifficultyIntelligence Coherence Grade SAT Happy Letter Job

25 Junction trees from Chordal graphs Maximum spanning tree algorithm Undirected moralized graph DifficultyIntelligence Coherence Grade SAT Happy Letter Job

26 Junction trees from Chordal graphs Maximum spanning tree algorithm Chordal (Triangulated) graph DifficultyIntelligence Coherence Grade SAT Happy Letter Job

27 Junction trees from Chordal graphs Maximum spanning tree algorithm Cluster graph DifficultyIntelligence Coherence Grade SAT Happy Letter Job C,D D,I,G 1 G,I,S 2 L,S,J 2 G,S,L 2 G,H 11 1 1 1

28 Junction trees from Chordal graphs Maximum spanning tree algorithm Junction tree DifficultyIntelligence Coherence Grade SAT Happy Letter Job C,D D,I,G D G,I,S G,I L,S,J S,L G,S,L G,S G,H G

29 Summary Junction tree data-structure for exact inference on general graphs Two methods Shafer-Shenoy Belief-update or Lauritzen-Speigelhalter Constructing Junction tree from chordal graphs Maximum spanning tree approach


Download ppt "Junction tree Algorithm 10-708:Probabilistic Graphical Models Recitation: 10/04/07 Ramesh Nallapati."

Similar presentations


Ads by Google