Presentation is loading. Please wait.

Presentation is loading. Please wait.

CSCI 121 Special Topics: Bayesian Networks Lecture #3: Multiply-Connected Graphs and the Junction Tree Algorithm.

Similar presentations


Presentation on theme: "CSCI 121 Special Topics: Bayesian Networks Lecture #3: Multiply-Connected Graphs and the Junction Tree Algorithm."— Presentation transcript:

1 CSCI 121 Special Topics: Bayesian Networks Lecture #3: Multiply-Connected Graphs and the Junction Tree Algorithm

2 Answering Queries: Problems Difficult if graph is not singly connected (a.k.a polytree): A DBCADCB Multiply connnected: more than one path from A to D. P(D|A) = ? Singly connnected (polytree): just one path from A to D.

3 Dealing with Multiply Connected Graphs Three basic options: 1) Clustering – group “offending” nodes into “meganodes” 2) Conditioning – set variables to definite values; then build a polytree for each combo 3) Stochastic simulation – Generate a large number of concrete models consistent with the domain. Clustering seems to be the most popular.

4 Clustering with the Junction-Tree Algorithm (Huang & Darwiche 1994) 0) Note that a BN is a directed acyclic graph (DAG) 1) “Moralize” the DAG: For parents A, B of node C, draw an edge between A and B. Then remove arrows (undirected graph). A B D F E C G H A B D F E C G H

5 Clustering with the Junction-Tree Algorithm 2) Triangulate the moral graph so that every cycle of length ≥ 4 contains an edge between nonadjacent nodes. Use a heuristic based on minimal # of edges added, minimal # possible values. A B D F E C G H A B D F E C G H

6 Clustering with the Junction-Tree Algorithm 3) Build cliques from triangulated graph: A B D F E C G H Put your hands in the air and represent your clique! – 112, “Peaches and Cream” ABD ADE DEF ACE CEG EGH

7 Clustering with the Junction-Tree Algorithm 4) Connect cliques by separation sets to form the junction tree: ABDADE DEF ACECEG EGH AD DE AECE EG

8 Marginalization At this point, each cluster (clique; meganode) has a joint probability table. To query a variable, we (heuristically) pick a cluster containing it, and marginalize over the joint probability Ф from the table: ABDФ ABD TTT.225 TTF.025 TFT.125 TFF.125 FTT.180 FTF.020 FFT.150 FFF.150 D P(D) Σ T.225 +.125 +.180 +.150 =.680 F.025 +.125 +.020 +.150 =.320

9 Message-Passing Sepset potentials are initialized via marginalization. When evidence is presented (“John calls”), heuristically pick a “root clique” and pass messages around the tree: ABDADE DEF ACECEG EGH AD DE AECE EG

10 Message-Passing Messages are passed from clique X to Y through sepset R by multiplication and division of table entries. Evidence is set by “masking” table entries with a bit vector (or probability distribution). E.g., observe B = F: ABDФ ABD TTT0 TTF0 TFT.125 TFF.125 FTT0 FTF0 FFT.150 FFF.150


Download ppt "CSCI 121 Special Topics: Bayesian Networks Lecture #3: Multiply-Connected Graphs and the Junction Tree Algorithm."

Similar presentations


Ads by Google