Download presentation

Presentation is loading. Please wait.

Published bySantos Mulford Modified over 2 years ago

1
Lirong Xia Bayesian networks (2) Thursday, Feb 25, 2014

2
Pay attention to all reminders Written HW 1 out: due this Friday before the class –type+printout preferred –or it must be in print handwriting. We may return illegible HWs without grading –no email submissions Midterm Mar 7 –in-class –open book –cannot use smartphone/laptops/wifi Project 2 deadline Mar 18 midnight 1 Reminder

3
Avg score for reassigned project 1: 15.97 Usage of old util.py –should not matter if you only use the member functions if so, let me know –might be problematic if you directly access the member variables –advise: make sure you use all files with the autograder for project 2 2 Reassigned project 1

4
Bayesian networks –compact, graphical representation of a joint probability distribution –conditional independence 3 Last class

5
Bayesian network 4 Definition of Bayesian network (Bayes’ net or BN) A set of nodes, one per variable X A directed, acyclic graph A conditional distribution for each node –A collection of distributions over X, one for each combination of parents’ values p(X| a 1,…, a n ) –CPT: conditional probability table –Description of a noisy “causal” process A Bayesian network = Topology (graph) + Local Conditional Probabilities

6
Probabilities in BNs 5 Bayesian networks implicitly encode joint distributions –As a product of local conditional distributions –Example: This lets us reconstruct any entry of the full joint Not every BN can represent every joint distribution –The topology enforces certain conditional independencies

7
Reachability (D-Separation) 6 Question: are X and Y conditionally independent given evidence vars {Z}? –Yes, if X and Y “separated” by Z –Look for active paths from X to Y –No active paths = independence! A path is active if each triple is active: –Causal chain where B is unobserved (either direction) –Common cause where B is unobserved –Common effect where B or one of its descendents is observed All it takes to block a path is a single inactive segment

8
Given random variables Z 1,…Z p, we are asked whether X ⊥ Y|Z 1,…Z p Step 1: shade Z 1,…Z p Step 2: for each undirected path from X to Y –if all triples are active, then X and Y are NOT conditionally independent If all paths have been checked and none of them is active, then X ⊥ Y|Z 1,…Z p 7 Checking conditional independence from BN graph

9
Example 8 Yes!

10
Example 9 Yes!

11
Example 10 Yes! Variables: –R: Raining –T: Traffic –D: Roof drips –S: I am sad Questions:

12
11 Today: Inference---variable elimination

13
Inference 12 Inference: calculating some useful quantity from a joint probability distribution Examples: –Posterior probability: –Most likely explanation:

14
Inference 13 Given unlimited time, inference in BNs is easy Recipe: –State the marginal probabilities you need –Figure out ALL the atomic probabilities you need –Calculate and combine them Example:

15
Example: Enumeration 14 In this simple method, we only need the BN to synthesize the joint entries

16
Inference by Enumeration? 15

17
More elaborate rain and sprinklers example Rained Sprinklers were on Grass wet Dog wet Neighbor walked dog p(+R) =.2 p(+N|+R) =.3 p(+N|-R) =.4 p(+S) =.6 p(+G|+R,+S) =.9 p(+G|+R,-S) =.7 p(+G|-R,+S) =.8 p(+G|-R,-S) =.2 p(+D|+N,+G) =.9 p(+D|+N,-G) =.4 p(+D|-N,+G) =.5 p(+D|-N,-G) =.3

18
Inference Want to know: p(+R|+D) = p(+R,+D)/P(+D) Let’s compute p(+R,+D) Rained Sprinklers were on Grass wet Dog wet Neighbor walked dog p(+R) =.2 p(+N|+R) =.3 p(+N|-R) =.4 p(+S) =.6 p(+G|+R,+S) =.9 p(+G|+R,-S) =.7 p(+G|-R,+S) =.8 p(+G|-R,-S) =.2 p(+D|+N,+G) =.9 p(+D|+N,-G) =.4 p(+D|-N,+G) =.5 p(+D|-N,-G) =.3

19
Inference… p(+R,+D)= Σ s Σ g Σ n p(+R)p(s)p(n|+R)p(g|+R,s)p(+D|n,g) = p(+R)Σ s p(s)Σ g p(g|+R,s)Σ n p(n|+R)p(+D|n,g) Rained Sprinklers were on Grass wet Dog wet Neighbor walked dog p(+R) =.2 p(+N|+R) =.3 p(+N|-R) =.4 p(+S) =.6 p(+G|+R,+S) =.9 p(+G|+R,-S) =.7 p(+G|-R,+S) =.8 p(+G|-R,-S) =.2 p(+D|+N,+G) =.9 p(+D|+N,-G) =.4 p(+D|-N,+G) =.5 p(+D|-N,-G) =.3

20
p(+R)Σ s p(s)Σ g p(g|+R,s)Σ n p(n|+R)p(+D|n,g) Order: s>g>n only involves s only involves s and g only involves s, g, and n 19 Staring at the formula…

21
Variable elimination From the factor Σ n p(n|+R)p(+D|n,g) we sum out n to obtain a factor only depending on g [Σ n p(n|+R)p(+D|n,+G)] = p(+N|+R)P(+D|+N,+G) + p(-N|+R)p(+D|-N,+G) =.3*.9+.7*.5 =.62 [Σ n p(n|+R)p(+D|n,-G)] = p(+N|+R)p(+D|+N,-G) + p(-N|+R)p(+D|-N,-G) =.3*.4+.7*.3 =.33 Continuing to the left, g will be summed out next, etc. (continued on board) Rained Sprinklers were on Grass wet Dog wet Neighbor walked dog p(+R) =.2 p(+N|+R) =.3 p(+N|-R) =.4 p(+S) =.6 p(+G|+R,+S) =.9 p(+G|+R,-S) =.7 p(+G|-R,+S) =.8 p(+G|-R,-S) =.2 p(+D|+N,+G) =.9 p(+D|+N,-G) =.4 p(+D|-N,+G) =.5 p(+D|-N,-G) =.3

22
Elimination order matters p(+R,+D)= Σ n Σ s Σ g p(+r)p(s)p(n|+R)p(g|+r,s)p(+D|n,g) = p(+R)Σ n p(n|+R)Σ s p(s)Σ g p(g|+R,s)p(+D|n,g) Last factor will depend on two variables in this case! Rained Sprinklers were on Grass wet Dog wet Neighbor walked dog p(+R) =.2 p(+N|+R) =.3 p(+N|-R) =.4 p(+S) =.6 p(+G|+R,+S) =.9 p(+G|+R,-S) =.7 p(+G|-R,+S) =.8 p(+G|-R,-S) =.2 p(+D|+N,+G) =.9 p(+D|+N,-G) =.4 p(+D|-N,+G) =.5 p(+D|-N,-G) =.3

23
Compute a marginal probability p( x 1,…, x p ) in a Bayesian network –Let Y 1,…,Y k denote the remaining variables –Step 1: fix an order over the Y’s (wlog Y 1 >…>Y k ) –Step 2: rewrite the summation as Σ y 1 Σ y 2 …Σ y k-1 Σ y k anything –Step 3: variable elimination from right to left 22 General method for variable elimination sth only involving X’s sth only involving Y 1 and X’s sth only involving Y 1, Y 2 and X’s sth only involving Y 1, Y 2,…,Y k -1 and X’s

24
Trees make our life much easier Choose an extreme variable to eliminate first …Σ x 4 P(x 4 |x 1,x 2 )…Σ x 5 P(x 5 |x 4 ) = …Σ x 4 P(x 4 |x 1,x 2 )[Σ x 5 P(x 5 |x 4 )]… Why this does not work well for general BNs? X1X1 X2X2 X3X3 X4X4 X6X6 X5X5 X7X7 X8X8

Similar presentations

OK

CS 188: Artificial Intelligence Fall 2009 Lecture 14: Bayes’ Nets 10/13/2009 Dan Klein – UC Berkeley.

CS 188: Artificial Intelligence Fall 2009 Lecture 14: Bayes’ Nets 10/13/2009 Dan Klein – UC Berkeley.

© 2017 SlidePlayer.com Inc.

All rights reserved.

Ads by Google

Ppt on natural selection 6th grade Ppt on area of parallelogram for class 9 Ppt on water activity table for toddlers Ppt on ip address classes and range Ppt on ufo and aliens today Ppt on types of business organisations Ppt on tcp ip protocol suite 3rd Ppt on human chromosomes chart Ppt on third generation solid state drives Disaster management ppt on uttarakhand disaster