Presentation is loading. Please wait.

Presentation is loading. Please wait.

Belief networks Conditional independence Syntax and semantics Exact inference Approximate inference CS 460, Belief Networks1 Mundhenk and Itti 2008. Based.

Similar presentations


Presentation on theme: "Belief networks Conditional independence Syntax and semantics Exact inference Approximate inference CS 460, Belief Networks1 Mundhenk and Itti 2008. Based."— Presentation transcript:

1 Belief networks Conditional independence Syntax and semantics Exact inference Approximate inference CS 460, Belief Networks1 Mundhenk and Itti 2008. Based on material from S Russel and P Norvig

2 Independence CS 460, Belief Networks2

3 Conditional independence CS 460, Belief Networks3

4 Conditional Independence CS 460, Belief Networks4 Cavity Toothache Catch Other interactions may exist, but they are either insignificant, unknown or irrelevant. We leave them out.

5 Conditional Independence CS 460, Belief Networks5 Cavity Toothache Catch We assume that a “catch” is not influenced by a toothache and visa versa.

6 Conditional independence CS 460, Belief Networks6

7 Conditional Independence CS 460, Belief Networks7 Cavity Toothache Catch (1a) Since Catch does not affect Toothache P(Toothache|Catch,Cavity) = P(Toothache|Cavity) Cavity Toothache =

8 Conditional Independence CS 460, Belief Networks8 Cavity Toothache Catch (1b) Algebraically these statements are equivalent P(Toothache,Catch|Cavity) = P(Toothache|Cavity)P(Catch|Cavity) Cavity Toothache Catch Cavity =

9 Conditional independence CS 460, Belief Networks9

10 Belief networks CS 460, Belief Networks10

11 CS 460, Belief Networks11 Example Probabilities derived from prior observations

12 Typical Bayesian Network CS 460, Belief Networks12 Burglary Earthquake Alarm JohnCalls MaryCalls P(B) 0.001 P(E) 0.002 BEP(A) TT0.95 TF0.94 FT0.29 FF0.001 AP(J) T0.90 F0.05 AP(M) T0.70 F0.01 Here we see both the Topology and the Conditional Probability Tables (CPT).

13 CS 460, Belief Networks13 Semantics

14 Semantics CS 460, Belief Networks14

15 CS 460, Belief Networks15 Markov blanket

16 Constructing belief networks CS 460, Belief Networks16

17 CS 460, Belief Networks17 Burglary Earthquake Alarm JohnCalls MaryCalls P(B) 0.001 P(E) 0.002 BEP(A) TT0.95 TF0.94 FT0.29 FF0.001 AP(J) T0.90 F0.05 AP(M) T0.70 F0.01 What is the probability that the alarm has sounded but neither a burglary nor earthquake has occurred and both John and Mary call? – Example: Full Joint Distribution

18 CS 460, Belief Networks18 Burglary Earthquake Alarm JohnCalls MaryCalls P(B) 0.001 P(E) 0.002 BEP(A) TT0.95 TF0.94 FT0.29 FF0.001 AP(J) T0.90 F0.05 AP(M) T0.70 F0.01

19 What if we find a new variable? CS 460, Belief Networks19 Burglary Earthquake Alarm JohnCalls MaryCalls P(B) 0.001 P(E) 0.002 BSEP(A) TTT0.98 TTF0.94 TFT0.96 TFF0.95 FTT0.45 FTF0.25 FFT0.29 FFF0.001 AP(J) T0.90 F0.05 AP(M) T0.70 F0.01 We find that storms can also set off alarms. We add that into our CPT. Notice that JohnCalls and MaryCalls stay the same since Storms were always there but were just unaccounted for. John and Mary did not change! However, we have better precision at P(A). Storm P(S) 0.1

20 What if we cause a new variable? (or a new variable just occurs) CS 460, Belief Networks20 Burglary Earthquake Alarm JohnCalls MaryCalls P(B) 0.001 P(E) 0.002 BCEP(A) TTT0.98 TTF0.94 TFT0.96 TFF0.95 FTT0.45 FTF0.25 FFT0.29 FFF0.001 AP(J) T0.90 F0.05 AP(M) T0.70 F0.01 What if we inject a new cause that was not there before. We pay a crazy guy to set off the alarm frequently, JohnCalls and MaryCalls may no longer be valid since we may have changed the behaviors. For instance the alarm goes off so often now that John and Mary are more likely to ignore it. CrazyGuy P(C) 0.25

21 What if we cause a new variable? CS 460, Belief Networks21 Burglary Earthquake Alarm JohnCalls MaryCalls P(B) 0.001 P(E) 0.002 BCEP(A) TTT0.98 TTF0.94 TFT0.96 TFF0.95 FTT0.45 FTF0.25 FFT0.29 FFF0.001 AP(J) T0.90 F0.05 AP(M) T0.70 F0.01 If the introduced variable is highly erratic, it can invalidate even more of the CPT than we would like. CrazyGuy P(C) 0.25

22 What if we cause a new variable? CS 460, Belief Networks22 Burglary Earthquake Alarm JohnCalls MaryCalls P(B) 0.001 P(E) 0.002 BCEP(A) TTT0.98 TTF0.94 TFT0.96 TFF0.95 FTT0.45 FTF0.25 FFT0.29 FFF0.001 AP(J) T0.90 F0.05 AP(M) T0.70 F0.01 However some changes to the CPT may be absurd so we may never have to worry about them. CrazyGuy P(C) 0.25

23 Things in the model can change CS 460, Belief Networks23 We can account for change in the model over time (a more advanced topic). John and Mary may be more or less likely to call at certain times. Cyclical repetition may be not too difficult to model. People may become tired of their job and be less likely call over longer periods. This may be easy or difficult to model. Crime picks up. If the trend is slow enough, the model may be able to adjust online even if we have never observed crime picking up before. However, this may easily and totally throw our model off. However, keep in mind, we may get good enough results for our model even without accounting for changes over time.

24 How would we apply this to Robotics? CS 460, Belief Networks24 Tree in Path Rock in Path Obstacle Detect Execute Stop Execute Turn Execute Turn P(T) 0.25 P(R) 0.4 TRP(O) TT0.95 TF0.45 FT0.29 FF0.001 OP(S) T0.90 F0.05 OP(U) T0.70 F0.01 The CPT can describe other events and probabilities such as action success given observations.

25 Exact Inference in a Bayesian Network CS 460, Belief Networks25 What if we want to make an inference such as: what is the probability of a tree in the path given that the robot has stopped and turned. This might be useful to a robot which can judge if there is a tree in a path based on the behavior of another robot. So if robot A sees robot B turn or stop it might infer that there is a tree in the path. Normalized we get:

26 Bayesian Inference cont’ CS 460, Belief Networks26 We compute P of tree and P of not tree and normalize. This is essentially an enumeration of all situations for tree and not tree.

27 Bayesian Inference cont’ CS 460, Belief Networks27 Finishing: 0.06 0.07497 The P of a tree in the path is 0.4445

28 CS 460, Belief Networks28 What about hidden variables?

29 CS 460, Belief Networks29 Example: car diagnosis

30 CS 460, Belief Networks30 Example: car insurance

31 Compact conditional distributions CS 460, Belief Networks31

32 CS 460, Belief Networks32 Compact conditional distributions Know Infer

33 CS 460, Belief Networks33 Hybrid (discrete+continuous) networks If subsidy was a hidden variable would it be discrete and discreet?

34 Continuous child variables CS 460, Belief Networks34

35 Continuous child variables CS 460, Belief Networks35

36 Discrete variable w/ continuous parents CS 460, Belief Networks36

37 CS 460, Belief Networks37 Discrete variable

38 Inference in belief networks Exact inference by enumeration Exact inference by variable elimination Approximate inference by stochastic simulation Approximate inference by Markov chain Monte Carlo (MCMC) CS 460, Belief Networks38


Download ppt "Belief networks Conditional independence Syntax and semantics Exact inference Approximate inference CS 460, Belief Networks1 Mundhenk and Itti 2008. Based."

Similar presentations


Ads by Google