# 1 Bayes nets Computing conditional probability Polytrees Probability Inferences Bayes nets Computing conditional probability Polytrees Probability Inferences.

## Presentation on theme: "1 Bayes nets Computing conditional probability Polytrees Probability Inferences Bayes nets Computing conditional probability Polytrees Probability Inferences."— Presentation transcript:

1 Bayes nets Computing conditional probability Polytrees Probability Inferences Bayes nets Computing conditional probability Polytrees Probability Inferences

2 Conditional probability Production rule P(A, B)=P(A|B)P(B) Bayes rule Formulas to remember P(A ) P(A|B)P(B) P(B|A) = P(A ) P(A, B) P(B|A) = P(A|B)P(B) + P(A|  B)P(  B) P(A|B)P(B) P(B|A) =

3 Bayes Nets It is also called “Causal nets”, “belief networks”, and “influence diagrams”. Bayes nets provide a general technique for computing probabilities of causally related random variables given evidence for some of them. For example, ? Joint distribution: P(Cold, Sore-throat, Runny-nose) Cold Runny-nose Sore-throat Causal link True/False

4 Some “query”examples? How likely is it that Cold, Sore-throat and Runny-nose are all true?  compute P(Cold, Sore-throat, Runny-nose) How likely is it that I have a sore throat given that I have a cold?  compute P(Sore-throat|Cold) How likely is it that I have a cold given that I have a sore throat?  compute P(Cold| Sore-throat) How likely is it that I have a cold given that I have a sore throat and a runny nose?  compute P(Cold| Sore-throat, Runny-nose)

5 For nets with a unique root ? Joint distribution: P(Cold, Sore-throat, Runny-nose) The joint probability distribution of all the variables in the net equals the probability of the root times the probability of each non-root node given its parents. P(Cold, Sore-throat, Runny-nose) = P(Cold)P(Sore-throat|Cold)P(Runny-nose|Cold) ? Prove it Cold Runny-nose Sore-throat

6 Proof For the “Cold” example, from the bayes nets we can assume that Sore-throat and Runny-nose are irrelevant, thus we can apply conditional independence. P(Sore-throat | Cold, Runny-nose) = P(Sore-throat | Cold) P(Runny-nose | Cold, Sore-throat) = P(Cold | Sore-throat) compute P(Cold, Sore-throat, Runny-nose) = P(Runny-nose | Sore-throat, Cold) P(Sore-throat | Cold)P(Cold) = P(Runny-nose | Cold) P(Sore-throat | Cold)P(Cold)

7 Further observations If there is no path that connects 2 nodes by a sequence of causal links, the nodes are conditionally independent with respect to root. For example, Sore-throat, Runny-nose Since Bayes nets assumption is equivalent to conditional independence assumptions, posterior probabilities in a Bayes net can be computed using standard formulas from probability theory P(Sore-throat | Cold) P(Cold) + P(Sore-throat |  Cold) P(  Cold) P(Sore-throat | Cold) P(Cold) P(Cold | Sore-throat) =

8 An example P(S) = 0.3 P(L|S) = 0.5, P(L|  S) = 0.05 P(C|L) = 0.7, P(C|  S) = 0.06 Joint probability distribution: P(S, L, C) = P(S) P(L|S)P(C|L) = 0.3*0.5 *0.7 = 0.105 Habitual smoking Lung cancer Chronic cough 0.3 0.3, 0.05 0.7, 0.06 ? P(L|C)

9 Compute P(L|C) P(S) = 0.3 P(L|S) = 0.5, P(L|  S) = 0.05 P(C|L) = 0.7, P(C|  L) = 0.06 Joint probability distribution: P(S, L, C) = P(S) P(L|S)P(C|L) = 0.3*0.5 *0.7 = 0.105 Habitual smoking Lung cancer Chronic cough 0.3 0.3, 0.05 0.7, 0.06 P(L|C) = (P(C|L)P(L)) / (P(C)) P(C) = P(C/L)P(L) + P(C/  L)P(  L) P(L) = P(L/S)P(S) + P(L/  S)P(  S) = 0.5*0.3 + 0.05*(1-0.3) = 0.185 P(  L) = (1-0.185) = 0.815 P(C) = 0.7*0.185 + 0.06*0.815 = 0.1784 P(L|C) = 0.7*0.185 / 0.1784 = 0.7258968 General way of computing any conditional probability: 1.Express the conditional probabilities for all the nodes 2.Use the Bayes net assumption to evaluate the joint probabilities. 3.P(A) + P(  A ) = 1

10 A general method is not efficient Better methods depend on systematic use of the independence assumptions implicit in the Bayes net assumption: A set of nodes X is independent of a set of nodes Y given nodes E iff every undirected path connecting a node in Y is directly or indirectly blocked by E Better methods direct blockage indirect blockage (no descendants in E) E X Y

11 Examples (1) E X Y direct blockage P(X|Y, E) = P(X|E)

12 unblocked path through here Examples (2) E X Y unblocked path through here P(X|Y, E)  P(X|E)

13 Singly connected networks are called Polytrees. Algorithm that works on Polytrees are derived in the following three steps. Inference in Polytrees Express P(X|E) in terms of P(E - x |X) and P(X|E + x ) Where, P(E - x |X) is likelihood of “evidential support” given X P(X|E + x ) is likelihood of X given its “causal support” E - x is E-nodes connected to X via X’s children E + x is E-nodes connected to X via X’s parents

14 Express P(X|E + x ) recursively in terms of P(U i | E + x ) where U i are the parents of X. Express P(E - x |X) recursively in terms of P(E - yi |Y i ) and P(Z ij |E - Zij Yi ) where Z ij are the parents of Y i, Y i are X’s children, E - Zij Yi are the E-nodes connected to Z ij except via Y i. YiYi

15 (a) Diagnose inferences ( from effects to causes ) e.g. Given that JohnCalls, infer that P(Burglary|JohnCalls) = 0.016 (b) Causal inferences ( from causes to effects ) e.g. Given Burglary, P(JohnCalls|Burglary)=0.86 and P(MaryCalls|Burglary) = 0.67 (c) Inter-causal inferences ( between causes of a common effect ) e.g. Given Alarm, we have P(Burglary|Alarm) = 0.376. But if we add the evidence that Earthquake is true, then P(Burglary|Alarm  Earthquake) goes down to 0.003. Even though burglaries and earthquakes are independent, the presence of one makes the other less likely. (d) Mixed inferences ( combining two or more of the above ) The nature of probability inferences Q E E Q E Q E Q E a b c d

16 Calculating the belief in query variables given define values for evidence variables, Making decisions based on probabilities in the network and on agent’s utilities, Deciding which additional evidence variables should be observed in order to gain useful information, Performing sensitive analysis to understand which aspects of the model have greatest impact on the probabilities of the query variables, Explaining the results of probabilistic inference to the user. Applications of Bayes nets

17 Exercises Ex1. Could you calculate P(S|L) in page 8. Ex1. Could you calculate P(S|L) in page 8. Ex2. Could you write down the formulas to compute P(Cold, Sore-throat, Runny-nose), P(Sore-throat|Cold) P(Cold| Sore-throat), P(Cold| Sore-throat, Runny-nose) Ex2. Could you write down the formulas to compute P(Cold, Sore-throat, Runny-nose), P(Sore-throat|Cold) P(Cold| Sore-throat), P(Cold| Sore-throat, Runny-nose) Cold Runny-nose Sore-throat

Download ppt "1 Bayes nets Computing conditional probability Polytrees Probability Inferences Bayes nets Computing conditional probability Polytrees Probability Inferences."

Similar presentations