Presentation is loading. Please wait.

Presentation is loading. Please wait.

Uncertainty Uncertain Knowledge Probability Review Bayes’ Theorem Summary.

Similar presentations


Presentation on theme: "Uncertainty Uncertain Knowledge Probability Review Bayes’ Theorem Summary."— Presentation transcript:

1 Uncertainty Uncertain Knowledge Probability Review Bayes’ Theorem Summary

2 Uncertain Knowledge In many situations we cannot assign a value of true or false to world statements. Example Symptom(p,Toothache)  Disease(p,Cavity) To generalize: Symptom(p,Toothache)  Disease(p,Cavity) V Disease(p,GumDisease) V …

3 Uncertain Knowledge Solution: Deal with degrees of belief. We will use probability theory. Probability states a degree of belief based on evidence: P(x) = 0.80 – based on evidence, 80% of the times in which the experiment is run, x occurs. It summarizes our uncertainty of what causes x. Degree of truth – Fuzzy logic.

4 Utility Theory Combine probability and decision theory To make a decision (action) an agent needs to have preferences between plans. An agent should choose the action with highest expected utility averaged over all possible outcomes.

5 Uncertainty Uncertain Knowledge Probability Review Bayes’ Theorem Summary

6 Random Variable Definition: A variable that can take on several values, each value having a probability of occurrence. There are two types of random variables:  Discrete. Take on a countable number of values.  Continuous. Take on a range of values.

7 Random Variable Discrete Variables  For every discrete variable X there will be a probability function P(x) = P(X = x).

8 Random Variable Continuous Variables:  For every continuous random variable X we will associate a probability density function f(x). It is the area under the density functions between two points that corresponds to the probability of the variable lying between the two values. Prob(x1 < X <= x2) = ∫ x1 f(x) dx x2

9 The Sample Space   The space of all possible outcomes of a given process or situation is called the sample space S. S red & small blue & small red & large blue & large

10 An Event   An event A is a subset of the sample space. S red & small blue & small red & large blue & large A

11 Atomic Event An atomic event is a single point in S. Properties:  Atomic events are mutually exclusive  The set of all atomic events is exhaustive  A proposition is the disjunction of the atomic events it covers.

12 The Laws of Probability  The probability of the sample space S is 1, P(S) = 1  The probability of any event A is such that 0 <= P(A) <= 1.  Law of Addition If A and B are mutually exclusive events, then the probability that either one of them will occur is the sum of the individual probabilities: P(A or B) = P(A) + P(B)

13 The Laws of Probability If A and B are not mutually exclusive: P(A or B) = P(A) + P(B) – P(A and B) A B

14 Prior Probability It is called the unconditional or prior probability of event A. P(A) -- Reflects our original degree of belief of X.

15 Conditional Probabilities   Given that A and B are events in sample space S, and P(B) is different of 0, then the conditional probability of A given B is P(A|B) = P(A and B) / P(B)  If A and B are independent then P(A|B) = P(A)

16 The Laws of Probability   Law of Multiplication What is the probability that both A and B occur together? P(A and B) = P(A) P(B|A) where P(B|A) is the probability of B conditioned on A.

17 The Laws of Probability If A and B are statistically independent: P(B|A) = P(B) and then P(A and B) = P(A) P(B)

18 Independence on Two Variables P(A,B|C) = P(A|C) P(B|C) If A and B are conditionally independent: P(A|B,C) = P(A|C) and P(B|A,C) = P(B|C)

19 19 Exercises Find the probability that the sum of the numbers on two unbiased dice will be even by considering the probabilities that the individual dice will show an even number.

20 20 Exercises X 1 – first throw X 2 – second throw

21 21 Exercises X 1 – first throw X 2 – second throw Pfinal = P(X 1 =1 & X 2 =1) + P(X 1 =1 & X 2 =3) + P(X 1 =1 & X 2 =5) + P(X 1 =2 & X 2 =2) + P(X 1 =2 & X 2 =4) + P(X 1 =2 & X 2 =6) + P(X 1 =2 & X 2 =2) + P(X 1 =2 & X 2 =4) + P(X 1 =2 & X 2 =6) + P(X 1 =3 & X 2 =1) + P(X 1 =3 & X 2 =3) + P(X 1 =3 & X 2 =5) + P(X 1 =3 & X 2 =1) + P(X 1 =3 & X 2 =3) + P(X 1 =3 & X 2 =5) + … P(X 1 =6 & X 2 =2) + P(X 1 =6 & X 2 =4) + P(X 1 =6 & X 2 =6). P(X 1 =6 & X 2 =2) + P(X 1 =6 & X 2 =4) + P(X 1 =6 & X 2 =6). P final = 18/36 = 1/2

22 22 Exercises Find the probabilities of throwing a sum of a) 3, b) 4 with three unbiased dice.

23 23 Exercises Find the probabilities of throwing a sum of a) 3, b) 4 with three unbiased dice. X = sum of X 1 and X 2 and X 3 P(X=3)? P(X 1 =1 & X 2 =1 & X 3 =1) = 1/216 P(X=4)? P(X 1 =1 & X 2 =1 & X 3 =2) + P(X 1 =1 & X 2 =2 & X 3 =1) + … P(X=4) = 3/216

24 24 Exercises Three men meet by chance. What are the probabilities that a) none of them, b) two of them, c) all of them have the same birthday?

25 25 Exercises None of them have the same birthday X 1 – birthday 1 st person X 2 – birthday 2 nd person X 3 – birthday 3 rd person a)P(X 2 is different than X 1 & X 3 is different than X 1 and X 2 ) P final = (364/365)(363/365)

26 26 Exercises Two of them have the same birthday P(X 1 = X 2 and X 3 is different than X 1 and X 2 ) + P(X 1 =X 3 and X 2 differs) + P(X 2 =X 3 and X 1 differs). P(X 1 =X 2 and X 3 differs) = (1/365)(364/365) P final = 3(1/365)(364/365)

27 27 Exercises All of them have the same birthday P(X 1 = X 2 = X 3 ) P final = (1/365)(1/365)

28 Multivariate o Joint Distributions P(x,y) = P( X = x and Y = y).  P’(x) = Prob( X = x) = ∑ y P(x,y) It is called the marginal distribution of X The same can be done on Y to define the marginal distribution of Y, P”(y).  If X and Y are independent then P(x,y) = P’(x) P”(y)

29 Expectations: The Mean   Let X be a discrete random variable that takes the following values: x1, x2, x3, …, xn. Let P(x1), P(x2), P(x3),…,P(xn) be their respective probabilities. Then the expected value of X, E(X), is defined as E(X) = x1P(x1) + x2P(x2) + x3P(x3) + … + xnP(xn) E(X) = Σi xi P(xi)

30 30 Exercises Suppose that X is a random variable taking the values {-1, 0, and 1} with equal probabilities and that Y = X 2. Find the joint distribution and the marginal distributions of X and Y and also the conditional distributions of X given a) Y = 0 and b) Y = 1.

31 31 Exercises 01/30 0 Y X 1/32/3 1/3 1/3 1/3 -1 0 1 01 If Y = 0 then X= 0 with probability 1 If Y = 1 then X is equally likely to be +1 or -1

32 Uncertainty Uncertain Knowledge Probability Review Bayes’ Theorem Summary

33 Bayes’ Theorem P(A,B) = P(A|B) P(B) P(B,A) = P(B|A) P(A) The theorem: P(B|A) = P(A|B) P(B) / P(A)

34 More General Bayes’ Theorem P(Y|X,e) = P(X|Y,e) P(Y|e) / P(X|e) Where e: background evidence.

35 Thomas Bayes Born in London (1701). Studied logic and theology (Univ. of Edinburgh). Fellow of the Royal Society (year 1742). Given white and black balls in an urn, what is the prob. of drawing one or the other? Given one or more balls, what can be said about the number of balls in the urn?

36 Uncertainty Uncertain Knowledge Probability Review Bayes’ Theorem Summary

37 Uncertainty comes from ignorance on the true state of the world. Probabilities indicate our degree of belief on certain event. Concepts: random variable, prior probabilities, conditional probabilities, joint distributions, conditional independence, Bayes’ theorem.

38 Application: Predicting Stock Market Bayesian Networks BNs have been exploited to predict the behavior of the stock market. BNs can be constructed from daily stock returns over a certain amount of time. Stocks can be analyzed from well-known repositories: e.g., S&P 500 index.


Download ppt "Uncertainty Uncertain Knowledge Probability Review Bayes’ Theorem Summary."

Similar presentations


Ads by Google