Presentation is loading. Please wait.

Presentation is loading. Please wait.

Uncertainty Copyright, 1996 © Dale Carnegie & Associates, Inc. Chapter 14.

Similar presentations


Presentation on theme: "Uncertainty Copyright, 1996 © Dale Carnegie & Associates, Inc. Chapter 14."— Presentation transcript:

1 Uncertainty Copyright, 1996 © Dale Carnegie & Associates, Inc. Chapter 14

2 CS 471/598 by H. Liu2 Uncertainty zEvolution of an intelligent agent: problem solving, planning, uncertainty zIt is an unavoidable problem in reality. zAn agent must act under uncertainty. zTo make decision with uncertainty, we need yProbability theory yUtility theory yDecision theory

3 CS 471/598 by H. Liu3 Sources of uncertainty zNo access to the whole truth zNo categorical answer zIncompleteness yThe qualification problem - impossible to explicitly enumerate all conditions zIncorrectness of information about conditions zThe rational decision depends on both the relative importance of various goals and the likelihood of its being achieved.

4 CS 471/598 by H. Liu4 Handling uncertain knowledge zDifficulties in using FOL to cope with UK yA dental diagnosis system using FOL zReasons yLaziness - too much work! yTheoretical ignorance - we don’t know everything yPractical ignorance - we don’t want to include all zRepresent UK with a degree of belief zThe tool for handling UK is probability theory

5 CS 471/598 by H. Liu5 zProbability provides a way of summarizing the uncertainty that comes from our laziness and ignorance - how wonderful it is! zProbability, belief of the truth of a sentence y1 - true, 0 - false, 0<P<1 - intermediate degrees of belief in the truth of the sentence zDegree of truth (fuzzy logic) vs. degree of belief zAlternatives to probability theory?

6 CS 471/598 by H. Liu6 zAll probability statements must indicate the evidence wrt which the probability is being assessed. yPrior or unconditional probability before evidence is obtained yPosterior or conditional probability after new evidence is obtained

7 CS 471/598 by H. Liu7 Uncertainty & rational decisions zWithout uncertainty, decision making is simple - achieving the goal or not zWith uncertainty, it becomes uncertain - three plans A90, A120 and A1440 zWe need first have preferences between the different possible outcomes of the plans zUtility theory is used to represent and reason with preferences.

8 CS 471/598 by H. Liu8 Rationality zDecision Theory = Probability T + Utility T zMaximum Expected Utility Principle defines rationality yAn agent is rational iff it chooses the action that yields the highest utility, averaged over all possible outcomes of the action zA decision-theoretic agent(Fig 14.1, p 419) yIs it any different from other agents we learned?

9 CS 471/598 by H. Liu9 Basic probability notation zPrior probability yProposition - P(Sunny) yRandom variable - P(Weather=Sunny) yEach RV has a domain (sunny,rain,cloudy,snow) yProbability distribution P(weather) = zJoint probability P(A^B) yprobabilities of all combinations of the values of a set of RVs ymore later

10 CS 471/598 by H. Liu10 Conditional probability zConditional probability yP(A|B) = P(A^B)/P(B) yProduct rule - P(A^B) = P(A|B)P(B) zProbabilistic inference does not work like logical inference y“P(A|B)=0.6” != “when B is true, P(A) is 0.6” xP(A) xP(A|B), P(A|B,C),...

11 CS 471/598 by H. Liu11 The axioms of probability zAll probabilities are between 0 and 1 zNecessarily true (valid) propositions have probability 1, false (unsatisfiable) 0 zThe probability of a disjunction P(AvB)=P(A)+P(B)-P(A^B) yA Venn diagram illustration

12 CS 471/598 by H. Liu12 The joint probability distribution zJoint completely specifies an agent’s probability assignments to all propositions in the domain zA probabilistic model consists of a set of random variables (X1, …,Xn). zAn atomic event is an assignment of particular values to all the variables.

13 CS 471/598 by H. Liu13 Joint probabilities zAn example of two Boolean variables Observations: mutually exclusive and collectively exhaustive What are P(Cavity), P(Cavity v Toothache), P(Cavity|Toothache)?

14 CS 471/598 by H. Liu14 Joint (2) zImpractical to specify all the entries for the Joint over n Boolean variables. zIf there is a Joint, we can read off any probability we need. zSidestep the Joint and work directly with conditional probability

15 CS 471/598 by H. Liu15 Bayes’ rule zDeriving the rule via the product rule P(B|A) = P(A|B)P(B)/P(A) yA more general case is P(X|Y) = P(Y|X)P(X)/P(Y) zBayes’ rule conditionalized on evidence E P(X|Y,E) = P(Y|X,E)P(X|E)/P(Y|E) zApplying the rule to medical diagnosis ymeningitis (P(M)=1/50,000)), stiff neck (P(S)=1/20), P(S|M)=0.5, what is P(M|S)? yWhy is this kind of inference useful?

16 CS 471/598 by H. Liu16 Applying Bayes’ rule zRelative likelihood yComparing the relative likelihood of meningitis and whiplash, given a stiff neck, which is more likely? P(M|S)/P(W|S) = P(S|M)P(M)/P(S|W)P(W) zAvoiding direct assessment of the prior yP(M|S) =? P(!M|S) =? And P(M|S) + P(!M|S) = 1, yP(S) = ? P(S|!M) = ? zNormalization - P(Y|X)=  P(X|Y)P(Y) yHow to normalize (Ex 14.7)? yMake the entries in the table P(Y|X) sum to 1

17 CS 471/598 by H. Liu17 Using Bayes’ rule zCombining evidence yfrom P(Cavity|Toothache) and P(Cavity|Catch) to P(Cavity|Toothache,Catch) zBayesian updating yfrom P(Cavity|T)=P(Cavity)P(T|Cavity)/P(T) yto P(Cavity|T,Catch)=P(Catch|T,Cavity)/P(Catch|T) zIndependent events A, B yP(B|A)=P(B), P(A|B)=P(A), P(A,B)=P(A)P(B) zConditional independence (X and Y are ind given Z) yP(X|Y,Z)=P(X|Z)

18 CS 471/598 by H. Liu18 Where do probabilities come from? zThere are three positions: yThe frequentist - numbers can come only from experiments yThe objectivist - probabilities are real aspects of the universe yThe subjectivist - characterizing an agent’s belief zWhat’s the probability that the sun will still exist tomorrow? (P 430) zThe reference class problem yThe doctor categorizes patients - an example

19 CS 471/598 by H. Liu19 Summary zUncertainty exists in the real world. zIt is good (it allows for laziness) and bad (we need new tools) zPriors, posteriors, and joint zBayes’ rule - the base of Bayesian Inference zConditional independence allows Bayesian updating to work effectively with many pieces of evidence. zBut...


Download ppt "Uncertainty Copyright, 1996 © Dale Carnegie & Associates, Inc. Chapter 14."

Similar presentations


Ads by Google