Uncertainty Copyright, 1996 © Dale Carnegie & Associates, Inc. Chapter 13.

Slides:



Advertisements
Similar presentations
Probability: Review The state of the world is described using random variables Probabilities are defined over events –Sets of world states characterized.
Advertisements

PROBABILITY. Uncertainty  Let action A t = leave for airport t minutes before flight from Logan Airport  Will A t get me there on time ? Problems :
Where are we in CS 440? Now leaving: sequential, deterministic reasoning Entering: probabilistic reasoning and machine learning.
Artificial Intelligence Uncertainty
Uncertainty Copyright, 1996 © Dale Carnegie & Associates, Inc. Chapter 14.
Reasoning under Uncertainty: Conditional Prob., Bayes and Independence Computer Science cpsc322, Lecture 25 (Textbook Chpt ) March, 17, 2010.
CPSC 422 Review Of Probability Theory.
Probability.
Probability Review 1 CS479/679 Pattern Recognition Dr. George Bebis.
Uncertainty Chapter 13. Uncertainty Let action A t = leave for airport t minutes before flight Will A t get me there on time? Problems: 1.partial observability.
CS 416 Artificial Intelligence Lecture 13 Uncertainty Chapter 13 Lecture 13 Uncertainty Chapter 13.
KI2 - 2 Kunstmatige Intelligentie / RuG Probabilities Revisited AIMA, Chapter 13.
1 Bayesian Reasoning Chapter 13 CMSC 471 Adapted from slides by Tim Finin and Marie desJardins.
University College Cork (Ireland) Department of Civil and Environmental Engineering Course: Engineering Artificial Intelligence Dr. Radu Marinescu Lecture.
Uncertainty Management for Intelligent Systems : for SEP502 June 2006 김 진형 KAIST
Representing Uncertainty CSE 473. © Daniel S. Weld 2 Many Techniques Developed Fuzzy Logic Certainty Factors Non-monotonic logic Probability Only one.
Ai in game programming it university of copenhagen Welcome to... the Crash Course Probability Theory Marco Loog.
Probability and Information Copyright, 1996 © Dale Carnegie & Associates, Inc. A brief review (Chapter 13)
Artificial Intelligence Uncertainty & probability Chapter 13, AIMA.
Uncertainty Chapter 13.
Uncertainty Chapter 13.
Methods in Computational Linguistics II Queens College Lecture 2: Counting Things.
CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes Feb 28 and March 13-15, 2012.
Handling Uncertainty. Uncertain knowledge Typical example: Diagnosis. Consider:  x Symptom(x, Toothache)  Disease(x, Cavity). The problem is that this.
Uncertainty Chapter 13. Uncertainty Let action A t = leave for airport t minutes before flight Will A t get me there on time? Problems: 1.partial observability.
CHAPTER 13 Oliver Schulte Summer 2011 Uncertainty.
An Introduction to Artificial Intelligence Chapter 13 & : Uncertainty & Bayesian Networks Ramin Halavati
CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes March 13, 2012.
Uncertainty Chapter 13. Outline Uncertainty Probability Syntax and Semantics Inference Independence and Bayes' Rule.
1 Chapter 13 Uncertainty. 2 Outline Uncertainty Probability Syntax and Semantics Inference Independence and Bayes' Rule.
Uncertainty Chapter 13. Outline Uncertainty Probability Syntax and Semantics Inference Independence and Bayes' Rule.
Probability and naïve Bayes Classifier Louis Oliphant cs540 section 2 Fall 2005.
An Introduction to Artificial Intelligence Chapter 13 & : Uncertainty & Bayesian Networks Ramin Halavati
1 Reasoning Under Uncertainty Artificial Intelligence Chapter 9.
CSE PR 1 Reasoning - Rule-based and Probabilistic Representing relations with predicate logic Limitations of predicate logic Representing relations.
Uncertainty Uncertain Knowledge Probability Review Bayes’ Theorem Summary.
Chapter 13 February 19, Acting Under Uncertainty Rational Decision – Depends on the relative importance of the goals and the likelihood of.
Uncertainty. Assumptions Inherent in Deductive Logic-based Systems All the assertions we wish to make and use are universally true. Observations of the.
Probability and Information Copyright, 1996 © Dale Carnegie & Associates, Inc. A brief review.
Uncertainty Chapter 13. Outline Uncertainty Probability Syntax and Semantics Inference Independence and Bayes' Rule.
Uncertainty Chapter 13. Outline Uncertainty Probability Syntax and Semantics Inference Independence and Bayes' Rule.
Introduction to Artificial Intelligence CS 438 Spring 2008 Today –AIMA, Ch. 13 –Reasoning with Uncertainty Tuesday –AIMA, Ch. 14.
Uncertainty Let action A t = leave for airport t minutes before flight Will A t get me there on time? Problems: 1.partial observability (road state, other.
CSE 473 Uncertainty. © UW CSE AI Faculty 2 Many Techniques Developed Fuzzy Logic Certainty Factors Non-monotonic logic Probability Only one has stood.
Uncertainty Fall 2013 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University.
Outline [AIMA Ch 13] 1 Uncertainty Probability Syntax and Semantics Inference Independence and Bayes' Rule.
Uncertainty & Probability CIS 391 – Introduction to Artificial Intelligence AIMA, Chapter 13 Many slides adapted from CMSC 421 (U. Maryland) by Bonnie.
Anifuddin Azis UNCERTAINTY. 2 Introduction The world is not a well-defined place. There is uncertainty in the facts we know: What’s the temperature? Imprecise.
Computer Science cpsc322, Lecture 25
Pattern Recognition Probability Review
AIMA 3e Chapter 13: Quantifying Uncertainty
Quick Review Probability Theory
Quick Review Probability Theory
Uncertainty Chapter 13 Copyright, 1996 © Dale Carnegie & Associates, Inc.
Where are we in CS 440? Now leaving: sequential, deterministic reasoning Entering: probabilistic reasoning and machine learning.
Uncertainty Chapter 13.
Uncertainty Chapter 13.
Where are we in CS 440? Now leaving: sequential, deterministic reasoning Entering: probabilistic reasoning and machine learning.
Uncertainty.
Probability and Information
Uncertainty in Environments
Representing Uncertainty
Probability and Information
CS 188: Artificial Intelligence Fall 2007
Class #21 – Monday, November 10
Bayesian Reasoning Chapter 13 Thomas Bayes,
Uncertainty Chapter 13.
Uncertainty Chapter 13.
Uncertainty Chapter 13.
Uncertainty Chapter 13.
Presentation transcript:

Uncertainty Copyright, 1996 © Dale Carnegie & Associates, Inc. Chapter 13

CSE 471/598, CBS 598 by H. Liu2 Uncertainty Evolution of an intelligent agent: problem solving, planning, uncertainty Dealing with uncertainty is an unavoidable problem in reality. An agent must act under uncertainty. To make decision with uncertainty, we need Probability theory Utility theory Decision theory

CSE 471/598, CBS 598 by H. Liu3 Sources of uncertainty No access to the whole truth No categorical answer Incompleteness The qualification problem - impossible to explicitly enumerate all conditions Incorrectness of information about conditions The rational decision depends on both the relative importance of various goals and the likelihood of its being achieved.

CSE 471/598, CBS 598 by H. Liu4 Handling uncertain knowledge Difficulties in using FOL to cope with UK A dental diagnosis system using FOL  Symptom (p, Toothache) =>Disease (p, Cavity)  Disease (p, Cavity) => Symptom (p, Toothache)  Are they correct? Reasons Laziness - too much work! Theoretical ignorance - we don’t know everything Practical ignorance - we don’t want to include all Represent UK with a degree of belief The tool for handling UK is probability theory

CSE 471/598, CBS 598 by H. Liu5 Probability provides a way of summarizing the uncertainty that comes from our laziness and ignorance - how wonderful it is! Probability, belief of the truth of a sentence 1 - true, 0 - false, 0<P<1 - intermediate degrees of belief in the truth of the sentence Degree of truth (fuzzy logic) vs. degree of belief Alternatives to probability theory? Yes, to be discussed in later chapters.

CSE 471/598, CBS 598 by H. Liu6 All probability statements must indicate the evidence w.r.t. which the probability is being assessed. Prior or unconditional probability before evidence is obtained Posterior or conditional probability after new evidence is obtained

CSE 471/598, CBS 598 by H. Liu7 Uncertainty & rational decisions Without uncertainty, decision making is simple - achieving the goal or not With uncertainty, it becomes uncertain - three plans A90, A120 and A1440 We need first have preferences between the different possible outcomes of the plans Utility theory is used to represent and reason with preferences.

CSE 471/598, CBS 598 by H. Liu8 Rationality Decision Theory = Probability T + Utility T Maximum Expected Utility Principle defines rationality An agent is rational iff it chooses the action that yields the highest utility, averaged over all possible outcomes of the action A decision-theoretic agent(Fig 13.1, p 466) Is it any different from other agents we learned?

CSE 471/598, CBS 598 by H. Liu9 Basic probability notation Prior probability Proposition - P(Sunny) Random variable - P(Weather=Sunny)  Boolean, discrete, and continuous random variables Each RV has a domain (sunny,rain,cloudy,snow) Probability distribution P(weather) = Joint probability P(A^B) probabilities of all combinations of the values of a set of RVs more later

CSE 471/598, CBS 598 by H. Liu10 Conditional probability P(A|B) = P(A^B)/P(B) Product rule - P(A^B) = P(A|B)P(B) Probabilistic inference does not work like logical inference “P(A|B)=0.6” != “when B is true, P(A) is 0.6”  P(A)  P(A|B), P(A|B,C),...

CSE 471/598, CBS 598 by H. Liu11 The axioms of probability All probabilities are between 0 and 1 Necessarily true (valid) propositions have probability 1, false (unsatisfiable) 0 The probability of a disjunction P(AvB)=P(A)+P(B)-P(A^B) A Venn diagram illustration Ex: Deriving the rule of Negation from P(a v !a)

CSE 471/598, CBS 598 by H. Liu12 The joint probability distribution Joint completely specifies an agent’s probability assignments to all propositions in the domain A probabilistic model consists of a set of random variables (X1, …,Xn). An atomic event is an assignment of particular values to all the variables Given Boolean random variables A and B, what are atomic events?

CSE 471/598, CBS 598 by H. Liu13 Joint probabilities An example of two Boolean variables Observations: mutually exclusive and collectively exhaustive What are P(Cavity), P(Cavity v Toothache), P(Cavity|Toothache)?

CSE 471/598, CBS 598 by H. Liu14 Joint (2) If there is a Joint, we can read off any probability we need. Is it true? How? Impractical to specify all the entries for the Joint over n Boolean variables. Sidestep the Joint and work directly with conditional probability

CSE 471/598, CBS 598 by H. Liu15 Inference using full joint distributions Marginal probability (Fig 13.3) P(cavity) = Maginalization – summing out all the variables other than cavity  P(Y) = Sum-over z P(Y,z) Conditioning – a variant of maginalization using the product rule  P(Y) = Sum-over z P(Y|z)P(z)

CSE 471/598, CBS 598 by H. Liu16 Normalization Method 1 using the def of conditional prob  P(cavity|toothache) = P(c^t)/P(t)  P(!cavity|toothache) = P(!c^t)/P(t) Method 2 using normalization  P(cavity|toothache) = αP(cavity,toothache) = α[P(cavity, T, catch) + P(cavity, T, !catch)] = α[ + ] = α  What is α?

CSE 471/598, CBS 598 by H. Liu17 Independence P(toothache, catch, cavity, weather) A total of 32 entries, given W has 4 values How is one’s tooth problem related to weather? P(T,Ch,Cy,W=cloudy) = P(W=Clo|T...)P(T…)?  Whose tooth problem can influence our weather? P(W=Clo|T…) = P(W=Clo) Hence, P(T,Ch,Cy,W=clo) = P(W=Clo)P(T…) How many joint distribution tables? Two - (4, 8) Independence between X and Y means P(X|Y) = P(X) or P(Y|X) = P(Y) or P(XY) = P(X)P(Y)

CSE 471/598, CBS 598 by H. Liu18 Bayes’ rule Deriving the rule via the product rule P(B|A) = P(A|B)P(B)/P(A) A more general case is P(X|Y) = P(Y|X)P(X)/P(Y) Bayes’ rule conditionalized on evidence E P(X|Y,E) = P(Y|X,E)P(X|E)/P(Y|E) Applying the rule to medical diagnosis meningitis (P(M)=1/50,000)), stiff neck (P(S)=1/20), P(S|M)=0.5, what is P(M|S)? Why is this kind of inference useful?

CSE 471/598, CBS 598 by H. Liu19 Applying Bayes’ rule Relative likelihood Comparing the relative likelihood of meningitis and whiplash, given a stiff neck, which is more likely? P(M|S)/P(W|S) = P(S|M)P(M)/P(S|W)P(W) Avoiding direct assessment of the prior P(M|S) =? P(!M|S) =? And P(M|S) + P(!M|S) = 1, P(S) = ? P(S|!M) = ?

CSE 471/598, CBS 598 by H. Liu20 Using Bayes’ rule Combining evidence from P(Cavity|Toothache) and P(Cavity|Catch) to P(Cavity|Toothache,Catch) Bayesian updating from P(Cavity|T)=P(Cavity)P(T|Cavity)/P(T)  P(A|B) = P(B|A)P(A)/P(B) to P(Cavity|T,Catch)=P(Catch|T,Cavity)/P(Catch|T)  P(A|B,C) = P(B|A,C)P(A|C)/P(B|C)

CSE 471/598, CBS 598 by H. Liu21 Recall that independent events A, B P(B|A)=P(B), P(A|B)=P(A), P(A,B)=P(A)P(B) Conditional independence (X and Y are ind given Z) P(X|Y,Z)=P(X|Z) and P(Y|X,Z)=P(Y|Z) P(XY|Z)=P(X|Z)P(Y|Z) derived from absolute indepence Given Cavity, Toothache and Catch are indpt P(T,Ch,Cy) = P(T,Ch|Cy)P(Cy) = P(T|Cy)P(Ch|Cy)P(Cy) One large table is decomposed into 3 smaller tables: vs. 5 (= 2*(2 1 -1)+2*(2 1 -1) ) T|CyT|!Cy Cy

CSE 471/598, CBS 598 by H. Liu22 Independence, decomposition, Naïve Bayes If all n symptoms are conditionally indpt given Cavity, the size of the representation grows as O(n) instead of O(2 n ) The decomposition of large probabilistic domains into weakly connected subsets via conditional independence is one important development in modern AI Naïve Bayes model (Cause and Effects) P(Cause,E1,…,En) = P(Cause)  P(Ei|Cause) An amazingly successful classifier as well

CSE 471/598, CBS 598 by H. Liu23 Where do probabilities come from? There are three positions: The frequentist - numbers can come only from experiments The objectivist - probabilities are real aspects of the universe The subjectivist - characterizing an agent’s belief The reference class problem – intrusion of subjectivity A frequentist doctor wants to consider similar patients  How similar two patients are? Laplace’s principle of indifference Propositions that are syntactically “symmetric” w.r.t. the evidence should be accorded equal probability

CSE 471/598, CBS 598 by H. Liu24 Summary Uncertainty exists in the real world. It is good (it allows for laziness) and bad (we need new tools) Priors, posteriors, and joint Bayes’ rule - the base of Bayesian Inference Conditional independence allows Bayesian updating to work effectively with many pieces of evidence. But...