Uncertainty Management for Intelligent Systems : for SEP502 June 2006 김 진형 KAIST

Slides:



Advertisements
Similar presentations
ICS-171:Notes 8: 1 Notes 8: Uncertainty, Probability and Optimal Decision-Making ICS 171, Winter 2001.
Advertisements

Decision Making Under Uncertainty CSE 495 Resources: –Russell and Norwick’s book.
Artificial Intelligence Universitatea Politehnica Bucuresti Adina Magda Florea
Probability: Review The state of the world is described using random variables Probabilities are defined over events –Sets of world states characterized.
PROBABILITY. Uncertainty  Let action A t = leave for airport t minutes before flight from Logan Airport  Will A t get me there on time ? Problems :
Uncertainty Everyday reasoning and decision making is based on uncertain evidence and inferences. Classical logic only allows conclusions to be strictly.
Where are we in CS 440? Now leaving: sequential, deterministic reasoning Entering: probabilistic reasoning and machine learning.
Artificial Intelligence Uncertainty
Uncertainty Copyright, 1996 © Dale Carnegie & Associates, Inc. Chapter 14.
CPSC 422 Review Of Probability Theory.
Probability.
Uncertainty Copyright, 1996 © Dale Carnegie & Associates, Inc. Chapter 13.
Uncertainty Chapter 13. Uncertainty Let action A t = leave for airport t minutes before flight Will A t get me there on time? Problems: 1.partial observability.
KI2 - 2 Kunstmatige Intelligentie / RuG Probabilities Revisited AIMA, Chapter 13.
1 Bayesian Reasoning Chapter 13 CMSC 471 Adapted from slides by Tim Finin and Marie desJardins.
Representing Uncertainty CSE 473. © Daniel S. Weld 2 Many Techniques Developed Fuzzy Logic Certainty Factors Non-monotonic logic Probability Only one.
Ai in game programming it university of copenhagen Welcome to... the Crash Course Probability Theory Marco Loog.
Probability and Information Copyright, 1996 © Dale Carnegie & Associates, Inc. A brief review (Chapter 13)
Uncertainty Logical approach problem: we do not always know complete truth about the environment Example: Leave(t) = leave for airport t minutes before.
Uncertainty Chapter 13.
Uncertainty Chapter 13.
Methods in Computational Linguistics II Queens College Lecture 2: Counting Things.
CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes Feb 28 and March 13-15, 2012.
Handling Uncertainty. Uncertain knowledge Typical example: Diagnosis. Consider:  x Symptom(x, Toothache)  Disease(x, Cavity). The problem is that this.
Uncertainty Chapter 13. Uncertainty Let action A t = leave for airport t minutes before flight Will A t get me there on time? Problems: 1.partial observability.
CSCI 121 Special Topics: Bayesian Network Lecture #1: Reasoning Under Uncertainty.
An Introduction to Artificial Intelligence Chapter 13 & : Uncertainty & Bayesian Networks Ramin Halavati
Dr. Gary Blau, Sean HanMonday, Aug 13, 2007 Statistical Design of Experiments SECTION I Probability Theory Review.
CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes March 13, 2012.
Uncertainty Chapter 13. Outline Uncertainty Probability Syntax and Semantics Inference Independence and Bayes' Rule.
1 Chapter 13 Uncertainty. 2 Outline Uncertainty Probability Syntax and Semantics Inference Independence and Bayes' Rule.
Probability and naïve Bayes Classifier Louis Oliphant cs540 section 2 Fall 2005.
An Introduction to Artificial Intelligence Chapter 13 & : Uncertainty & Bayesian Networks Ramin Halavati
1 Reasoning Under Uncertainty Artificial Intelligence Chapter 9.
CSE PR 1 Reasoning - Rule-based and Probabilistic Representing relations with predicate logic Limitations of predicate logic Representing relations.
Uncertainty Uncertain Knowledge Probability Review Bayes’ Theorem Summary.
Chapter 13 February 19, Acting Under Uncertainty Rational Decision – Depends on the relative importance of the goals and the likelihood of.
Uncertainty Management in Rule-based Expert Systems
Uncertainty. Assumptions Inherent in Deductive Logic-based Systems All the assertions we wish to make and use are universally true. Observations of the.
Probability and Information Copyright, 1996 © Dale Carnegie & Associates, Inc. A brief review.
Making sense of randomness
Uncertainty Chapter 13. Outline Uncertainty Probability Syntax and Semantics Inference Independence and Bayes' Rule.
Uncertainty Chapter 13. Outline Uncertainty Probability Syntax and Semantics Inference Independence and Bayes' Rule.
Introduction to Artificial Intelligence CS 438 Spring 2008 Today –AIMA, Ch. 13 –Reasoning with Uncertainty Tuesday –AIMA, Ch. 14.
Uncertainty Let action A t = leave for airport t minutes before flight Will A t get me there on time? Problems: 1.partial observability (road state, other.
1 Probability FOL fails for a domain due to: –Laziness: too much to list the complete set of rules, too hard to use the enormous rules that result –Theoretical.
CSE 473 Uncertainty. © UW CSE AI Faculty 2 Many Techniques Developed Fuzzy Logic Certainty Factors Non-monotonic logic Probability Only one has stood.
Uncertainty Fall 2013 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University.
Outline [AIMA Ch 13] 1 Uncertainty Probability Syntax and Semantics Inference Independence and Bayes' Rule.
Uncertainty & Probability CIS 391 – Introduction to Artificial Intelligence AIMA, Chapter 13 Many slides adapted from CMSC 421 (U. Maryland) by Bonnie.
Anifuddin Azis UNCERTAINTY. 2 Introduction The world is not a well-defined place. There is uncertainty in the facts we know: What’s the temperature? Imprecise.
Matching ® ® ® Global Map Local Map … … … obstacle Where am I on the global map?                                   
AIMA 3e Chapter 13: Quantifying Uncertainty
Quick Review Probability Theory
Uncertainty Chapter 13 Copyright, 1996 © Dale Carnegie & Associates, Inc.
Where are we in CS 440? Now leaving: sequential, deterministic reasoning Entering: probabilistic reasoning and machine learning.
Uncertainty Chapter 13.
Uncertainty Chapter 13.
Where are we in CS 440? Now leaving: sequential, deterministic reasoning Entering: probabilistic reasoning and machine learning.
Uncertainty.
Probability and Information
Uncertainty in Environments
Probability and Information
Class #21 – Monday, November 10
Bayesian Reasoning Chapter 13 Thomas Bayes,
Uncertainty Logical approach problem: we do not always know complete truth about the environment Example: Leave(t) = leave for airport t minutes before.
Uncertainty Chapter 13.
Uncertainty Chapter 13.
Uncertainty Chapter 13.
Uncertainty Chapter 13.
Presentation transcript:

Uncertainty Management for Intelligent Systems : for SEP502 June 2006 김 진형 KAIST

2 Contents Uncertainty Probability Bayes’ Rule Where Do Probability Come From ? Summary

3 Uncertainty Motivation Truth value is unknown Too complex to compute prior to make decision Characteristics of real-world applications Source of Uncertainty Cannot be explained by deterministic model decay of radioactive substances Don’t understand well disease transmit mechanism Too complex to compute coin tossing

4 Types of Uncertainty Randomness Which side will be up if I toss a coin ? Vagueness Am I pretty ? Confidence How much are you confident on your decision ? One Formalism for all vs. Separate formalism Representation + Computational Engine

5 Uncertainty Representation Binary Logic Multi-valued Logic Probability Theory Upper/Lower Probability Possibility Theory

6 Applications Wide range of Applications diagnose disease language understanding pattern recognition managerial decision making Useful answer from uncertain, conflicting knowledge acquiring qualitative and quantitative relationships data fusion multiple experts’ opinion

7 Handling Uncertain Knowledge Diagnosis Rule  p Symptom(p, Toothache)  Disease(p, Cavity)  p Symptom(p, Toothache)  Disease(p, Cavity)  Disease(p,GumDisease)  Disease(p, ImpactedWisdom)  … Causal Rule  p Disease(p, Cavity)  Symptom(p, Toothache) Not every Cavity causes toothache

8 Why First-order Logic Fails? Laziness : Too much works to prepare complete set of exceptionless rule, and too hard to use the enormous rules Theoretical ignorance : Medical science has no complete theory for the domain Practical ignorance : All the necessary tests cannot be run, even though we know all the rules

9 Degree of Belief Agent can provide degree of belief for sentence Main tool : Probability theory assign a numerical degree of belief between 0 and 1 to sentences the way of summarizing the uncertainty that comes from laziness and ignorance Probability can be derived from statistical data

10 Degree of Belief vs. Degree of Truth Degree of Belief The sentence itself is in fact either true or false Same ontological commitment as logic ; the facts either do or do not hold in the world Probability theory Degree of Truth Not a question of the external world Case of vagueness or uncertainty about the meaning of the linguistic term “tall”, “pretty” Fuzzy logic

11 Evidence in Uncertain Reasoning Assign probability to a proposition based on the percepts that it has received to date Evidence : perception that an agent receives Probabilities can change when more evidence is acquired Prior / unconditional probability : no evidence at all Posterior / conditional probability : after evidence is obtained

12 Uncertainty and Rational Decisions No plan can guarantee to achieve the goal To make choice, agent must have preferences between the different possible outcomes of various plans missing plane v.s. long waiting Utility theory to represent and reason with preferences Utility: the quality of being useful (degree of usefulness)

13 Principle of Maximum Expected Utility Decision Theory = Probability Theory + Utility Theory An agent is rational if and only if it chooses the action that yields the highest expected utility, average over all possible outcomes of the action

14 Prior Probability P(A) : unconditional or prior probability that the proposition A is true No other information on the proposition P(Cavity) = 0.1 Proposition can include equality using random variable P(Weather = Sunny) = 0.7, P(Weather = Rain) = 0.2 P(Weather = Cloudy) = 0.08, P(Weather = Snow) = 0.02 Each random variable X has domain of possible values

15 Prior Probability Vector of values P(Weather) P(Weather) = probability distribution for random variable Weather P(Weather, Cavity) : probabilities of all combinations of the values of a set of random variables : 4 X 2 table P(Cavity   Insured) = 0.06

16 Conditional Probability As soon as evidence concerning the previously unknown proposition making up the domain, prior probabilities are no longer applicable We use conditional or posterior probabilities P(A|B) : Probability of A given that all we know is B P(Cavity|Toothache) = 0.8 New information C is known, P(A|B  C) P(A|B) = P(A  B) / P(B) P(A  B) = P(A|B)P(B) =P(B|A)P(A) P(X,Y) = P(X|Y)P(Y) P(X=x1  Y=y2) = P(X=x1|Y=y2)P(Y=y2)

17 Axioms of Probability All probabilities are between 0 and 1 0  P(A)  0 Necessarily true propositions have probability 1 and necessarily false propositions have probability 0 P(True) = 1, P(False) = 0 Probability of a disjunction is P(A  B) = P(A) + P(B) - P(A  B)

18 Axioms of Probability AB A  B

19 Axioms of Probability P(A  ¬A) = P(A) + P(¬A) - P(A  ¬A) 1 = P(A) + P(¬A) P(¬A) = 1 - P(A)

20 Joint Probability Distribution Completely assigns probabilities to all propositions in the domain Joint probability distribution P(X1, X2, …, Xn) assigns probabilities to all possible atomic events Atomic event : an assignment of particular values to all the variables (complete specification of the state of the domain) Cavity  Cavity Toothache  Toothache

21 Joint Probability Distribution Atomic events are mutually exclusive any conjunction of atomic events is necessarily false Atomic events are collectively exhaustive disjunction of all atomic events are necessarily true P(Cavity  Toothache) = =0.11 P(Cavity|Toothache) = P(Cavity  Toothache) /P(Toothache) = 0.04/( ) = 0.80 Not practical to define 2 n entries for the joint probability distribution over n Boolean variables

22 Bayes ’ Rule Allows to covert P(A|B) to P(B|A), vice versa P(Y|X) = P(X|Y)P(Y) P(X) P(Y|X,E) = P(X|Y,E)P(Y|E) P(X|E) P(B|A) = P(A|B)P(B) P(A) P(A  B) = P(A|B)P(B) = P(B|A)P(A) ==>

23 Applying Bayes ’ Rule Allows to elicit psychologically obtainable values P( symptom | disease ) vs P(disease | symptom) P(cause | effect ) vs P(effect | cause ) P( object | attribute) vs P(attribute | object )

24 Bayes ’ Rule : Normalization P(M|S) + P(  M|S) = 1 이므로 P(S) = P(S|M)P(M) + P(S|  M)P(  M) Normalization : 1/P(S) as normalization constant that allows conditional terms to sum to 1 P(Y|X) =  P(X|Y)P(Y), where  is the normalization constant to make the entries in the table sum to 1

25 Where Do Probabilities Come From ? Endless debate over source and status of probability number Frequentist The numbers can come only from experiments Objectivist Probabilities are real aspect of universe (propensities of objects to behave in certain way) Subjectivist Probabilities as a way of characterizing an agent’s beliefs, rather than having any external physical significance

26 Where Do Probabilities Come From ? Probability that the sun will still exist tomorrow (question raised by Hume’s Inquiry) The probability is undefined, because there has never been an experiment that tested the existence of the sun tomorrow The probability is 1, because in all the experiments that have been done (on past days) the sun has existed. The probability is 1 - , where  is the proportion of stars in the universe that go supernova and explode per day. The probability is (d+1)/(d+2), where d is the number of days that the sun has existed so far. (Laplace) The probability can be derived from the type, age, size, and temperature of the sun, even though we have never observed another star with those exact properties.

27 Summary Probability is the right way to reason about uncertainty Uncertainty arises because of both laziness and ignorance. It is inescapable in complex, dynamic, or inaccessible worlds. Uncertainty means that many of the simplifications that are possible with deductive inference are no longer valid Probabilities expresses the agent’s inability to reach a definite decision regarding the truth of a sentence, and summarize the agent’s belief Basic probability statements include prior probabilities and conditional probabilities over simple and complex propositions. The axioms of probability specify constraints on reasonable assignments of probabilities to propositions. An agent that violates the axioms will behave irrationally in some circumstances.

28 Summary The joint probability distribution specifies the probability of each complete assignment of values to random variables. It is usually far too large to create or use. Bayes’ rule allows unknown probabilities to be computed from known, stable ones. In the general case, combining many pieces of evidence may require assessing a large number of conditional probabilities. Conditional independence brought about by direct causal relationships in the domain allows Bayesian updating to work effectively even with multiple piece of evidence.