Presentation is loading. Please wait.

Presentation is loading. Please wait.

Uncertainty Management for Intelligent Systems : for SEP502 June 2006 김 진형 KAIST

Similar presentations


Presentation on theme: "Uncertainty Management for Intelligent Systems : for SEP502 June 2006 김 진형 KAIST"— Presentation transcript:

1 Uncertainty Management for Intelligent Systems : for SEP502 June 2006 김 진형 KAIST prof_jkim@kaist.ac.krrof_jkim@kaist.ac.kr

2 2 Contents Uncertainty Probability Bayes’ Rule Where Do Probability Come From ? Summary

3 3 Uncertainty Motivation Truth value is unknown Too complex to compute prior to make decision Characteristics of real-world applications Source of Uncertainty Cannot be explained by deterministic model decay of radioactive substances Don’t understand well disease transmit mechanism Too complex to compute coin tossing

4 4 Types of Uncertainty Randomness Which side will be up if I toss a coin ? Vagueness Am I pretty ? Confidence How much are you confident on your decision ? One Formalism for all vs. Separate formalism Representation + Computational Engine

5 5 Uncertainty Representation Binary Logic Multi-valued Logic Probability Theory Upper/Lower Probability Possibility Theory

6 6 Applications Wide range of Applications diagnose disease language understanding pattern recognition managerial decision making Useful answer from uncertain, conflicting knowledge acquiring qualitative and quantitative relationships data fusion multiple experts’ opinion

7 7 Handling Uncertain Knowledge Diagnosis Rule  p Symptom(p, Toothache)  Disease(p, Cavity)  p Symptom(p, Toothache)  Disease(p, Cavity)  Disease(p,GumDisease)  Disease(p, ImpactedWisdom)  … Causal Rule  p Disease(p, Cavity)  Symptom(p, Toothache) Not every Cavity causes toothache

8 8 Why First-order Logic Fails? Laziness : Too much works to prepare complete set of exceptionless rule, and too hard to use the enormous rules Theoretical ignorance : Medical science has no complete theory for the domain Practical ignorance : All the necessary tests cannot be run, even though we know all the rules

9 9 Degree of Belief Agent can provide degree of belief for sentence Main tool : Probability theory assign a numerical degree of belief between 0 and 1 to sentences the way of summarizing the uncertainty that comes from laziness and ignorance Probability can be derived from statistical data

10 10 Degree of Belief vs. Degree of Truth Degree of Belief The sentence itself is in fact either true or false Same ontological commitment as logic ; the facts either do or do not hold in the world Probability theory Degree of Truth Not a question of the external world Case of vagueness or uncertainty about the meaning of the linguistic term “tall”, “pretty” Fuzzy logic

11 11 Evidence in Uncertain Reasoning Assign probability to a proposition based on the percepts that it has received to date Evidence : perception that an agent receives Probabilities can change when more evidence is acquired Prior / unconditional probability : no evidence at all Posterior / conditional probability : after evidence is obtained

12 12 Uncertainty and Rational Decisions No plan can guarantee to achieve the goal To make choice, agent must have preferences between the different possible outcomes of various plans missing plane v.s. long waiting Utility theory to represent and reason with preferences Utility: the quality of being useful (degree of usefulness)

13 13 Principle of Maximum Expected Utility Decision Theory = Probability Theory + Utility Theory An agent is rational if and only if it chooses the action that yields the highest expected utility, average over all possible outcomes of the action

14 14 Prior Probability P(A) : unconditional or prior probability that the proposition A is true No other information on the proposition P(Cavity) = 0.1 Proposition can include equality using random variable P(Weather = Sunny) = 0.7, P(Weather = Rain) = 0.2 P(Weather = Cloudy) = 0.08, P(Weather = Snow) = 0.02 Each random variable X has domain of possible values

15 15 Prior Probability Vector of values P(Weather) P(Weather) = probability distribution for random variable Weather P(Weather, Cavity) : probabilities of all combinations of the values of a set of random variables : 4 X 2 table P(Cavity   Insured) = 0.06

16 16 Conditional Probability As soon as evidence concerning the previously unknown proposition making up the domain, prior probabilities are no longer applicable We use conditional or posterior probabilities P(A|B) : Probability of A given that all we know is B P(Cavity|Toothache) = 0.8 New information C is known, P(A|B  C) P(A|B) = P(A  B) / P(B) P(A  B) = P(A|B)P(B) =P(B|A)P(A) P(X,Y) = P(X|Y)P(Y) P(X=x1  Y=y2) = P(X=x1|Y=y2)P(Y=y2)

17 17 Axioms of Probability All probabilities are between 0 and 1 0  P(A)  0 Necessarily true propositions have probability 1 and necessarily false propositions have probability 0 P(True) = 1, P(False) = 0 Probability of a disjunction is P(A  B) = P(A) + P(B) - P(A  B)

18 18 Axioms of Probability AB A  B

19 19 Axioms of Probability P(A  ¬A) = P(A) + P(¬A) - P(A  ¬A) 1 = P(A) + P(¬A) P(¬A) = 1 - P(A)

20 20 Joint Probability Distribution Completely assigns probabilities to all propositions in the domain Joint probability distribution P(X1, X2, …, Xn) assigns probabilities to all possible atomic events Atomic event : an assignment of particular values to all the variables (complete specification of the state of the domain) Cavity  Cavity Toothache 0.04 0.01  Toothache 0.06 0.89

21 21 Joint Probability Distribution Atomic events are mutually exclusive any conjunction of atomic events is necessarily false Atomic events are collectively exhaustive disjunction of all atomic events are necessarily true P(Cavity  Toothache) = 0.04+0.06+0.01=0.11 P(Cavity|Toothache) = P(Cavity  Toothache) /P(Toothache) = 0.04/(0.04+0.01) = 0.80 Not practical to define 2 n entries for the joint probability distribution over n Boolean variables

22 22 Bayes ’ Rule Allows to covert P(A|B) to P(B|A), vice versa P(Y|X) = P(X|Y)P(Y) P(X) P(Y|X,E) = P(X|Y,E)P(Y|E) P(X|E) P(B|A) = P(A|B)P(B) P(A) P(A  B) = P(A|B)P(B) = P(B|A)P(A) ==>

23 23 Applying Bayes ’ Rule Allows to elicit psychologically obtainable values P( symptom | disease ) vs P(disease | symptom) P(cause | effect ) vs P(effect | cause ) P( object | attribute) vs P(attribute | object )

24 24 Bayes ’ Rule : Normalization P(M|S) + P(  M|S) = 1 이므로 P(S) = P(S|M)P(M) + P(S|  M)P(  M) Normalization : 1/P(S) as normalization constant that allows conditional terms to sum to 1 P(Y|X) =  P(X|Y)P(Y), where  is the normalization constant to make the entries in the table sum to 1

25 25 Where Do Probabilities Come From ? Endless debate over source and status of probability number Frequentist The numbers can come only from experiments Objectivist Probabilities are real aspect of universe (propensities of objects to behave in certain way) Subjectivist Probabilities as a way of characterizing an agent’s beliefs, rather than having any external physical significance

26 26 Where Do Probabilities Come From ? Probability that the sun will still exist tomorrow (question raised by Hume’s Inquiry) The probability is undefined, because there has never been an experiment that tested the existence of the sun tomorrow The probability is 1, because in all the experiments that have been done (on past days) the sun has existed. The probability is 1 - , where  is the proportion of stars in the universe that go supernova and explode per day. The probability is (d+1)/(d+2), where d is the number of days that the sun has existed so far. (Laplace) The probability can be derived from the type, age, size, and temperature of the sun, even though we have never observed another star with those exact properties.

27 27 Summary Probability is the right way to reason about uncertainty Uncertainty arises because of both laziness and ignorance. It is inescapable in complex, dynamic, or inaccessible worlds. Uncertainty means that many of the simplifications that are possible with deductive inference are no longer valid Probabilities expresses the agent’s inability to reach a definite decision regarding the truth of a sentence, and summarize the agent’s belief Basic probability statements include prior probabilities and conditional probabilities over simple and complex propositions. The axioms of probability specify constraints on reasonable assignments of probabilities to propositions. An agent that violates the axioms will behave irrationally in some circumstances.

28 28 Summary The joint probability distribution specifies the probability of each complete assignment of values to random variables. It is usually far too large to create or use. Bayes’ rule allows unknown probabilities to be computed from known, stable ones. In the general case, combining many pieces of evidence may require assessing a large number of conditional probabilities. Conditional independence brought about by direct causal relationships in the domain allows Bayesian updating to work effectively even with multiple piece of evidence.


Download ppt "Uncertainty Management for Intelligent Systems : for SEP502 June 2006 김 진형 KAIST"

Similar presentations


Ads by Google