Uncertainty ECE457 Applied Artificial Intelligence Spring 2007 Lecture #8.

Slides:



Advertisements
Similar presentations
Probability: Review The state of the world is described using random variables Probabilities are defined over events –Sets of world states characterized.
Advertisements

PROBABILISTIC MODELS David Kauchak CS451 – Fall 2013.
Probability Notation Review Prior (unconditional) probability is before evidence is obtained, after is posterior or conditional probability P(A) – Prior.
Chapter 4 Probability.
Chapter 4 Basic Probability
Chapter 4 Basic Probability
KI2 - 2 Kunstmatige Intelligentie / RuG Probabilities Revisited AIMA, Chapter 13.
1 Bayesian Reasoning Chapter 13 CMSC 471 Adapted from slides by Tim Finin and Marie desJardins.
Ai in game programming it university of copenhagen Welcome to... the Crash Course Probability Theory Marco Loog.
Statistics for Managers Using Microsoft Excel, 5e © 2008 Pearson Prentice-Hall, Inc.Chap 4-1 Statistics for Managers Using Microsoft® Excel 5th Edition.
Chapter 4: Basic Probability
Probability (cont.). Assigning Probabilities A probability is a value between 0 and 1 and is written either as a fraction or as a proportion. For the.
Uncertainty Chapter 13.
Copyright ©2011 Pearson Education 4-1 Chapter 4 Basic Probability Statistics for Managers using Microsoft Excel 6 th Global Edition.
Chapter 4 Basic Probability
CSCI 121 Special Topics: Bayesian Network Lecture #1: Reasoning Under Uncertainty.
Chapter 8 Probability Section R Review. 2 Barnett/Ziegler/Byleen Finite Mathematics 12e Review for Chapter 8 Important Terms, Symbols, Concepts  8.1.
Dr. Gary Blau, Sean HanMonday, Aug 13, 2007 Statistical Design of Experiments SECTION I Probability Theory Review.
Copyright ©2014 Pearson Education Chap 4-1 Chapter 4 Basic Probability Statistics for Managers Using Microsoft Excel 7 th Edition, Global Edition.
Copyright © 2010 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Chapter 4 Probability.
Business Statistics: A First Course, 5e © 2009 Prentice-Hall, Inc. Chap 4-1 Chapter 4 Basic Probability Business Statistics: A First Course 5 th Edition.
Probability & Statistics I IE 254 Exam I - Reminder  Reminder: Test 1 - June 21 (see syllabus) Chapters 1, 2, Appendix BI  HW Chapter 1 due Monday at.
Probabilistic Reasoning ECE457 Applied Artificial Intelligence Spring 2007 Lecture #9.
Probability and naïve Bayes Classifier Louis Oliphant cs540 section 2 Fall 2005.
Uncertainty Uncertain Knowledge Probability Review Bayes’ Theorem Summary.
AP STATISTICS LESSON 6.3 (DAY 1) GENERAL PROBABILITY RULES.
Probability 2.0. Independent Events Events can be "Independent", meaning each event is not affected by any other events. Example: Tossing a coin. Each.
Topic 2: Intro to probability CEE 11 Spring 2002 Dr. Amelia Regan These notes draw liberally from the class text, Probability and Statistics for Engineering.
Dr. Ahmed Abdelwahab Introduction for EE420. Probability Theory Probability theory is rooted in phenomena that can be modeled by an experiment with an.
Chapter 13 February 19, Acting Under Uncertainty Rational Decision – Depends on the relative importance of the goals and the likelihood of.
Probability Course web page: vision.cis.udel.edu/cv March 19, 2003  Lecture 15.
Basic Business Statistics Assoc. Prof. Dr. Mustafa Yüzükırmızı
1 Chapter 12 Probabilistic Reasoning and Bayesian Belief Networks.
Once again about the science-policy interface. Open risk management: overview QRAQRA.
PROBABILITY, PROBABILITY RULES, AND CONDITIONAL PROBABILITY
Probability You’ll probably like it!. Probability Definitions Probability assignment Complement, union, intersection of events Conditional probability.
12/7/20151 Math b Conditional Probability, Independency, Bayes Theorem.
Probability. Rules  0 ≤ P(A) ≤ 1 for any event A.  P(S) = 1  Complement: P(A c ) = 1 – P(A)  Addition: If A and B are disjoint events, P(A or B) =
Education as a Signaling Device and Investment in Human Capital Topic 3 Part I.
Copyright © 2006 The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Review of Statistics I: Probability and Probability Distributions.
Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc. Chap 4-1 Chapter 4 Basic Probability Basic Business Statistics 11 th Edition.
Probability. Probability Probability is fundamental to scientific inference Probability is fundamental to scientific inference Deterministic vs. Probabilistic.
Chapter 2: Probability. Section 2.1: Basic Ideas Definition: An experiment is a process that results in an outcome that cannot be predicted in advance.
STATISTICS 6.0 Conditional Probabilities “Conditional Probabilities”
Statistical NLP: Lecture 4 Mathematical Foundations I: Probability Theory (Ch2)
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Chapter 4 Probability.
Business Statistics: A First Course, 5e © 2009 Prentice-Hall, Inc. Chap 4-1 Chapter 4 Basic Probability Business Statistics: A First Course 5 th Edition.
Chap 4-1 Chapter 4 Using Probability and Probability Distributions.
PROBABILITY 1. Basic Terminology 2 Probability 3  Probability is the numerical measure of the likelihood that an event will occur  The probability.
Decision Making ECE457 Applied Artificial Intelligence Spring 2007 Lecture #10.
ECE457 Applied Artificial Intelligence Fall 2007 Lecture #8
Chapter 3 Probability.
Chapter 4 Probability.
Chapter 4 Basic Probability.
Quick Review Probability Theory
Quick Review Probability Theory
ECE457 Applied Artificial Intelligence Fall 2007 Lecture #10
ECE457 Applied Artificial Intelligence Spring 2008 Lecture #10
Basic Probability aft A RAJASEKHAR YADAV.
Chapter 4 Basic Probability.
Statistical NLP: Lecture 4
CS 188: Artificial Intelligence Fall 2008
LESSON 5: PROBABILITY Outline Probability Events
Probability Notes Math 309.
Chapter 5 – Probability Rules
Basic Probability Chapter Goal:
ECE457 Applied Artificial Intelligence Spring 2008 Lecture #8
Probability Notes Math 309.
basic probability and bayes' rule
Chapter 1 Probability Spaces
Presentation transcript:

Uncertainty ECE457 Applied Artificial Intelligence Spring 2007 Lecture #8

ECE457 Applied Artificial Intelligence R. Khoury (2007)Page 2 Outline Uncertainty Probability Bayes’ Theorem Russell & Norvig, chapter 13

ECE457 Applied Artificial Intelligence R. Khoury (2007)Page 3 Limit of FOL FOL works only when facts are known to be true or false “Some purple mushrooms are poisonous”  x Purple(x)  Mushroom(x)  Poisonous(x) In real life there is almost always uncertainty “There’s a 70% chance that a purple mushroom is poisonous” Can’t be represented as FOL sentence

ECE457 Applied Artificial Intelligence R. Khoury (2007)Page 4 Acting Under Uncertainty So far, rational decision is to pick action with “best” outcome Two actions #1 leads to great outcome #2 leads to good outcome It’s only rational to pick #1 Assumes outcome is 100% certain What if outcome is not certain? Two actions #1 has 1% probability to lead to great outcome #2 has 90% probability to lead to good outcome What is the rational decision?

ECE457 Applied Artificial Intelligence R. Khoury (2007)Page 5 Acting Under Uncertainty Maximum Expected Utility (MEU) Pick action that leads to best outcome averaged over all possible outcomes of the action Same principle as Expectiminimax, used to solve games of chance (see Game Playing, lecture #5) How do we compute the MEU? First, we need to compute the probability of each event

ECE457 Applied Artificial Intelligence R. Khoury (2007)Page 6 Types of Uncertain Variables Boolean Can be true or false Warm  {True, False} Discrete Can take a value from a limited, countable domain Temperature  {Hot, Warm, Cool, Cold} Continuous Can take a value from a set of real numbers Temperature  [-35, 35] We’ll focus on discrete variables for now

ECE457 Applied Artificial Intelligence R. Khoury (2007)Page 7 Probability Each possible value in the domain of an uncertain variable is assigned a probability Represents how likely it is that the variable will have this value P(Temperature=Warm) Probability that the “Temperature” variable will have the value “Warm” We can simply write P(Warm)

ECE457 Applied Artificial Intelligence R. Khoury (2007)Page 8 Probability Axioms P(x)  [0, 1] P(x) = 1 x is necessarily true, or certain to occur P(x) = 0 x is necessarily false, or certain not to occur P(A  B) = P(A) + P(B) – P(A  B) P(A  B) = 0 A and B are said to be mutually exclusive  P(x) = 1 If all values of x are mutually exclusive

ECE457 Applied Artificial Intelligence R. Khoury (2007)Page 9 Visualizing Probabilities P(A) is the proportion of the event space in which A is true Area of green circle / Total area P(A) = 0 if the green circle doesn’t exist P(A) = 1 if the green circle covers the entire event space P(A) P(  A) Event space

ECE457 Applied Artificial Intelligence R. Khoury (2007)Page 10 Visualizing Probabilities P(A  B) = P(A) + P(B) – P(A  B) Sum of area of both circles / Total area P(A  B) = 0 There is no intersection between both regions A and B can’t happen together: mutually exclusive P(A)P(B) P(A  B)

ECE457 Applied Artificial Intelligence R. Khoury (2007)Page 11 Prior (Unconditional) Probability Probability that A is true in the absence of any other information P(A) Example P(Temperature=Hot) = 0.2 P(Temperature=Warm) = 0.6 P(Temperature=Cool) = 0.15 P(Temperature=Cold) = 0.05 P(Temperature) = {0.2, 0.6, 0.15, 0.05} This is a probability distribution

ECE457 Applied Artificial Intelligence R. Khoury (2007)Page 12 Joint Probability Distribution Let’s add another variable Condition  {Sunny, Cloudy, Raining} We can compute P(Temperature,Condition) SunnyCloudyRaining Hot Warm Cool Cold

ECE457 Applied Artificial Intelligence R. Khoury (2007)Page 13 Joint Probability Distribution Given a joint probability distribution P(a,b), we can compute P(a=A i ) P(A i ) =  j P(A i,B j ) Assumes all events (A i,B j ) are mutually exclusive This is called marginalization P(Warm) = P(Warm,Sunny) + P(Warm,Cloudy) + P(Warm,Raining) P(Warm) = P(Warm) = 0.6

ECE457 Applied Artificial Intelligence R. Khoury (2007)Page 14 Visualizing Marginalization P(A) = P(A,B) + P(A,C) + P(A,D) No area of A not covered by B, C or D B, C and D do not intersect inside A P(A,B) P(A,C) P(A,D)P(D) P(B) P(C) P(A)

ECE457 Applied Artificial Intelligence R. Khoury (2007)Page 15 Posterior (Conditional) Probability Probability that A is true given that we know that B is true P(A|B) Can be computed using prior and joint probability P(A|B) = P(A,B) / P(B) P(Warm|Cloudy) = P(Warm,Cloudy) / P(Cloudy) P(Warm|Cloudy) = 0.2 / 0.32 P(Warm|Cloudy) = 0.625

ECE457 Applied Artificial Intelligence R. Khoury (2007)Page 16 Visualizing Posterior Probability P(A|B) = P(A,B) / P(B) We know that B is true We want the area of B where A is also true We don’t care about the area P(  B) P(A | B) P(B)P(A)

ECE457 Applied Artificial Intelligence R. Khoury (2007)Page 17 Bayes’ Theorem Start from previous conditional probability equation P(A|B)P(B) = P(A,B) P(B|A)P(A) = P(B,A) P(A|B)P(B) = P(B|A)P(A) P(A|B) = P(B|A)P(A) / P(B) (important!) P(A|B): Posterior probability P(A): Prior probability P(B|A): Likelihood P(B): Normalizing constant

ECE457 Applied Artificial Intelligence R. Khoury (2007)Page 18 Bayes’ Theorem Allows us to compute P(A|B) without knowing P(A,B) In many real-life situations, P(A|B) cannot be measured directly, but P(B|A) is available Bayes’ Theorem underlies all modern probabilistic AI systems

ECE457 Applied Artificial Intelligence R. Khoury (2007)Page 19 Visualizing Bayes’ Theorem P(A|B) = P(B|A)P(A) / P(B) We know the portion of the event space where A is true, and that where B is true We know the portion of A where B is also true We want the portion of B where A is also true P(B|A) P(A) P(B)

ECE457 Applied Artificial Intelligence R. Khoury (2007)Page 20 Bayes’ Theorem Example #1 We want to design a classifier (for spam) Compute the probability that an item belongs to class C (spam) given that it exhibits feature F (the word “Viagra”) We know that 20% of items in the world belong to class C 90% of items in class C exhibit feature F 40% of items in the world exhibit feature F P(C|F) = P(F|C) * P(C) / P(F) P(C|F) = 0.9 * 0.2 / 0.4 P(C|F) = 0.45

ECE457 Applied Artificial Intelligence R. Khoury (2007)Page 21 Bayes’ Theorem Example #2 A drug test returns “positive” if drugs are detected in an athlete’s system, but it can make mistakes If an athlete took drugs, 99% chance of + If an athlete didn’t take drugs, 10% chance of + 5% of athletes take drugs What’s the probability that an athlete who tested positive really does take drugs? P(drug|+) = P(+|drug) * P(drug) / P(+) P(+) = P(+|drug)P(drug) + P(+|nodrug)P(nodrug) P(+) = 0.99 * *0.95 = P(drug|+) = 0.99 * 0.05 / =

ECE457 Applied Artificial Intelligence R. Khoury (2007)Page 22 Bayes’ Theorem We computed the normalizing constant using marginalization! P(B) =  i P(B|A i )P(A i )

ECE457 Applied Artificial Intelligence R. Khoury (2007)Page 23 Chain Rule Recall that P(A,B) = P(A|B)P(B) Can be extended to multiple variables Extend to three variables P(A,B,C) = P(A|B,C)P(B,C) P(A,B,C) = P(A|B,C)P(B|C)P(C) General form P(A 1,A 2,…,A n ) = P(A 1 |A 2,…,A n )P(A 2 |A 3,…,A n )…P(A n-1 |A n )P(A n ) Compute full joint probability distribution Simple if variables conditionally independent

ECE457 Applied Artificial Intelligence R. Khoury (2007)Page 24 Visualizing Chain Rule P(A,B,C) = P(A|B,C)P(B|C)P(C) We want the proportion of the event space where A, B and C are true Proportion of B,C where A is also true * proportion of C where B is also true * proportion of the event space where C is true P(A,B,C) P(B,C) P(A|B,C) P(B) P(C) P(B|C) P(A)

ECE457 Applied Artificial Intelligence R. Khoury (2007)Page 25 Independence Two variables are independent if knowledge of one does not affect the probability of the other P(A|B) = P(A) P(B|A) = P(B) P(A  B) = P(A)P(B) Impact on chain rule P(A 1,A 2,…,A n ) = P(A 1 )P(A 2 )…P(A n ) P(A 1,A 2,…,A n ) =  i=1 n P(A i )

ECE457 Applied Artificial Intelligence R. Khoury (2007)Page 26 Conditional Independence Independence is hard to satisfy Two variables are conditionally independent given a third if knowledge of one does not affect the probability of the other if the value of the third is known P(A|B,C) = P(A|C) P(B|A,C) = P(B|C) Impact on chain rule P(A 1,A 2,…,A n ) = P(A 1 |A n )P(A 2 |A n )…P(A n-1 |A n )P(A n ) P(A 1,A 2,…,A n ) = P(A n )  i=1 n-1 P(A i |A n )

ECE457 Applied Artificial Intelligence R. Khoury (2007)Page 27 Bayes’ Theorem Example #3 We want to design a classifier Compute the probability that an item belongs to class C given that it exhibits features F 1 to F n We know % of items in the world that belong to class C % of items in class C that exhibit feature F i % of items in the world exhibit features F 1 to F n P(C|F 1,…,F n ) = P(F 1,…,F n |C)*P(C)/P(F 1,…,F n ) P(F 1,…,F n |C) * P(C) = P(C,F 1,…,F n ) by chain rule P(C,F 1,…,F n ) = P(C)  i P(F i |C) assuming features are conditionally independent given the class P(C|F 1,…,F n ) = P(C)  i P(F i |C) / P(F 1,…,F n )

ECE457 Applied Artificial Intelligence R. Khoury (2007)Page 28 Bayes’ Theorem Example #3 P(F 1,…,F n ) Independent of class C In multi-class problems, it makes no difference! P(C|F 1,…,F n ) = P(C)  i P(F i |C) This is called the Naïve Bayes Classifier “Naïve” because it assumes conditional independence of F i given C whether it’s actually true or not Often used in practice in cases where F i are not conditionally independent given C, with very good results