CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes March 13, 2012.

Slides:



Advertisements
Similar presentations
Probability: Review The state of the world is described using random variables Probabilities are defined over events –Sets of world states characterized.
Advertisements

PROBABILITY. Uncertainty  Let action A t = leave for airport t minutes before flight from Logan Airport  Will A t get me there on time ? Problems :
Artificial Intelligence Uncertainty
Reasoning under Uncertainty: Conditional Prob., Bayes and Independence Computer Science cpsc322, Lecture 25 (Textbook Chpt ) March, 17, 2010.
Marginal Independence and Conditional Independence Computer Science cpsc322, Lecture 26 (Textbook Chpt 6.1-2) March, 19, 2010.
CPSC 422 Review Of Probability Theory.
Probability.
Uncertainty Chapter 13. Uncertainty Let action A t = leave for airport t minutes before flight Will A t get me there on time? Problems: 1.partial observability.
Bayesian networks Chapter 14 Section 1 – 2.
Bayesian Belief Networks
KI2 - 2 Kunstmatige Intelligentie / RuG Probabilities Revisited AIMA, Chapter 13.
University College Cork (Ireland) Department of Civil and Environmental Engineering Course: Engineering Artificial Intelligence Dr. Radu Marinescu Lecture.
Ai in game programming it university of copenhagen Welcome to... the Crash Course Probability Theory Marco Loog.
Uncertainty Chapter 13.
Uncertainty Chapter 13.
Probabilistic Reasoning
Methods in Computational Linguistics II Queens College Lecture 2: Counting Things.
EE562 ARTIFICIAL INTELLIGENCE FOR ENGINEERS
CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes Feb 28 and March 13-15, 2012.
Handling Uncertainty. Uncertain knowledge Typical example: Diagnosis. Consider:  x Symptom(x, Toothache)  Disease(x, Cavity). The problem is that this.
Uncertainty Chapter 13. Uncertainty Let action A t = leave for airport t minutes before flight Will A t get me there on time? Problems: 1.partial observability.
Bayes Nets and Probabilities
CHAPTER 13 Oliver Schulte Summer 2011 Uncertainty.
An Introduction to Artificial Intelligence Chapter 13 & : Uncertainty & Bayesian Networks Ramin Halavati
CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes March 15and20, 2012.
Uncertainty Chapter 13. Outline Uncertainty Probability Syntax and Semantics Inference Independence and Bayes' Rule.
1 Chapter 13 Uncertainty. 2 Outline Uncertainty Probability Syntax and Semantics Inference Independence and Bayes' Rule.
Uncertainty Chapter 13. Outline Uncertainty Probability Syntax and Semantics Inference Independence and Bayes' Rule.
CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes March 20, 2012.
Bayesian networks. Motivation We saw that the full joint probability can be used to answer any question about the domain, but can become intractable as.
Uncertainty Chapter 13. Outline Uncertainty Probability Syntax and Semantics Inference Independence and Bayes' Rule.
Probability and naïve Bayes Classifier Louis Oliphant cs540 section 2 Fall 2005.
An Introduction to Artificial Intelligence Chapter 13 & : Uncertainty & Bayesian Networks Ramin Halavati
Uncertainty. Assumptions Inherent in Deductive Logic-based Systems All the assertions we wish to make and use are universally true. Observations of the.
Uncertainty Chapter 13. Outline Uncertainty Probability Syntax and Semantics Inference Independence and Bayes' Rule.
Uncertainty Chapter 13. Outline Uncertainty Probability Syntax and Semantics Inference Independence and Bayes' Rule.
2101INT – Principles of Intelligent Systems Lecture 9.
Introduction to Artificial Intelligence CS 438 Spring 2008 Today –AIMA, Ch. 13 –Reasoning with Uncertainty Tuesday –AIMA, Ch. 14.
Web-Mining Agents Data Mining Prof. Dr. Ralf Möller Universität zu Lübeck Institut für Informationssysteme Karsten Martiny (Übungen)
Uncertainty Let action A t = leave for airport t minutes before flight Will A t get me there on time? Problems: 1.partial observability (road state, other.
Conditional Probability, Bayes’ Theorem, and Belief Networks CISC 2315 Discrete Structures Spring2010 Professor William G. Tanner, Jr.
CSE 473 Uncertainty. © UW CSE AI Faculty 2 Many Techniques Developed Fuzzy Logic Certainty Factors Non-monotonic logic Probability Only one has stood.
Uncertainty Fall 2013 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University.
Web-Mining Agents Data Mining Prof. Dr. Ralf Möller Universität zu Lübeck Institut für Informationssysteme Karsten Martiny (Übungen)
Outline [AIMA Ch 13] 1 Uncertainty Probability Syntax and Semantics Inference Independence and Bayes' Rule.
Uncertainty & Probability CIS 391 – Introduction to Artificial Intelligence AIMA, Chapter 13 Many slides adapted from CMSC 421 (U. Maryland) by Bonnie.
Anifuddin Azis UNCERTAINTY. 2 Introduction The world is not a well-defined place. There is uncertainty in the facts we know: What’s the temperature? Imprecise.
Chapter 12. Probability Reasoning Fall 2013 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University.
Introduction to Artificial Intelligence – Unit 7 Probabilistic Reasoning Course The Hebrew University of Jerusalem School of Engineering and Computer.
Computer Science cpsc322, Lecture 25
Marginal Independence and Conditional Independence
Bayesian networks Chapter 14 Section 1 – 2.
Presented By S.Yamuna AP/CSE
Where are we in CS 440? Now leaving: sequential, deterministic reasoning Entering: probabilistic reasoning and machine learning.
Uncertainty Chapter 13.
Uncertainty Chapter 13.
Bayes Rule Which is shorthand for: 10 Mar 2004 CS Uncertainty.
Conditional Probability, Bayes’ Theorem, and Belief Networks
Uncertainty.
Uncertainty in Environments
CS 188: Artificial Intelligence Fall 2008
CS 188: Artificial Intelligence Fall 2007
CS 188: Artificial Intelligence Fall 2007
Uncertainty Chapter 13.
Bayesian networks Chapter 14 Section 1 – 2.
Probabilistic Reasoning
Uncertainty Chapter 13.
Uncertainty Chapter 13.
Uncertainty Chapter 13.
Presentation transcript:

CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes March 13, 2012

Outline Probabilistic Reasoning Conditional Probability Independence Bayes’ Rule “Expert” systems and the combinatorics of joint probabilities Bayes Rule and Conditional Independence Thanks to Prof. Andrew Moore (CMU) for some slides on probabilistic AI.

Probabilistic Reasoning Basic element: random variable Similar to propositional logic: possible worlds defined by assignment of values to random variables. Boolean random variables e.g., Cavity (do I have a cavity?) Discrete random variables e.g., Weather is one of Domain values must be exhaustive and mutually exclusive Elementary proposition constructed by assignment of a value to a single random variable: e.g., Weather = sunny, Cavity = false (sometime abbreviated as  cavity) Complex propositions formed from elementary propositions and standard logical connectives e.g., Weather = sunny  Cavity = false

Syntax Atomic event: A complete specification of the state of the world about which the agent is uncertain E.g., if the world consists of only two Boolean variables Cavity and Toothache, then there are 4 distinct atomic events: Cavity = false  Toothache = false Cavity = false  Toothache = true Cavity = true  Toothache = false Cavity = true  Toothache = true Atomic events are mutually exclusive and exhaustive (often called “Outcomes”) Events in general are sets of atomic events, such as “Cavity = true” Textbook uses P to denote the set of atomic events: P (Cavity, Toothache) called the probability distribution

Axioms of probability For any propositions A, B – 0 ≤ P(A) ≤ 1 – P(true) = 1 and P(false) = 0 – P(A  B) = P(A) + P(B) - P(A  B)

Prior probability Prior or unconditional probabilities of propositions e.g., P(Cavity = true) = 0.1 and P(Weather = sunny) = 0.72 correspond to belief prior to arrival of any (new) evidence Probability distribution gives values for all possible assignments: P(Weather) = (sums to 1) Joint probability distribution for a set of random variables gives the probability of every atomic event on those random variables: P(Weather,Cavity) = a 4 × 2 matrix of values: Weather =sunnyrainycloudysnow Cavity = true Cavity = false Every probability question about a domain can be answered by the joint distribution

Conditional probability Conditional or posterior probabilities e.g., P(cavity | toothache) = 0.8 i.e., given that toothache is all I know (Notation for conditional distributions (use boldface): P(Cavity | Toothache) = 2-element vector of 2-element vectors) If we know more, e.g., cavity is also given, then we have P(cavity | toothache,cavity) = 1 New evidence may be irrelevant, allowing simplification, e.g., cavity does not depend on weather: P(cavity | toothache, sunny) = P(cavity | toothache) = 0.8 This kind of inference, sanctioned by domain knowledge, is crucial for AI reasoning. How does it relate to what we learned earlier in this class ??

Conditional probability Definition of conditional probability: P(a | b) = P(a  b) / P(b) if P(b) > 0 Product rule gives an alternative formulation: P(a  b) = P(a | b) P(b) = P(b | a) P(a) A general version holds for whole distributions, e.g., P(Weather,Cavity) = P(Weather | Cavity) P(Cavity) (View as a set of 8 equations ( 4 Weather× 2 Cavity)

Conditional probability Chain rule is derived by successive application of product rule: P(X 1, …,X n ) = P(X 1,...,X n-1 ) P(X n | X 1,...,X n-1 ) = P(X 1,...,X n-2 ) P(X n-1 | X 1,...,X n-2 ) P(X n | X 1,...,X n-1 ) = … = P(X1) P(X2 | X1) P(X3 | X1, X2)... P(Xn | X1,..., Xn-1) OR: π i= 1 to n P(X i | X 1, …,X i-1 )

Inference by enumeration Start with the joint probability distribution: For any proposition φ, sum the atomic events where it is true: P(φ) = Σ ω:ω ╞ φ P(ω)

Inference by enumeration Start with the joint probability distribution: For any proposition φ, sum the atomic events where it is true: P(φ) = Σ ω:ω ╞ φ P(ω) P(toothache) = = 0.2

Inference by enumeration Start with the joint probability distribution: Can also compute conditional probabilities: P(  cavity | toothache) = P(  cavity  toothache) P(toothache) = = 0.4

In class exercise Given the joint distribution shown below and the definition P(a | b) = P(a  b) / P(b): – What is P(Cavity = True) ? – What is P(Weather = Sunny) ? – What is P(Cavity = True | Weather = Sunny) Given the meta-equation: – P(Weather,Cavity) = P(Weather | Cavity) P(Cavity) What are the 8 equations represented here? Weather = sunny rainycloudy snow Cavity = true Cavity = false

Independence A and B are independent iff P(A|B) = P(A) or P(B|A) = P(B) or P(A, B) = P(A) P(B) P(Toothache, Catch, Cavity, Weather) = P(Toothache, Catch, Cavity) P(Weather) 32 entries reduced to 12; for n independent biased coins, O(2 n ) →O(n) Absolute independence powerful but rare Dentistry is a large field with hundreds of variables, none of which are independent. What to do?

Bayes' Rule Product rule P(a  b) = P(a | b) P(b) = P(b | a) P(a)  Bayes' rule: P(a | b) = P(b | a) P(a) / P(b) or in distribution form P(Y|X) = P(X|Y) P(Y) / P(X) = αP(X|Y) P(Y) Useful for assessing diagnostic probability from causal probability: – P(Cause|Effect) = P(Effect|Cause) P(Cause) / P(Effect) – E.g., let M be meningitis, S be stiff neck: P(m|s) = P(s|m) P(m) / P(s) = 0.8 × / 0.1 = – Note: posterior probability of meningitis still very small!

Example: Expert Systems for Medical Diagnosis 100 diseases (assume only one at a time!) 20 symptoms # of parameters needed to calculate P(Di) when a patient provides his/her symptoms Strategy to reduce the size: assume independence of all symptoms Recalculate number of parameters needed

Bayes' Rule and conditional independence P(Cavity | toothache  catch) = αP(toothache  catch | Cavity) P(Cavity) = αP(toothache | Cavity) P(catch | Cavity) P(Cavity) This is an example of a naïve Bayes model: P(Cause,Effect 1, …,Effect n ) = P(Cause) π i P(Effect i |Cause) Total number of parameters is linear in n

Conditional independence P(Toothache, Cavity, Catch) has 2 3 – 1 = 7 independent entries If I have a cavity, the probability that the probe catches in it doesn't depend on whether I have a toothache: (1) P(catch | toothache, cavity) = P(catch | cavity) The same independence holds if I haven't got a cavity: (2) P(catch | toothache,  cavity) = P(catch |  cavity) Catch is conditionally independent of Toothache given Cavity: P(Catch | Toothache,Cavity) = P(Catch | Cavity) Equivalent statements: P(Toothache | Catch, Cavity) = P(Toothache | Cavity) P(Toothache, Catch | Cavity) = P(Toothache | Cavity) P(Catch | Cavity)

Conditional independence contd. Write out full joint distribution using chain rule: P(Toothache, Catch, Cavity) = P(Toothache | Catch, Cavity) P(Catch, Cavity) = P(Toothache | Catch, Cavity) P(Catch | Cavity) P(Cavity) = P(Toothache | Cavity) P(Catch | Cavity) P(Cavity) I.e., = 5 independent numbers In most cases, the use of conditional independence reduces the size of the representation of the joint distribution from exponential in n to linear in n. Conditional independence is our most basic and robust form of knowledge about uncertain environments.

Bayesian networks A simple, graphical notation for conditional independence assertions and hence for compact specification of full joint distributions Syntax: – a set of nodes, one per variable – a directed, acyclic graph (link ≈ "directly influences") – a conditional distribution for each node given its parents: P (X i | Parents (X i )) In the simplest case, conditional distribution represented as a conditional probability table (CPT) giving the distribution over X i for each combination of parent values