Ch. 14 – Probabilistic Reasoning Supplemental slides for CSE 327 Prof. Jeff Heflin.

Slides:



Advertisements
Similar presentations
Bayesian networks Chapter 14 Section 1 – 2. Outline Syntax Semantics Exact computation.
Advertisements

Bayesian Networks CSE 473. © Daniel S. Weld 2 Last Time Basic notions Atomic events Probabilities Joint distribution Inference by enumeration Independence.
CHAPTER 14 Oliver Schulte Bayesian Networks. Environment Type: Uncertain Artificial Intelligence a modern approach 2 Fully Observable Deterministic Certainty:
Identifying Conditional Independencies in Bayes Nets Lecture 4.
1 22c:145 Artificial Intelligence Bayesian Networks Reading: Ch 14. Russell & Norvig.
Bayesian Networks Chapter 14 Section 1, 2, 4. Bayesian networks A simple, graphical notation for conditional independence assertions and hence for compact.
CPSC 322, Lecture 26Slide 1 Reasoning Under Uncertainty: Belief Networks Computer Science cpsc322, Lecture 27 (Textbook Chpt 6.3) March, 16, 2009.
3/19. Conditional Independence Assertions We write X || Y | Z to say that the set of variables X is conditionally independent of the set of variables.
Review: Bayesian learning and inference
Bayesian Networks. Motivation The conditional independence assumption made by naïve Bayes classifiers may seem to rigid, especially for classification.
Probabilistic Reasoning Copyright, 1996 © Dale Carnegie & Associates, Inc. Chapter 14 (14.1, 14.2, 14.3, 14.4) Capturing uncertain knowledge Probabilistic.
Bayesian networks Chapter 14 Section 1 – 2.
Bayesian Belief Networks
Bayesian Reasoning. Tax Data – Naive Bayes Classify: (_, No, Married, 95K, ?)
1 Probabilistic Belief States and Bayesian Networks (Where we exploit the sparseness of direct interactions among components of a world) R&N: Chap. 14,
Bayesian networks More commonly called graphical models A way to depict conditional independence relationships between random variables A compact specification.
Probabilistic Reasoning
EE562 ARTIFICIAL INTELLIGENCE FOR ENGINEERS
Bayes Nets and Probabilities
Bayesian Networks Material used 1 Random variables
Artificial Intelligence CS 165A Tuesday, November 27, 2007  Probabilistic Reasoning (Ch 14)
Uncertain KR&R Chapter 9. 2 Outline Probability Bayesian networks Fuzzy logic.
Bayesian networks Chapter 14 Section 1 – 2. Bayesian networks A simple, graphical notation for conditional independence assertions and hence for compact.
An Introduction to Artificial Intelligence Chapter 13 & : Uncertainty & Bayesian Networks Ramin Halavati
B AYESIAN N ETWORKS. S IGNIFICANCE OF C ONDITIONAL INDEPENDENCE Consider Grade(CS101), Intelligence, and SAT Ostensibly, the grade in a course doesn’t.
Probabilistic Belief States and Bayesian Networks (Where we exploit the sparseness of direct interactions among components of a world) R&N: Chap. 14, Sect.
Bayesian networks. Motivation We saw that the full joint probability can be used to answer any question about the domain, but can become intractable as.
Ch. 14 – Probabilistic Reasoning Supplemental slides for CSE 327 Prof. Jeff Heflin.
Introduction to Bayesian Networks
An Introduction to Artificial Intelligence Chapter 13 & : Uncertainty & Bayesian Networks Ramin Halavati
Announcements  Office hours this week Tuesday (as usual) and Wednesday  HW6 posted, due Monday 10/20.
Probabilistic Reasoning [Ch. 14] Bayes Networks – Part 1 ◦Syntax ◦Semantics ◦Parameterized distributions Inference – Part2 ◦Exact inference by enumeration.
Marginalization & Conditioning Marginalization (summing out): for any sets of variables Y and Z: Conditioning(variant of marginalization):
Bayesian Networks CSE 473. © D. Weld and D. Fox 2 Bayes Nets In general, joint distribution P over set of variables (X 1 x... x X n ) requires exponential.
Review: Bayesian inference  A general scenario:  Query variables: X  Evidence (observed) variables and their values: E = e  Unobserved variables: Y.
CS B 553: A LGORITHMS FOR O PTIMIZATION AND L EARNING Bayesian Networks.
1 Probability FOL fails for a domain due to: –Laziness: too much to list the complete set of rules, too hard to use the enormous rules that result –Theoretical.
Conditional Probability, Bayes’ Theorem, and Belief Networks CISC 2315 Discrete Structures Spring2010 Professor William G. Tanner, Jr.
CPSC 322, Lecture 26Slide 1 Reasoning Under Uncertainty: Belief Networks Computer Science cpsc322, Lecture 27 (Textbook Chpt 6.3) Nov, 13, 2013.
Artificial Intelligence Universitatea Politehnica Bucuresti Adina Magda Florea
Conditional Independence As with absolute independence, the equivalent forms of X and Y being conditionally independent given Z can also be used: P(X|Y,
PROBABILISTIC REASONING Heng Ji 04/05, 04/08, 2016.
Chapter 12. Probability Reasoning Fall 2013 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University.
CS 2750: Machine Learning Bayesian Networks Prof. Adriana Kovashka University of Pittsburgh March 14, 2016.
Bayesian Networks CS182/CogSci110/Ling109 Spring 2007 Leon Barrett a.k.a. belief nets, bayes nets.
CMPT 726 Simon Fraser University CHAPTER 14 Oliver Schulte
CS 2750: Machine Learning Review
CS 2750: Machine Learning Directed Graphical Models
Bayesian Networks Chapter 14 Section 1, 2, 4.
Bayesian networks Chapter 14 Section 1 – 2.
Presented By S.Yamuna AP/CSE
Qian Liu CSE spring University of Pennsylvania
CS b553: Algorithms for Optimization and Learning
Computer Science Department
Conditional Probability, Bayes’ Theorem, and Belief Networks
Bayesian Networks Probability In AI.
CSCI 121 Special Topics: Bayesian Networks Lecture #2: Bayes Nets
Supplemental slides for CSE 327 Prof. Jeff Heflin
Structure and Semantics of BN
CS 188: Artificial Intelligence
CAP 5636 – Advanced Artificial Intelligence
CSE 473: Artificial Intelligence Autumn 2011
CAP 5636 – Advanced Artificial Intelligence
CS 188: Artificial Intelligence
Announcements Midterm: Wednesday 7pm-9pm
Structure and Semantics of BN
Bayesian networks Chapter 14 Section 1 – 2.
Probabilistic Reasoning
Warm-up as you walk in Each node in a Bayes net represents a conditional probability distribution. What distribution do you get when you multiply all of.
Presentation transcript:

Ch. 14 – Probabilistic Reasoning Supplemental slides for CSE 327 Prof. Jeff Heflin

Conditional Independence if effects E 1,E 2,…,E n are conditionally independent given cause C can be used to factor joint distributions P(Weather,Cavity,Toothache,Catch) = P(Weather)P(Cavity,Toothache,Catch) = P(Weather)P(Cavity)P(Toothache|Cavity)P(Catch|Cavity)

Bayes Net Example P(M|A) 0.70 A T F0.01 P(J|A) 0.90 A T F0.05 P(B) Burglary Earthquake Alarm JohnCalls MaryCalls P(E) P(A|B,E) 0.95 E T F B T T T F0.001 F F From Fig. 14.2, p. 512

Global Semantics atomic event using a Bayesian Network atomic event using the chain rule P(b,  e,a, j,  m) = P(b)P(  e|b)P(a|b,  e)P(j| b,  e,a)P(  m| b,  e,a,j) P(b,  e,a, j,  m) = P(b)P(  e)P(a|b,  e)P(j|a)P(  m|a)

Local Semantics Each node is conditionally independent of its non-descendants, given its parents. U1U1 U2U2 XZ1Z1 Y1Y1 Z2Z2 Y2Y2 P(X|U 1,U 2,Z 1,Z 2 ) = P(X| U 1,U 2 )

Bayes Net Inference P(b|j,  m)=αP(b)[P(e)[P(a|b,e)P(j|a)P(  m|a) + P(  a|b,e)P(j|  a)P(  m|  a)] + P(  e)[P(a|b,  e)P(j|a)P(  m|a) + P(  a|b,  e)P(j|  a)P(  m|  a)] Formula: Example:

Tree of Inference Calculations + ++ P(b)=.001 P(e)=.002 P(  e)=.998 P(a|b,e)=.95 P(  a|b,e)=.05P(a|b,  e)=.94P(  a|b,  e)=.06 P(j|a)=.90 P(  m|a)=.30 P(j|  a)=.05 P(  m|  a)=.99 P(j|a)=.90 P(  m|a)=.30 P(j|  a)=.05 P(  m|  a)=.99

Calculating P(b|j,  m) and P(  b|j,  m) P(b|j,  m)=αP(b)[P(e)[P(a|b,e)P(j|a)P(  m|a) + P(  a|b,e)P(j|  a)P(  m|  a)] + P(  e)[P(a|b,  e)P(j|a)P(  m|a) + P(  a|b,  e)P(j|  a)P(  m|  a)]] = α(0.001)[(0.002)[(0.95)(0.9)(0.3) + (0.05)(0.05)(0.99)] + (0.998)[(0.94)(0.9)(0.3) + (0.06)(0.05)(0.99)]] = α(0.001)[(0.002)[ ] + (0.998)[ ]] = α(0.001)[(0.002)( ) + (0.998)( )] = α(0.001)[ ] = α(0.001)( ) = α( ) P(  b|j,  m)=αP(  b)[P(e)[P(a|  b,e)P(j|a)P(  m|a) + P(  a|  b,e)P(j|  a)P(  m|  a)] + P(  e)[P(a|  b,  e)P(j|a)P(  m|a) + P(  a|  b,  e)P(j|  a)P(  m|  a)]] = α(0.999)[(0.002)[(0.29)(0.9)(0.3) + (0.71)(0.05)(0.99)] + (0.998)[(0.001)(0.9)(0.3) + (0.999)(0.05)(0.99)]] = α(0.999)[(0.002)[ ] + (0.998)[ ]] = α(0.999)[(0.002)( ) + (0.998)( )] = α(0.999)[ ] = α(0.999)( ) = α( )

Normalizing the Answer P(b|j,  m) = α( ) P(  b|j,  m) = α( ) α = 1 / ( ) α = 1 / α  P(b|j,  m)  ( )( )  P(  b|j,  m)  ( ) ( )  P(B|j,  m) =