PROBABILISTIC REASONING Heng Ji 04/05, 04/08, 2016.

Slides:



Advertisements
Similar presentations
Bayesian networks Chapter 14 Section 1 – 2. Outline Syntax Semantics Exact computation.
Advertisements

Probabilistic Reasoning Bayesian Belief Networks Constructing Bayesian Networks Representing Conditional Distributions Summary.
BAYESIAN NETWORKS. Bayesian Network Motivation  We want a representation and reasoning system that is based on conditional independence  Compact yet.
Identifying Conditional Independencies in Bayes Nets Lecture 4.
1 22c:145 Artificial Intelligence Bayesian Networks Reading: Ch 14. Russell & Norvig.
Bayesian Networks Chapter 14 Section 1, 2, 4. Bayesian networks A simple, graphical notation for conditional independence assertions and hence for compact.
CPSC 322, Lecture 26Slide 1 Reasoning Under Uncertainty: Belief Networks Computer Science cpsc322, Lecture 27 (Textbook Chpt 6.3) March, 16, 2009.
Review: Bayesian learning and inference
Bayesian Networks. Motivation The conditional independence assumption made by naïve Bayes classifiers may seem to rigid, especially for classification.
1 Data Mining with Bayesian Networks (I) Instructor: Qiang Yang Hong Kong University of Science and Technology Thanks: Dan Weld, Eibe.
Bayesian networks Chapter 14 Section 1 – 2.
Bayesian Belief Networks
Bayesian Belief Network. The decomposition of large probabilistic domains into weakly connected subsets via conditional independence is one of the most.
University College Cork (Ireland) Department of Civil and Environmental Engineering Course: Engineering Artificial Intelligence Dr. Radu Marinescu Lecture.
5/25/2005EE562 EE562 ARTIFICIAL INTELLIGENCE FOR ENGINEERS Lecture 16, 6/1/2005 University of Washington, Department of Electrical Engineering Spring 2005.
CS 188: Artificial Intelligence Fall 2009 Lecture 14: Bayes’ Nets 10/13/2009 Dan Klein – UC Berkeley.
Bayesian Reasoning. Tax Data – Naive Bayes Classify: (_, No, Married, 95K, ?)
Bayesian networks More commonly called graphical models A way to depict conditional independence relationships between random variables A compact specification.
Probabilistic Reasoning
EE562 ARTIFICIAL INTELLIGENCE FOR ENGINEERS
Advanced Artificial Intelligence
Announcements  Midterm 1  Graded midterms available on pandagrader  See Piazza for post on how fire alarm is handled  Project 4: Ghostbusters  Due.
Bayesian Networks Material used 1 Random variables
Artificial Intelligence CS 165A Tuesday, November 27, 2007  Probabilistic Reasoning (Ch 14)
Bayes’ Nets  A Bayes’ net is an efficient encoding of a probabilistic model of a domain  Questions we can ask:  Inference: given a fixed BN, what is.
Bayesian networks Chapter 14. Outline Syntax Semantics.
Bayesian Networks Tamara Berg CS Artificial Intelligence Many slides throughout the course adapted from Svetlana Lazebnik, Dan Klein, Stuart Russell,
Bayesian networks Chapter 14 Section 1 – 2. Bayesian networks A simple, graphical notation for conditional independence assertions and hence for compact.
An Introduction to Artificial Intelligence Chapter 13 & : Uncertainty & Bayesian Networks Ramin Halavati
Probabilistic Belief States and Bayesian Networks (Where we exploit the sparseness of direct interactions among components of a world) R&N: Chap. 14, Sect.
Bayesian networks. Motivation We saw that the full joint probability can be used to answer any question about the domain, but can become intractable as.
1 Chapter 14 Probabilistic Reasoning. 2 Outline Syntax of Bayesian networks Semantics of Bayesian networks Efficient representation of conditional distributions.
2 Syntax of Bayesian networks Semantics of Bayesian networks Efficient representation of conditional distributions Exact inference by enumeration Exact.
1 Monte Carlo Artificial Intelligence: Bayesian Networks.
An Introduction to Artificial Intelligence Chapter 13 & : Uncertainty & Bayesian Networks Ramin Halavati
Suppose we choose the ordering M, J, A, B, E P(J | M) = P(J)? Example.
Announcements Project 4: Ghostbusters Homework 7
Probabilistic Reasoning [Ch. 14] Bayes Networks – Part 1 ◦Syntax ◦Semantics ◦Parameterized distributions Inference – Part2 ◦Exact inference by enumeration.
Marginalization & Conditioning Marginalization (summing out): for any sets of variables Y and Z: Conditioning(variant of marginalization):
Review: Bayesian inference  A general scenario:  Query variables: X  Evidence (observed) variables and their values: E = e  Unobserved variables: Y.
CPSC 7373: Artificial Intelligence Lecture 5: Probabilistic Inference Jiang Bian, Fall 2012 University of Arkansas at Little Rock.
1 Probability FOL fails for a domain due to: –Laziness: too much to list the complete set of rules, too hard to use the enormous rules that result –Theoretical.
CPSC 322, Lecture 26Slide 1 Reasoning Under Uncertainty: Belief Networks Computer Science cpsc322, Lecture 27 (Textbook Chpt 6.3) Nov, 13, 2013.
Chapter 12. Probability Reasoning Fall 2013 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University.
CS 2750: Machine Learning Bayesian Networks Prof. Adriana Kovashka University of Pittsburgh March 14, 2016.
Web-Mining Agents Data Mining Prof. Dr. Ralf Möller Universität zu Lübeck Institut für Informationssysteme Karsten Martiny (Übungen)
Another look at Bayesian inference
Reasoning Under Uncertainty: Belief Networks
CS 188: Artificial Intelligence Spring 2007
CS 2750: Machine Learning Directed Graphical Models
Bayesian Networks Chapter 14 Section 1, 2, 4.
Bayesian networks Chapter 14 Section 1 – 2.
Presented By S.Yamuna AP/CSE
Bayesian networks (1) Lirong Xia Spring Bayesian networks (1) Lirong Xia Spring 2017.
CS 4/527: Artificial Intelligence
CS 4/527: Artificial Intelligence
Read R&N Ch Next lecture: Read R&N
Probabilistic Reasoning; Network-based reasoning
CS 188: Artificial Intelligence
CAP 5636 – Advanced Artificial Intelligence
CSE 473: Artificial Intelligence Autumn 2011
CS 188: Artificial Intelligence Fall 2007
CS 188: Artificial Intelligence Fall 2008
CAP 5636 – Advanced Artificial Intelligence
CS 188: Artificial Intelligence
Bayesian networks Chapter 14 Section 1 – 2.
Probabilistic Reasoning
Read R&N Ch Next lecture: Read R&N
Bayesian networks (1) Lirong Xia. Bayesian networks (1) Lirong Xia.
Presentation transcript:

PROBABILISTIC REASONING Heng Ji 04/05, 04/08, 2016

Bayesian networks More commonly called graphical models A way to depict conditional independence relationships between random variables A compact specification of full joint distributions

Structure Nodes: random variables Can be assigned (observed) or unassigned (unobserved) Arcs: interactions An arrow from one variable to another indicates direct influence Encode conditional independence Weather is independent of the other variables Toothache and Catch are conditionally independent given Cavity Must form a directed, acyclic graph

Example: N independent coin flips Complete independence: no interactions X1X1 X2X2 XnXn …

Example: Traffic Variables: R: It rains T: There is traffic Model 1: independence Why is an agent using model 2 better? R T R T  Model 2: rain causes traffic

Let’s build a causal graphical model! Variables T: Traffic R: It rains L: Low pressure D: Roof drips B: Ballgame C: Cavity Example: Traffic II

Example: Alarm Network Variables B: Burglary A: Alarm goes off M: Mary calls J: John calls E: Earthquake!

Example: Naïve Bayes spam filter Random variables: C: message class (spam or not spam) W 1, …, W n : words comprising the message W1W1 W2W2 WnWn … C

Example: Burglar Alarm I have a burglar alarm that is sometimes set off by minor earthquakes. My two neighbors, John and Mary, promised to call me at work if they hear the alarm Example inference task: suppose Mary calls and John doesn’t call. What is the probability of a burglary? What are the random variables? Burglary, Earthquake, Alarm, JohnCalls, MaryCalls What are the direct influence relationships? A burglar can set the alarm off An earthquake can set the alarm off The alarm can cause Mary to call The alarm can cause John to call

Example: Burglar Alarm What are the model parameters?

Conditional probability distributions To specify the full joint distribution, we need to specify a conditional distribution for each node given its parents: P (X | Parents(X)) Z1Z1 Z2Z2 ZnZn X … P (X | Z 1, …, Z n )

Example: Burglar Alarm

The joint probability distribution For each node X i, we know P(X i | Parents(X i )) How do we get the full joint distribution P(X 1, …, X n )? Using chain rule: For example, P(j, m, a,  b,  e) = P(  b) P(  e) P(a |  b,  e) P(j | a) P(m | a)

Conditional independence Key assumption: X is conditionally independent of every non- descendant node given its parents Example: causal chain Are X and Z independent? Is Z independent of X given Y?

Conditional independence Common cause Are X and Z independent? No Are they conditionally independent given Y? Yes Common effect Are X and Z independent? Yes Are they conditionally independent given Y? No

Compactness Suppose we have a Boolean variable X i with k Boolean parents. How many rows does its conditional probability table have? 2 k rows for all the combinations of parent values Each row requires one number p for X i = true If each variable has no more than k parents, how many numbers does the complete network require? O(n · 2 k ) numbers – vs. O(2 n ) for the full joint distribution How many nodes for the burglary network? = 10 numbers (vs = 31)

Constructing Bayesian networks 1. Choose an ordering of variables X 1, …, X n 2. For i = 1 to n add X i to the network select parents from X 1, …,X i-1 such that P(X i | Parents(X i )) = P(X i | X 1,... X i-1 )

Suppose we choose the ordering M, J, A, B, E P(J | M) = P(J)? Example

Suppose we choose the ordering M, J, A, B, E P(J | M) = P(J)?No Example

Suppose we choose the ordering M, J, A, B, E P(J | M) = P(J)?No P(A | J, M) = P(A)? P(A | J, M) = P(A | J)? P(A | J, M) = P(A | M)? Example

Suppose we choose the ordering M, J, A, B, E P(J | M) = P(J)?No P(A | J, M) = P(A)?No P(A | J, M) = P(A | J)?No P(A | J, M) = P(A | M)?No Example

Suppose we choose the ordering M, J, A, B, E P(J | M) = P(J)?No P(A | J, M) = P(A)?No P(A | J, M) = P(A | J)?No P(A | J, M) = P(A | M)?No P(B | A, J, M) = P(B)? P(B | A, J, M) = P(B | A)? Example

Suppose we choose the ordering M, J, A, B, E P(J | M) = P(J)?No P(A | J, M) = P(A)?No P(A | J, M) = P(A | J)?No P(A | J, M) = P(A | M)?No P(B | A, J, M) = P(B)? No P(B | A, J, M) = P(B | A)?Yes Example

Suppose we choose the ordering M, J, A, B, E P(J | M) = P(J)?No P(A | J, M) = P(A)?No P(A | J, M) = P(A | J)?No P(A | J, M) = P(A | M)?No P(B | A, J, M) = P(B)? No P(B | A, J, M) = P(B | A)?Yes P(E | B, A,J, M) = P(E)? P(E | B, A, J, M) = P(E | A, B)? Example

Suppose we choose the ordering M, J, A, B, E P(J | M) = P(J)?No P(A | J, M) = P(A)?No P(A | J, M) = P(A | J)?No P(A | J, M) = P(A | M)?No P(B | A, J, M) = P(B)? No P(B | A, J, M) = P(B | A)?Yes P(E | B, A,J, M) = P(E)?No P(E | B, A, J, M) = P(E | A, B)?Yes Example

Example contd. Deciding conditional independence is hard in noncausal directions The causal direction seems much more natural Network is less compact: = 13 numbers needed

A more realistic Bayes Network: Car diagnosis Initial observation: car won’t start Orange: “broken, so fix it” nodes Green: testable evidence Gray: “hidden variables” to ensure sparse structure, reduce parameteres

Car insurance

In research literature… Causal Protein-Signaling Networks Derived from Multiparameter Single-Cell Data Karen Sachs, Omar Perez, Dana Pe'er, Douglas A. Lauffenburger, and Garry P. Nolan (22 April 2005) Science 308 (5721), 523.

In research literature… Describing Visual Scenes Using Transformed Objects and Parts E. Sudderth, A. Torralba, W. T. Freeman, and A. Willsky. International Journal of Computer Vision, No. 1-3, May 2008, pp

Summary Bayesian networks provide a natural representation for (causally induced) conditional independence Topology + conditional probability tables Generally easy for domain experts to construct

Exercise Compute: Prob(D=T) Prob(D=F,C=T) Prob(A=T|C=T) Prob(A=T|D=F) Prob(A=T,D=T|B=F) Prob(C=T | A=F, E=T)