Bayesian networks. Motivation We saw that the full joint probability can be used to answer any question about the domain, but can become intractable as.

Slides:



Advertisements
Similar presentations
Bayesian networks Chapter 14 Section 1 – 2. Outline Syntax Semantics Exact computation.
Advertisements

Probabilistic Reasoning Bayesian Belief Networks Constructing Bayesian Networks Representing Conditional Distributions Summary.
BAYESIAN NETWORKS. Bayesian Network Motivation  We want a representation and reasoning system that is based on conditional independence  Compact yet.
Identifying Conditional Independencies in Bayes Nets Lecture 4.
For Monday Finish chapter 14 Homework: –Chapter 13, exercises 8, 15.
1 22c:145 Artificial Intelligence Bayesian Networks Reading: Ch 14. Russell & Norvig.
Bayesian Networks Chapter 14 Section 1, 2, 4. Bayesian networks A simple, graphical notation for conditional independence assertions and hence for compact.
CPSC 322, Lecture 26Slide 1 Reasoning Under Uncertainty: Belief Networks Computer Science cpsc322, Lecture 27 (Textbook Chpt 6.3) March, 16, 2009.
Review: Bayesian learning and inference
Bayesian Networks. Motivation The conditional independence assumption made by naïve Bayes classifiers may seem to rigid, especially for classification.
Artificial Intelligence Probabilistic reasoning Fall 2008 professor: Luigi Ceccaroni.
Probabilistic Reasoning Copyright, 1996 © Dale Carnegie & Associates, Inc. Chapter 14 (14.1, 14.2, 14.3, 14.4) Capturing uncertain knowledge Probabilistic.
1 Data Mining with Bayesian Networks (I) Instructor: Qiang Yang Hong Kong University of Science and Technology Thanks: Dan Weld, Eibe.
Bayesian networks Chapter 14 Section 1 – 2.
CSCI 5582 Fall 2006 CSCI 5582 Artificial Intelligence Lecture 14 Jim Martin.
Bayesian Belief Networks
Bayesian Belief Network. The decomposition of large probabilistic domains into weakly connected subsets via conditional independence is one of the most.
1 Bayesian Reasoning Chapter 13 CMSC 471 Adapted from slides by Tim Finin and Marie desJardins.
University College Cork (Ireland) Department of Civil and Environmental Engineering Course: Engineering Artificial Intelligence Dr. Radu Marinescu Lecture.
Bayesian Networks What is the likelihood of X given evidence E? i.e. P(X|E) = ?
5/25/2005EE562 EE562 ARTIFICIAL INTELLIGENCE FOR ENGINEERS Lecture 16, 6/1/2005 University of Washington, Department of Electrical Engineering Spring 2005.
Bayesian networks practice. Semantics e.g., P(j  m  a   b   e) = P(j | a) P(m | a) P(a |  b,  e) P(  b) P(  e) = … Suppose we have the variables.
Bayesian Reasoning. Tax Data – Naive Bayes Classify: (_, No, Married, 95K, ?)
Bayesian networks More commonly called graphical models A way to depict conditional independence relationships between random variables A compact specification.
Probabilistic Reasoning
EE562 ARTIFICIAL INTELLIGENCE FOR ENGINEERS
Bayesian Networks Material used 1 Random variables
Artificial Intelligence CS 165A Tuesday, November 27, 2007  Probabilistic Reasoning (Ch 14)
Bayesian networks Chapter 14. Outline Syntax Semantics.
Bayesian networks Chapter 14 Section 1 – 2. Bayesian networks A simple, graphical notation for conditional independence assertions and hence for compact.
An Introduction to Artificial Intelligence Chapter 13 & : Uncertainty & Bayesian Networks Ramin Halavati
CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes March 13, 2012.
Probabilistic Belief States and Bayesian Networks (Where we exploit the sparseness of direct interactions among components of a world) R&N: Chap. 14, Sect.
1 Chapter 14 Probabilistic Reasoning. 2 Outline Syntax of Bayesian networks Semantics of Bayesian networks Efficient representation of conditional distributions.
2 Syntax of Bayesian networks Semantics of Bayesian networks Efficient representation of conditional distributions Exact inference by enumeration Exact.
Baye’s Rule.
1 Monte Carlo Artificial Intelligence: Bayesian Networks.
An Introduction to Artificial Intelligence Chapter 13 & : Uncertainty & Bayesian Networks Ramin Halavati
Probabilistic Reasoning [Ch. 14] Bayes Networks – Part 1 ◦Syntax ◦Semantics ◦Parameterized distributions Inference – Part2 ◦Exact inference by enumeration.
Marginalization & Conditioning Marginalization (summing out): for any sets of variables Y and Z: Conditioning(variant of marginalization):
Bayesian Networks CSE 473. © D. Weld and D. Fox 2 Bayes Nets In general, joint distribution P over set of variables (X 1 x... x X n ) requires exponential.
Review: Bayesian inference  A general scenario:  Query variables: X  Evidence (observed) variables and their values: E = e  Unobserved variables: Y.
CPSC 7373: Artificial Intelligence Lecture 5: Probabilistic Inference Jiang Bian, Fall 2012 University of Arkansas at Little Rock.
Uncertainty Let action A t = leave for airport t minutes before flight Will A t get me there on time? Problems: 1.partial observability (road state, other.
1 Probability FOL fails for a domain due to: –Laziness: too much to list the complete set of rules, too hard to use the enormous rules that result –Theoretical.
Belief Networks Kostas Kontogiannis E&CE 457. Belief Networks A belief network is a graph in which the following holds: –A set of random variables makes.
CPSC 322, Lecture 26Slide 1 Reasoning Under Uncertainty: Belief Networks Computer Science cpsc322, Lecture 27 (Textbook Chpt 6.3) Nov, 13, 2013.
Bayesian networks Chapter 14 Slide Set 2. Constructing Bayesian networks 1. Choose an ordering of variables X 1, …,X n 2. For i = 1 to n –add X i to the.
Conditional Independence As with absolute independence, the equivalent forms of X and Y being conditionally independent given Z can also be used: P(X|Y,
PROBABILISTIC REASONING Heng Ji 04/05, 04/08, 2016.
Chapter 12. Probability Reasoning Fall 2013 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University.
CS 2750: Machine Learning Bayesian Networks Prof. Adriana Kovashka University of Pittsburgh March 14, 2016.
Web-Mining Agents Data Mining Prof. Dr. Ralf Möller Universität zu Lübeck Institut für Informationssysteme Karsten Martiny (Übungen)
A Brief Introduction to Bayesian networks
Another look at Bayesian inference
Reasoning Under Uncertainty: Belief Networks
CS 2750: Machine Learning Directed Graphical Models
Bayesian Networks Chapter 14 Section 1, 2, 4.
Bayesian networks Chapter 14 Section 1 – 2.
Presented By S.Yamuna AP/CSE
Read R&N Ch Next lecture: Read R&N
Bayesian Networks Probability In AI.
Read R&N Ch Next lecture: Read R&N
Probabilistic Reasoning; Network-based reasoning
CS 188: Artificial Intelligence Fall 2007
Belief Networks CS121 – Winter 2003 Belief Networks.
Bayesian networks Chapter 14 Section 1 – 2.
Probabilistic Reasoning
Read R&N Ch Next lecture: Read R&N
Presentation transcript:

Bayesian networks

Motivation We saw that the full joint probability can be used to answer any question about the domain, but can become intractable as the number of variables grow. Furthermore specifying probabilities of atomic events is rather unnatural and can be very difficult unless a large amount of data is available from which to gather statistics. Human performance, by contrast, exhibits a different complexity ordering: probabilistic judgments on conditional statements involving a small number of propositions are issued swiftly and reliably, while judging the likelihood of a conjuction of many propositions is done with a great degree of difficulty and hesitancy. This suggests that the elementary building blocks which make up human knowledge aren’t the entries of joint probability table but, rather, the low- order conditional probabilities defined over small clusters of propositions.

Bayesian networks A simple, graphical notation for conditional independence assertions and hence for compact specification of full joint distributions. Syntax: –a set of nodes, one per variable –a directed, acyclic graph (a link means: "directly influences") –a conditional distribution for each node given its parents: P (X i | Parents (X i )) The conditional distribution is represented as a conditional probability table (CPT) giving the distribution over X i for each combination of parent values.

Example Topology of network encodes conditional independence assertions: Weather is independent of the other variables Toothache and Catch are conditionally independent given Cavity, which is indicated by the absence of a link between them.

Another Example I'm at work, neighbor John calls to say my alarm is ringing, but neighbor Mary doesn't call. Sometimes it's set off by minor earthquakes. Is there a burglar? Variables: Burglary, Earthquake, Alarm, JohnCalls, MaryCalls Network topology reflects "causal" knowledge: –A burglar can set the alarm off –An earthquake can set the alarm off –The alarm can cause Mary to call –The alarm can cause John to call

Example cont’d The topology shows that burglary and earthquakes directly affect the probability of alarm, but whether Mary or John call depends only on the alarm. Thus our assumptions are that they don’t perceive any burglaries directly, and they don’t confer before calling.

Compactness of Conditional Probability Tables (CPT’s) A CPT for Boolean X i with k Boolean parents has 2 k rows for the combinations of parent values Each row requires one number p for X i = true (the number for X i = false is just 1-p) If each variable has no more than k parents, the complete network requires O(n · 2 k ) numbers I.e., grows linearly with n, vs. O(2 n ) for the full joint distribution For burglary net, = 10 numbers (vs = 31)

Semantics The full joint distribution is defined as the product of the local conditional distributions: P(x 1, …,x n ) =  i = 1 P(x i | parents(x i )) e.g., P(j  m  a   b   e) = P(j | a) P(m | a) P(a |  b,  e) P(  b) P(  e) = …

Constructing Bayesian networks 1. Choose an ordering of variables X 1, …,X n 2. For i = 1 to n –add X i to the network –select parents from X 1, …,X i-1 such that P(X i | Parents(X i )) = P(X i | X 1,... X i-1 ) This choice of parents guarantees: P(X 1, …,X n ) =  i =1 P(X i | X 1, …, X i-1 ) (chain rule) =  i =1 P(X i | Parents(X i )) (by construction)

The ordering of variables is very important. E.g. suppose we choose the ordering M, J, A, B, E Adding MaryCalls: No parents P (J|M) = P (J)? Is P(John calling) independent of P(Mary calling)? Clearly not, since, on any given day, if Mary called, then the probability that John called is much better than the background probability that he called. So, we add a link from MaryCalls to JohnCalls. Example

Suppose we choose the ordering M, J, A, B, E Adding the A (Alarm) node: Is P(A | J, M) = P(A | J)? P(A | J, M) = P(A)? No. Clearly, if both call, it’s more likely that the alarm has gone off that if just one or neither call, so we need both MaryCalls and JohnCalls as parents. Example

Suppose we choose the ordering M, J, A, B, E Adding B (Burglary) node: Is P(B | A, J, M) = P(B | A)? P(B | A, J, M) = P(B)? Yes for the first. No for the second. If we know the alarm state, then the call from John or Mary might give us information about the phone ringing or Mary’s music, but not about burglary. So, we need just Alarm as parent. Example

Suppose we choose the ordering M, J, A, B, E Adding E (Earthquake) node: Is P(E | B, A,J, M) = P(E | A)? P(E | B, A, J, M) = P(E | A, B)? No for the first. Yes for the second. If the alarm is on, it is more likely that there has been an earthquake. But if we know there has been a burglary, then that explains the alarm, and the probability of an earthquake would be only slightly above normal. Hence we need both Alarm and Burglary as parents. Example

Example cont’d So, the n etwork is less compact if we go non-causal: = 13 numbers needed instead of 10 if we go in causal direction. Deciding conditional independence is hard in noncausal directions Causal models and conditional independence seem hardwired for humans!

So… Bayesian networks provide a natural representation for (causally induced) conditional independence Topology + CPT’s = compact representation of joint distribution Generally easy for domain experts to construct

Noisy-OR Even if the maximum number of parents k is small, filling the CPT for a node is tedius. Uncertain relationships can often be characterized by so-called “noisy OR” relation, which is generalization of logical OR. In propositional logic, we might say that Fever is true if Cold, Flu, or Malaria is true. The noisy-OR allows for uncertainty about the ability of each parent cause to cause the child to be true. –The causal relationship may be inhibited, and so a patient could have cold, but not exhibit fever. So, suppose we managed to find out the probabilities of inhibitions: P(  fever | cold,  flu,  malaria) = 0.6 P(  fever |  cold, flu,  malaria) = 0.2 P(  fever |  cold,  flu, malaria) = 0.1 Now, we can easily construct the “truth” table:

Observe that by using the “noisy-OR” we needed to specify 3 entries only instead of 8. In general, for k parents, we need to specify k entries instead of 2 k.

Exact Inference in Bayesian Networks The basic task for any probabilistic inference system is to compute the posterior probability for a set of query variables, given some observed event – that is, some assignment of values to a set of evidence variables. Notation: –X denotes query variable –E denotes the set of evidence variables E 1,…,E m, and e is a particular event, i.e. an assignment to the variables in E. –Y will denote the set of the remaining variables (hidden variables). A typical query asks for the posterior probability P(X=x|e), i.e. P(x|e 1,…,e m ). E.g. We could ask: What’s the probability of a burglary if both Mary and John calls, P(burglary | johhcalls, marycalls)?

Inference by enumeration Slightly intelligent way to sum out variables from the joint without actually constructing its explicit representation

Numerically… P(b | j,m) =  P(b)  e P(e)  a P(a|b,e)P(j|a)P(m|a) = …=  * P(  b | j,m) =  P(  b)  e P(e)  a P(a|  b,e)P(j|a)P(m|a) = …=  * P(B | j,m) =  =. Complete it for exercise