B AYESIAN N ETWORKS. S IGNIFICANCE OF C ONDITIONAL INDEPENDENCE Consider Grade(CS101), Intelligence, and SAT Ostensibly, the grade in a course doesn’t.

Slides:



Advertisements
Similar presentations
Bayesian networks Chapter 14 Section 1 – 2. Outline Syntax Semantics Exact computation.
Advertisements

Bayesian Networks CSE 473. © Daniel S. Weld 2 Last Time Basic notions Atomic events Probabilities Joint distribution Inference by enumeration Independence.
P ROBABILISTIC I NFERENCE. A GENDA Conditional probability Independence Intro to Bayesian Networks.
Belief networks Conditional independence Syntax and semantics Exact inference Approximate inference CS 460, Belief Networks1 Mundhenk and Itti Based.
1 Bayesian Networks Slides from multiple sources: Weng-Keen Wong, School of Electrical Engineering and Computer Science, Oregon State University.
Identifying Conditional Independencies in Bayes Nets Lecture 4.
1 22c:145 Artificial Intelligence Bayesian Networks Reading: Ch 14. Russell & Norvig.
Bayesian Networks Chapter 14 Section 1, 2, 4. Bayesian networks A simple, graphical notation for conditional independence assertions and hence for compact.
I NFERENCE IN B AYESIAN N ETWORKS. A GENDA Reading off independence assumptions Efficient inference in Bayesian Networks Top-down inference Variable elimination.
CPSC 322, Lecture 26Slide 1 Reasoning Under Uncertainty: Belief Networks Computer Science cpsc322, Lecture 27 (Textbook Chpt 6.3) March, 16, 2009.
Review: Bayesian learning and inference
Bayesian Networks. Motivation The conditional independence assumption made by naïve Bayes classifiers may seem to rigid, especially for classification.
Bayesian networks Chapter 14 Section 1 – 2.
CSCI 5582 Fall 2006 CSCI 5582 Artificial Intelligence Lecture 14 Jim Martin.
Bayesian Belief Networks
Bayesian Networks Russell and Norvig: Chapter 14 CMCS424 Fall 2003 based on material from Jean-Claude Latombe, Daphne Koller and Nir Friedman.
Belief Networks Russell and Norvig: Chapter 15 CS121 – Winter 2002.
Bayesian Reasoning. Tax Data – Naive Bayes Classify: (_, No, Married, 95K, ?)
1 Probabilistic Belief States and Bayesian Networks (Where we exploit the sparseness of direct interactions among components of a world) R&N: Chap. 14,
Bayesian Networks Russell and Norvig: Chapter 14 CMCS421 Fall 2006.
Bayesian networks More commonly called graphical models A way to depict conditional independence relationships between random variables A compact specification.
Probabilistic Reasoning
EE562 ARTIFICIAL INTELLIGENCE FOR ENGINEERS
Artificial Intelligence CS 165A Tuesday, November 27, 2007  Probabilistic Reasoning (Ch 14)
Read R&N Ch Next lecture: Read R&N
Bayesian networks Chapter 14. Outline Syntax Semantics.
Bayesian networks Chapter 14 Section 1 – 2. Bayesian networks A simple, graphical notation for conditional independence assertions and hence for compact.
An Introduction to Artificial Intelligence Chapter 13 & : Uncertainty & Bayesian Networks Ramin Halavati
CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes March 13, 2012.
B AYESIAN N ETWORKS. S OME A PPLICATIONS OF BN Medical diagnosis Troubleshooting of hardware/software systems Fraud/uncollectible debt detection Data.
P ROBABILISTIC I NFERENCE. A GENDA Random variables Bayes rule Intro to Bayesian Networks.
Probabilistic Belief States and Bayesian Networks (Where we exploit the sparseness of direct interactions among components of a world) R&N: Chap. 14, Sect.
Bayesian networks. Motivation We saw that the full joint probability can be used to answer any question about the domain, but can become intractable as.
For Wednesday Read Chapter 11, sections 1-2 Program 2 due.
Ch. 14 – Probabilistic Reasoning Supplemental slides for CSE 327 Prof. Jeff Heflin.
1 Monte Carlo Artificial Intelligence: Bayesian Networks.
Introduction to Bayesian Networks
An Introduction to Artificial Intelligence Chapter 13 & : Uncertainty & Bayesian Networks Ramin Halavati
P ROBABILISTIC I NFERENCE. A GENDA Conditional probability Independence Intro to Bayesian Networks.
CS B 553: A LGORITHMS FOR O PTIMIZATION AND L EARNING Monte Carlo Methods for Probabilistic Inference.
Probabilistic Reasoning [Ch. 14] Bayes Networks – Part 1 ◦Syntax ◦Semantics ◦Parameterized distributions Inference – Part2 ◦Exact inference by enumeration.
Marginalization & Conditioning Marginalization (summing out): for any sets of variables Y and Z: Conditioning(variant of marginalization):
Review: Bayesian inference  A general scenario:  Query variables: X  Evidence (observed) variables and their values: E = e  Unobserved variables: Y.
Ch. 14 – Probabilistic Reasoning Supplemental slides for CSE 327 Prof. Jeff Heflin.
CS B 553: A LGORITHMS FOR O PTIMIZATION AND L EARNING Bayesian Networks.
1 Probability FOL fails for a domain due to: –Laziness: too much to list the complete set of rules, too hard to use the enormous rules that result –Theoretical.
CS B 553: A LGORITHMS FOR O PTIMIZATION AND L EARNING Bayesian Networks.
Conditional Probability, Bayes’ Theorem, and Belief Networks CISC 2315 Discrete Structures Spring2010 Professor William G. Tanner, Jr.
Artificial Intelligence Universitatea Politehnica Bucuresti Adina Magda Florea
Conditional Independence As with absolute independence, the equivalent forms of X and Y being conditionally independent given Z can also be used: P(X|Y,
Belief Networks CS121 – Winter Other Names Bayesian networks Probabilistic networks Causal networks.
PROBABILISTIC REASONING Heng Ji 04/05, 04/08, 2016.
Chapter 12. Probability Reasoning Fall 2013 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University.
CS 2750: Machine Learning Bayesian Networks Prof. Adriana Kovashka University of Pittsburgh March 14, 2016.
Bayesian Networks CS182/CogSci110/Ling109 Spring 2007 Leon Barrett a.k.a. belief nets, bayes nets.
CS 2750: Machine Learning Directed Graphical Models
Bayesian Networks Chapter 14 Section 1, 2, 4.
Bayesian networks Chapter 14 Section 1 – 2.
Presented By S.Yamuna AP/CSE
Qian Liu CSE spring University of Pennsylvania
Read R&N Ch Next lecture: Read R&N
Conditional Probability, Bayes’ Theorem, and Belief Networks
CSCI 121 Special Topics: Bayesian Networks Lecture #2: Bayes Nets
Read R&N Ch Next lecture: Read R&N
Belief Networks CS121 – Winter 2003 Belief Networks.
Read R&N Ch Next lecture: Read R&N
Bayesian networks Chapter 14 Section 1 – 2.
Probabilistic Reasoning
Read R&N Ch Next lecture: Read R&N
Presentation transcript:

B AYESIAN N ETWORKS

S IGNIFICANCE OF C ONDITIONAL INDEPENDENCE Consider Grade(CS101), Intelligence, and SAT Ostensibly, the grade in a course doesn’t have a direct relationship with SAT scores but good students are more likely to get good SAT scores, so they are not independent… It is reasonable to believe that Grade(CS101) and SAT are conditionally independent given Intelligence

B AYESIAN N ETWORK Explicitly represent independence among propositions Notice that Intelligence is the “cause” of both Grade and SAT, and the causality is represented explicitly Intel. Grade highlow SAT P(I,G,S) = P(G,S|I) P(I) = P(G|I) P(S|I) P(I) I \ G‘A’‘B’‘C’ low high P(I) P(G|I) P(S|I) I \ Slowhigh low high0.20.8

A M ORE C OMPLEX BN BurglaryEarthquake Alarm MaryCallsJohnCalls causes effects Directed acyclic graph Intuitive meaning of arc from x to y: “x has direct influence on y”

BEP(A| … ) TTFFTTFF TFTFTFTF BurglaryEarthquake Alarm MaryCallsJohnCalls P(B) P(E) AP(J|…) TFTF AP(M|…) TFTF Size of the CPT for a node with k parents: 2 k A M ORE C OMPLEX BN 10 probabilities, instead of 31

S IGNIFICANCE OF B AYESIAN N ETWORKS If we know that some variables are conditionally independent, we should be able to decompose joint distribution to take advantage of it Bayesian networks are a way of efficiently factoring the joint distribution into conditional probabilities And also building complex joint distributions from smaller models of probabilistic relationships But… What knowledge does the BN encode about the distribution? How do we use a BN to compute probabilities of variables that we are interested in?

W HAT DOES THE BN ENCODE ? Each of the beliefs JohnCalls and MaryCalls is independent of Burglary and Earthquake given Alarm or  Alarm BurglaryEarthquake Alarm MaryCallsJohnCalls For example, John does not observe any burglaries directly P(B  J)  P(B) P(J) P(B  J|A)  P(B|A) P(J|A)

W HAT DOES THE BN ENCODE ? The beliefs JohnCalls and MaryCalls are independent given Alarm or  Alarm For instance, the reasons why John and Mary may not call if there is an alarm are unrelated BurglaryEarthquake Alarm MaryCallsJohnCalls P(B  J|A)  P(B|A) P(J|A) P(J  M|A)  P(J|A) P(M|A) P(B  J|A)  P(B|A) P(J|A) P(J  M|A)  P(J|A) P(M|A) A node is independent of its non-descendants given its parents

W HAT DOES THE BN ENCODE ? BurglaryEarthquake Alarm MaryCallsJohnCalls A node is independent of its non-descendants given its parents Burglary and Earthquake are independent Burglary and Earthquake are independent The beliefs JohnCalls and MaryCalls are independent given Alarm or  Alarm For instance, the reasons why John and Mary may not call if there is an alarm are unrelated

L OCALLY S TRUCTURED W ORLD A world is locally structured (or sparse) if each of its components interacts directly with relatively few other components In a sparse world, the CPTs are small and the BN contains much fewer probabilities than the full joint distribution If the # of entries in each CPT is bounded by a constant, i.e., O(1), then the # of probabilities in a BN is linear in n – the # of propositions – instead of 2 n for the joint distribution

B UT DOES A BN REPRESENT A BELIEF STATE ? I N OTHER WORDS, CAN WE COMPUTE THE FULL JOINT DISTRIBUTION OF THE PROPOSITIONS FROM IT ?

C ALCULATION OF J OINT P ROBABILITY BEP(A| … ) TTFFTTFF TFTFTFTF BurglaryEarthquake Alarm MaryCallsJohnCalls P(B) P(E) AP(J|…) TFTF AP(M|…) TFTF P(J  M  A   B   E) = ??

P(J  M  A  B  E) = P(J  M|A,  B,  E)  P(A  B  E) = P(J|A,  B,  E)  P(M|A,  B,  E)  P(A  B  E) (J and M are independent given A) P(J|A,  B,  E) = P(J|A) (J and B and J and E are independent given A) P(M|A,  B,  E) = P(M|A) P(A  B  E) = P(A|  B,  E)  P(  B|  E)  P(  E) = P(A|  B,  E)  P(  B)  P(  E) (B and E are independent) P(J  M  A  B  E) = P(J|A)P(M|A)P(A|  B,  E)P(  B)P(  E) BurglaryEarthquake Alarm MaryCallsJohnCalls

C ALCULATION OF J OINT P ROBABILITY BEP(A| … ) TTFFTTFF TFTFTFTF BurglaryEarthquake Alarm MaryCallsJohnCalls P(B) P(E) AP(J|…) TFTF AP(M|…) TFTF P(J  M  A   B   E) = P(J|A)P(M|A)P(A|  B,  E)P(  B)P(  E) = 0.9 x 0.7 x x x =

C ALCULATION OF J OINT P ROBABILITY BEP(A| … ) TTFFTTFF TFTFTFTF BurglaryEarthquake Alarm MaryCallsJohnCalls P(B) P(E) AP(J|…) TFTF AP(M|…) TFTF P(J  M  A   B   E) = P(J|A)P(M|A)P(A|  B,  E)P(  B)P(  E) = 0.9 x 0.7 x x x = P(x 1  x 2  …  x n ) =  i=1,…,n P(x i |parents(X i ))  full joint distribution table

C ALCULATION OF J OINT P ROBABILITY BEP(A| … ) TTFFTTFF TFTFTFTF BurglaryEarthquake Alarm MaryCallsJohnCalls P(B) P(E) AP(J|…) TFTF AP(M|…) TFTF P(x 1  x 2  …  x n ) =  i=1,…,n P(x i |parents(X i ))  full joint distribution table P(J  M  A   B   E) = P(J|A)P(M|A)P(A|  B,  E)P(  b)P(  e) = 0.9 x 0.7 x x x = Since a BN defines the full joint distribution of a set of propositions, it represents a belief state

W HAT DOES THE BN ENCODE ? Burglary  Earthquake JohnCalls  MaryCalls | Alarm JohnCalls  Burglary | Alarm JohnCalls  Earthquake | Alarm MaryCalls  Burglary | Alarm MaryCalls  Earthquake | Alarm BurglaryEarthquake Alarm MaryCallsJohnCalls A node is independent of its non-descendents, given its parents

R EADING OFF INDEPENDENCE RELATIONSHIPS How about Burglary  Earthquake | Alarm ? No! Why? BurglaryEarthquake Alarm MaryCallsJohnCalls

R EADING OFF INDEPENDENCE RELATIONSHIPS How about Burglary  Earthquake | Alarm ? No! Why? P(B  E|A) = P(A|B,E)P(B  E)/P(A) = P(B|A)P(E|A) = BurglaryEarthquake Alarm MaryCallsJohnCalls

R EADING OFF INDEPENDENCE RELATIONSHIPS How about Burglary  Earthquake | JohnCalls? No! Why? Knowing JohnCalls affects the probability of Alarm, which makes Burglary and Earthquake dependent BurglaryEarthquake Alarm MaryCallsJohnCalls

I NDEPENDENCE RELATIONSHIPS Rough intuition (this holds for tree-like graphs, polytrees): Evidence on the (directed) road between two variables makes them independent Evidence on an “A” node makes descendants independent Evidence on a “V” node, or below the V, makes the ancestors of the variables dependent (otherwise they are independent) Formal property in general case : D-separation  independence (see R&N)

B ENEFITS OF S PARSE M ODELS Modeling Fewer relationships need to be encoded (either through understanding or statistics) Large networks can be built up from smaller ones Intuition Dependencies/independencies between variables can be inferred through network structures Tractable probabilistic inference

P ROBABILISTIC I NFERENCE IN B AYES N ETS

P ROBABILISTIC I NFERENCE Is the following problem…. Given: A belief state P(X 1,…,X n ) in some form (e.g., a Bayes net or a joint probability table) A query variable indexed by q Some subset of evidence variables indexed by e 1,…,e k Find: P(X q | X e1,…, X ek )

P ROBABILISTIC I NFERENCE WITHOUT E VIDENCE For the moment we’ll assume no evidence variables Find P(X q )

P ROBABILISTIC I NFERENCE WITHOUT E VIDENCE In joint probability table, we can find P(X q ) through marginalization Computational complexity: 2 n-1 (assuming boolean random variables) In Bayesian networks, we find P(X q ) using top down inference

BEP(A| … ) TTFFTTFF TFTFTFTF BurglaryEarthquake Alarm MaryCallsJohnCalls P(B) P(E) AP(J|…) TFTF AP(M|…) TFTF T OP -D OWN INFERENCE Suppose we want to compute P(Alarm)

BEP(A| … ) TTFFTTFF TFTFTFTF BurglaryEarthquake Alarm MaryCallsJohnCalls P(B) P(E) AP(J|…) TFTF AP(M|…) TFTF T OP -D OWN INFERENCE Suppose we want to compute P(Alarm) 1.P(Alarm) = Σ b,e P(A,b,e) 2.P(Alarm) = Σ b,e P(A|b,e)P(b)P(e) Suppose we want to compute P(Alarm) 1.P(Alarm) = Σ b,e P(A,b,e) 2.P(Alarm) = Σ b,e P(A|b,e)P(b)P(e)

BEP(A| … ) TTFFTTFF TFTFTFTF BurglaryEarthquake Alarm MaryCallsJohnCalls P(B) P(E) AP(J|…) TFTF AP(M|…) TFTF T OP -D OWN INFERENCE Suppose we want to compute P(Alarm) 1.P(Alarm) = Σ b,e P(A,b,e) 2.P(Alarm) = Σ b,e P(A|b,e)P(b)P(e) 3.P(Alarm) = P(A|B,E)P(B)P(E) + P(A|B,  E)P(B)P(  E) + P(A|  B,E)P(  B)P(E) + P(A|  B,  E)P(  B)P(  E) Suppose we want to compute P(Alarm) 1.P(Alarm) = Σ b,e P(A,b,e) 2.P(Alarm) = Σ b,e P(A|b,e)P(b)P(e) 3.P(Alarm) = P(A|B,E)P(B)P(E) + P(A|B,  E)P(B)P(  E) + P(A|  B,E)P(  B)P(E) + P(A|  B,  E)P(  B)P(  E)

BEP(A| … ) TTFFTTFF TFTFTFTF BurglaryEarthquake Alarm MaryCallsJohnCalls P(B) P(E) AP(J|…) TFTF AP(M|…) TFTF T OP -D OWN INFERENCE Suppose we want to compute P(Alarm) 1.P(A) = Σ b,e P(A,b,e) 2.P(A) = Σ b,e P(A|b,e)P(b)P(e) 3.P(A) = P(A|B,E)P(B)P(E) + P(A|B,  E)P(B)P(  E) + P(A|  B,E)P(  B)P(E) + P(A|  B,  E)P(  B)P(  E) 4.P(A) = 0.95*0.001* *0.001* *0.999* *0.999*0.998 = Suppose we want to compute P(Alarm) 1.P(A) = Σ b,e P(A,b,e) 2.P(A) = Σ b,e P(A|b,e)P(b)P(e) 3.P(A) = P(A|B,E)P(B)P(E) + P(A|B,  E)P(B)P(  E) + P(A|  B,E)P(  B)P(E) + P(A|  B,  E)P(  B)P(  E) 4.P(A) = 0.95*0.001* *0.001* *0.999* *0.999*0.998 =

BEP(A| … ) TTFFTTFF TFTFTFTF BurglaryEarthquake Alarm MaryCallsJohnCalls P(B) P(E) AP(J|…) TFTF AP(M|…) TFTF T OP -D OWN INFERENCE Now, suppose we want to compute P(MaryCalls)

BEP(A| … ) TTFFTTFF TFTFTFTF BurglaryEarthquake Alarm MaryCallsJohnCalls P(B) P(E) AP(J|…) TFTF AP(M|…) TFTF T OP -D OWN INFERENCE Now, suppose we want to compute P(MaryCalls) 1.P(M) = P(M|A)P(A) + P(M|  A) P(  A) Now, suppose we want to compute P(MaryCalls) 1.P(M) = P(M|A)P(A) + P(M|  A) P(  A)

BEP(A| … ) TTFFTTFF TFTFTFTF BurglaryEarthquake Alarm MaryCallsJohnCalls P(B) P(E) AP(J|…) TFTF AP(M|…) TFTF T OP -D OWN INFERENCE Now, suppose we want to compute P(MaryCalls) 1.P(M) = P(M|A)P(A) + P(M|  A) P(  A) 2.P(M) = 0.70* *( ) = Now, suppose we want to compute P(MaryCalls) 1.P(M) = P(M|A)P(A) + P(M|  A) P(  A) 2.P(M) = 0.70* *( ) =

BEP(A| … ) TTFFTTFF TFTFTFTF BurglaryEarthquake Alarm MaryCallsJohnCalls P(B) P(E) AP(J|…) TFTF AP(M|…) TFTF T OP -D OWN INFERENCE WITH E VIDENCE Suppose we want to compute P(Alarm|Earthquake)

BEP(A| … ) TTFFTTFF TFTFTFTF BurglaryEarthquake Alarm MaryCallsJohnCalls P(B) P(E) AP(J|…) TFTF AP(M|…) TFTF T OP -D OWN INFERENCE WITH E VIDENCE Suppose we want to compute P(A|e) 1.P(A|e) = Σ b P(A,b|e) 2.P(A|e) = Σ b P(A|b,e)P(b) Suppose we want to compute P(A|e) 1.P(A|e) = Σ b P(A,b|e) 2.P(A|e) = Σ b P(A|b,e)P(b)

BEP(A| … ) TTFFTTFF TFTFTFTF BurglaryEarthquake Alarm MaryCallsJohnCalls P(B) P(E) AP(J|…) TFTF AP(M|…) TFTF T OP -D OWN INFERENCE WITH E VIDENCE Suppose we want to compute P(A|e) 1.P(A|e) = Σ b P(A,b|e) 2.P(A|e) = Σ b P(A|b,e)P(b) 3.P(A|e) = 0.95* * = Suppose we want to compute P(A|e) 1.P(A|e) = Σ b P(A,b|e) 2.P(A|e) = Σ b P(A|b,e)P(b) 3.P(A|e) = 0.95* * =

T OP -D OWN INFERENCE Only works if the graph of ancestors of a variable is a polytree Evidence given on ancestor(s) of the query variable Efficient: O(d 2 k ) time, where d is the number of ancestors of a variable, with k a bound on # of parents Evidence on an ancestor cuts off influence of portion of graph above evidence node

Q UERYING THE BN The BN gives P(T|C) What about P(C|T)? Cavity Toothache P(C) 0.1 CP(T|C) TFTF

B AYES ’ R ULE P(A  B) = P(A|B) P(B) = P(B|A) P(A) So… P(A|B) = P(B|A) P(A) / P(B)

A PPLYING B AYES ’ R ULE Let A be a cause, B be an effect, and let’s say we know P(B|A) and P(A) (conditional probability tables) What’s P(B)?

A PPLYING B AYES ’ R ULE Let A be a cause, B be an effect, and let’s say we know P(B|A) and P(A) (conditional probability tables) What’s P(B)? P(B) =  a P(B,A=a) [marginalization] P(B,A=a) = P(B|A=a)P(A=a)[conditional probability] So, P(B) =  a P(B | A=a) P(A=a)

A PPLYING B AYES ’ R ULE Let A be a cause, B be an effect, and let’s say we know P(B|A) and P(A) (conditional probability tables) What’s P(A|B)?

A PPLYING B AYES ’ R ULE Let A be a cause, B be an effect, and let’s say we know P(B|A) and P(A) (conditional probability tables) What’s P(A|B)? P(A|B) = P(B|A)P(A)/P(B)[Bayes rule] P(B) =  a P(B | A=a) P(A=a)[Last slide] So, P(A|B) = P(B|A)P(A) / [  a P(B | A=a) P(A=a)]

H OW DO WE READ THIS ? P(A|B) = P(B|A)P(A) / [  a P(B | A=a) P(A=a)] [An equation that holds for all values A can take on, and all values B can take on] P(A=a|B=b) =

H OW DO WE READ THIS ? P(A|B) = P(B|A)P(A) / [  a P(B | A=a) P(A=a)] [An equation that holds for all values A can take on, and all values B can take on] P(A=a|B=b) = P(B=b|A=a)P(A=a) / [  a P(B=b | A=a) P(A=a)] Are these the same a?

H OW DO WE READ THIS ? P(A|B) = P(B|A)P(A) / [  a P(B | A=a) P(A=a)] [An equation that holds for all values A can take on, and all values B can take on] P(A=a|B=b) = P(B=b|A=a)P(A=a) / [  a P(B=b | A=a) P(A=a)] Are these the same a? NO!

H OW DO WE READ THIS ? P(A|B) = P(B|A)P(A) / [  a P(B | A=a) P(A=a)] [An equation that holds for all values A can take on, and all values B can take on] P(A=a|B=b) = P(B=b|A=a)P(A=a) / [  a’ P(B=b | A=a’) P(A=a’)] Be careful about indices!

Q UERYING THE BN The BN gives P(T|C) What about P(C|T)? P(Cavity|Toothache) = P(Toothache|Cavity) P(Cavity) P(Toothache) [Bayes’ rule] Querying a BN is just applying Bayes’ rule on a larger scale… Cavity Toothache P(C) 0.1 CP(T|C) TFTF Denominator computed by summing out numerator over Cavity and  Cavity

N AÏVE B AYES M ODELS P(Cause,Effect 1,…,Effect n ) = P(Cause)  i P(Effect i | Cause) Cause Effect 1 Effect 2 Effect n

N AÏVE B AYES C LASSIFIER P(Class,Feature 1,…,Feature n ) = P(Class)  i P(Feature i | Class) Class Feature 1 Feature 2 Feature n P(C|F 1,….,F k ) = P(C,F 1,….,F k )/P(F 1,….,F k ) = 1/Z P(C)  i P(Fi|C) Given features, what class? Spam / Not Spam English / French/ Latin … Word occurrences

M ORE COMPLEX NETWORKS Two limitations of current discussion: “Tree like” networks Evidence at specific locations Next time: Tractable exact inference without “loops” With loops: NP-hard (compare to CSPs) Approximate inference for general networks

S OME A PPLICATIONS OF BN Medical diagnosis Troubleshooting of hardware/software systems Fraud/uncollectible debt detection Data mining Analysis of genetic sequences Data interpretation, computer vision, image understanding

M ORE C OMPLICATED S INGLY -C ONNECTED B ELIEF N ET Radio Battery SparkPlugs Starts Gas Moves

Region = {Sky, Tree, Grass, Rock} R2 R4 R3 R1 Above

H OMEWORK Read R&N