Conditional Probability, Bayes’ Theorem, and Belief Networks CISC 2315 Discrete Structures Spring2010 Professor William G. Tanner, Jr.

Slides:



Advertisements
Similar presentations
Bayesian networks Chapter 14 Section 1 – 2. Outline Syntax Semantics Exact computation.
Advertisements

CHAPTER 14 Oliver Schulte Bayesian Networks. Environment Type: Uncertain Artificial Intelligence a modern approach 2 Fully Observable Deterministic Certainty:
1 22c:145 Artificial Intelligence Bayesian Networks Reading: Ch 14. Russell & Norvig.
Bayesian Networks Chapter 14 Section 1, 2, 4. Bayesian networks A simple, graphical notation for conditional independence assertions and hence for compact.
Uncertain knowledge and reasoning
CPSC 322, Lecture 26Slide 1 Reasoning Under Uncertainty: Belief Networks Computer Science cpsc322, Lecture 27 (Textbook Chpt 6.3) March, 16, 2009.
3/19. Conditional Independence Assertions We write X || Y | Z to say that the set of variables X is conditionally independent of the set of variables.
Review: Bayesian learning and inference
CPSC 422 Review Of Probability Theory.
Bayesian Networks. Motivation The conditional independence assumption made by naïve Bayes classifiers may seem to rigid, especially for classification.
Bayesian networks Chapter 14 Section 1 – 2.
Bayesian Belief Networks
11/18 Everything is fine.. Everything is fine… Everything is fine…
Intro to AI Uncertainty Ruth Bergman Fall Why Not Use Logic? Suppose I want to write down rules about medical diagnosis: Diagnostic rules: A x has(x,sorethroat)
University College Cork (Ireland) Department of Civil and Environmental Engineering Course: Engineering Artificial Intelligence Dr. Radu Marinescu Lecture.
Bayesian Networks What is the likelihood of X given evidence E? i.e. P(X|E) = ?
Bayesian Reasoning. Tax Data – Naive Bayes Classify: (_, No, Married, 95K, ?)
1 Probabilistic Belief States and Bayesian Networks (Where we exploit the sparseness of direct interactions among components of a world) R&N: Chap. 14,
Bayesian networks More commonly called graphical models A way to depict conditional independence relationships between random variables A compact specification.
Uncertainty Chapter 13.
Uncertainty Chapter 13.
EE562 ARTIFICIAL INTELLIGENCE FOR ENGINEERS
CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes Feb 28 and March 13-15, 2012.
Bayes Nets and Probabilities
CSCI 121 Special Topics: Bayesian Network Lecture #1: Reasoning Under Uncertainty.
Artificial Intelligence CS 165A Tuesday, November 27, 2007  Probabilistic Reasoning (Ch 14)
Bayesian networks Chapter 14. Outline Syntax Semantics.
Bayesian networks Chapter 14 Section 1 – 2. Bayesian networks A simple, graphical notation for conditional independence assertions and hence for compact.
An Introduction to Artificial Intelligence Chapter 13 & : Uncertainty & Bayesian Networks Ramin Halavati
B AYESIAN N ETWORKS. S IGNIFICANCE OF C ONDITIONAL INDEPENDENCE Consider Grade(CS101), Intelligence, and SAT Ostensibly, the grade in a course doesn’t.
CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes March 13, 2012.
1 Chapter 13 Uncertainty. 2 Outline Uncertainty Probability Syntax and Semantics Inference Independence and Bayes' Rule.
CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes March 20, 2012.
Probabilistic Belief States and Bayesian Networks (Where we exploit the sparseness of direct interactions among components of a world) R&N: Chap. 14, Sect.
Bayesian networks. Motivation We saw that the full joint probability can be used to answer any question about the domain, but can become intractable as.
For Wednesday Read Chapter 11, sections 1-2 Program 2 due.
Aprendizagem Computacional Gladys Castillo, UA Bayesian Networks Classifiers Gladys Castillo University of Aveiro.
1 Monte Carlo Artificial Intelligence: Bayesian Networks.
Introduction to Bayesian Networks
An Introduction to Artificial Intelligence Chapter 13 & : Uncertainty & Bayesian Networks Ramin Halavati
CSE PR 1 Reasoning - Rule-based and Probabilistic Representing relations with predicate logic Limitations of predicate logic Representing relations.
METU Informatics Institute Min720 Pattern Classification with Bio-Medical Applications Lecture notes 9 Bayesian Belief Networks.
Uncertainty Chapter 13. Outline Uncertainty Probability Syntax and Semantics Inference Independence and Bayes' Rule.
Uncertainty Chapter 13. Outline Uncertainty Probability Syntax and Semantics Inference Independence and Bayes' Rule.
CS B 553: A LGORITHMS FOR O PTIMIZATION AND L EARNING Bayesian Networks.
Uncertainty Let action A t = leave for airport t minutes before flight Will A t get me there on time? Problems: 1.partial observability (road state, other.
1 Probability FOL fails for a domain due to: –Laziness: too much to list the complete set of rules, too hard to use the enormous rules that result –Theoretical.
CPSC 322, Lecture 26Slide 1 Reasoning Under Uncertainty: Belief Networks Computer Science cpsc322, Lecture 27 (Textbook Chpt 6.3) Nov, 13, 2013.
Belief Networks CS121 – Winter Other Names Bayesian networks Probabilistic networks Causal networks.
PROBABILISTIC REASONING Heng Ji 04/05, 04/08, 2016.
Anifuddin Azis UNCERTAINTY. 2 Introduction The world is not a well-defined place. There is uncertainty in the facts we know: What’s the temperature? Imprecise.
Chapter 12. Probability Reasoning Fall 2013 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University.
CMPT 726 Simon Fraser University CHAPTER 14 Oliver Schulte
CS 2750: Machine Learning Review
CS 2750: Machine Learning Directed Graphical Models
Bayesian networks Chapter 14 Section 1 – 2.
Presented By S.Yamuna AP/CSE
Qian Liu CSE spring University of Pennsylvania
Uncertainty Chapter 13.
Bayes Rule Which is shorthand for: 10 Mar 2004 CS Uncertainty.
Conditional Probability, Bayes’ Theorem, and Belief Networks
CSCI 121 Special Topics: Bayesian Networks Lecture #2: Bayes Nets
Uncertainty.
Belief Networks CS121 – Winter 2003 Belief Networks.
Bayesian networks Chapter 14 Section 1 – 2.
Probabilistic Reasoning
Read R&N Ch Next lecture: Read R&N
Uncertainty Chapter 13.
Uncertainty Chapter 13.
Presentation transcript:

Conditional Probability, Bayes’ Theorem, and Belief Networks CISC 2315 Discrete Structures Spring2010 Professor William G. Tanner, Jr.

Conditional probability P(A|B) = the conditional probability of A given that all we know is B. Once we receive some evidence concerning a proposition, prior probabilities are no longer applicable. We need to assess the conditional probability of that proposition given that all we know is the available evidence. e.g., in the picture, P(A) = 0.25; P(B) = 0.5; P(A & B) = 0.25; P(A|B) = 0.5. BB ~B A

Example: P(Cavity=true|Toothache=true) = 0.8 is a conditional probability statement. We could also have more evidence, e.g., P(Cavity|Toothache, Earthquake). This evidence could be irrelevant, e.g., P(Cavity|Toothache, Earthquake) = P(Cavity|Toothache) = 0.8. Also, P(Cavity|Toothache, Cavity) = 1. Examples of conditional probabilities

Bayes’ Rule Here is the derivation of the rule:

The significance of Bayes’ Rule Bayes’ Rule underlies many probabilistic reasoning systems in artificial intelligence (AI). It is useful because, in practice, we often know the probabilities on the right hand side of Bayes’ Rule and wish to estimate the probability on the left.

Example of the use of Bayes Rule Bayes’ Rule is particularly useful for assessing disease hypotheses from symptoms: P(Cause|Effect) = P(Effect|Cause) P(Cause) / P(Effect) From knowledge of conditional probabilities on causal relationships in medicine, we can derive probabilities of diagnoses. Let S be the proposition that a patient has a stiff neck, and M the proposition that the patient has meningitis. Suppose we want to know P(M|S). P(S|M)=0.5 P(M)=1/50000 P(S)=1/20 Using Bayes’ Rule, P(M|S)=P(S|M)P(M)/P(S)=0.0002

Belief networks Belief networks represent dependence between variables. A belief network B = (V,E) is a directed acyclic graph with nodes V and directed edges E where Each node in V corresponds to a random variable. There is a directed edge from node X to node Y if variable X has a direct influence on variable Y. Each node in V has a conditional probability table (CPT) associated with it. The CPT specifies the conditional distribution of the node given its parents, i.e., P (X i |Parents(X i )). The parents of a node are all those nodes that have arrows pointing to it.

A simple belief network Alarm MaryCalls JohnCalls Earthquake Burglary

A simple belief net with CPTs Alarm MaryCalls JohnCalls Earthquake Burglary P(B)=0.001 P(E)=0.002 P(A|B,E)=0.95 P(A|B,not E)=0.94 P(A|not B,E)=0.29 P(A|not B, not E)=0.001 P(J|A)=0.9 P(J|not A)=0.05 P(M|A)=0.7 P(M|not A)=0.01

An Example We can compute P(Alarm|Burglary), using probabilistic inference (taught in the AI class).

A Successful Belief Net Application PATHFINDER is a diagnostic expert system for lymph-node diseases, built by the Stanford Medical Computer Science program in the 1980’s. The system deals with over 60 diseases. Four versions have been built, and PATHFINDER IV uses a belief network. PATHFINDER IV was tested on 53 actual cases of patients referred to a lymph-node specialist, and it scored highly. A recent comparison between medical experts and PATHFINDER IV shows the system outperforming the experts, some of who are among the world’s leading pathologists, and some of who were consulted to build the system in the first place!