CPSC 322, Lecture 26Slide 1 Reasoning Under Uncertainty: Belief Networks Computer Science cpsc322, Lecture 27 (Textbook Chpt 6.3) Nov, 13, 2013.

Slides:



Advertisements
Similar presentations
Bayesian networks Chapter 14 Section 1 – 2. Outline Syntax Semantics Exact computation.
Advertisements

Decision Theory: Sequential Decisions Computer Science cpsc322, Lecture 34 (Textbook Chpt 9.3) Nov, 28, 2012.
BAYESIAN NETWORKS. Bayesian Network Motivation  We want a representation and reasoning system that is based on conditional independence  Compact yet.
Department of Computer Science Undergraduate Events More
Reasoning Under Uncertainty: Bayesian networks intro Jim Little Uncertainty 4 November 7, 2014 Textbook §6.3, 6.3.1, 6.5, 6.5.1,
Slide 1 Reasoning Under Uncertainty: More on BNets structure and construction Jim Little Nov (Textbook 6.3)
1 22c:145 Artificial Intelligence Bayesian Networks Reading: Ch 14. Russell & Norvig.
Bayesian Networks Chapter 14 Section 1, 2, 4. Bayesian networks A simple, graphical notation for conditional independence assertions and hence for compact.
CPSC 322, Lecture 26Slide 1 Reasoning Under Uncertainty: Belief Networks Computer Science cpsc322, Lecture 27 (Textbook Chpt 6.3) March, 16, 2009.
CPSC 322, Lecture 19Slide 1 Propositional Logic Intro, Syntax Computer Science cpsc322, Lecture 19 (Textbook Chpt ) February, 23, 2009.
Decision Theory: Single Stage Decisions Computer Science cpsc322, Lecture 33 (Textbook Chpt 9.2) March, 30, 2009.
CPSC 322, Lecture 18Slide 1 Planning: Heuristics and CSP Planning Computer Science cpsc322, Lecture 18 (Textbook Chpt 8) February, 12, 2010.
Review: Bayesian learning and inference
CPSC 322, Lecture 30Slide 1 Reasoning Under Uncertainty: Variable elimination Computer Science cpsc322, Lecture 30 (Textbook Chpt 6.4) March, 23, 2009.
Bayesian Networks. Motivation The conditional independence assumption made by naïve Bayes classifiers may seem to rigid, especially for classification.
CPSC 322, Lecture 12Slide 1 CSPs: Search and Arc Consistency Computer Science cpsc322, Lecture 12 (Textbook Chpt ) January, 29, 2010.
Bayesian networks Chapter 14 Section 1 – 2.
Bayesian Belief Networks
CPSC 322, Lecture 28Slide 1 Reasoning Under Uncertainty: More on BNets structure and construction Computer Science cpsc322, Lecture 28 (Textbook Chpt 6.3)
CPSC 322, Lecture 31Slide 1 Probability and Time: Markov Models Computer Science cpsc322, Lecture 31 (Textbook Chpt 6.5) March, 25, 2009.
CPSC 322, Lecture 22Slide 1 Logic: Domain Modeling /Proofs + Top-Down Proofs Computer Science cpsc322, Lecture 22 (Textbook Chpt 5.2) March, 8, 2010.
CPSC 322, Lecture 32Slide 1 Probability and Time: Hidden Markov Models (HMMs) Computer Science cpsc322, Lecture 32 (Textbook Chpt 6.5) March, 27, 2009.
CPSC 322, Lecture 24Slide 1 Reasoning under Uncertainty: Intro to Probability Computer Science cpsc322, Lecture 24 (Textbook Chpt 6.1, 6.1.1) March, 15,
Bayesian Reasoning. Tax Data – Naive Bayes Classify: (_, No, Married, 95K, ?)
Bayesian networks More commonly called graphical models A way to depict conditional independence relationships between random variables A compact specification.
Probabilistic Reasoning
Artificial Intelligence CS 165A Tuesday, November 27, 2007  Probabilistic Reasoning (Ch 14)
Read R&N Ch Next lecture: Read R&N
Bayesian networks Chapter 14. Outline Syntax Semantics.
Bayesian networks Chapter 14 Section 1 – 2. Bayesian networks A simple, graphical notation for conditional independence assertions and hence for compact.
An Introduction to Artificial Intelligence Chapter 13 & : Uncertainty & Bayesian Networks Ramin Halavati
CPSC 322, Lecture 22Slide 1 Logic: Domain Modeling /Proofs + Top-Down Proofs Computer Science cpsc322, Lecture 22 (Textbook Chpt 5.2) Oct, 26, 2010.
Reasoning Under Uncertainty: Bayesian networks intro CPSC 322 – Uncertainty 4 Textbook §6.3 – March 23, 2011.
Reasoning Under Uncertainty: Independence and Inference Jim Little Uncertainty 5 Nov 10, 2014 Textbook §6.3.1, 6.5, 6.5.1,
Department of Computer Science Undergraduate Events More
Bayesian networks. Motivation We saw that the full joint probability can be used to answer any question about the domain, but can become intractable as.
An Introduction to Artificial Intelligence Chapter 13 & : Uncertainty & Bayesian Networks Ramin Halavati
CPSC 322, Lecture 32Slide 1 Probability and Time: Hidden Markov Models (HMMs) Computer Science cpsc322, Lecture 32 (Textbook Chpt 6.5.2) Nov, 25, 2013.
Reasoning Under Uncertainty: Conditioning, Bayes Rule & the Chain Rule Jim Little Uncertainty 2 Nov 3, 2014 Textbook §6.1.3.
Probabilistic Reasoning [Ch. 14] Bayes Networks – Part 1 ◦Syntax ◦Semantics ◦Parameterized distributions Inference – Part2 ◦Exact inference by enumeration.
CPSC 422, Lecture 11Slide 1 Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 11 Oct, 2, 2015.
Review: Bayesian inference  A general scenario:  Query variables: X  Evidence (observed) variables and their values: E = e  Unobserved variables: Y.
Lecture 29 Conditional Independence, Bayesian networks intro Ch 6.3, 6.3.1, 6.5, 6.5.1,
Finish Probability Theory + Bayesian Networks Computer Science cpsc322, Lecture 9 (Textbook Chpt ) June, 5, CPSC 322, Lecture 9.
Reasoning Under Uncertainty: Independence and Inference CPSC 322 – Uncertainty 5 Textbook §6.3.1 (and for HMMs) March 25, 2011.
PROBABILISTIC REASONING Heng Ji 04/05, 04/08, 2016.
Chapter 12. Probability Reasoning Fall 2013 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University.
Web-Mining Agents Data Mining Prof. Dr. Ralf Möller Universität zu Lübeck Institut für Informationssysteme Karsten Martiny (Übungen)
Computer Science CPSC 322 Lecture 19 Bayesian Networks: Construction 1.
A Brief Introduction to Bayesian networks
Probability and Time: Hidden Markov Models (HMMs)
Reasoning Under Uncertainty: Belief Networks
Reasoning Under Uncertainty: More on BNets structure and construction
Bayesian networks Chapter 14 Section 1 – 2.
Presented By S.Yamuna AP/CSE
Qian Liu CSE spring University of Pennsylvania
Reasoning Under Uncertainty: More on BNets structure and construction
Reasoning Under Uncertainty: Conditioning, Bayes Rule & Chain Rule
CSCI 121 Special Topics: Bayesian Networks Lecture #2: Bayes Nets
Probability and Time: Markov Models
Read R&N Ch Next lecture: Read R&N
Probability and Time: Markov Models
Probabilistic Reasoning; Network-based reasoning
Decision Theory: Single Stage Decisions
Probability and Time: Markov Models
Probability and Time: Markov Models
Bayesian networks Chapter 14 Section 1 – 2.
Probabilistic Reasoning
Reasoning Under Uncertainty: Variable elimination
Presentation transcript:

CPSC 322, Lecture 26Slide 1 Reasoning Under Uncertainty: Belief Networks Computer Science cpsc322, Lecture 27 (Textbook Chpt 6.3) Nov, 13, 2013

CPSC 322, Lecture 2Slide 2 Big Picture: R&R systems Environment Problem Query Planning Deterministic Stochastic Search Arc Consistency Search Value Iteration Var. Elimination Constraint Satisfaction Logics STRIPS Belief Nets Vars + Constraints Decision Nets Markov Processes Var. Elimination Static Sequential Representation Reasoning Technique SLS

CPSC 322, Lecture 26Slide 3 Key points Recap We model the environment as a set of …. Why the joint is not an adequate representation ? “Representation, reasoning and learning” are “exponential” in ….. Solution: Exploit marginal&conditional independence But how does independence allow us to simplify the joint?

CPSC 322, Lecture 26Slide 4 Lecture Overview Belief Networks Build sample BN Intro Inference, Compactness, Semantics More Examples

CPSC 322, Lecture 26Slide 5 Belief Nets: Burglary Example There might be a burglar in my house The anti-burglar alarm in my house may go off I have an agreement with two of my neighbors, John and Mary, that they call me if they hear the alarm go off when I am at work Minor earthquakes may occur and sometimes the set off the alarm. Variables: Joint has entries/probs

CPSC 322, Lecture 26Slide 6 Belief Nets: Simplify the joint Typically order vars to reflect causal knowledge (i.e., causes before effects) A burglar (B) can set the alarm (A) off An earthquake (E) can set the alarm (A) off The alarm can cause Mary to call (M) The alarm can cause John to call (J) Apply Chain Rule Simplify according to marginal&conditional independence

CPSC 322, Lecture 26Slide 7 Belief Nets: Structure + Probs Express remaining dependencies as a network Each var is a node For each var, the conditioning vars are its parents Associate to each node corresponding conditional probabilities Directed Acyclic Graph (DAG)

Slide 8 Burglary: complete BN BEP(A=T | B,E)P(A=F | B,E) TT TF FT FF P(B=T)P(B=F ) P(E=T)P(E=F ) AP(J=T | A)P(J=F | A) T F AP(M=T | A)P(M=F | A) T F.01.99

CPSC 322, Lecture 26Slide 9 Lecture Overview Belief Networks Build sample BN Intro Inference, Compactness, Semantics More Examples

CPSC 322, Lecture 26Slide 10 Burglary Example: Bnets inference (Ex1) I'm at work, neighbor John calls to say my alarm is ringing, neighbor Mary doesn't call. No news of any earthquakes. Is there a burglar? (Ex2) I'm at work, Receive message that neighbor John called, News of minor earthquakes. Is there a burglar? Our BN can answer any probabilistic query that can be answered by processing the joint! Set digital places to monitor to 5

CPSC 322, Lecture 26Slide 11 Burglary Example: Bnets inference (Ex1) I'm at work, neighbor John calls to say my alarm is ringing, neighbor Mary doesn't call. No news of any earthquakes. Is there a burglar? Our BN can answer any probabilistic query that can be answered by processing the joint! The probability of Burglar will: A.Go down B.Remain the same C.Go up

CPSC 322, Lecture 26Slide 12 Bayesian Networks – Inference Types Diagnostic Burglary Alarm JohnCalls P(J) = 1.0 P(B) = Burglary Earthquake Alarm Intercausal P(A) = 1.0 P(B) = P(E) = 1.0 JohnCalls Predictive Burglary Alarm P(J) = P(B) = 1.0 Mixed Earthquake Alarm JohnCalls P(M) = 1.0 P(  E) = 1.0 P(A) =

Slide 13 BNnets: Compactness B E P(A=T | B,E)P(A=F | B,E) T T T F F T F F P(B=T)P(B=F ) P(E=T) P(E=F ) AP(J=T | A)P(J=F | A) T F AP(M=T | A)P(M=F | A) T F.01.99

CPSC 322, Lecture 26Slide 14 BNets: Compactness In General: A CPT for boolean X i with k boolean parents has rows for the combinations of parent values Each row requires one number p i for X i = true (the number for X i = false is just 1-p i ) If each variable has no more than k parents, the complete network requires O( ) numbers For k<< n, this is a substantial improvement, the numbers required grow linearly with n, vs. O(2 n ) for the full joint distribution

CPSC 322, Lecture 26Slide 15 BNets: Construction General Semantics The full joint distribution can be defined as the product of conditional distributions: P (X 1, …,X n ) = π i = 1 P(X i | X 1, …,X i-1 ) (chain rule) Simplify according to marginal&conditional independence n Express remaining dependencies as a network Each var is a node For each var, the conditioning vars are its parents Associate to each node corresponding conditional probabilities P (X 1, …,X n ) = π i = 1 P (X i | Parents(X i )) n

CPSC 322, Lecture 26Slide 16 BNets: Construction General Semantics (cont’) n P (X 1, …,X n ) = π i = 1 P (X i | Parents(X i )) Every node is independent from its non-descendants given it parents

CPSC 322, Lecture 26Slide 17 Lecture Overview Belief Networks Build sample BN Intro Inference, Compactness, Semantics More Examples

CPSC 322, Lecture 26Slide 18 Other Examples: Fire Diagnosis (textbook Ex. 6.10) Suppose you want to diagnose whether there is a fire in a building you receive a noisy report about whether everyone is leaving the building. if everyone is leaving, this may have been caused by a fire alarm. if there is a fire alarm, it may have been caused by a fire or by tampering if there is a fire, there may be smoke raising from the bldg.

CPSC 322, Lecture 26Slide 19 Other Examples (cont’) Make sure you explore and understand the Fire Diagnosis example (we’ll expand on it to study Decision Networks) Electrical Circuit example (textbook ex 6.11) Patient’s wheezing and coughing example (ex. 6.14) Several other examples on

CPSC 322, Lecture 26Slide 20 Realistic BNet: Liver Diagnosis Source: Onisko et al., 1999

CPSC 322, Lecture 26Slide 21 Realistic BNet: Liver Diagnosis Source: Onisko et al., 1999

CPSC 322, Lecture 26Slide 22 Realistic BNet: Liver Diagnosis Source: Onisko et al., 1999 Assuming there are ~60 nodes in this Bnet and assuming they are all binary, how many numbers are required for the JPD vs BNet

CPSC 322, Lecture 18Slide 23 Answering Query under Uncertainty Static Belief Network & Variable Elimination Dynamic Bayesian Network Probability Theory Hidden Markov Models spam filters Diagnostic Systems (e.g., medicine) Natural Language Processing Student Tracing in tutoring Systems Monitoring (e.g credit cards)

CPSC 322, Lecture 4Slide 24 Learning Goals for today’s class You can: Build a Belief Network for a simple domain Classify the types of inference Compute the representational saving in terms on number of probabilities required

CPSC 322, Lecture 26Slide 25 Next Class (Wednesday!) Bayesian Networks Representation Additional Dependencies encoded by BNets More compact representations for CPT Very simple but extremely useful Bnet (Bayes Classifier)

CPSC 322, Lecture 26Slide 26 Belief network summary A belief network is a directed acyclic graph (DAG) that effectively expresses independence assertions among random variables. The parents of a node X are those variables on which X directly depends. Consideration of causal dependencies among variables typically help in constructing a Bnet