Artificial Intelligence Chapter 19 Reasoning with Uncertain Information Biointelligence Lab School of Computer Sci. & Eng. Seoul National University.

Slides:



Advertisements
Similar presentations
Artificial Intelligence Chapter 13 The Propositional Calculus Biointelligence Lab School of Computer Sci. & Eng. Seoul National University.
Advertisements

A Tutorial on Learning with Bayesian Networks
BAYESIAN NETWORKS Ivan Bratko Faculty of Computer and Information Sc. University of Ljubljana.
Probabilistic models Jouni Tuomisto THL. Outline Deterministic models with probabilistic parameters Hierarchical Bayesian models Bayesian belief nets.
Lauritzen-Spiegelhalter Algorithm
Artificial Intelligence Universitatea Politehnica Bucuresti Adina Magda Florea
Exact Inference in Bayes Nets
Reasoning under Uncertainty Department of Computer Science & Engineering Indian Institute of Technology Kharagpur.
Identifying Conditional Independencies in Bayes Nets Lecture 4.
For Monday Read chapter 18, sections 1-2 Homework: –Chapter 14, exercise 8 a-d.
For Monday Finish chapter 14 Homework: –Chapter 13, exercises 8, 15.
Artificial Intelligence Chapter 14. Resolution in the Propositional Calculus Artificial Intelligence Chapter 14. Resolution in the Propositional Calculus.
Introduction of Probabilistic Reasoning and Bayesian Networks
Artificial Intelligence Chapter 19 Reasoning with Uncertain Information Biointelligence Lab School of Computer Sci. & Eng. Seoul National University.
Overview of Inference Algorithms for Bayesian Networks Wei Sun, PhD Assistant Research Professor SEOR Dept. & C4I Center George Mason University, 2009.
Causal and Bayesian Network (Chapter 2) Book: Bayesian Networks and Decision Graphs Author: Finn V. Jensen, Thomas D. Nielsen CSE 655 Probabilistic Reasoning.
M.I. Jaime Alfonso Reyes ´Cortés.  The basic task for any probabilistic inference system is to compute the posterior probability distribution for a set.
ETHEM ALPAYDIN © The MIT Press, Lecture Slides for.
A Differential Approach to Inference in Bayesian Networks - Adnan Darwiche Jiangbo Dang and Yimin Huang CSCE582 Bayesian Networks and Decision Graph.
Uncertainty & Bayesian Belief Networks. 2 Data-Mining with Bayesian Networks on the Internet Internet can be seen as a massive repository of Data Data.
Representing Uncertainty CSE 473. © Daniel S. Weld 2 Many Techniques Developed Fuzzy Logic Certainty Factors Non-monotonic logic Probability Only one.
5/25/2005EE562 EE562 ARTIFICIAL INTELLIGENCE FOR ENGINEERS Lecture 16, 6/1/2005 University of Washington, Department of Electrical Engineering Spring 2005.
CSE (c) S. Tanimoto, 2008 Bayes Nets 1 Probabilistic Reasoning With Bayes’ Rule Outline: Motivation Generalizing Modus Ponens Bayes’ Rule Applying.
Bayesian Networks Alan Ritter.
Artificial Intelligence Chapter 17 Knowledge-Based Systems Biointelligence Lab School of Computer Sci. & Eng. Seoul National University.
Artificial Intelligence Chapter 14 Resolution in the Propositional Calculus Artificial Intelligence Chapter 14 Resolution in the Propositional Calculus.
A Differential Approach to Inference in Bayesian Networks - Adnan Darwiche Jiangbo Dang and Yimin Huang CSCE582 Bayesian Networks and Decision Graphs.
1 Bayesian Networks Chapter ; 14.4 CS 63 Adapted from slides by Tim Finin and Marie desJardins. Some material borrowed from Lise Getoor.
Quiz 4: Mean: 7.0/8.0 (= 88%) Median: 7.5/8.0 (= 94%)
Made by: Maor Levy, Temple University  Probability expresses uncertainty.  Pervasive in all of Artificial Intelligence  Machine learning 
Bayesian networks Chapter 14. Outline Syntax Semantics.
CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes March 13, 2012.
Bayesian Networks What is the likelihood of X given evidence E? i.e. P(X|E) = ?
For Wednesday Read Chapter 11, sections 1-2 Program 2 due.
Bayesian Networks for Data Mining David Heckerman Microsoft Research (Data Mining and Knowledge Discovery 1, (1997))
Probabilistic Reasoning ECE457 Applied Artificial Intelligence Spring 2007 Lecture #9.
Bayesian Statistics and Belief Networks. Overview Book: Ch 13,14 Refresher on Probability Bayesian classifiers Belief Networks / Bayesian Networks.
1 Monte Carlo Artificial Intelligence: Bayesian Networks.
Introduction to Bayesian Networks
Ch 8. Graphical Models Pattern Recognition and Machine Learning, C. M. Bishop, Revised by M.-O. Heo Summarized by J.W. Nam Biointelligence Laboratory,
Announcements Project 4: Ghostbusters Homework 7
PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 8: GRAPHICAL MODELS.
1 CMSC 671 Fall 2001 Class #21 – Tuesday, November 13.
Cognitive Computer Vision Kingsley Sage and Hilary Buxton Prepared under ECVision Specific Action 8-3
METU Informatics Institute Min720 Pattern Classification with Bio-Medical Applications Lecture notes 9 Bayesian Belief Networks.
Computing & Information Sciences Kansas State University Data Sciences Summer Institute Multimodal Information Access and Synthesis Learning and Reasoning.
Bayesian Networks Aldi Kraja Division of Statistical Genomics.
INTRODUCTION TO MACHINE LEARNING 3RD EDITION ETHEM ALPAYDIN © The MIT Press, Lecture.
Lecture 29 Conditional Independence, Bayesian networks intro Ch 6.3, 6.3.1, 6.5, 6.5.1,
Exact Inference in Bayes Nets. Notation U: set of nodes in a graph X i : random variable associated with node i π i : parents of node i Joint probability:
1 CMSC 671 Fall 2001 Class #20 – Thursday, November 8.
Pattern Recognition and Machine Learning
CSE 473 Uncertainty. © UW CSE AI Faculty 2 Many Techniques Developed Fuzzy Logic Certainty Factors Non-monotonic logic Probability Only one has stood.
Bayesian Brain Probabilistic Approaches to Neural Coding 1.1 A Probability Primer Bayesian Brain Probabilistic Approaches to Neural Coding 1.1 A Probability.
Artificial Intelligence Chapter 7 Agents That Plan Biointelligence Lab School of Computer Sci. & Eng. Seoul National University.
CS 2750: Machine Learning Bayesian Networks Prof. Adriana Kovashka University of Pittsburgh March 14, 2016.
Bayesian Decision Theory Introduction to Machine Learning (Chap 3), E. Alpaydin.
INTRODUCTION TO Machine Learning 2nd Edition
CS 2750: Machine Learning Directed Graphical Models
Artificial Intelligence Chapter 19
CAP 5636 – Advanced Artificial Intelligence
Biointelligence Lab School of Computer Sci. & Eng.
Bayesian Statistics and Belief Networks
Artificial Intelligence Chapter 20 Learning and Acting with Bayes Nets
Class #19 – Tuesday, November 3
CS 188: Artificial Intelligence
Biointelligence Lab School of Computer Sci. & Eng.
Biointelligence Lab School of Computer Sci. & Eng.
Class #16 – Tuesday, October 26
Chapter 20. Learning and Acting with Bayes Nets
Presentation transcript:

Artificial Intelligence Chapter 19 Reasoning with Uncertain Information Biointelligence Lab School of Computer Sci. & Eng. Seoul National University

(C) SNU CSE Biointelligence Lab2 Outline Review of Probability Theory Probabilistic Inference Bayes Networks Patterns of Inference in Bayes Networks Uncertain Evidence D-Separation Probabilistic Inference in Polytrees

(C) SNU CSE Biointelligence Lab Review of Probability Theory (1/4) Random variables Joint probability (B (BAT_OK), M (MOVES), L (LIFTABLE), G (GUAGE)) Joint Probability (True, True, True, True) (True, True, True, False) (True, True, False, True) (True, True, False, False) …… Ex.

(C) SNU CSE Biointelligence Lab Review of Probability Theory (2/4) Marginal probability Conditional probability  Ex. The probability that the battery is charged given that the arm does not move Ex.

(C) SNU CSE Biointelligence Lab Review of Probability Theory (3/4) Figure 19.1 A Venn Diagram

(C) SNU CSE Biointelligence Lab Review of Probability Theory (4/4) Chain rule Bayes’ rule set notation  Abbreviation for where

(C) SNU CSE Biointelligence Lab Probabilistic Inference We desire to calculate the probability of some variable V i has value v i given the evidence  =e. p(P,Q,R)0.3 p(P,Q,¬R)0.2 p(P, ¬Q,R)0.2 p(P, ¬Q,¬R)0.1 p(¬P,Q,R)0.05 p(¬P, Q, ¬R)0.1 p(¬P, ¬Q,R)0.05 p(¬P, ¬Q,¬R)0.0 Example

(C) SNU CSE Biointelligence Lab8 Statistical Independence Conditional independence  Intuition: V i tells us nothing more about  than we already knew by knowing V j Mutually conditional independence Unconditional independence : a set of variables (When is empty)

(C) SNU CSE Biointelligence Lab Bayes Networks (1/2) Directed, acyclic graph (DAG) whose nodes are labeled by random variables Characteristics of Bayesian networks  Node V i is conditionally independent of any subset of nodes that are not descendents of V i given its parents Prior probability Conditional probability table (CPT)

(C) SNU CSE Biointelligence Lab Bayes Networks (2/2) Bayes network about the block-lifting example

(C) SNU CSE Biointelligence Lab Patterns of Inference in Bayes Networks (1/3) Causal or top-down inference  Ex. The probability that the arm moves given that the block is liftable (chain rule) (from the structure) B GM L

(C) SNU CSE Biointelligence Lab12 Diagnostic or bottom-up inference  Using an effect (or symptom) to infer a cause  Ex. The probability that the block is not liftable given that the arm does not move. (using causal reasoning) (using Bayes’ rule) 19.4 Patterns of Inference in Bayes Networks (2/3) B GM L (using Bayes’ rule)

(C) SNU CSE Biointelligence Lab13 Explaining away  One evidence: (the arm does not move)  Additional evidence: (the battery is not charged)  ¬B explains ¬M, making ¬L less certain (0.30<0.7379) (Bayes’ rule) (def. of conditional prob.) (structure of the Bayes network) 19.4 Patterns of Inference in Bayes Networks (3/3) B GM L

(C) SNU CSE Biointelligence Lab Uncertain Evidence We must be certain about the truth or falsity of the propositions they represent.  Each uncertain evidence node should have a child node, about which we can be certain.  Ex. Suppose the robot is not certain that its arm did not move.  Introducing M’ : “The arm sensor says that the arm moved” –We can be certain that that proposition is either true or false.  p(¬L| ¬B, ¬M’) instead of p(¬L| ¬B, ¬M)  Ex. Suppose we are uncertain about whether or not the battery is charged.  Introducing G : “Battery guage”  p(¬L| ¬G, ¬M’) instead of p(¬L| ¬B, ¬M’)

(C) SNU CSE Biointelligence Lab D-Separation (1/3) D-saparation: direction-dependent separation Two nodes V i and V j are conditionally independent given a set of nodes  if for every undirected path in the Bayes network between V i and V j, there is some node, V b, on the path having one of the following three properties.  V b is in , and both arcs on the path lead out of V b  V b is in , and one arc on the path leads in to V b and one arc leads out.  Neither V b nor any descendant of V b is in , and both arcs on the path lead in to V b. V b blocks the path given  when any one of these conditions holds for a path. If all paths between V i and V j are blocked, we say that  d-separates V i and V j

(C) SNU CSE Biointelligence Lab D-Separation (2/3) Figure 19.3 Conditional Independence via Blocking Nodes

(C) SNU CSE Biointelligence Lab D-Separation (3/3) Ex.  I(G, L|B) by rules 1 and 3  By rule 1, B blocks the (only) path between G and L, given B.  By rule 3, M also blocks this path given B.  I(G, L)  By rule 3, M blocks the path between G and L.  I(B, L)  By rule 3, M blocks the path between B and L. Even using d-separation, probabilistic inference in Bayes networks is, in general, NP-hard. B GM L

(C) SNU CSE Biointelligence Lab Probabilistic Inference in Polytrees (1/2) Polytree  A DAG for which there is just one path, along arcs in either direction, between any two nodes in the DAG.

(C) SNU CSE Biointelligence Lab19 A node is above Q  The node is connected to Q only through Q’s parents A node is below Q  The node is connected to Q only through Q’s immediate successors. Three types of evidences  All evidence nodes are above Q.  All evidence nodes are below Q.  There are evidence nodes both above and below Q Probabilistic Inference in Polytrees (2/2)

(C) SNU CSE Biointelligence Lab20 Evidence Above (1/2) Bottom-up recursive algorithm Ex. p(Q|P5, P4) (Structure of The Bayes network) (d-separation)

(C) SNU CSE Biointelligence Lab21 Evidence Above (2/2) Calculating p(P7|P4) and p(P6|P5) Calculating p(P1|P5)  Evidence is “below”  Here, we use Bayes’ rule

(C) SNU CSE Biointelligence Lab22 Evidence Below (1/2) Using a top-down recursive algorithm (d-separation)

(C) SNU CSE Biointelligence Lab23 Evidence Below (2/2)

(C) SNU CSE Biointelligence Lab24 Evidence Above and Below ++ -- (We have calculated two probabilities already) (d-separation)

(C) SNU CSE Biointelligence Lab25 A Numerical Example (1/2) We want to calculate p ( Q | U ) (Bayes’ rule) To determine k, we need to calculate p (¬ Q | U )

(C) SNU CSE Biointelligence Lab26 A Numerical Example (2/2) (Bayes’ rule) Finally

Other methods for Probabilistic inference in Bayes Networks Bucket elimination Monte Carlo methods (when the network is not a polytree) Clustering (C) SNU CSE Biointelligence Lab27

(C) SNU CSE Biointelligence Lab28 Additional Readings (1/5) [Feller 1968]  Probability Theory [Goldszmidt, Morris & Pearl 1990]  Non-monotonic inference through probabilistic method [Pearl 1982a, Kim & Pearl 1983]  Message-passing algorithm [Russell & Norvig 1995, pp.447ff]  Polytree methods

(C) SNU CSE Biointelligence Lab29 Additional Readings (2/5) [Shachter & Kenley 1989]  Bayesian network for continuous random variables [Wellman 1990]  Qualitative networks [Neapolitan 1990]  Probabilistic methods in expert systems [Henrion 1990]  Probability inference in Bayesian networks

(C) SNU CSE Biointelligence Lab30 Additional Readings (3/5) [Jensen 1996]  Bayesian networks: HUGIN system [Neal 1991]  Relationships between Bayesian networks and neural networks [Hecherman 1991, Heckerman & Nathwani 1992]  PATHFINDER [Pradhan, et al. 1994]  CPCSBN

(C) SNU CSE Biointelligence Lab31 Additional Readings (4/5) [Shortliffe 1976, Buchanan & Shortliffe 1984]  MYCIN: uses certainty factor [Duda, Hart & Nilsson 1987]  PROSPECTOR: uses sufficiency index and necessity index [Zadeh 1975, Zadeh 1978, Elkan 1993]  Fuzzy logic and possibility theory [Dempster 1968, Shafer 1979]  Dempster-Shafer’s combination rules

(C) SNU CSE Biointelligence Lab32 Additional Readings (5/5) [Nilsson 1986]  Probabilistic logic [Tversky & Kahneman 1982]  Human generally loses consistency facing uncertainty [Shafer & Pearl 1990]  Papers for uncertain inference Proceedings & Journals  Uncertainty in Artificial Intelligence (UAI)  International Journal of Approximate Reasoning