Bayes Nets Rong Jin. Hidden Markov Model  Inferring from observations (o i ) to hidden variables (q i )  This is a general framework for representing.

Slides:



Advertisements
Similar presentations
CS188: Computational Models of Human Behavior
Advertisements

CSE 473/573 Computer Vision and Image Processing (CVIP) Ifeoma Nwogu Lecture 27 – Overview of probability concepts 1.
A Tutorial on Learning with Bayesian Networks
Probabilistic models Jouni Tuomisto THL. Outline Deterministic models with probabilistic parameters Hierarchical Bayesian models Bayesian belief nets.
Naïve Bayes. Bayesian Reasoning Bayesian reasoning provides a probabilistic approach to inference. It is based on the assumption that the quantities of.
1 Some Comments on Sebastiani et al Nature Genetics 37(4)2005.
Identifying Conditional Independencies in Bayes Nets Lecture 4.
Introduction of Probabilistic Reasoning and Bayesian Networks
CPSC 322, Lecture 26Slide 1 Reasoning Under Uncertainty: Belief Networks Computer Science cpsc322, Lecture 27 (Textbook Chpt 6.3) March, 16, 2009.
Review: Bayesian learning and inference
Bayesian Networks Chapter 2 (Duda et al.) – Section 2.11
1 Graphical Models in Data Assimilation Problems Alexander Ihler UC Irvine Collaborators: Sergey Kirshner Andrew Robertson Padhraic Smyth.
Probabilistic Reasoning Copyright, 1996 © Dale Carnegie & Associates, Inc. Chapter 14 (14.1, 14.2, 14.3, 14.4) Capturing uncertain knowledge Probabilistic.
Bayesian Belief Networks
Bayesian Networks. Graphical Models Bayesian networks Conditional random fields etc.
10/22  Homework 3 returned; solutions posted  Homework 4 socket opened  Project 3 assigned  Mid-term on Wednesday  (Optional) Review session Tuesday.
. Approximate Inference Slides by Nir Friedman. When can we hope to approximate? Two situations: u Highly stochastic distributions “Far” evidence is discarded.
1 Bayesian Networks Chapter ; 14.4 CS 63 Adapted from slides by Tim Finin and Marie desJardins. Some material borrowed from Lise Getoor.
Bayesian networks More commonly called graphical models A way to depict conditional independence relationships between random variables A compact specification.
Learning In Bayesian Networks. Learning Problem Set of random variables X = {W, X, Y, Z, …} Training set D = { x 1, x 2, …, x N }  Each observation specifies.
Bayesian Networks Textbook: Probabilistic Reasoning, Sections 1-2, pp
Quiz 4: Mean: 7.0/8.0 (= 88%) Median: 7.5/8.0 (= 94%)
Machine Learning CUNY Graduate Center Lecture 21: Graphical Models.
Read R&N Ch Next lecture: Read R&N
A Brief Introduction to Graphical Models
Bayesian Networks Martin Bachler MLA - VO
Unsupervised Learning: Clustering Some material adapted from slides by Andrew Moore, CMU. Visit for
1 Monte Carlo Artificial Intelligence: Bayesian Networks.
Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Lecture 28 of 41 Friday, 22 October.
UIUC CS 498: Section EA Lecture #21 Reasoning in Artificial Intelligence Professor: Eyal Amir Fall Semester 2011 (Some slides from Kevin Murphy (UBC))
Ch 8. Graphical Models Pattern Recognition and Machine Learning, C. M. Bishop, Revised by M.-O. Heo Summarized by J.W. Nam Biointelligence Laboratory,
1 CMSC 671 Fall 2001 Class #21 – Tuesday, November 13.
Computing & Information Sciences Kansas State University Data Sciences Summer Institute Multimodal Information Access and Synthesis Learning and Reasoning.
The famous “sprinkler” example (J. Pearl, Probabilistic Reasoning in Intelligent Systems, 1988)
Learning In Bayesian Networks. General Learning Problem Set of random variables X = {X 1, X 2, X 3, X 4, …} Training set D = { X (1), X (2), …, X (N)
Computer Vision Lecture 6. Probabilistic Methods in Segmentation.
CPSC 422, Lecture 11Slide 1 Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 11 Oct, 2, 2015.
Review: Bayesian inference  A general scenario:  Query variables: X  Evidence (observed) variables and their values: E = e  Unobserved variables: Y.
1 CMSC 671 Fall 2001 Class #20 – Thursday, November 8.
Bayes network inference  A general scenario:  Query variables: X  Evidence (observed) variables and their values: E = e  Unobserved variables: Y 
Introduction on Graphic Models
Oct 15th, 2001Copyright © 2001, Andrew W. Moore Bayes Nets for representing and reasoning about uncertainty Andrew W. Moore Associate Professor School.
Zaawansowana Analiza Danych Wykład 3: Sieci Bayesowskie Piotr Synak.
Belief Networks Kostas Kontogiannis E&CE 457. Belief Networks A belief network is a graph in which the following holds: –A set of random variables makes.
Bayesian networks Chapter 14 Slide Set 2. Constructing Bayesian networks 1. Choose an ordering of variables X 1, …,X n 2. For i = 1 to n –add X i to the.
Conditional Independence As with absolute independence, the equivalent forms of X and Y being conditionally independent given Z can also be used: P(X|Y,
PROBABILISTIC REASONING Heng Ji 04/05, 04/08, 2016.
Chapter 12. Probability Reasoning Fall 2013 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University.
An Algorithm to Learn the Structure of a Bayesian Network Çiğdem Gündüz Olcay Taner Yıldız Ethem Alpaydın Computer Engineering Taner Bilgiç Industrial.
Web-Mining Agents Data Mining Prof. Dr. Ralf Möller Universität zu Lübeck Institut für Informationssysteme Karsten Martiny (Übungen)
Basics of Multivariate Probability
CS 2750: Machine Learning Directed Graphical Models
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 12
Today.
Read R&N Ch Next lecture: Read R&N
Bayes Net Learning: Bayesian Approaches
CS 4/527: Artificial Intelligence
CAP 5636 – Advanced Artificial Intelligence
Read R&N Ch Next lecture: Read R&N
Inference Inference: calculating some useful quantity from a joint probability distribution Examples: Posterior probability: Most likely explanation: B.
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 12
Propagation Algorithm in Bayesian Networks
Instructors: Fei Fang (This Lecture) and Dave Touretzky
Professor Marie desJardins,
CS 188: Artificial Intelligence
Class #19 – Tuesday, November 3
CS 188: Artificial Intelligence Fall 2008
Class #16 – Tuesday, October 26
Read R&N Ch Next lecture: Read R&N
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 12
Presentation transcript:

Bayes Nets Rong Jin

Hidden Markov Model  Inferring from observations (o i ) to hidden variables (q i )  This is a general framework for representing and reasoning about uncertainty Representing uncertain information with random variables (nodes) Representing the relationship between information with conditional probability distribution (directed arcs) Infer from observation (shadowed nodes) to the hidden variables (circled nodes) q0q0 q1q1 q2q2 q3q3 q4q4 O0O0 O1O1 O3O3 O4O4 O2O2

An Example of Bayes Network  S: It is sunny  L: Ali arrives slightly late  O: Slides are put on web late

Bayes Network Example SO L Absence of an arrow: Random S and O are independent. Knowing S will not help predicate O Two arrows into L: L depends on S and O. Knowing S and O will help predicate L.

Inference in Bayes Network  S = 1, O = 0, P(L) = ?  S = 1, P(O) = ?, P(L) = ?  L = 1, P(S) = ?, P(O) = ?  L = 1, S = 1, P(O) = ? SO L

Conditional Independence  Formal definition: A and B are conditional independent given C iff  Different from independence C A B  Example:  A: shoe size  B: glove size  C: heigh  Shoe size is not independent from glove size

Distinguish Two Cases C A B  A: shoe size  B: glove size  C: heigh SO L  S: It is sunny  L: Ali arrives slightly late  O: Slides are put on web late Given C: A and B are independent Without C: A and B can be dependent Without L: S and O are independent Given L: S and O can be dependent

Another Example for Bayes Nets Inference questions  W=1, P(R) =?  W= 1, P(C) = ?  W= 1, C = 1, P(S) = ?, P(C) = ?, P(S,R) = ? Cloudy Sprinkle Rain WetGrass

Bayes Nets Formalized A Bayes net (also called a belief network) is an augmented directed acyclic graph, represented by the pair V, E where: V is a set of vertices. E is a set of directed edges joining vertices. No loops of any length are allowed. Each vertex in V contains the following information: The name of a random variable A probability distribution table indicating how the probability of this variable’s values depends on all possible combinations of parental values.

Building a Bayes Net 1. Choose a set of relevant variables. 2. Choose an ordering for them 3. Assume they’re called X 1.. X m (where X 1 is the first in the ordering, X 1 is the second, etc) 4. For i = 1 to m: 1. Add the X i node to the network 2. Set Parents(X i ) to be a minimal subset of { X 1 …X i-1 } such that we have conditional independence of X i and all other members of { X 1 …X i-1 } given Parents(X i ) 3. Define the probability table of P(X i =k  Assignments of Parents(X i ) ).

Example of Building Bayes Nets Suppose we’re building a nuclear power station. There are the following random variables: GRL : Gauge Reads Low. CTL : Core temperature is low. FG : Gauge is faulty. FA : Alarm is faulty AS : Alarm sounds  If alarm working properly, the alarm is meant to sound if the gauge stops reading a low temp.  If gauge working properly, the gauge is meant to read the temp of the core.

Bayes Net for Power Station CTL GRL AS FA FG GRL : Gauge Reads Low. CTL : Core temperature is low. FG : Gauge is faulty. FA : Alarm is faulty AS : Alarm sounds

Inference with Bayes Nets  Key issue: computing joint probability P(X 1 = x 1 ^ X 2 =x 2 ^ ….X n-1 =x n-1 ^ X n =x n )  Using the conditional independence relations to simplify the computation

Example for Inference Inference questions  W=1, P(R) =?  W= 1, P(C) = ?  W= 1, C = 1, P(S) = ?, P(C) = ?, P(S,R) = ? Cloudy Sprinkle Rain WetGrass

Problem with Inference using Bayes Nets  Inference Infer from observations E O to unknown variables E u Suppose you have m binary-valued variables in your Bayes Net and expression E o mentions k variables. How much work is the above computation?

Problem with Inference using Bayes Nets  General querying of Bayes nets is NP-complete.  Some solutions: Belief propagation  Take advantage of the structure of Bayes nets Stochastic simulation  Similar to the sampling approaches for Bayesian average

More Interesting Questions  Learning Bayes nets Given the topological structure of a Bayes net, learn all the conditional probability tables from examples  Example: Hierarchical mixture model Learning the topological structure of Bayes net  Very very hard question  Unfortunately, the lecturer does not have enough knowledge to teach you if he wants to !

Learning Cond. Probabilities in Bayes Nets  Three types of training examples 1. C, S, R, W 2. C, R, W 3. S, C, W Maximum likelihood approach for estimating the conditional probabilities EM algorithm for optimization Cloudy Sprinkle Rain WetGrass