A Brief Introduction to Bayesian networks

Slides:



Advertisements
Similar presentations
Bayesian networks Chapter 14 Section 1 – 2. Outline Syntax Semantics Exact computation.
Advertisements

Probabilistic Reasoning Bayesian Belief Networks Constructing Bayesian Networks Representing Conditional Distributions Summary.
BAYESIAN NETWORKS. Bayesian Network Motivation  We want a representation and reasoning system that is based on conditional independence  Compact yet.
Identifying Conditional Independencies in Bayes Nets Lecture 4.
1 22c:145 Artificial Intelligence Bayesian Networks Reading: Ch 14. Russell & Norvig.
Bayesian Networks Chapter 14 Section 1, 2, 4. Bayesian networks A simple, graphical notation for conditional independence assertions and hence for compact.
CPSC 322, Lecture 26Slide 1 Reasoning Under Uncertainty: Belief Networks Computer Science cpsc322, Lecture 27 (Textbook Chpt 6.3) March, 16, 2009.
Review: Bayesian learning and inference
Bayesian Networks. Motivation The conditional independence assumption made by naïve Bayes classifiers may seem to rigid, especially for classification.
Bayesian networks Chapter 14 Section 1 – 2.
Bayesian Belief Networks
Bayesian Belief Network. The decomposition of large probabilistic domains into weakly connected subsets via conditional independence is one of the most.
University College Cork (Ireland) Department of Civil and Environmental Engineering Course: Engineering Artificial Intelligence Dr. Radu Marinescu Lecture.
5/25/2005EE562 EE562 ARTIFICIAL INTELLIGENCE FOR ENGINEERS Lecture 16, 6/1/2005 University of Washington, Department of Electrical Engineering Spring 2005.
Bayesian Reasoning. Tax Data – Naive Bayes Classify: (_, No, Married, 95K, ?)
Bayesian networks More commonly called graphical models A way to depict conditional independence relationships between random variables A compact specification.
Probabilistic Reasoning
EE562 ARTIFICIAL INTELLIGENCE FOR ENGINEERS
Bayesian Networks Material used 1 Random variables
Artificial Intelligence CS 165A Tuesday, November 27, 2007  Probabilistic Reasoning (Ch 14)
Machine Learning CUNY Graduate Center Lecture 21: Graphical Models.
Read R&N Ch Next lecture: Read R&N
Bayesian networks Chapter 14. Outline Syntax Semantics.
Bayesian networks Chapter 14 Section 1 – 2. Bayesian networks A simple, graphical notation for conditional independence assertions and hence for compact.
An Introduction to Artificial Intelligence Chapter 13 & : Uncertainty & Bayesian Networks Ramin Halavati
CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes March 13, 2012.
1 CS 391L: Machine Learning: Bayesian Learning: Beyond Naïve Bayes Raymond J. Mooney University of Texas at Austin.
Bayesian networks. Motivation We saw that the full joint probability can be used to answer any question about the domain, but can become intractable as.
1 Chapter 14 Probabilistic Reasoning. 2 Outline Syntax of Bayesian networks Semantics of Bayesian networks Efficient representation of conditional distributions.
2 Syntax of Bayesian networks Semantics of Bayesian networks Efficient representation of conditional distributions Exact inference by enumeration Exact.
1 Monte Carlo Artificial Intelligence: Bayesian Networks.
An Introduction to Artificial Intelligence Chapter 13 & : Uncertainty & Bayesian Networks Ramin Halavati
Overview of the final test for CSC Overview PART A: 7 easy questions –You should answer 5 of them. If you answer more we will select 5 at random.
Probabilistic Reasoning [Ch. 14] Bayes Networks – Part 1 ◦Syntax ◦Semantics ◦Parameterized distributions Inference – Part2 ◦Exact inference by enumeration.
Marginalization & Conditioning Marginalization (summing out): for any sets of variables Y and Z: Conditioning(variant of marginalization):
Review: Bayesian inference  A general scenario:  Query variables: X  Evidence (observed) variables and their values: E = e  Unobserved variables: Y.
Lecture 2: Statistical learning primer for biologists
1 CMSC 671 Fall 2001 Class #20 – Thursday, November 8.
CPSC 322, Lecture 26Slide 1 Reasoning Under Uncertainty: Belief Networks Computer Science cpsc322, Lecture 27 (Textbook Chpt 6.3) Nov, 13, 2013.
PROBABILISTIC REASONING Heng Ji 04/05, 04/08, 2016.
Chapter 12. Probability Reasoning Fall 2013 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University.
Web-Mining Agents Data Mining Prof. Dr. Ralf Möller Universität zu Lübeck Institut für Informationssysteme Karsten Martiny (Übungen)
Artificial Intelligence Bayes’ Nets: Independence Instructors: David Suter and Qince Li Course Harbin Institute of Technology [Many slides.
Another look at Bayesian inference
Reasoning Under Uncertainty: Belief Networks
CS 2750: Machine Learning Review
CS 2750: Machine Learning Directed Graphical Models
Bayesian Networks Chapter 14 Section 1, 2, 4.
Bayesian networks Chapter 14 Section 1 – 2.
Presented By S.Yamuna AP/CSE
Web-Mining Agents Data Mining
Qian Liu CSE spring University of Pennsylvania
Computer Science Department
Read R&N Ch Next lecture: Read R&N
CS 4/527: Artificial Intelligence
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 18
Read R&N Ch Next lecture: Read R&N
Probabilistic Reasoning; Network-based reasoning
Structure and Semantics of BN
CAP 5636 – Advanced Artificial Intelligence
CS 188: Artificial Intelligence Fall 2007
CS 188: Artificial Intelligence
Structure and Semantics of BN
Read R&N Ch Next lecture: Read R&N
Bayesian networks Chapter 14 Section 1 – 2.
Probabilistic Reasoning
CS 188: Artificial Intelligence Spring 2006
Read R&N Ch Next lecture: Read R&N
Chapter 14 February 26, 2004.
Presentation transcript:

A Brief Introduction to Bayesian networks CIS 391 – Introduction to Artificial Intelligence

Bayesian networks A simple, graphical notation for conditional independence assertions and hence for compact specification of full joint distributions Syntax: A set of nodes, one per random variable A set of directed edges (link ≈ "directly influences"), yielding a directed, acyclic graph A conditional distribution for each node given its parents: P (Xi | Parents (Xi)) In the simplest case, conditional distribution represented as a conditional probability table (CPT) giving the distribution over Xi for each combination of parent values CIS 391 - Introduction to AI

The Required-By-Law Burglar Example I'm at work, and my neighbor John calls to say my burglar alarm is ringing, but my neighbor Mary doesn't call. Sometimes it's set off by minor earthquakes. Is there a burglar? Variables: Burglary, Earthquake, Alarm, JohnCalls, MaryCalls Network topology reflects "causal" knowledge: A burglar can set the alarm off An earthquake can set the alarm off The alarm can cause Mary to call The alarm can cause John to call CIS 391 - Introduction to AI

Belief network for example CIS 391 - Introduction to AI

Semantics Local semantics give rise to global semantics Local semantics: given its parents, each node is conditionally independent of everything except its descendants Global semantics (why?): CIS 391 - Introduction to AI

Belief Network Construction Algorithm Choose an ordering of variables X1, … , Xn For i = 1 to n add Xi to the network select parents from X1, … , Xi-1 such that CIS 391 - Introduction to AI

Construction Example Suppose we choose the ordering M, J, A, B, E M J P(J|M)=P(J)? No P(A|J,M)=P(A)? P(A|J,M)=P(A|J)? No P(B|A,J,M)=P(B)? No P(B|A,J,M)=P(B|A)? Yes P(E|B,A,J,M)=P(E|A)? No P(E|B,A,J,M)=P(E|A,B)? Yes CIS 391 - Introduction to AI

Lessons from the Example Network less compact: 13 numbers (compared to 10) Ordering of variables can make a big difference! Intuitions about causality useful CIS 391 - Introduction to AI

Belief Networks and Hidden Variables: An Example CIS 391 - Introduction to AI

Compact Conditional Probability Tables CPT size grows exponentially; continuous variables have infinite CPTs! Solution: compactly defined canonical distributions e.g. boolean functions Gaussian and other standard distributions CIS 391 - Introduction to AI

For Inference in Bayesian Networks… CIS 520 Machine Learning CIS 520 provides a fundamental introduction to the mathematics, algorithms and practice of machine learning. Topics covered include: Supervised learning: least squares regression, logistic regression, perceptron, generalized linear models, discriminant analysis, naive Bayes, support vector machines. Model and feature selection, ensemble methods, bagging, boosting. Learning theory: Bias/variance tradeoff. Union and Chernoff/Hoeffding bounds. VC dimension. Online learning. Unsupervised learning: Clustering. K-means. EM. Mixture of Gaussians. Factor analysis. PCA. MDS. pPCA. Independent components analysis (ICA). Graphical models: HMMs, Bayesian and Markov networks. Inference. Variable elimination. Reinforcement learning: MDPs. Bellman equations. Value iteration. Policy iteration. Q-learning. Value function approximation. (But check prerequisites: CIS 320, statistics, linear algebra) CIS 391 - Introduction to AI