Graphical Models Lei Tang. Review of Graphical Models Directed Graph (DAG, Bayesian Network, Belief Network) Typically used to represent causal relationship.

Slides:



Advertisements
Similar presentations
CS188: Computational Models of Human Behavior
Advertisements

Markov Networks Alan Ritter.
Exact Inference. Inference Basic task for inference: – Compute a posterior distribution for some query variables given some observed evidence – Sum out.
Variational Methods for Graphical Models Micheal I. Jordan Zoubin Ghahramani Tommi S. Jaakkola Lawrence K. Saul Presented by: Afsaneh Shirazi.
CS498-EA Reasoning in AI Lecture #15 Instructor: Eyal Amir Fall Semester 2011.
Graphical Models BRML Chapter 4 1. the zoo of graphical models Markov networks Belief networks Chain graphs (Belief and Markov ) Factor graphs =>they.
Lauritzen-Spiegelhalter Algorithm
Exact Inference in Bayes Nets
Junction Trees And Belief Propagation. Junction Trees: Motivation What if we want to compute all marginals, not just one? Doing variable elimination for.
Dynamic Bayesian Networks (DBNs)
An Introduction to Variational Methods for Graphical Models.
EE462 MLCV Lecture Introduction of Graphical Models Markov Random Fields Segmentation Tae-Kyun Kim 1.
Belief Propagation by Jakob Metzler. Outline Motivation Pearl’s BP Algorithm Turbo Codes Generalized Belief Propagation Free Energies.
Overview of Inference Algorithms for Bayesian Networks Wei Sun, PhD Assistant Research Professor SEOR Dept. & C4I Center George Mason University, 2009.
Graphical models: approximate inference and learning CA6b, lecture 5.
Hidden Markov Models M. Vijay Venkatesh. Outline Introduction Graphical Model Parameterization Inference Summary.
Chapter 8-3 Markov Random Fields 1. Topics 1. Introduction 1. Undirected Graphical Models 2. Terminology 2. Conditional Independence 3. Factorization.
GS 540 week 6. HMM basics Given a sequence, and state parameters: – Each possible path through the states has a certain probability of emitting the sequence.
ETHEM ALPAYDIN © The MIT Press, Lecture Slides for.
1 Graphical Models in Data Assimilation Problems Alexander Ihler UC Irvine Collaborators: Sergey Kirshner Andrew Robertson Padhraic Smyth.
Bayesian Networks. Graphical Models Bayesian networks Conditional random fields etc.
Belief Propagation, Junction Trees, and Factor Graphs
Computer vision: models, learning and inference Chapter 10 Graphical Models.
Bayes Nets. Bayes Nets Quick Intro Topic of much current research Models dependence/independence in probability distributions Graph based - aka “graphical.
1 Bayesian Networks Chapter ; 14.4 CS 63 Adapted from slides by Tim Finin and Marie desJardins. Some material borrowed from Lise Getoor.
Quiz 4: Mean: 7.0/8.0 (= 88%) Median: 7.5/8.0 (= 94%)
Machine Learning CUNY Graduate Center Lecture 21: Graphical Models.
Computer vision: models, learning and inference
CSC2535 Spring 2013 Lecture 1: Introduction to Machine Learning and Graphical Models Geoffrey Hinton.
Ch 8. Graphical Models Pattern Recognition and Machine Learning, C. M. Bishop, Summarized by B.-H. Kim Biointelligence Laboratory, Seoul National.
Directed - Bayes Nets Undirected - Markov Random Fields Gibbs Random Fields Causal graphs and causality GRAPHICAL MODELS.
Undirected Models: Markov Networks David Page, Fall 2009 CS 731: Advanced Methods in Artificial Intelligence, with Biomedical Applications.
Introduction to Bayesian Networks
Ch 8. Graphical Models Pattern Recognition and Machine Learning, C. M. Bishop, Revised by M.-O. Heo Summarized by J.W. Nam Biointelligence Laboratory,
1 CMSC 671 Fall 2001 Class #21 – Tuesday, November 13.
CS Statistical Machine learning Lecture 24
An Introduction to Variational Methods for Graphical Models
INTRODUCTION TO MACHINE LEARNING 3RD EDITION ETHEM ALPAYDIN © The MIT Press, Lecture.
Probabilistic Graphical Models seminar 15/16 ( ) Haim Kaplan Tel Aviv University.
Lecture 2: Statistical learning primer for biologists
Exact Inference in Bayes Nets. Notation U: set of nodes in a graph X i : random variable associated with node i π i : parents of node i Joint probability:
Maximum Entropy Model, Bayesian Networks, HMM, Markov Random Fields, (Hidden/Segmental) Conditional Random Fields.
Christopher M. Bishop, Pattern Recognition and Machine Learning 1.
1 CMSC 671 Fall 2001 Class #20 – Thursday, November 8.
Cognitive Computer Vision Kingsley Sage and Hilary Buxton Prepared under ECVision Specific Action 8-3
Pattern Recognition and Machine Learning
CS Statistical Machine learning Lecture 25 Yuan (Alan) Qi Purdue CS Nov
Bayes network inference  A general scenario:  Query variables: X  Evidence (observed) variables and their values: E = e  Unobserved variables: Y 
Introduction on Graphic Models
Machine Learning – Lecture 18
Perceptual and Sensory Augmented Computing Machine Learning, Summer’09 Machine Learning – Lecture 13 Exact Inference & Belief Propagation Bastian.
Today Graphical Models Representing conditional dependence graphically
Bayesian Belief Propagation for Image Understanding David Rosenberg.
Perfect recall: Every decision node observes all earlier decision nodes and their parents (along a “temporal” order) Sum-max-sum rule (dynamical programming):
INTRODUCTION TO Machine Learning 2nd Edition
CHAPTER 16: Graphical Models
Today.
Prof. Adriana Kovashka University of Pittsburgh April 4, 2017
Markov Networks.
CSCI 5822 Probabilistic Models of Human and Machine Learning
Markov Random Fields Presented by: Vladan Radosavljevic.
Exact Inference Eric Xing Lecture 11, August 14, 2010
Graduate School of Information Sciences, Tohoku University
Expectation-Maximization & Belief Propagation
Class #16 – Tuesday, October 26
Readings: K&F: 5.1, 5.2, 5.3, 5.4, 5.5, 5.6, 5.7 Markov networks, Factor graphs, and an unified view Start approximate inference If we are lucky… Graphical.
Lecture 3: Exact Inference in GMs
Markov Networks.
Advanced Machine Learning
Chapter 14 February 26, 2004.
Presentation transcript:

Graphical Models Lei Tang

Review of Graphical Models Directed Graph (DAG, Bayesian Network, Belief Network) Typically used to represent causal relationship Undirected Graph (Markov Random Field, Markov Network) Usually when the relationship between variables are not very clear.

Some rules(1) A graph to represent a regression problem Plate is used to represent repetition.

Some rules(2) Suppose we have some parameters Observations are shaded.

Model Representation (DAG) Usually, the higher-numbered variables corresponds to terminal nodes of the graph, representing the observations; Lower- numbered nodes are latent variables. A graph representing the naïve Bayes model.

Factorization For directed graph: (Ancestral Sampling) For undirected graph: Potential Function Partition Function Energy Function

Directed-> Undirected Graph

Moralization (marrying the parents) Moralization adds the fewest extra links but remains the maximum number of independence properties.

Perfect Map Every independence property of the distribution is reflected in the graph and vice versa, then the graph is a perfect map. Not all directed graph can not be represented as undirected graph. (As in previous example) Not all undirected graph can be represented as directed graph.

Inference on a Chain(1) N variables, each one has K states, then O(K^(N-1))

Inference on a Chain(2) Complexity: O( KN)

Inference on a Chain(3) Message Passed forwards along the chain Message Passed backwards along the chain

Inference on a Chain(4) This message passing is more efficient to find the marginal distributions of all variables. If some of the nodes in the graph are observed, then there is no summation for the corresponding variable. If some parameters are not observed, apply EM algorithm (discussed later)

Factor Graph We can apply similar strategy (message passing) to undirected/directed trees and polytrees as well. Polytree is a tree that one node has two or more parents. In a factor graph, a node (circle) represents a variable, and additional nodes (squares) represents a factor.

Factor Graph is not unique

A poly tree example It is still a tree without loops!!

The sum-product algorithm This algorithm is the same as belief propagation which is proposed for directed graphs without loops.

An intuitive Example

in General Graphs Exact Inference: Junction tree algorithm. Inexact inference: No closed form for the distribution. Dimensionality of latent space is too high.