Bayes Networks Markov Networks Noah Berlow. Bayesian -> Markov (Section 4.5.1) Given B, How can we turn into Markov Network? The general idea: – Convert.

Slides:



Advertisements
Similar presentations
CS188: Computational Models of Human Behavior
Advertisements

CSE 473/573 Computer Vision and Image Processing (CVIP) Ifeoma Nwogu Lecture 27 – Overview of probability concepts 1.
1 Undirected Graphical Models Graphical Models – Carlos Guestrin Carnegie Mellon University October 29 th, 2008 Readings: K&F: 4.1, 4.2, 4.3, 4.4,
Markov Networks Alan Ritter.
Graphical Models BRML Chapter 4 1. the zoo of graphical models Markov networks Belief networks Chain graphs (Belief and Markov ) Factor graphs =>they.
Bayesian Networks, Winter Yoav Haimovitch & Ariel Raviv 1.
BAYESIAN NETWORKS. Bayesian Network Motivation  We want a representation and reasoning system that is based on conditional independence  Compact yet.
Introduction to Markov Random Fields and Graph Cuts Simon Prince
The Distributive Property Section 5.4 Simplifying Expressions 3(4+x)+ x Like terms.
An introduction to Bayesian networks Stochastic Processes Course Hossein Amirkhani Spring 2011.
EE462 MLCV Lecture Introduction of Graphical Models Markov Random Fields Segmentation Tae-Kyun Kim 1.
From: Probabilistic Methods for Bioinformatics - With an Introduction to Bayesian Networks By: Rich Neapolitan.
Junction Tree Algorithm Brookes Vision Reading Group.
CS774. Markov Random Field : Theory and Application Lecture 06 Kyomin Jung KAIST Sep
Midterm Review. The Midterm Everything we have talked about so far Stuff from HW I won’t ask you to do as complicated calculations as the HW Don’t need.
Applied Discrete Mathematics Week 12: Trees
Bayesian Network Representation Continued
Bayesian Networks Clique tree algorithm Presented by Sergey Vichik.
Graphical Models Lei Tang. Review of Graphical Models Directed Graph (DAG, Bayesian Network, Belief Network) Typically used to represent causal relationship.
CPSC 422, Lecture 18Slide 1 Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 18 Feb, 25, 2015 Slide Sources Raymond J. Mooney University of.
Entropy Rate of a Markov Chain
Undirected Models: Markov Networks David Page, Fall 2009 CS 731: Advanced Methods in Artificial Intelligence, with Biomedical Applications.
Based on slides by Y. Peng University of Maryland
1 Variable Elimination Graphical Models – Carlos Guestrin Carnegie Mellon University October 11 th, 2006 Readings: K&F: 8.1, 8.2, 8.3,
Daphne Koller Variable Elimination Graph-Based Perspective Probabilistic Graphical Models Inference.
MAT 125 – Applied Calculus 5.3 – Compound Interest.
LAC group, 16/06/2011. So far...  Directed graphical models  Bayesian Networks Useful because both the structure and the parameters provide a natural.
Dependency Networks for Collaborative Filtering and Data Visualization UAI-2000 발표 : 황규백.
Lecture 2: Statistical learning primer for biologists
Seminar on random walks on graphs Lecture No. 2 Mille Gandelsman,
Point Estimation of Parameters and Sampling Distributions Outlines:  Sampling Distributions and the central limit theorem  Point estimation  Methods.
1 Use graphs and not pure logic Variables represented by nodes and dependencies by edges. Common in our language: “threads of thoughts”, “lines of reasoning”,
Graphs Basic properties.
Daphne Koller Markov Networks General Gibbs Distribution Probabilistic Graphical Models Representation.
Introduction on Graphic Models
Today Graphical Models Representing conditional dependence graphically
Central Limit Theorem Let X 1, X 2, …, X n be n independent, identically distributed random variables with mean  and standard deviation . For large n:
Bayesian Belief Propagation for Image Understanding David Rosenberg.
Markov Random Fields in Vision
1 Variable Elimination Graphical Models – Carlos Guestrin Carnegie Mellon University October 15 th, 2008 Readings: K&F: 8.1, 8.2, 8.3,
. Bayesian Networks Some slides have been edited from Nir Friedman’s lectures which is available at Changes made by Dan Geiger.
1 BN Semantics 3 – Now it’s personal! Parameter Learning 1 Graphical Models – Carlos Guestrin Carnegie Mellon University September 22 nd, 2006 Readings:
Today.
Data Mining Lecture 11.
Exact Inference Continued
The set  of all independence statements defined by (3
Based on slides by Y. Peng University of Maryland
Markov Networks.
General Gibbs Distribution
Independence in Markov Networks
Markov Networks Independencies Representation Probabilistic Graphical
General Gibbs Distribution
Bayesian Networks Independencies Representation Probabilistic
Markov Networks.
Exact Inference ..
Independence in Markov Networks
Trees L Al-zaid Math1101.
General Gibbs Distribution
Trees 11.1 Introduction to Trees Dr. Halimah Alshehri.
Readings: K&F: 5.1, 5.2, 5.3, 5.4, 5.5, 5.6, 5.7 Markov networks, Factor graphs, and an unified view Start approximate inference If we are lucky… Graphical.
Conditional Random Fields
Conditional Random Fields
BN Semantics 3 – Now it’s personal! Parameter Learning 1
Markov Networks Independencies Representation Probabilistic Graphical
Junction Trees 3 Undirected Graphical Models
Independence in Markov Networks
Markov Networks Independencies Representation Probabilistic Graphical
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 18
Variable Elimination Graphical Models – Carlos Guestrin
Applied Discrete Mathematics Week 13: Graphs
Presentation transcript:

Bayes Networks Markov Networks Noah Berlow

Bayesian -> Markov (Section 4.5.1) Given B, How can we turn into Markov Network? The general idea: – Convert CPD into factors Given graph G, how can G become H? The general idea: – Convert directed independencies to undirected using Moral Graphs

Background Material: Gibbs Distribution “Borrowed” from Dr. Sridharan’s slides

Distribution Perspective Suppose is a distribution for Bayesian network B with associated graph G The parameterization of B can directly become a Gibbs Distribution. What is the normalization value Z for this factor?

Distribution Example Consider the simple example: B0B1 A0.3.7 A1.1.9 A0A1.4.6 A0,B0A0,B1A1,B0A1,B1.4*.3.4*.7.6*.1.6*.9

Conditioning Perspective Suppose B is conditioned on E = e. Let W = X – {e}. Then is a Gibbs dist. defined by Where each X What is the normalizing factor Z?

Conditioning Example Consider the simple example: B0B1 A0.3.7 A1.1.9 A0A1.4.6 A0,B0A0,B1.4*.3.4*.7

Background Material: Moral Graphs The Moral Graph of G, denoted M[G], is the undirected graph over X that contains undirected edges X – Y if: X -> Y or X <- Y X,Y Pa(Z), Z X BN itself has a Moral graph G if: X,Y Pa(Z) => X->Y or X<-Y

Moral Graph Example DifficultyIntelligence Grade SAT Letter DifficultyIntelligence Grade SAT Letter Job Adapted from Student example in KF book GH

Moral Graphs of G For any distribution s.t. B is a parameterization of G, M[G] is an I-Map for Moreover, M[G] is a minimal I-map for G. (1) This Moral Graph is a Markov network H – Moral Graphs are undirected and encode some of the independencies in the original graph

Sketch of Proof for (1) Markov Blanket for X in G d-separates X from rest of G I.E., No subset of the Markov Blanket has this property. However, information can be lost – V-structures are the biggest culprit – MNs cannot encode V-structure

When are things perfect? If G itself is Moral, then M[G] is a P-map Proof available on pg. 135 However, Moral G is not the norm

Markov -> Bayesian (Section 4.5.2) How can we find a Bayesian Network which is a minimal I-map for a Markov network? “Sadly, it is not quite so easy”

Background Material: Chordal Graphs Chord: In loop a chord is and edge Undirected graph H is chordal if: – has a chord – n > 3

Properties of Markov -> Bayesian If H is a Markov network, the Bayesian network G cannot have immoralities If H is a Markov network, G must be chordal. If H is nonchordal, there is no Bayesian P-map corresponding to H (Theorem 4.11) If H is a chordal Markov network, then there is a Bayesian P-map of H (Theorem 4.13)

Refresher: Active Trails in Bayesian Networks

Markov -> Bayesian Example A CB D F E A CB D F E GH

Resources Koller Friedman Book sections Dr. Sridharan’s class notes