Presentation is loading. Please wait.

Presentation is loading. Please wait.

CS774. Markov Random Field : Theory and Application Lecture 02

Similar presentations


Presentation on theme: "CS774. Markov Random Field : Theory and Application Lecture 02"— Presentation transcript:

1 CS774. Markov Random Field : Theory and Application Lecture 02
Sep CS774. Markov Random Field : Theory and Application Lecture 02 Hello, my name is kyomin jung, and in this talk I’ll talk about learning complex networks, especially designing algorithms by utilizing structural properties of the system Kyomin Jung KAIST

2 Markov Random Field (MRF) :Definition 1
A collection of random variables , defined on a graph G is called an MRF iff The conditional probability distribution of at vertex is dependent only on its neighbors : Formally, MRF is a collection of random variables defined on a graph G, together with a probability distribution defined on it. Here a variable X_i is defined on a vertex I and each X_i takes value from a finite alphabet set, for example in the previous… Then the probability distribution should satisfy that … Graph G

3 Markov Random Field (MRF) :Definition 1
A collection of random variables , defined on a graph G is called an MRF iff The conditional probability distribution of at vertex is dependent only on its neighbors : Formally, MRF is a collection of random variables defined on a graph G, together with a probability distribution defined on it. Here a variable X_i is defined on a vertex I and each X_i takes value from a finite alphabet set, for example in the previous… Then the probability distribution should satisfy that … 2 1 3 6 Graph G

4 Markov Random Field (MRF) :Definition 1
A collection of random variables , defined on a graph G is called an MRF iff The conditional probability distribution of at vertex is dependent only on its neighbors : Formally, MRF is a collection of random variables defined on a graph G, together with a probability distribution defined on it. Here a variable X_i is defined on a vertex I and each X_i takes value from a finite alphabet set, for example in the previous… Then the probability distribution should satisfy that … 3 2 1 4 Graph G

5 Markov Random Field (MRF) :Definition 2
Given a graph G=(V,E), a subset S of V is called a cut of G if Removal of S from G induces two or more connected components. i.e s.t and , and there is no edge connecting A and B. Definition 2. X is called an MRF if for any cut S of G, and are conditionally independent given Formally, MRF is a collection of random variables defined on a graph G, together with a probability distribution defined on it. Here a variable X_i is defined on a vertex I and each X_i takes value from a finite alphabet set, for example in the previous… Then the probability distribution should satisfy that … cut Graph G

6 Hammersley-Clifford Theorem
A subset C of V is called a clique of G if the subgraph of G induced by C is a complete graph. Consider the following form of probability distribution X: Hammersley–Clifford theorem, any MRF that is strictly positive decomposes as a product form of functions associated with the cliques of the graph. Computing Z = computing prob for any specific assignment for some Thm1. Above X satisfies the Definition of MRF. Thm2. Every positive MRF (i.e. P[X=x]>0 for all x) can be expressed by the above form.

7 Pair-wise MRF X is a pair-wise MRF if Hammersley–Clifford theorem, any MRF that is strictly positive decomposes as a product form of functions associated with the cliques of the graph. Computing Z = computing prob for any specific assignment for some and Z is called the partition function of the above expression.

8 Problem of Interest 1: Computing Maximum A Posteriori (MAP)
MAP(Maximum A Posteriori) assignment Most likely assignment (mode of the distribution) NP-hard to compute in general Corresponds to an “optimization” problem Heuristics or approximation algorithms for specific MRFs are commonly used In the weather condition example, MAP is the most likely weather condition for all the states.

9 Example 1 : Image denoising
We want to restore a binary (-1/+1) image Y of size with noise added. Consider Y as an element of Color: black or white We will use an MRF model to restore the original image. The underlying graph is a grid graph of size

10 Example 1 : Image denoising
We will utilize two properties of the original image It is similar to Y. It is smooth, i.e. number of edges with different color is small. Define the following MRF, where MAP assignment : original image

11 Example 2 : Maximum Weight Independent Set (MWIS)
Given a graph G=(V,E), a subset I of V is called an Independent Set, if for all , the two end points of e does not belong to I simultaneously. When the vertices are weighted, an independent set I is called MWIS if the sum of the weights of is maximum. Finding a MWIS is equivalent to finding a MAP in the following MRF on where , and Make it seem like a part III if otherwise is the weight at node v.

12 Example 2 : Maximum Weight Independent Set (MWIS)
Has application to wireless networks with queue Interference in the network requires that the set of transmitters at each time must be an independent set The Maximum Weight Scheduling [Tassiulas, Ephremides ‘92] is abstracted as finding (approximate) MWIS with weights being the queue sizes of vertices Make it seem like a part III

13 Problem of Interest 2: Computing marginal probability
Can be computed by computing partition functions of sub MRFs Note that computing Z of an MRF X is (poly time) computationally equivalent to computing P[X=x] for one x. NP-Hard to compute in general. NP-Hard


Download ppt "CS774. Markov Random Field : Theory and Application Lecture 02"

Similar presentations


Ads by Google