Presentation is loading. Please wait.

Presentation is loading. Please wait.

6.4 Random Fields on Graphs 6.5 Random Fields Models In “Adaptive Cooperative Systems” Summarized by Ho-Sik Seok.

Similar presentations


Presentation on theme: "6.4 Random Fields on Graphs 6.5 Random Fields Models In “Adaptive Cooperative Systems” Summarized by Ho-Sik Seok."— Presentation transcript:

1 6.4 Random Fields on Graphs 6.5 Random Fields Models In “Adaptive Cooperative Systems” Summarized by Ho-Sik Seok

2 Gibbs-Markov Equivalence for Random Fields on Graphs (1/4) A graph G on   A set of vertices or sites and a set of edges connecting pairs of elements of .  Neighbors: if site i and j are connected by edge.  N k : the set of neighbors of a site k.  Clique: a subset of  whose sites are all neighbors of one another. As neural networks  A graph as representing sets of neurons (sites or vertices) and their interactions (edges).  Given a set of N neurons and each of these neurons can be in one of a finite number of states, 1, 2, …, A  Configuration: an assignment of a label to each of the neurons of the graph. © 2009 SNU CSE Biointelligence Lab 2 A random field ( , , p)   : the space of all configurations  Markov random field if - The joint probabilities are positive definite for all configurations - The conditional probabilities satisfy the Markov relationship

3 Gibbs-Markov Equivalence for Random Fields on Graphs (2/4) For any x , A  , and non-neighboring pair of sites a and b, such that a  A, b  A, a random field ( , , p) is a markov random field, if  The joint probabilities are positive definite for all configurations  The joint probabilities satisfy the nearest-neighbor relationship Gibbs distribution  A random field ( , , p) is a Gibbs distribution if there exists a potential function V for all x   Two real-valued functions based on the subsets of a finite set © 2009 SNU CSE Biointelligence Lab 3 The state at a given site b depends only on the states at neighboring sites, namely on the states at sites that interact with the given site. L(x) is the set of sites not labeled 0 in the configuration, the summations is over all cliques in L(x) ….Eq. 6.27 |C|: the number of elements in the set C M ö bius inversion formula

4 Gibbs-Markov Equivalence for Random Fields on Graphs (3/4) Theorem: a random field ( , , p) is a Gibbs distribution if and only if it is a Markov random field.  ( , , p) is a Markov random field, and V is defined by Eq. 6.27 for all x , A  L(x)  V(x A ) vanishes unless A is a clique. Suppose that A contains two sites, a and b, that not neighbors, then  Thus ( , , p) is a Gibbs random field with a potential given by Eq. 6.27. © 2009 SNU CSE Biointelligence Lab 4

5 Gibbs-Markov Equivalence for Random Fields on Graphs (4/4) Suppose that ( , , p) is a Gibbs random field with a potential function V.  By definition the probabilities are positive definite © 2009 SNU CSE Biointelligence Lab 5 ( , , p) is a Markov random field

6 Higher order Interactions Higher-order interactions are required to describe the correlations among spiking neurons. In a linear model there is a one-to-one correspondence between the weights assigned to edges and the coefficients of the interaction potentials  this mapping fails when higher-order terms are included. Alternative approach to the treatment of higher-order interactions  The coefficient appearing in the polynomial expansion of the potential function were used to model joint probabilities of fields in databases.  Solution: to renormalize the potentials appearing in the M ö bius inversion relation.  The renormalized potentials satisfy the relations © 2009 SNU CSE Biointelligence Lab 6

7 Besag’s Auto Models Random field of the above form are called auto or linear models. Upon evaluating the quantity Q(x)-Q(0), conditional probability structure Autologistic models are auto models for which the random variables may assume one of two values at each site. The potential function for an autologistic model Normalized conditional probabilities © 2009 SNU CSE Biointelligence Lab 7 Besag’s auto models follow from the additional requirement that b i is linear in x i so that Q(x) is simplified. A random field model for which Q(x) only has contributions from single-site and pair interactions. Restricting to conditional probabilities for each site that are of an exponential form

8 The Autobinomial Model In texture generation model  In modeling textures, the brightness of each pixel of the lattice is regarded as a random variable that may assume one of G possible values in the range 0 to G-1.  In the autobinomial model the brightness values for a given pixel in the lattice are binomially distributed with parameters determined by the neighboring pixels.  The conditional probabilities  The probabilities of success  i is dependent on the values of the neighboring pixels  It is possible to use this model to synthesize textured images by means of a Metropolis sampling procedure.  By generating a Markov chain of texture states that converges to an autobinomial limiting distribution characterized by a given set of parameters.  For any texture realization x and y, © 2009 SNU CSE Biointelligence Lab 8

9 Wide-Sense Markov Processes Strict sense Markov random field models: conditional probabilities associated with the joint probability distribution of the configuration of a system possess a neighborhood property. Wide-sense Markov random field  If the random variables X ij satisfy the conditional error relation for ( i, j )   Through autoregressive family of linear equations h kl : a set of coefficients,  mn : a random process that is othogonal to f mn © 2009 SNU CSE Biointelligence Lab 9 h kl : the coefficients of the linear minimum mean square error estimate of f mn given its neighbors. …Eq. 6.50  ij : noncausal neighborhood of the lattice point (i, j) The coefficient for f mn is vanish for sites (m, n) not in the neighborhood  kl The pixel grey value at a given site is considered as a linear combination of the pixel grey values at the neighboring sites plus an additive noise term Stacking the rows of N  N image to form a vector f of length N 2 B: N 2  N 2 matrix

10 Gauss-Markov Random Fields Wide-sense AR representation of the form given by Eq. 6.50 in which the random variables  mn are Gaussian distributed. Examples of texture synthesized. © 2009 SNU CSE Biointelligence Lab 10 The original texture Results derived using full model


Download ppt "6.4 Random Fields on Graphs 6.5 Random Fields Models In “Adaptive Cooperative Systems” Summarized by Ho-Sik Seok."

Similar presentations


Ads by Google