Download presentation

Presentation is loading. Please wait.

Published byAmber Leonard Modified over 4 years ago

1
10/11/2001Random walks and spectral segmentation1 CSE 291 Fall 2001 Marina Meila and Jianbo Shi: Learning Segmentation by Random Walks/A Random Walks View of Spectral Segmentation Markus Herrgard UCSD Bioengineering and Bioinformatics

2
CSE 291 Fall 2001 10/11/2001 Random walks and spectral segmentation 2 Overview Introduction: Why random walks? Review of the Ncut algorithm Finite Markov chains Spectral properties of Markov chains Conductance of a Markov chain Block-stochastic matrices Application: Supervised segmentation

3
CSE 291 Fall 2001 10/11/2001 Random walks and spectral segmentation 3 Introduction Why bother with mapping a segmentation problem to a random walk problem? Utilize strong connections between: Graph theory Theory of stochastic processes Matrix algebra

4
CSE 291 Fall 2001 10/11/2001 Random walks and spectral segmentation 4 Applications of random walks Markov chain monte carlo: Approximate high dimensional integration e.g. in Bayesian inference How to sample efficiently from a complex distribution? Randomized algorithms: Approximate counting in high dimensional spaces How to sample points efficiently inside a convex polytope?

5
CSE 291 Fall 2001 10/11/2001 Random walks and spectral segmentation 5 Segmentation as graph partitioning Consider an image I with a similarity function S ij between all pairs of pixels i,j I Represent S as graph G =(I,S): Pixels are the nodes of the graph S ij is the weight of the edge between nodes i and j Degree of node i : d i = j S ij Volume of set A I : volA= i A d i

6
CSE 291 Fall 2001 10/11/2001 Random walks and spectral segmentation 6 Simple example Data with both distance and color cues Similarity matrix

7
CSE 291 Fall 2001 10/11/2001 Random walks and spectral segmentation 7 The normalized cut criterion Partitioning of G into A and its complement is found by minimizing the normalized cut criterion: Produces more balanced partitions than regular graph cut Approximate solution can be found through spectral methods

8
CSE 291 Fall 2001 10/11/2001 Random walks and spectral segmentation 8 The normalized cut algorithm Define: Diagonal matrix D with D ii = d i Laplacian of the graph G : L = D – S Solve the generalized eigenvalue problem: Lx = Dx Let x L be the eigenvector corresponding to 2 nd smallest eigenvalue L Partition x L to two sets containing roughly equal values graph partition

9
CSE 291 Fall 2001 10/11/2001 Random walks and spectral segmentation 9 What does this actually mean? Spectral methods are easy to apply, but notoriously hard to understand intuitively Some questions: Why does it work? (see Shi & Malik) Why this particular eigenvector? Why would x L be piecewise constant? What if there are more than two segments? What if x L is not piecewise constant? (see Kannan, Vempala & Vetta)

10
CSE 291 Fall 2001 10/11/2001 Random walks and spectral segmentation 10 Interlude: Finite Markov chains Discrete time, finite state random process State of the system at time t n : x n Probability of being in state i at time t n given by: Probability distribution for all states represented by the column vector (n) Markov property:

11
CSE 291 Fall 2001 10/11/2001 Random walks and spectral segmentation 11 Transition matrix Transition matrix: P is a (row) stochastic matrix: P ij 0 j P ij = 1 If at t n the distribution is (n) at t n+1 the distribution is given by:

12
CSE 291 Fall 2001 10/11/2001 Random walks and spectral segmentation 12 Example of a Markov chain Play Work Sleep

13
CSE 291 Fall 2001 10/11/2001 Random walks and spectral segmentation 13 Some terminology Stationary distribution is given by: Markov chain is reversible if the “detailed balance” condition holds: A reversible finite Markov chain is called a random walk

14
CSE 291 Fall 2001 10/11/2001 Random walks and spectral segmentation 14 Spectra of stochastic matrices For reversible Markov chains the eigenvalues of P are real and eigenvectors orthogonal Spectral radius (P) = 1 (i.e. | | 1 ) Right (left) hand eigenvector corresponding to 1 =1 is x 1 =1 ( x 1 = )

15
CSE 291 Fall 2001 10/11/2001 Random walks and spectral segmentation 15 Back to Ncut How is Ncut related to random walks on graphs? Transform the similarity matrix S to a stochastic matrix: P ij is the probability of moving from pixel i to pixel j in the graph representation of the image in one step of a random walk

16
CSE 291 Fall 2001 10/11/2001 Random walks and spectral segmentation 16 Relationship to random walks Spectrum of P: The generalized eigenvalue problem in Ncut can be written as: How are the spectra related? Same eigenvectors: x =x P Eigenvalues: = 1- P

17
CSE 291 Fall 2001 10/11/2001 Random walks and spectral segmentation 17 Simple example Similarity matrix S Transition matrix P=D -1 S

18
CSE 291 Fall 2001 10/11/2001 Random walks and spectral segmentation 18 Eigenvalues and eigenvectors of P

19
CSE 291 Fall 2001 10/11/2001 Random walks and spectral segmentation 19 Why the second eigenvector? The smallest eigenvalue in NCut corresponds to the largest eigenvalue of P The corresponding eigenvector x 1 =1 has no information about partitioning

20
CSE 291 Fall 2001 10/11/2001 Random walks and spectral segmentation 20 Conductance of a Markov chain Conductance of set A : If we start from a random node in A (according to ) this as the probability of moving out of A in one step

21
CSE 291 Fall 2001 10/11/2001 Random walks and spectral segmentation 21 Conductance and the Ncut criterion Assume that the random walk started from its stationary distribution Using this and P ij = S ij /d i we can write:

22
CSE 291 Fall 2001 10/11/2001 Random walks and spectral segmentation 22 Interpretation of the Ncut criterion Alternative representation of the Ncut criterion: Minimum NCut is equivalent to minimizing the conductance between set A and its complement minimizing the probability of moving between set A and its complement

23
CSE 291 Fall 2001 10/11/2001 Random walks and spectral segmentation 23 Block-stochastic matrices Let = (A 1,A 2,…,A k ) be a partition of I P is a block-stochastic matrix or equivalently the Markov chain is aggregatable iff

24
CSE 291 Fall 2001 10/11/2001 Random walks and spectral segmentation 24 Aggregation Markov chain defined by P with state space i I can be aggregated to a Markov chain with a smaller state space A s and a transition matrix R The k eigenvalues of R are the same as the k largest eigenvalues of P Aggregation can be performed as a linear transformation R = UPV

25
CSE 291 Fall 2001 10/11/2001 Random walks and spectral segmentation 25 Aggregation example Aggregated transition matrix R Transition matrix P

26
CSE 291 Fall 2001 10/11/2001 Random walks and spectral segmentation 26 Why piecewise constant eigenvectors? If P is block-stochastic with k blocks then its k first eigenvectors are piecewise constant Ncut is exact for block-stochastic matrices in addition to block diagonal matrices Ncut groups pixels by the similarity of their transition probabilities to subsets of I

27
CSE 291 Fall 2001 10/11/2001 Random walks and spectral segmentation 27 Block-stochastic matrix example Transition matrix P Piecewise constant eigenvector x

28
CSE 291 Fall 2001 10/11/2001 Random walks and spectral segmentation 28 The modified Ncut algorithm Finds k segments in one pass Requires that the k eigenvalues of R are larger than the other n-k spurious eigenvalues of P 1.Compute eigenvalues of P 2.Select k largest eigenvectors 3.Use k-means to obtain segmentation based on the k eigenvectors

29
CSE 291 Fall 2001 10/11/2001 Random walks and spectral segmentation 29 Supervised image segmentation Training data: Based on a human-segmented image define target probabilities Features: Different criteria f q ij q=1,…,Q that measure similarity between pixels i and j

30
CSE 291 Fall 2001 10/11/2001 Random walks and spectral segmentation 30 Supervised segmentation criterion Model: Parametrized similarity function: Optimization criterion: Minimize Kullback-Leibler divergence between target transition matrix P* and P( )=D -1 S ( ) Corresponds to maximizing cross-entropy:

31
CSE 291 Fall 2001 10/11/2001 Random walks and spectral segmentation 31 Supervised segmentation algorithm This can be done by using gradient ascent in : where

32
CSE 291 Fall 2001 10/11/2001 Random walks and spectral segmentation 32 Toy example Distance “Color” (or intensity) Training segmentation 1 (by distance): 1 =-1.19, 2 =1.04 Training segmentation 2 (by color): 1 =-0.19, 2 =-4.55

33
CSE 291 Fall 2001 10/11/2001 Random walks and spectral segmentation 33 Toy example results Test data Training segmentation 1 (by distance): Training segmentation 2 (by color):

34
CSE 291 Fall 2001 10/11/2001 Random walks and spectral segmentation 34 Application real image segmentation Cues: Intervening contour: Edge flow:

35
CSE 291 Fall 2001 10/11/2001 Random walks and spectral segmentation 35 Training

36
CSE 291 Fall 2001 10/11/2001 Random walks and spectral segmentation 36 Testing

37
CSE 291 Fall 2001 10/11/2001 Random walks and spectral segmentation 37 Conclusions I Random walks perspective provides new insights to the Ncut algorithm: Relating the Ncut algorithm to spectral properties of random walks Interpreting of the Ncut criterion in terms of conductance of a random walk Proving that Ncut is exact for block stochastic matrices

38
CSE 291 Fall 2001 10/11/2001 Random walks and spectral segmentation 38 Conclusions II Is any of this useful in practice? Supervised segmentation method Comparing different spectral clustering methods in terms of the underlying random walks Choosing the kernel to allow for effective clustering (approximately block-stochastic) New clustering criteria, e.g. bipartite clustering

39
CSE 291 Fall 2001 10/11/2001 Random walks and spectral segmentation 39 References Kemeny JG, Snell JL: Finite Markov Chains. Springer 1976. Stewart WJ: Introduction to the Numerical Solution of Markov Chains. Princeton University Press 1994. Lovasz L: Random Walks of Graphs: A Survey. Jerrum M, Sinclair A: The Markov Chain Monte Carlo Method: An Approach to Approximate Counting and Integration.

Similar presentations

© 2019 SlidePlayer.com Inc.

All rights reserved.

To make this website work, we log user data and share it with processors. To use this website, you must agree to our Privacy Policy, including cookie policy.

Ads by Google