Presentation is loading. Please wait.

Presentation is loading. Please wait.

A Unified View of Kernel k-means, Spectral Clustering and Graph Cuts

Similar presentations


Presentation on theme: "A Unified View of Kernel k-means, Spectral Clustering and Graph Cuts"— Presentation transcript:

1 A Unified View of Kernel k-means, Spectral Clustering and Graph Cuts
Dhillon, Inderjit S., Yuqiang Guan and Brian Kulis

2 Outline (Kernel) kmean, weighted kernel kmean
Spectral clustering algorithms The connect of kernel kmean and spectral clustering algorithms The Uniformed Problem and the ways to solve the problem Experiment results

3 K means and Kernel K means

4 Weighted Kernel k means
Distance from ai to cluster c Matrix Form

5 Spectral Methods Represent the data by a graph
Each data points corresponds to a node on the graph The weight of the edge between two nodes represent the similarity between the two corresponding data points The similarity can be a kernel function, such as the RBF kernel Use spectral theory to find the cut for the graph: Spectral Clustering

6 Spectral Methods

7 Spectral Methods Similar in the cluster Difference between clusters

8 Represented with Matrix
Ratio assoc Ratio cut L for Ncut Norm assoc

9 Weighted Graph Cut Weighted association Weighted cut

10 Conclusion Spectral Methods are special case of Kernel K means

11 Solve the unified problem
A standard result in linear algebra states that if we relax the trace maximizations, such that Y is an arbitrary orthonormal matrix, then the optimal Y is of the form Vk Q, where Vk consists of the leading k eigenvectors of W1/2KW1/2 and Q is an arbitrary k × k orthogonal matrix. As these eigenvectors are not indicator vectors, we must then perform postprocessing on the eigenvectors to obtain a discrete clustering of the point

12 From Eigen Vector to Cluster Indicator
1 2 Normalized U with L2 norm equal to 1

13 The Other Way Using k means to solve the graph cut problem: (random start points+ EM, local optimal). To make sure k mean converge, the kernel matrix must be positive definite. This is not true for arbitrary kernel matrix

14 The effect of the regularization
ai is in ai is not in

15 Experiment results

16 Results (ratio association)

17 Results (normalized association)

18 Image Segmentation

19 Thank you. Any Question?


Download ppt "A Unified View of Kernel k-means, Spectral Clustering and Graph Cuts"

Similar presentations


Ads by Google