Presentation on theme: "EE462 MLCV 1 Lecture 3-4 Clustering (1hr) Gaussian Mixture and EM (1hr) Tae-Kyun Kim."— Presentation transcript:
EE462 MLCV 1 Lecture 3-4 Clustering (1hr) Gaussian Mixture and EM (1hr) Tae-Kyun Kim
EE462 MLCV 2 2D data vectors (green) are grouped to two homogenous clusters (blue and red). Clustering is achieved by an iterative algorithm (left to right). The cluster centers are marked x. Vector Clustering
EE462 MLCV 3 ` RGBRGB Pixel Clustering (Image Quantisation) Image pixels are represented by 3D vectors of R,G,B values. The vectors are grouped to K=10,3,2 clusters, and represented by the mean values of the respective clusters.
EE462 MLCV 4 dimension D … …… or raw pixels … K codewords Patch Clustering (BoW in Lecture 9-10) Image patches are harvested around feature points in a large number of images. They are represented by finite dimensional vectors, and clustered to form a visual dictionary. SIFT 20 D=400
EE462 MLCV 5 …… Image Clustering Whole images are represented as finite dimensional vectors. Homogenous vectors are grouped together in Euclidean space.
EE462 MLCV 6 K-means vs GMM Hard clustering: a data point is assigned only one cluster. Soft clustering: a data point is assigned multiple Gaussians probabilistically. Two representative techniques are k-means and Gaussian Mixture Model (GMM). K-means assigns data points to the nearest clusters, while GMM assigns data to the Gaussian densities that best represent the data.
EE462 MLCV 7 Matrix and Vector Derivatives
EE462 MLCV 8
9 K-means Clustering
EE462 MLCV 10
EE462 MLCV 11 till converge
EE462 MLCV 12 K=2 μ 1 μ 2 r nk
EE462 MLCV 13 Convergence proof (yes) Global minimum (no)