Download presentation
Presentation is loading. Please wait.
Published byJanice Doe Modified over 9 years ago
1
Sparse Modeling for Finding Representative Objects Ehsan Elhamifar Guillermo Sapiro Ren´e Vidal Johns Hopkins University University of Minnesota Johns Hopkins University
2
Outline Introduction Problem Formulation Geometry of Representatives Representatives of Subspaces Practical Considerations and Extensions Experimental Results
3
Introduction Two important problem related to large database Reducing the Feature-space dimension Reducing the Object-space dimension
4
Introduction Kmeans < Step1: k initial "means" are randomly selected from the data set < Step2: k clusters are created by every observation with the nearest mean < Step3: The centriod of each of the k clusters becomes the new mean < Step4: Steps 2 and 3 are repeated until convergence has been reached
5
Introduction Kmedoids A variant of Kmeans < Step3
6
Introduction Rank Revealing QR(RRQR) a matrix decomposition algorithm based on the QR factorization T. Chan. Rank revealing qr factorizations. Lin. Alg. and its Appl.,1987. 1
7
Introduction Consider the optimization problem Wish to find at most k << N representatives that best reconstruct the data collection is the coefficient matrix counts the number of nonzero rows of C
8
Introduction Applications to video summarization
9
Problem Formulation Finding compact dictionaries to represent data minimizing the objective function D: the dictionary X: the coefficient matrix
10
Problem Formulation Finding Representative Data Consider a modification to the dictionary learning framework Y: the matrix of data points C: the coefficient matrix enforce Counts the number of nonzero rows of C
11
Problem Formulation This is an NP-hard problem A standard relaxation of this optimization is obtained as An appropriately chosen parameter
12
Problem Formulation Indicates the representatives as the nonzero rows of C
13
Geometry of Representatives minimizes the number of representatives that can reconstruct the collection of data points up to an ε error Set ε = 0
14
Geometry of Representatives Theorem H be the convex hull of the columns of Y k be the number of vertices of H the optimal solution : a permutation matrix : the k-dimensional identity matrix : the elements of Δ lie in [0, 1)
15
Representatives of Subspaces coefficient matrix corresponding to data from two subspaces >
16
Representatives of Subspaces
17
Practical Considerations and Extensions
18
Dealing with Outliers Among the rows of the coefficient matrix the true data ○ many nonzero elements Outlier ○ just one nonzero element
19
Practical Considerations and Extensions Define the row-sparsity index of each candidate representative outliers the rsi value is close to 1 true representative the rsi is value close to 0
20
Practical Considerations and Extensions
22
Experimental Results Video Summarization Using Lagrange multipliers
23
Experimental Results Investigate the effect of changing the parameter λ
24
Experimental Results Classification Using Representatives Evaluate the performance ○ Sparse Modeling Representative Selection (SMRS) - proposed algorithm ○ Kmedoids ○ Rank Revealing QR (RRQR) ○ simple random selection of training data (Rand)
25
Experimental Results Several standard classification algorithms Nearest Neighbor (NN) Nearest Subspace (NS) Sparse Representation-based Classification (SRC) Linear Support Vector Machine (SVM)
26
Experimental Results run on the USPS digits database and the Extended YaleB face database USPS digit database YaleB face database SRC and NS work well when the data in each class lie in a union of low- dimensional subspaces NN often needs to have enough samples from its nearest neighbor
27
Experimental Results Outlier Rejection a dataset of N = 1, 024 images (1 − ρ) fraction of the images are randomly selected from the Extended YaleB face database ρ fraction of random images downloaded from the internet
28
Experimental Results Outlier Rejection
Similar presentations
© 2024 SlidePlayer.com Inc.
All rights reserved.