Presentation is loading. Please wait.

Presentation is loading. Please wait.

Structured Sparse Principal Component Analysis Reading Group Presenter: Peng Zhang Cognitive Radio Institute Friday, October 01, 2010 Authors: Rodolphe.

Similar presentations


Presentation on theme: "Structured Sparse Principal Component Analysis Reading Group Presenter: Peng Zhang Cognitive Radio Institute Friday, October 01, 2010 Authors: Rodolphe."— Presentation transcript:

1 Structured Sparse Principal Component Analysis Reading Group Presenter: Peng Zhang Cognitive Radio Institute Friday, October 01, 2010 Authors: Rodolphe Jenatton, Guillaume Obozinski, Francis Bach

2 Outline ■Introduction (in Imaging Sense) □Principal Component Analysis (PCA) □Sparse PCA (SPCA) □Structured Sparse PCA (SSPCA) ■Problem Statement ■The SSPCA Algorithm ■Experiments ■Conclusion and Other Thoughts

3 Introduction (Imaging Sense) ■The face recognition problem □A database includes a huge amount of faces □How to let computer to recognize different faces with database ■The challenge □Huge amount of data □Computation complexity ■The trick □Represent the face using a weighted “face dictionary” ▪Similar to code book in data compression ▪Example: An 200 X 200 pixel face can be represented by 100 coefficients using the “face dictionary” ■The solution □Principal component analysis (PCA)

4 PCA ■PCA □A compression method □Given a large amount of sample vectors {x} □2 nd moment statistics of the sample vectors □Eigen-decomposition finds the “dictionary” and “energy” of the dictionary codes ▪Eigen-vectors {v} form the “dictionary” ▪Eigen-values {d} give the “energy” of “dictionary” elements

5 PCA ■Original signal can be represented using only part of the dictionary ▪Data is compressed with fewer elements ■Meaning of “dictionary” v: □It is the weights of each elements in x ■The problem for PCA for face recognition: No physical meaning for “dictionary”

6 PCA ■Face recognition The Face Samples PCA The “dictionary”, eigen-faces These eigen-faces can reconstruct original faces perfectly, but make no sense in real life

7 Structured SPCA ■The SPCA goal: □Make dictionary more interpretable □The “sparse” solution: Limit the number of nonzeros Non-sparse Eigen-faces from PCA Sparse Eigen-faces from SPCA But the eigen-faces are still meaningless most of time

8 Structured SPCA ■The new idea, SSPCA □Eigen-faces will be meaningful when some structured constraints are set □Meaningful areas in faces are constrained in “grids” Eigen-faces from SSPCA

9 Structured SPCA ■This paper’s contribution □Add the “structure” constraint to make the dictionary more meaningful □How the constraint works □Meaningful dictionary is more close to “true” dictionary □Meaningful dictionary is more robust against noise □Meaningful dictionary is more accurate in face recognition

10 Outline ■Introduction □Principal Component Analysis (PCA) □Sparse PCA (SPCA) □Structured Sparse PCA (SSPCA) ■Problem Statement ■The SSPCA Algorithm ■Experiments ■Conclusion and Other Thoughts

11 Problem Statement ■From SPCA to SSPCA □The optimization problem □X is sample matrix, U is coefficient matrix, V is dictionary □||.|| and are different types of norms □The trick in SPCA ▪L1 norm force the dictionary to be a sparse solution

12 Problem Statement ■Structured SPCA, however, deal with a mixed l1/l2 minimization: ■Right now it’s hard for me to understand the G and d

13 Problem Statement ■In short, the norm constraints have the following effects □Dictionary has some structures □All non-zeros in the dictionary will be confined inside a grid

14 Outline ■Introduction □Principal Component Analysis (PCA) □Sparse PCA (SPCA) □Structured Sparse PCA (SSPCA) ■Problem Statement ■The SSPCA Algorithm ■Experiments ■Conclusion and Other Thoughts

15 The SSPCA Algorithm ■Making the dictionary sparser □The norm, □The new SSPCA problem:

16 The SSPCA Algorithm ■Methods to solve a sequence of convex problems

17 Excerpt from Author’s slide ■Excerpt from author’s slide:

18 Excerpt from Author’s slide

19

20

21

22

23

24

25

26 Outline ■Introduction □Principal Component Analysis (PCA) □Sparse PCA (SPCA) □Structured Sparse PCA (SSPCA) ■Problem Statement ■The SSPCA Algorithm ■Experiments ■Conclusion and Other Thoughts

27 Conclusion and Other Thoughts ■Conclusion □This paper shows how to use SSPCA □SSPCA gets better performance in denoising, face recognition and classification ■Other thoughts □Usually, the meaningful dictionary in communication signals is Fourier dictionary □But Fourier dictionary may not fit some transient signals or time- variant signals □How to manipulate the G, d and norms to set constraints for our needs?

28 THANK YOU!


Download ppt "Structured Sparse Principal Component Analysis Reading Group Presenter: Peng Zhang Cognitive Radio Institute Friday, October 01, 2010 Authors: Rodolphe."

Similar presentations


Ads by Google