Presentation is loading. Please wait.

Presentation is loading. Please wait.

PCA & LDA for Face Recognition

Similar presentations


Presentation on theme: "PCA & LDA for Face Recognition"— Presentation transcript:

1 PCA & LDA for Face Recognition
2017/4/21 2010 Scientific Computing PCA & LDA for Face Recognition J.-S. Roger Jang (張智星) CS Dept., Tsing Hua Univ., Taiwan ... In this talk, we are going to apply two neural network controller design techniques to fuzzy controllers, and construct the so-called on-line adaptive neuro-fuzzy controllers for nonlinear control systems. We are going to use MATLAB, SIMULINK and Handle Graphics to demonstrate the concept. So you can also get a preview of some of the features of the Fuzzy Logic Toolbox, or FLT, version 2.

2 Face Recognition Characteristics of FR:
2017/4/21 Face Recognition Image database: Test image: A: B: Who is this guy? C: D: Characteristics of FR: A mode of biometric identification Easy for human, hard for machine E: F: G: 2017/4/21

3 Biometric Identification
2017/4/21 Biometric Identification Identification of people from their physical characteristics, such as faces voices fingerprints palm prints hand vein distributions hand shapes and sizes retinal scans 2017/4/21

4 FR via PCA First paper: Characteristics
2017/4/21 FR via PCA First paper: M. Turk and A. Pentland, "Eigenfaces for Recognition", Journal of Cognitive Neuroscience, vol. 3, no. 1, pp , 1991 Characteristics Efficient computation Proven mathematics Applicable to face detection 2017/4/21

5 Problem Definition Input Output: A dataset of face images of n person
2017/4/21 Problem Definition Input A dataset of face images of n person An unknown person’s face image Output: Determine the identity of the unknown person 2017/4/21

6 ATT Face Dataset Origin Stats: Characteristics
2017/4/21 ATT Face Dataset Origin Olivetti Research Laboratory, 1992~1994 Stats: 40 subjects, each with 10 images Characteristics Same-size photos of black and white Centered faces of different poses 2017/4/21

7 Face Recognition via PCA
Facial Signatures Compute Eigenvectors (Eigenfaces) Compute Mean Face Select 6 Principal Eigenfaces Quick step through the animation on this slide whilst talking about basic algorithm operation: calculate mean, subtract mean from all images, compute eigenvectors of entire image dataset (call these eigenfaces). These give the principal components that can be used to build up our images, in order of significance. To reduce the data we need to use, take the TOP 6. Based on these top 6, compute a set of signatures for the known faces. This last step is assisted with the formula to see how a face can be reconstructed. Note where the big data is in this application. Note how we have to process all the images AT ONCE to derive optimum eigenvectors. However, in doing this processing we achieve great data reduction. As the number of faces in this database increases, the nunber of principal eigenfaces also increases, but not linearly. EG it is suggested in literature on this topic that only 100 eigenfaces are needed to be able to uniquely identify anybody. Step to next slide to emphasise data set reduction Subtract 400 400 400 The MathWorks

8 Steps of Feature Extraction via PCA
2017/4/21 Steps of Feature Extraction via PCA 3 simple steps: Data preprocessing Each sample image is rearranged into a column vector of length 112*92= All images are put into a matrix F of size 10304x400. Mean face is subtracted from each column. PCA Find the eigenvectors of F*F’. Projection Select top k eigenvectors with k largest eigenvalues  k eigenfaces! Do projection along these eigenfaces to find new features for classification 2017/4/21

9 2017/4/21 Details for Step 2: PCA Problem: is large,10304x10304! (849MB!) How to compute the eigenvectors of ? Observation: If u is the eigenvector of F’F, then Fu is the eigenvector of FF’. If l is the eigenvalue of F’F, then l is also the eigenvalue of FF’. Note that: FF’ has eigenvalues. F’F has 400 eigenvalues, corresponding to the 400 largest eigenvalues of FF’. 2017/4/21

10 Details for Step 3: Projection (1/2)
2017/4/21 Details for Step 3: Projection (1/2) Each face (minus the mean) in the training set can be represented as a linear combination of the best k eigenvectors: Typical eigenfaces when k=4: 2017/4/21

11 Details for Step 3: Projection (2/2)
2017/4/21 Details for Step 3: Projection (2/2) Since is an orthonormal basis, any face (after mean subtraction) can be represented by this basis: The feature vector of the face is then the new coordinates obtained by: 2017/4/21

12 2017/4/21 Classification Once the features for images are extracted, we can then apply any classification methods to obtain the final recognition results, including Minimum distance classifier Support vector machines Neural networks Quadratic classifier Gaussian mixture models 2017/4/21

13 Face Detection Using Eigenfaces
2017/4/21 Face Detection Using Eigenfaces 2017/4/21

14 Distance from Face Space (DFFS)
2017/4/21

15 PCA for ATT Dataset Variance vs. no. of eigenvalues used 16 eigenfaces
2017/4/21

16 PCA for ATT Dataset: Accuracy
Accuracy vs. no. of eigenvalues used  Accuracy of 98.50% is achieved when the dimensionality is 28. 2017/4/21

17 PCA for ATT Dataset: DFFS
2017/4/21

18 PCA for ATT Dataset: Similarity
2017/4/21

19 PCA for ATT Dataset: Demo
Face Recognition via PCA (eigenfaces) load faceData.mat frOpt.method='pca'; frOpt.pcaDim=7; frOpt.plot=1; faceRecogDemo(faceData, frOpt); 2017/4/21

20 PCA+LDA for FR Steps for FR via fisherfaces:
Perform PCA to reduce to 60 dimensions Perform LDA to find the best dimensionality  99.00% when the dimensionality is 14. 2017/4/21


Download ppt "PCA & LDA for Face Recognition"

Similar presentations


Ads by Google