EE4-62 MLCV Lecture 13-14 Face Recognition – Subspace/Manifold Learning Tae-Kyun Kim 1 EE4-62 MLCV.

Slides:



Advertisements
Similar presentations
Face Recognition Sumitha Balasuriya.
Advertisements

EigenFaces and EigenPatches Useful model of variation in a region –Region must be fixed shape (eg rectangle) Developed for face recognition Generalised.
Face Recognition Ying Wu Electrical and Computer Engineering Northwestern University, Evanston, IL
CS Statistical Machine learning Lecture 13 Yuan (Alan) Qi Purdue CS Oct
Dimension reduction (1)
1er. Escuela Red ProTIC - Tandil, de Abril, 2006 Principal component analysis (PCA) is a technique that is useful for the compression and classification.
As applied to face recognition.  Detection vs. Recognition.
Computer Vision Spring ,-685 Instructor: S. Narasimhan Wean 5403 T-R 3:00pm – 4:20pm Lecture #20.
Principal Component Analysis CMPUT 466/551 Nilanjan Ray.
An introduction to Principal Component Analysis (PCA)
EE 290A: Generalized Principal Component Analysis Lecture 6: Iterative Methods for Mixture-Model Segmentation Sastry & Yang © Spring, 2011EE 290A, University.
Hilbert Space Embeddings of Hidden Markov Models Le Song, Byron Boots, Sajid Siddiqi, Geoff Gordon and Alex Smola 1.
Dimensionality Reduction Chapter 3 (Duda et al.) – Section 3.8
Principal Component Analysis
CS 790Q Biometrics Face Recognition Using Dimensionality Reduction PCA and LDA M. Turk, A. Pentland, "Eigenfaces for Recognition", Journal of Cognitive.
An Introduction to Kernel-Based Learning Algorithms K.-R. Muller, S. Mika, G. Ratsch, K. Tsuda and B. Scholkopf Presented by: Joanna Giforos CS8980: Topics.
Dimensional reduction, PCA
Principal Component Analysis IML Outline Max the variance of the output coordinates Optimal reconstruction Generating data Limitations of PCA.
Face Recognition using PCA (Eigenfaces) and LDA (Fisherfaces)
Descriptive statistics Experiment  Data  Sample Statistics Experiment  Data  Sample Statistics Sample mean Sample mean Sample variance Sample variance.
Face Recognition Jeremy Wyatt.
Subspace Representation for Face Recognition Presenters: Jian Li and Shaohua Zhou.
ICA Alphan Altinok. Outline  PCA  ICA  Foundation  Ambiguities  Algorithms  Examples  Papers.
Continuous Latent Variables --Bishop
Lightseminar: Learned Representation in AI An Introduction to Locally Linear Embedding Lawrence K. Saul Sam T. Roweis presented by Chan-Su Lee.
CS 485/685 Computer Vision Face Recognition Using Principal Components Analysis (PCA) M. Turk, A. Pentland, "Eigenfaces for Recognition", Journal of Cognitive.
Summarized by Soo-Jin Kim
Principle Component Analysis Presented by: Sabbir Ahmed Roll: FH-227.
Dimensionality Reduction: Principal Components Analysis Optional Reading: Smith, A Tutorial on Principal Components Analysis (linked to class webpage)
Feature extraction 1.Introduction 2.T-test 3.Signal Noise Ratio (SNR) 4.Linear Correlation Coefficient (LCC) 5.Principle component analysis (PCA) 6.Linear.
1 Recognition by Appearance Appearance-based recognition is a competing paradigm to features and alignment. No features are extracted! Images are represented.
1 Introduction to Kernel Principal Component Analysis(PCA) Mohammed Nasser Dept. of Statistics, RU,Bangladesh
CSE 185 Introduction to Computer Vision Face Recognition.
Ch 12. Continuous Latent Variables Pattern Recognition and Machine Learning, C. M. Bishop, Summarized by S.-J. Kim and J.-K. Rhee Revised by D.-Y.
Blind Information Processing: Microarray Data Hyejin Kim, Dukhee KimSeungjin Choi Department of Computer Science and Engineering, Department of Chemical.
CSSE463: Image Recognition Day 27 This week This week Today: Applications of PCA Today: Applications of PCA Sunday night: project plans and prelim work.
Principal Component Analysis Machine Learning. Last Time Expectation Maximization in Graphical Models – Baum Welch.
Principal Manifolds and Probabilistic Subspaces for Visual Recognition Baback Moghaddam TPAMI, June John Galeotti Advanced Perception February 12,
Tony Jebara, Columbia University Advanced Machine Learning & Perception Instructor: Tony Jebara.
Final Review Course web page: vision.cis.udel.edu/~cv May 21, 2003  Lecture 37.
Speech Lab, ECE, State University of New York at Binghamton  Classification accuracies of neural network (left) and MXL (right) classifiers with various.
PCA vs ICA vs LDA. How to represent images? Why representation methods are needed?? –Curse of dimensionality – width x height x channels –Noise reduction.
Elements of Pattern Recognition CNS/EE Lecture 5 M. Weber P. Perona.
Feature Selection and Dimensionality Reduction. “Curse of dimensionality” – The higher the dimensionality of the data, the more data is needed to learn.
Principal Component Analysis and Linear Discriminant Analysis for Feature Reduction Jieping Ye Department of Computer Science and Engineering Arizona State.
Feature Extraction 主講人:虞台文. Content Principal Component Analysis (PCA) PCA Calculation — for Fewer-Sample Case Factor Analysis Fisher’s Linear Discriminant.
2D-LDA: A statistical linear discriminant analysis for image matrix
ECE 8443 – Pattern Recognition ECE 8527 – Introduction to Machine Learning and Pattern Recognition LECTURE 10: PRINCIPAL COMPONENTS ANALYSIS Objectives:
Irena Váňová. B A1A1. A2A2. A3A3. repeat until no sample is misclassified … labels of classes Perceptron algorithm for i=1...N if then end * * * * *
Face detection and recognition Many slides adapted from K. Grauman and D. Lowe.
CSSE463: Image Recognition Day 25 This week This week Today: Applications of PCA Today: Applications of PCA Sunday night: project plans and prelim work.
Principal Components Analysis ( PCA)
Machine Learning Supervised Learning Classification and Regression K-Nearest Neighbor Classification Fisher’s Criteria & Linear Discriminant Analysis Perceptron:
1 C.A.L. Bailer-Jones. Machine Learning. Data exploration and dimensionality reduction Machine learning, pattern recognition and statistical data modelling.
Ch 12. Continuous Latent Variables ~ 12
LECTURE 09: BAYESIAN ESTIMATION (Cont.)
University of Ioannina
LECTURE 10: DISCRIMINANT ANALYSIS
9.3 Filtered delay embeddings
Lecture: Face Recognition and Feature Reduction
Principal Component Analysis (PCA)
Machine Learning Dimensionality Reduction
PCA vs ICA vs LDA.
Lecture 14 PCA, pPCA, ICA.
Principal Component Analysis (PCA)
PCA is “an orthogonal linear transformation that transfers the data to a new coordinate system such that the greatest variance by any projection of the.
Feature space tansformation methods
Generally Discriminant Analysis
LECTURE 09: DISCRIMINANT ANALYSIS
Principal Component Analysis
Presentation transcript:

EE4-62 MLCV Lecture Face Recognition – Subspace/Manifold Learning Tae-Kyun Kim 1 EE4-62 MLCV

Face Image Tagging and Retrieval Face tagging at commercial weblogs Key issues – User interaction for face tags – Representation of a long- time accumulated data – Online and efficient learning 2 Active research area in Face Recognition Test and MPEG-7 for face image retrieval and automatic passport control Our proposal promoted to MPEG7 ISO/IEC standard

EE4-62 MLCV Principal Component Analysis (PCA) - Maximum Variance Formulation of PCA - Minimum-error formulation of PCA - Probabilistic PCA 3

EE4-62 MLCV Maximum Variance Formulation of PCA 4

EE4-62 MLCV 5

6

7

8

Minimum-error formulation of PCA 9

EE4-62 MLCV 10

EE4-62 MLCV 11

EE4-62 MLCV 12

EE4-62 MLCV 13

EE4-62 MLCV 14

EE4-62 MLCV Applications of PCA to Face Recognition 15

EE4-62 MLCV (Recap) Geometrical interpretation of PCA Principal components are the vectors in the direction of the maximum variance of the projection samples. Each two-dimensional data point is transformed to a single variable z1 representing the projection of the data point onto the eigenvector u1. The data points projected onto u1 has the max variance. Infer the inherent structure of high dimensional data. The intrinsic dimensionality of data is much smaller. For given 2D data points, u1 and u2 are found as PCs 16

EE4-62 MLCV Eigenfaces Collect a set of face images Normalize for scale, orientation (using eye locations) Construct the covariance matrix and obtain eigenvectors w h D=wh 17

EE4-62 MLCV Eigenfaces Project data onto the subspace Reconstruction is obtained as Use the distance to the subspace for face recognition 18 EE4-62 MLCV

19

EE4-62 MLCV Matlab Demos – Face Recognition by PCA 20

EE4-62 MLCV Face Images Eigen-vectors and Eigen-value plot Face image reconstruction Projection coefficients (visualisation of high- dimensional data) Face recognition 21

EE4-62 MLCV Probabilistic PCA A subspace is spanned by the orthonormal basis (eigenvectors computed from covariance matrix) Can interpret each observation with a generative model Estimate (approximately) the probability of generating each observation with Gaussian distribution, PCA: uniform prior on the subspace PPCA: Gaussian dist. 22 EE4-62 MLCV

Continuous Latent Variables 23

EE4-62 MLCV 24

EE4-62 MLCV 25

EE4-62 MLCV Probabilistic PCA 26 EE4-62 MLCV

27

EE4-62 MLCV 28

EE4-62 MLCV 29

EE4-62 MLCV 30

EE4-62 MLCV Maximum likelihood PCA 31

EE4-62 MLCV 32

EE4-62 MLCV 33

EE4-62 MLCV 34

EE4-62 MLCV 35

EE4-62 MLCV Limitations of PCA 36

EE4-62 MLCV Unsupervised learning 37 PCA vs LDA (Linear Discriminant Analysis)

EE4-62 MLCV Linear model Linear Manifold = Subspace Nonlinear Manifold 38 EE4-62 MLCV PCA vs Kernel PCA

EE4-62 MLCV Gaussian Distribution Assumption 39 IC1 IC2 PC1 PC2 PCA vs ICA (Independent Component Analysis)

EE4-62 MLCV 40 (also by ICA)