Download presentation
Presentation is loading. Please wait.
Published byUnique Rankin Modified over 10 years ago
1
EigenFaces and EigenPatches Useful model of variation in a region –Region must be fixed shape (eg rectangle) Developed for face recognition Generalised for –face location –object location/recognition
2
Overview Model of variation in a region
3
Overview of Construction Mark face region on training set Sample region Normalise Statistical Analysis
4
Sampling a region Must sample at equivalent points across region Place grid on image and rotate/scale as necessary Use interpolation to sample image at each grid node
5
Interpolation Pixel values are known at integer positions –What is a suitable value at non-integer positions? Values known at integer values Estimate value here
6
Interpolation in 1D Estimate continuous function, f(x), that passes through a set of points (i,g(i)) f(x) x
7
1D Interpolation techniques f(x) x Nearest Neighbour f(x) x Linear f(x) x Cubic
8
2D Interpolation Extension of 1D case Nearest Neighbour Bilinear y interp at x=0 y interp at x=1
9
Representing Regions Represent each region as a vector –Raster scan values n x m region: nm vector g
10
Normalisation Allow for global lighting variations Common linear approach –Shift and scale so that Mean of elements is zero Variance of elements is 1 Alternative non-linear approach –Histogram equalization Transforms so similar numbers of each grey-scale value
11
Review of Construction Mark face region on training set Sample region Normalise Statistical Analysis The Fun Step
12
Multivariate Statistical Analysis Need to model the distribution of normalised vectors –Generate plausible new examples –Test if new region similar to training set –Classify region
13
Fitting a gaussian Mean and covariance matrix of data define a gaussian model
14
Principal Component Analysis Compute eigenvectors of covariance, S Eigenvectors : main directions Eigenvalue : variance along eigenvector
15
Eigenvector Decomposition If A is a square matrix then an eigenvector of A is a vector, p, such that Usually p is scaled to have unit length,|p|=1
16
Eigenvector Decomposition If K is an n x n covariance matrix, there exist n linearly independent eigenvectors, and all the corresponding eigenvalues are non-negative. We can decompose K as
17
Eigenvector Decomposition Recall that a normal pdf has The inverse of the covariance matrix is
18
Fun with Eigenvectors The normal distribution has form
19
Fun with Eigenvectors Consider the transformation
20
Fun with Eigenvectors The exponent of the distribution becomes
21
Normal distribution Thus by applying the transformation The normal distribution is simplified to
22
Dimensionality Reduction Co-ords often correllated Nearby points move together
23
Dimensionality Reduction Data lies in subspace of reduced dim. However, for some t,
24
Approximation Each element of the data can be written
25
Normal PDF
26
Useful Trick If x of high dimension, S huge If No. samples, N<dim(x) use
27
Building Eigen-Models Given examples Compute mean and eigenvectors of covar. Model is then P – First t eigenvectors of covar. matrix b – Shape model parameters
28
Eigen-Face models Model of variation in a region
29
Applications: Locating objects Scan window over target region At each position: –Sample, normalise, evaluate p(g) Select position with largest p(g)
30
Multi-Resolution Search Train models at each level of pyramid –Gaussian pyramid with step size 2 –Use same points but different local models Start search at coarse resolution –Refine at finer resolution
31
Application: Object Detection Scan image to find points with largest p(g) If p(g)>p min then object is present Strictly should use a background model: This only works if the PDFs are good approximations – often not the case
32
Application: Face Recognition Eigenfaces developed for face recognition –More about this later
Similar presentations
© 2024 SlidePlayer.com Inc.
All rights reserved.