Presentation is loading. Please wait.

Presentation is loading. Please wait.

Eigenfaces (for Face Recognition)

Similar presentations


Presentation on theme: "Eigenfaces (for Face Recognition)"— Presentation transcript:

1 Eigenfaces (for Face Recognition)

2 Face Recognition, Eigenfaces, Matrices
We start with looking at problem such as: 4 Apples plus 5 Bananas costs $10 7 Apples plus 6 Bananas costs $20 What is price of apple or banana? So, first we call applePrice A, and bananaPrice B, get: 4A + 5B = 10 and 7A + 6B = 20

3 Linear Equations and Matrices
4A + 5B = 10 and 7A + 6B = 20 Those two equations can be written as Matrix, get: [ ] [ ] = [ ] Which is matrix notation for this kind of problem. AB 10 20

4 Going from one notation to another
4A + 5B = 10 and 7A + 6B = 20 XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX [ ] [ ] = [ ] Note that we have not actually solved anything by doing this to get the new notation, we have merely got a different way to express the Problem. But, now, solution will be easier. AB 10 20

5 [ ] [ ] = [ ] Matrices We said, solution will be easier.
[ ] [ ] = [ ] We said, solution will be easier. Well, what happens in mathematics, is that they tell you: To name the matrix as, say, P. Then find the inverse of P, called P Then, multiply both sides by P 10 20 AB -1 -1

6 P P [ ] = P [ ] Matrices 4 5 7 6 A B 10 20 AB 10 20
So, we name the matrix, as P Let P = [ ], so get P [ ] = [ ] Then find the inverse of P, called P Then, multiply both sides by P , multiply by putting on left So, we get P P [ ] = P [ ] A B 10 20 -1 -1 AB 10 20 -1 -1

7 P P [ ] = P [ ] Matrices So, we have Now, for some matrix facts:
1) For any square matrix P, P x P = Identity 2) For any matrix or vector T, Identity x T is = T 3) For any square matrix P, if P is of dimension 2x2, then its inverse, P , is also of dimension 2x2. A B 10 20 -1 -1 -1 -1

8 Matrices 4) For two entities (matrix or vector) to multiply, when the dimensions are written out, the inner numbers must agree, i.e., be common; the resulting answer will have dimensions that are specified by the two outer numbers of the writing. example1 for Fact 4: A B = C is 2x7 in the example above, = means “gives the result” example2 : A x B is not valid. Why?? __x__ 2x3 3x7 2x3 2x3

9 Matrices For Fact 4, example3 : A x B x C
the dimensions of B MUST be 3x5. 5) The Identity matrix is a square matrix with zeroes in the complete matrix except the main diagonal which has 1’s. The main diagonal is the one that runs from the Northwest to the Southeast, or can be said as, from the top left to the bottom right. 2x3 __x__ 5x3

10 Matrices Some notes on Matrices
1) we have not said anything about “How to Compute the Inverse?” This topic Is covered in a Matrix Algebra class in about the 3 to 4 weeks. Suffice it to say that we rarely have to write the actual computation code for the inverse. This is because it has become a standard add-on to most libraries, in particular to platforms like MATLab . 2) Whether we multiply on the left or the right depends on the positioning of the item we are trying to remove.

11 Redundant Equations for Matrices
Our equations had been: 4A + 5B = 10 and 7A + 6B = 20 Now, consider: 4A + 5B = 10 and 8A + 10B = 20 our matrix then is P = [ ]

12 Redundant Equations for Matrices
our matrix then is P = [ ] Our system of two equations now, and our new matrix P now, does not have what is called a Unique Solution. I hope you know (very well) that the reason is because the second eqn is not adding anything new to the first eqn (2nd simply doubled the 1st).

13 Redundant Equations for Matrices
our matrix then is P = [ ] The 2nd simply doubling the 1st, is an example of redundant equations. In general systems of equations, one can have Exactly one solution, 2) No solution Or 3) Infinitely many solutions.

14 Redundant Equations for Matrices
In systems of equations, one can have Exactly one solution, 2) No solution Or 3) Infinitely many solutions. This can be seen geometrically as: 1) Two lines crossing (intersecting) at a unique point ; 2) two lines never intersecting each other, being parallel to each other; 3) two lines being identical, laying on each other (infinite solutions).

15 Redundant Equations for Matrices
The cases 2 and 3, 2) two lines never intersecting each other, being parallel to each other; 3) two lines being identical, laying on each other (infinite solutions), arise from the following situation: 4A + 5B = 10 and 8A + 10B = ??

16 Redundant Equations for Matrices
The cases 2 and 3, 2) two lines never intersecting each other, being parallel to each other; 3) two lines being identical, laying on each other (infinite solutions), arise from the following: 4A + 5B = 10 and 8A + 10B = ?? if ?? is 2x10 (i.e., 20), this means the lines are identical (Case 3); if not, then parallel (Case 2)

17 Redundant Equations for Matrices
For eqns like A + 5B = 10 and 8A + 10B = ?? if ?? is 2x10 (i.e., 20), this means the lines are identical (Case 3); if not, then parallel (Case 2). (Case 2 is sometimes called Inconsistent Equations). So, Mathematics evolved techniques to automatically tell which case was the case given to the system.

18 Redundant Equations for Matrices
Not only did Mathematics evolve techniques to automatically tell which case was the case given to the system, they went further. Consider 4A B = 10 and 8A + 10B = These lines (equations) are not exactly identical, but almost are.

19 Redundant Equations for Matrices
For eqns 4A B = 10 and 8A + 10B = Math invented techniques to tell that rows of a matrix are “almost” redundant. In fact, the techniques can tell how strongly non-redundant (called independent) a row is. This black-box technique is called EigenAnalysis. You feed it a square matrix, it comes back with a set of new Columns (called eigenvectors) and a set of scalar numbers (called eigenvalues).

20 Redundant Equations for Matrices
In EigenAnalysis, we feed it a square matrix, it comes back with a set of new Columns (called eigenvectors) and a set of scalar numbers (called eigenvalues). The eigenvectors are telling us about a new way to represent any original vector of the space. The eigenvalues are telling us how much redundancy there is in the data vectors.

21 Redundant Equations for Matrices
The eigenvalues are telling us how much redundancy there is in the data vectors. It turns out that in the world of faces, there is a lot of redundancy. So, face recognition can be sped up by exploiting the redundancy.

22 Brute Force way to Recognize Faces
Imagine we have a database of M faces. Suppose each face is 256x256 in size. Then the matching task, i.e., the recognition task, can be done by: Simply get a match score between the 256x256 Test Face picture and each database picture. Then find the minimum score (or lowest 5 scores, or something like that). Now, since this face recognition topic is a CS topic, we examine the time cost of this approach, to get the match score for matching test pic to each database pic.

23 Brute Force way to Recognize Faces
Since this face recognition topic is a CS topic, we examine the time cost of this approach, to get the match score for matching a test pic to each database pic. The match score can be computed by an algorithm that loops thru each pixel position and compares the test pic’s pixel and the database pic’s pixel (of the same position). The comparison could take the form of a simple subtraction followed by an absolute value calculation. Then, all these absolute values (each for a different position in the image) would be added up to give the match score. So, our score is ∑ AbsoluteValue (TestPic pixel i - DatabasePic pixel i) 256x256 i=1

24 Brute Force way to Recognize Faces
Now, for a database of M faces, we are matching a test pic to the stored faces. So total work is M x (Number of steps in matching the test pic to a single database pic face.) Each face takes 256x256 loop executions. Total work is 256x256 loop executions x M. This is the Brute Force Approach. It is the simplest approach that would come to mind, without much thought. The challenge, then, is to beat the brute force speed.

25 Brute Force way to Recognize Faces
The challenge is to beat the brute force speed. What the eigenface approach does is lower this from 256x256 executions (per picture match in the brute force approach), to about 50 loop executions, per match to a picture in the database. How is the lowering achieved?

26 Redundant Equations for Matrices
The eigenvectors are going to give us a new representation of each original face, plus we will know how much redundancy there is in the system, so that the true independent dimensions are much fewer than 256x256. In the brute force approach, each database pic is represented by its 256x256 number of pixels, hence the matching had to go thru each of these 256x256 numbers. If we can represent each pic now by fewer numbers, say 50, we will have a speedier approach. We next look at how this lowering is done.


Download ppt "Eigenfaces (for Face Recognition)"

Similar presentations


Ads by Google