Presentation is loading. Please wait.

Presentation is loading. Please wait.

Principle Component Analysis Presented by: Sabbir Ahmed Roll: FH-227.

Similar presentations


Presentation on theme: "Principle Component Analysis Presented by: Sabbir Ahmed Roll: FH-227."— Presentation transcript:

1 Principle Component Analysis Presented by: Sabbir Ahmed Roll: FH-227

2 Overview Variance and Covariance Eigenvector and Eigenvalue Principle Component Analysis Application of PCA in Image Processing 2

3 Variance and Covariance(1/2) The variance is a measure of how far a set of numbers is spread out. The equation of variance is 3

4 Variance and Covariance(2/2) Covariance is a measure of how much two random variables change together. The equation of variance is 4

5 Covariance Matrix Covariance Matrix is a n*n matrix where each element can be define as A covariance matrix over 2 dimensional dataset is 5

6 Eigenvector The eigenvectors of a square matrix A are the non- zero vectors x such that, after being multiplied by the matrix, remain parallel to the original vector. 6

7 Eigenvalue For each Eigenvector, the corresponding Eigenvalue is the factor by which the eigenvector is scaled when multiplied by the matrix. 7

8 Eigenvector and Eigenvalue (1/2) The vector x is an eigenvector of the matrix A with eigenvalue λ (lambda) if the following equation holds: 8

9 Eigenvector and Eigenvalue (2/2) Calculating Eigenvalues Calculating Eigenvector 9

10 Principle Component Analysis (1/3) PCA (Principle Component Analysis) is defined as an orthogonal linear transformation that transforms the data to a new coordinate system such that the greatest variance comes to lie on the first coordinate, the second greatest variance on the second coordinate and so on. 10

11 Principle Component Analysis (2/3) 11

12 Principle Component Analysis (3/3) 12

13 Principle Component Each Coordinate in Principle Component Analysis is called Principle Component. C i = b i1 (x 1 ) + b i2 (x 2 ) + … + b in (x n ) where, C i is the i th principle component, b ij is the regression coefficient for observed variable j for the principle component i and x i are the variables/dimensions. 13

14 Eigenvector and Principle Component It turns out that the Eigenvectors of covariance matrix of the data set are the principle components of the data set. Eigenvector with the highest eigenvalue is first principle component and with the 2 nd highest eigenvalue is the second principle component and so on. 14

15 Steps to find Principle Components 1. Adjust the dataset to zero mean dataset. 2. Find the Covariance Matrix M 3. Calculate the normalized Eigenvectors and Eigenvalues of M 4. Sort the Eigenvectors according to Eigenvalues from highest to lowest 5. Form the Feature vector F using the transpose of Eigenvectors. 6. Multiply the transposed dataset with F 15

16 Example XY 2.52.4 0.50.7 2.22.9 1.92.2 3.13.0 2.32.7 21.6 11.1 1.51.6 1.10.9 XY 0.690.49 -1.31-1.21 0.390.99 0.090.29 1.291.09 0.490.79 0.19-0.31 -0.81 -0.31 -0.71-1.01 Original Data Adjusted Dataset 16 AdjustedDataSet = OriginalDataSet - Mean

17 Covariance Matrix 17

18 Eigenvalues and Eigenvectors The eigenvalues of matrix M are Normalized Eigenvectors with corresponding eigenvales are 18

19 Feature Vector Sorted eigenvector Feature vector 19

20 Final Data (1/2) XY -0.827970186-0.175115307 1.777580330.142857227 -0.9921974940.384374989 -0.2742104160.130417207 -1.67580142-0.209498461 -0.9129491030.175282444 -0.099109437-0.349824698 1.144572160.0464172582 0.4380461370.0177646297 1.22382056-0.162675287 FinalData = F x AdjustedDataSetTransposed 20

21 Final Data (2/2) FinalData = F x AdjustedDataSetTransposed 21 X -0.827970186 1.77758033 -0.992197494 -0.274210416 -1.67580142 -0.912949103 0.0991094375 1.14457216 0.438046137 1.22382056

22 Retrieving Original Data(1/2) FinalData = F x AdjustedDataSetTransposed AdjustedDataSetTransposed = F -1 x FinalData but, F -1 =F T So, AdjustedDataSetTransposed =F T x FinalData and, OriginalDataSet = AdjustedDataSet + Mean 22

23 Retrieving Original Data(2/2) 23

24 Application of PCA in Image Processing Pattern Recognition Image Compression Determination of Object Orientation and Rotation 24

25 Question ? 25

26 References Principle Component Analysis in Wikipedia http://en.wikipedia.org/wiki/Principal_componen t_analysis http://en.wikipedia.org/wiki/Principal_componen t_analysis A tutorial on Principal Components Analysis by Lindsay I Smith http://www.sccg.sk/~haladova/principal_component s.pdf http://www.sccg.sk/~haladova/principal_component s.pdf Principle Component Analysis in Image Processing by M. Mudrov´, A. Proch´zka http://dsp.vscht.cz/konference_matlab/matlab05/pri spevky/mudrova/mudrova.pdf http://dsp.vscht.cz/konference_matlab/matlab05/pri spevky/mudrova/mudrova.pdf 26


Download ppt "Principle Component Analysis Presented by: Sabbir Ahmed Roll: FH-227."

Similar presentations


Ads by Google