Slide 1 EE3J2 Data Mining EE3J2 Data Mining Lecture 9(b) Principal Components Analysis Martin Russell.

Slides:



Advertisements
Similar presentations
Face Recognition Sumitha Balasuriya.
Advertisements

Krishna Rajan Data Dimensionality Reduction: Introduction to Principal Component Analysis Case Study: Multivariate Analysis of Chemistry-Property data.
Covariance Matrix Applications
EigenFaces.
Machine Learning Lecture 8 Data Processing and Representation
Dimensionality Reduction PCA -- SVD
PCA + SVD.
1er. Escuela Red ProTIC - Tandil, de Abril, 2006 Principal component analysis (PCA) is a technique that is useful for the compression and classification.
Principal Components Analysis Babak Rasolzadeh Tuesday, 5th December 2006.
Principal Component Analysis CMPUT 466/551 Nilanjan Ray.
Principal Component Analysis
Principal Component Analysis
3D Geometry for Computer Graphics
CHAPTER 19 Correspondence Analysis From: McCune, B. & J. B. Grace Analysis of Ecological Communities. MjM Software Design, Gleneden Beach, Oregon.
Face Recognition Jeremy Wyatt.
Face Recognition Using Eigenfaces
The Terms that You Have to Know! Basis, Linear independent, Orthogonal Column space, Row space, Rank Linear combination Linear transformation Inner product.
Principal Component Analysis. Consider a collection of points.
Slide 1 EE3J2 Data Mining EE3J2 Data Mining Lecture 9 Data Analysis Martin Russell.
Computer Vision Spring ,-685 Instructor: S. Narasimhan WH 5409 T-R 10:30am – 11:50am Lecture #18.
Summarized by Soo-Jin Kim
Principle Component Analysis Presented by: Sabbir Ahmed Roll: FH-227.
NUS CS5247 A dimensionality reduction approach to modeling protein flexibility By, By Miguel L. Teodoro, George N. Phillips J* and Lydia E. Kavraki Rice.
Dimensionality Reduction: Principal Components Analysis Optional Reading: Smith, A Tutorial on Principal Components Analysis (linked to class webpage)
Principal Components Analysis (PCA). a technique for finding patterns in data of high dimension.
Chapter 2 Dimensionality Reduction. Linear Methods
CSE554AlignmentSlide 1 CSE 554 Lecture 8: Alignment Fall 2014.
Principal Components Analysis BMTRY 726 3/27/14. Uses Goal: Explain the variability of a set of variables using a “small” set of linear combinations of.
Computer Vision Spring ,-685 Instructor: S. Narasimhan Wean 5403 T-R 3:00pm – 4:20pm Lecture #19.
CSE554AlignmentSlide 1 CSE 554 Lecture 5: Alignment Fall 2011.
Additive Data Perturbation: data reconstruction attacks.
Principal Component Analysis Bamshad Mobasher DePaul University Bamshad Mobasher DePaul University.
N– variate Gaussian. Some important characteristics: 1)The pdf of n jointly Gaussian R.V.’s is completely described by means, variances and covariances.
Descriptive Statistics vs. Factor Analysis Descriptive statistics will inform on the prevalence of a phenomenon, among a given population, captured by.
CSE554AlignmentSlide 1 CSE 554 Lecture 8: Alignment Fall 2013.
ECE 8443 – Pattern Recognition LECTURE 08: DIMENSIONALITY, PRINCIPAL COMPONENTS ANALYSIS Objectives: Data Considerations Computational Complexity Overfitting.
CSE 185 Introduction to Computer Vision Face Recognition.
Introduction to Linear Algebra Mark Goldman Emily Mackevicius.
EIGENSYSTEMS, SVD, PCA Big Data Seminar, Dedi Gadot, December 14 th, 2014.
ECE 8443 – Pattern Recognition ECE 8527 – Introduction to Machine Learning and Pattern Recognition LECTURE 10: PRINCIPAL COMPONENTS ANALYSIS Objectives:
Principal Components Analysis ( PCA)
Multivariate statistical methods. Multivariate methods multivariate dataset – group of n objects, m variables (as a rule n>m, if possible). confirmation.
Unsupervised Learning II Feature Extraction
CSE 554 Lecture 8: Alignment
PREDICT 422: Practical Machine Learning
Information Management course
Background on Classification
LECTURE 10: DISCRIMINANT ANALYSIS
Eigen Decomposition Based on the slides by Mani Thomas and book by Gilbert Strang. Modified and extended by Longin Jan Latecki.
9.3 Filtered delay embeddings
Lecture 8:Eigenfaces and Shared Features
Lecture: Face Recognition and Feature Reduction
Principal Component Analysis
Additive Data Perturbation: data reconstruction attacks
Eigen Decomposition Based on the slides by Mani Thomas and book by Gilbert Strang. Modified and extended by Longin Jan Latecki.
Eigen Decomposition Based on the slides by Mani Thomas and book by Gilbert Strang. Modified and extended by Longin Jan Latecki.
Principal Component Analysis
PCA is “an orthogonal linear transformation that transfers the data to a new coordinate system such that the greatest variance by any projection of the.
Outline H. Murase, and S. K. Nayar, “Visual learning and recognition of 3-D objects from appearance,” International Journal of Computer Vision, vol. 14,
Introduction PCA (Principal Component Analysis) Characteristics:
Recitation: SVD and dimensionality reduction
Eigen Decomposition Based on the slides by Mani Thomas and book by Gilbert Strang. Modified and extended by Longin Jan Latecki.
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John.
X.1 Principal component analysis
Symmetric Matrices and Quadratic Forms
LECTURE 09: DISCRIMINANT ANALYSIS
Lecture 13: Singular Value Decomposition (SVD)
Eigen Decomposition Based on the slides by Mani Thomas and book by Gilbert Strang. Modified and extended by Longin Jan Latecki.
Principal Component Analysis
Symmetric Matrices and Quadratic Forms
Presentation transcript:

Slide 1 EE3J2 Data Mining EE3J2 Data Mining Lecture 9(b) Principal Components Analysis Martin Russell

Slide 2 EE3J2 Data Mining Objectives  To illustrate PCA through an example application  3D dance motion modelling

Slide 3 EE3J2 Data Mining Data  Analysis of dance sequence data  Body position represented as 90 dimensional vector  Dance sequence represented as a sequence of these vectors  MEng FYP 2004/5, Wan Ni Chong

Slide 4 EE3J2 Data Mining Data Capture (1)

Slide 5 EE3J2 Data Mining Data Capture (2)

Slide 6 EE3J2 Data Mining Data Capture (3)

Slide 7 EE3J2 Data Mining Calculating PCA  Step 1: Arrange data as a matrix –Rows correspond to individual data points –Number of columns = dimension of data (= 90) –Number of rows = number of data points = N

Slide 8 EE3J2 Data Mining Calculating PCA (step 2)  Compute the covariance matrix of the data  In MATLAB >>C = cov(X)  Alternatively (as in slides from last lecture): – calculate the mean vector m, –subtract m from each row of X to give Y –Then

Slide 9 EE3J2 Data Mining Calculating PCA (step 3)  Do an eigenvector decomposition of C, so that: C = UDU T  Where –U is a unitary (rotation) matrix –D is a diagonal matrix (in fact all elements of D will be real and non-negative)  In MATLAB type >>[U,D] = eig(C)

Slide 10 EE3J2 Data Mining Calculating PCA (step 4)  Each column of U is a principal vector  The corresponding eigenvalue indictates the variance of the data along that dimension  Large eigenvalues indicate significant components of the data  Small eigenvalues indicate that the variation along the corresponding eigenvectors may be noise

Slide 11 EE3J2 Data Mining Eigenvalues Insignificant Component s More Significant Components PCsData 1 st 2 nd 3 rd 90 th 1 st 90 th Eigenvalues

Slide 12 EE3J2 Data Mining Calculating PCA (step 6)  It may be advantageous to ignore dimensions which correspond to small eigenvalues and only consider the projection of the data onto the most significant eigenvectors  In this way the dimension of the data can be reduced

Slide 13 EE3J2 Data Mining Visualising PCA Original pattern (blue) U Eigenspace Set coordinates 11 – 90 to zero U -1 Reduced pattern (red)

Slide 14 EE3J2 Data Mining PCA Example  Original 90 dimensional data reduced to just 1 dimension

Slide 15 EE3J2 Data Mining PCA Example  Original 90 dimensional data reduced to 10 dimensions

Slide 16 EE3J2 Data Mining Summary  Example of PCA  Analysis of 90 dimensional 3D dance data  Analysis shows that PCA can reduce 90 dimensional representation to just 10 dimensions with minimal loss of accuracy