3D Geometry for Computer Graphics

Slides:



Advertisements
Similar presentations
Eigen Decomposition and Singular Value Decomposition
Advertisements

Eigen Decomposition and Singular Value Decomposition
3D Geometry for Computer Graphics
Chapter 28 – Part II Matrix Operations. Gaussian elimination Gaussian elimination LU factorization LU factorization Gaussian elimination with partial.
PCA + SVD.
Two-View Geometry CS Sastry and Yang
Solving Linear Systems (Numerical Recipes, Chap 2)
Lecture 19 Singular Value Decomposition
Slides by Olga Sorkine, Tel Aviv University. 2 The plan today Singular Value Decomposition  Basic intuition  Formal definition  Applications.
1cs542g-term High Dimensional Data  So far we’ve considered scalar data values f i (or interpolated/approximated each component of vector values.
Principal Component Analysis CMPUT 466/551 Nilanjan Ray.
© 2003 by Davi GeigerComputer Vision September 2003 L1.1 Face Recognition Recognized Person Face Recognition.
Principal Component Analysis
Computer Graphics Recitation 5.
3D Geometry for Computer Graphics
Computer Graphics Recitation 6. 2 Last week - eigendecomposition A We want to learn how the transformation A works:
3D Geometry for Computer Graphics
Math for CSLecture 41 Linear Least Squares Problem Over-determined systems Minimization problem: Least squares norm Normal Equations Singular Value Decomposition.
Singular Value Decomposition COS 323. Underconstrained Least Squares What if you have fewer data points than parameters in your function?What if you have.
1 Numerical geometry of non-rigid shapes Spectral Methods Tutorial. Spectral Methods Tutorial 6 © Maks Ovsjanikov tosca.cs.technion.ac.il/book Numerical.
Real-time Combined 2D+3D Active Appearance Models Jing Xiao, Simon Baker,Iain Matthew, and Takeo Kanade CVPR 2004 Presented by Pat Chan 23/11/2004.
3-D Geometry.
Face Recognition Jeremy Wyatt.
Lecture 21 SVD and Latent Semantic Indexing and Dimensional Reduction
The Terms that You Have to Know! Basis, Linear independent, Orthogonal Column space, Row space, Rank Linear combination Linear transformation Inner product.
Principal Component Analysis Barnabás Póczos University of Alberta Nov 24, 2009 B: Chapter 12 HRF: Chapter 14.5.
Previously Two view geometry: epipolar geometry Stereo vision: 3D reconstruction epipolar lines Baseline O O’ epipolar plane.
3D Geometry for Computer Graphics
Lecture 20 SVD and Its Applications Shang-Hua Teng.
Ordinary least squares regression (OLS)
Olga Sorkine’s slides Tel Aviv University. 2 Spectra and diagonalization A If A is symmetric, the eigenvectors are orthogonal (and there’s always an eigenbasis).
Multimedia Databases LSI and SVD. Text - Detailed outline text problem full text scanning inversion signature files clustering information filtering and.
1cs542g-term Notes  Extra class next week (Oct 12, not this Friday)  To submit your assignment: me the URL of a page containing (links to)
CS 485/685 Computer Vision Face Recognition Using Principal Components Analysis (PCA) M. Turk, A. Pentland, "Eigenfaces for Recognition", Journal of Cognitive.
Linear Algebra Review By Tim K. Marks UCSD Borrows heavily from: Jana Kosecka Virginia de Sa (UCSD) Cogsci 108F Linear.
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
SVD(Singular Value Decomposition) and Its Applications
Manifold learning: Locally Linear Embedding Jieping Ye Department of Computer Science and Engineering Arizona State University
Summarized by Soo-Jin Kim
Linear Least Squares Approximation. 2 Definition (point set case) Given a point set x 1, x 2, …, x n  R d, linear least squares fitting amounts to find.
Dimensionality Reduction: Principal Components Analysis Optional Reading: Smith, A Tutorial on Principal Components Analysis (linked to class webpage)
CSE554AlignmentSlide 1 CSE 554 Lecture 8: Alignment Fall 2014.
Alignment Introduction Notes courtesy of Funk et al., SIGGRAPH 2004.
Feature extraction 1.Introduction 2.T-test 3.Signal Noise Ratio (SNR) 4.Linear Correlation Coefficient (LCC) 5.Principle component analysis (PCA) 6.Linear.
CSE554AlignmentSlide 1 CSE 554 Lecture 5: Alignment Fall 2011.
1 Recognition by Appearance Appearance-based recognition is a competing paradigm to features and alignment. No features are extracted! Images are represented.
SVD: Singular Value Decomposition
CSE554AlignmentSlide 1 CSE 554 Lecture 8: Alignment Fall 2013.
CSE 185 Introduction to Computer Vision Face Recognition.
1 Matrix Algebra and Random Vectors Shyh-Kang Jeng Department of Electrical Engineering/ Graduate Institute of Communication/ Graduate Institute of Networking.
EIGENSYSTEMS, SVD, PCA Big Data Seminar, Dedi Gadot, December 14 th, 2014.
1. Systems of Linear Equations and Matrices (8 Lectures) 1.1 Introduction to Systems of Linear Equations 1.2 Gaussian Elimination 1.3 Matrices and Matrix.
Advanced Computer Graphics Spring 2014 K. H. Ko School of Mechatronics Gwangju Institute of Science and Technology.
3D Geometry for Computer Graphics Class 3. 2 Last week - eigendecomposition A We want to learn how the transformation A works:
Irena Váňová. B A1A1. A2A2. A3A3. repeat until no sample is misclassified … labels of classes Perceptron algorithm for i=1...N if then end * * * * *
Chapter 13 Discrete Image Transforms
CS246 Linear Algebra Review. A Brief Review of Linear Algebra Vector and a list of numbers Addition Scalar multiplication Dot product Dot product as a.
CSE 554 Lecture 8: Alignment
Spectral Methods Tutorial 6 1 © Maks Ovsjanikov
Machine Learning Dimensionality Reduction
Singular Value Decomposition
SVD: Physical Interpretation and Applications
Introduction PCA (Principal Component Analysis) Characteristics:
Recitation: SVD and dimensionality reduction
Parallelization of Sparse Coding & Dictionary Learning
Matrix Algebra and Random Vectors
Outline Singular Value Decomposition Example of PCA: Eigenfaces.
Feature space tansformation methods
Lecture 13: Singular Value Decomposition (SVD)
Principal Component Analysis
Presentation transcript:

3D Geometry for Computer Graphics Class 7

The plan today Review of SVD basics Connection PCA Applications: Eigenfaces Animation compression

The geometry of linear transformations A linear transform A always takes hyper-spheres to hyper-ellipses. A A

The geometry of linear transformations Thus, one good way to understand what A does is to find which vectors are mapped to the “main axes” of the ellipsoid. A A

Spectral decomposition If we are lucky: A = V  VT, V orthogonal The eigenvectors of A are the axes of the ellipse A

Spectral decomposition The standard basis: y x Rotation VT Au = V  VTu

Spectral decomposition Scaling: x’’ = 1 x’, y’’ = 2 y’ Scale  Au = V  VTu

Spectral decomposition Rotate back: Rotate V Au = V  VTu

General linear transformations: SVD In general A will also contain rotations, not just scales: 1 1 1 2 A

SVD The standard basis: y x Rotation VT Au = U  VTu

Spectral decomposition Scaling: x’’ = 1 x’, y’’ = 2 y’ Scale  Au = U  VTu

Spectral decomposition Rotate again: Rotate U Au = U  VTu

SVD more formally SVD exists for any matrix Formal definition: For square matrices A  Rnn, there exist orthogonal matrices U, V  Rnn and a diagonal matrix , such that all the diagonal values i of  are non-negative and =

Reduced SVD For rectangular matrices, we have two forms of SVD. The reduced SVD looks like this: The columns of U are orthonormal Cheaper form for computation and storage Mn Mn nn nn =

Full SVD We can complete U to a full orthogonal matrix and pad  by zeros accordingly Mn MM Mn nn =

SVD is the “working horse” of linear algebra There are numerical algorithms to compute SVD. Once you have it, you have many things: Matrix inverse  can solve square linear systems Numerical rank of a matrix Can solve least-squares systems PCA Many more…

SVD is the “working horse” of linear algebra Must remember: SVD is expensive! For a nn matrix, SVD costs O(n3). For sparse matrices could sometimes be cheaper

Shape matching We have two objects in correspondence Want to find the rigid transformation that aligns them

Shape matching When the objects are aligned, the lengths of the connecting lines are small.

Shape matching – formalization Align two point sets Find a translation vector t and rotation matrix R so that:

Summary of rigid alignment Translate the input points to the centroids: Compute the “covariance matrix” Compute the SVD of H: The optimal rotation is The translation vector is = H is 22 or 33 matrix!

SVD for animation compression Chicken animation

3D animations Each frame is a 3D model (mesh) Connectivity – mesh faces

3D animations Each frame is a 3D model (mesh) Connectivity – mesh faces Geometry – 3D coordinates of the vertices

3D animations Connectivity is usually constant (at least on large segments of the animation) The geometry changes in each frame  vast amount of data, huge filesize! 13 seconds, 3000 vertices/frame, 26 MB

Animation compression by dimensionality reduction The geometry of each frame is a vector in R3N space (N = #vertices) 3N  f

Animation compression by dimensionality reduction Find a few vectors of R3N that will best represent our frame vectors! U 3Nf  ff VT ff VT =

Animation compression by dimensionality reduction The first principal components are the important ones u1 u2 u3 VT =

Animation compression by dimensionality reduction Approximate each frame by linear combination of the first principal components The more components we use, the better the approximation Usually, the number of components needed is much smaller than f. u1 u2 u3 = 1 + 2 + 3

Animation compression by dimensionality reduction Compressed representation: The chosen principal component vectors Coefficients i for each frame ui Animation with only 2 principal components Animation with 20 out of 400 principal components

Eigenfaces Same principal components analysis can be applied to images

Eigenfaces Each image is a vector in R250300 Want to find the principal axes – vectors that best represent the input database of images

Reconstruction with a few vectors Represent each image by the first few (n) principal components

Face recognition Given a new image of a face, w  R250300 Represent w using the first n PCA vectors: Now find an image in the database whose representation in the PCA basis is the closest: The angle between w and w’ is the smallest w w’ w w’

Non-linear dimensionality reduction More sophisticated methods can discover non-linear structures in the face datasets Isomap, Science, Dec. 2000

See you next time