CS246 Linear Algebra Review. A Brief Review of Linear Algebra Vector and a list of numbers Addition Scalar multiplication Dot product Dot product as a.

Slides:



Advertisements
Similar presentations
Eigen Decomposition and Singular Value Decomposition
Advertisements

3D Geometry for Computer Graphics
Chapter 28 – Part II Matrix Operations. Gaussian elimination Gaussian elimination LU factorization LU factorization Gaussian elimination with partial.
Latent Semantic Analysis
Lecture 19 Singular Value Decomposition
Slides by Olga Sorkine, Tel Aviv University. 2 The plan today Singular Value Decomposition  Basic intuition  Formal definition  Applications.
Symmetric Matrices and Quadratic Forms
Computer Graphics Recitation 5.
3D Geometry for Computer Graphics
Some useful linear algebra. Linearly independent vectors span(V): span of vector space V is all linear combinations of vectors v i, i.e.
Chapter 3 Determinants and Matrices
The Terms that You Have to Know! Basis, Linear independent, Orthogonal Column space, Row space, Rank Linear combination Linear transformation Inner product.
CSci 6971: Image Registration Lecture 2: Vectors and Matrices January 16, 2004 Prof. Chuck Stewart, RPI Dr. Luis Ibanez, Kitware Prof. Chuck Stewart, RPI.
3D Geometry for Computer Graphics
Lecture 20 SVD and Its Applications Shang-Hua Teng.
Lecture 18 Eigenvalue Problems II Shang-Hua Teng.
Dirac Notation and Spectral decomposition
Boot Camp in Linear Algebra Joel Barajas Karla L Caballero University of California Silicon Valley Center October 8th, 2008.
Matrices CS485/685 Computer Vision Dr. George Bebis.
Linear Algebra Review By Tim K. Marks UCSD Borrows heavily from: Jana Kosecka Virginia de Sa (UCSD) Cogsci 108F Linear.
Orthogonal Matrices and Spectral Representation In Section 4.3 we saw that n  n matrix A was similar to a diagonal matrix if and only if it had n linearly.
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
Linear Algebra Review 1 CS479/679 Pattern Recognition Dr. George Bebis.
CS246 Topic-Based Models. Motivation  Q: For query “car”, will a document with the word “automobile” be returned as a result under the TF-IDF vector.
8.1 Vector spaces A set of vector is said to form a linear vector space V Chapter 8 Matrices and vector spaces.
SVD: Singular Value Decomposition
Linear algebra: matrix Eigen-value Problems
Chapter 5 Eigenvalues and Eigenvectors 大葉大學 資訊工程系 黃鈴玲 Linear Algebra.
§ Linear Operators Christopher Crawford PHY
Elementary Linear Algebra Anton & Rorres, 9 th Edition Lecture Set – 07 Chapter 7: Eigenvalues, Eigenvectors.
Introduction to Linear Algebra Mark Goldman Emily Mackevicius.
EIGENSYSTEMS, SVD, PCA Big Data Seminar, Dedi Gadot, December 14 th, 2014.
7.1 Eigenvalues and Eigenvectors
Instructor: Mircea Nicolescu Lecture 8 CS 485 / 685 Computer Vision.
1 Chapter 8 – Symmetric Matrices and Quadratic Forms Outline 8.1 Symmetric Matrices 8.2Quardratic Forms 8.3Singular ValuesSymmetric MatricesQuardratic.
Chapter 61 Chapter 7 Review of Matrix Methods Including: Eigen Vectors, Eigen Values, Principle Components, Singular Value Decomposition.
Boot Camp in Linear Algebra TIM 209 Prof. Ram Akella.
Lecture XXVII. Orthonormal Bases and Projections Suppose that a set of vectors {x 1,…,x r } for a basis for some space S in R m space such that r  m.
Tutorial 6. Eigenvalues & Eigenvectors Reminder: Eigenvectors A vector x invariant up to a scaling by λ to a multiplication by matrix A is called.
Orthogonal Matrices & Symmetric Matrices
Introduction to Linear Algebra
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
Review of Linear Algebra
CS479/679 Pattern Recognition Dr. George Bebis
Elementary Linear Algebra Anton & Rorres, 9th Edition
Matrices and vector spaces
Linear Algebra Review.
Properties Of the Quadratic Performance Surface
Some useful linear algebra
SVD: Physical Interpretation and Applications
CS485/685 Computer Vision Dr. George Bebis
Christopher Crawford PHY
Recitation: SVD and dimensionality reduction
Chapter 3 Linear Algebra
Eigenvalues and Eigenvectors
Symmetric Matrices and Quadratic Forms
Maths for Signals and Systems Linear Algebra in Engineering Lectures 13 – 14, Tuesday 8th November 2016 DR TANIA STATHAKI READER (ASSOCIATE PROFFESOR)
Lecture 13: Singular Value Decomposition (SVD)
Linear Algebra Lecture 32.
Linear Algebra Lecture 29.
Physics 319 Classical Mechanics
Linear Algebra Lecture 35.
Eigenvalues and Eigenvectors
Subject :- Applied Mathematics
Linear Algebra: Matrix Eigenvalue Problems – Part 2
Lecture 20 SVD and Its Applications
Linear Algebra Lecture 28.
Symmetric Matrices and Quadratic Forms
Presentation transcript:

CS246 Linear Algebra Review

A Brief Review of Linear Algebra Vector and a list of numbers Addition Scalar multiplication Dot product Dot product as a projection Q: (1, 0) vs (0, 1). Are they the same vectors? A: Choice of basis determines the “meaning” of the numbers Matrix Matrix multiplication Four ways to look at matrix multiplication Matrix as vector transformation

Change of Coordinates (1) Two coordinate systems Q: What are the coordinates of (2,0) under the second coordinate system? Q: What about (1,1)?

Change of Coordinates (2) In general, we get the new coordinates of a vector under the new basis vectors by multiplying the original coordinates with the following matrix Verify with previous example Q: What does the above matrix look like? How can we identify a coordinate-change matrix?

Matrix and Change of Coordinates vectors are orthonormal to each other Orthonormal matrix: An orthonormal matrix can be interpreted as change- of-coordinate transformation The rows of the matrix Q are the new basis vectors

Linear Transformation Linear transformation Every linear transformation can be represented as a matrix By selecting appropriate basis vectors Matrix form of a linear transformation can be obtained simply by learning how the basis vectors transform Verify with 45 degree rotation. What transformations are possible for linear transformation?

Linear Transformation that We Know Rotation Stretching Anything else? Claim: Any linear transformation is a stretching followed by a rotation “Meaning” of singular value decomposition An important result of linear algebra Let us learn why this is the case

Rotation Matrix form of rotation? What property will it have? Remember Rotation matrix R Orthonormal matrix ’s are unit basis vectors as well Orthonormal matrix Change of coordinates Rotation

Stretching (1) Q: Matrix form of stretching by 3 along x, y, z axes in 3D? Q: Matrix form of stretching by 3 along x axis and by 2 along y axis in 3D. Q: Stretching matrix diagonal matrix?

Stretching (2) Q: Matrix form of stretching by 3 along and by 2 along ? Verify by transforming (1,1) and (-1, 1) Decomposition of T = Q T’ Q T shows the transformation in a different coordinate system Under the matrix form, the simplicity of the stretching transformation may not be obvious Q: What if we chose as the basis?

Stretching (3) Under a good choice of basis vectors, orthogonal- stretching transformation can always be represented as a diagonal matrix Q: How can we tell whether a matrix corresponds to an orthogonal-stretching transformation?

Stretching – Orthogonal Stretching (1) Remember that this is orthogonal-stretching along If a transformation is orthogonal stretching, we should always be able to represent it as QDQ T for some Q, where Q shows the stretching axes Q: What is the matrix form of the transformation that stretches by 5 along (4/5, 3/5) and by 4 along (-3/5, 4/5)?

Stretching – Orthogonal Stretching (2)  Q: Given a matrix, how do we know whether it is orthogonal-stretching?  A: When it can be decomposed to T = QDQ T  A: Spectral Theorem  Any symmetric matrix T can always be decomposed into T = QDQ T  Symmetric matrix orthogonal stretching  Q: How can we decompose T to QDQ T ?  A: If T stretches along X, then TX = X for some.  X: eigenvector of T  : eigenvalue of T  Solve the equation for and X

Eigen Values, Eigen Vectors and Orthogonal Stretching Eigenvector: stretching axis Eigenvalue: stretching factor All eigenvectors are orthogonal Orthogonal stretching Symmetric matrix (spectral theorem) Example Q: What transformation is this?

Singular Value Decomposition (SVD) Any linear transformation T can be decomposed to T = R S (R: rotation, S: orthogonal stretching) One of the basic results of linear algebra In matrix form, any matrix T can be decomposed to Diagonal entries in D: singular values Example Q: What transformation is this?

Singular Value Decomposition (2) Q: For (n x m) matrix T, what will be the dimension of the three matrices after SVD? Q: What is the meaning of non-square diagonal matrix? The diagonal matrix is also responsible for projection (or dimension padding).

Singular Values vs Eigenvalues Q: What is this transformation? A: Q 1 – eigenvectors of T T T D – square root of eigenvalues of T T T. Similarly, Q 2 – eigenvectors of TT T D – square root of eigenvalues of TT T. SVD can be done by computing eigenvalues and eigenvectors of T T T and TT T

SVD as Matrix Approximation Q: If we want to reduce the rank of T to 2, what will be a good choice? The best rank-k approximation of any matrix T is to keep the first-k entries of its SVD.

SVD Approximation Example: 1000 x 1000 matrix with (0…255)

Image of original matrix 1000x1000

SVD. Rank 1 approximation

SVD. Rank 10 approximation

SVD. Rank 100 approximation

Original vs Rank 100 approximation Q: How many numbers do we keep for each?