 # Lecture 20 SVD and Its Applications Shang-Hua Teng.

## Presentation on theme: "Lecture 20 SVD and Its Applications Shang-Hua Teng."— Presentation transcript:

Lecture 20 SVD and Its Applications Shang-Hua Teng

Every symmetric matrix A can be written as Spectral Theorem and Spectral Decomposition where x 1 …x n are the n orthonormal eigenvectors of A, they are the principal axis of A. x i x i T is the projection matrix on to x i !!!!!

Singular Value Decomposition Any m by n matrix A may be factored such that A = U  V T U: m by m, orthogonal, columns V: n by n, orthogonal, columns  : m by n, diagonal, r singular values

The Singular Value Decomposition r = the rank of A = number of linearly independent columns/rows AU VTVT m x n m x m m x n n x n · · =  0 0

SVD Properties U, V give us orthonormal bases for the subspaces of A: –1st r columns of U: Column space of A –Last m - r columns of U: Left nullspace of A –1st r columns of V: Row space of A –1st n - r columns of V: Nullspace of A IMPLICATION: Rank(A) = r

The Singular Value Decomposition · · AU VTVT =  0 0 A U VTVT m x n m x r r x r r x n =  0 0 m x n m x m m x n n x n

Singular Value Decomposition where u 1 …u r are the r orthonormal vectors that are basis of C(A) and v 1 …v r are the r orthonormal vectors that are basis of C(A T )

SVD Proof Any m x n matrix A has two symmetric covariant matrices (m x m) AA T (n x n) A T A

Spectral Decomposition of Covariant Matrices (m x m) AA T =U   U T –U is call the left singular vectors of A (n x n) A T A = V   V T –V is call the right singular vectors of A Claim: are the same

Singular Value Decomposition Proof

All Singular Values are non Negative

Row and Column Space Projection Suppose A is an m by n matrix that has rank r and r << n, and r << m. –Then A has r non-zero singular values –Let A = U  V T be the SVD of A where S is an r by r diagonal matrix –Examine:

The Singular Value Projection · A U VTVT m x n m x r r x r r x n =  0 0

Therefore Rows of U  are r dimensional projections of rows of A Columns of  V T are r dimensional projections of columns of A So we can compute their distances or dot products in a lower dimensional space

Eigenvalues and Determinants Product law: Summation Law: Both can be proved by examining the characteristic polynomial

Eigenvalues and Pivots If A is symmetric the number of positive (negative) eigenvalues equals to the number of positive (negative) pivots A = LDL T Topological Proof: scale down the off-diagonal entries of L continuously to 0, i.e., moving L continuously to I. Any change sign in eigenvalue must cross 0

Next Lecture Dimensional reduction for Latent Semantic Analysis Eigenvalue Problems in Web Analysis