 # CHAPTER SIX Eigenvalues

## Presentation on theme: "CHAPTER SIX Eigenvalues"— Presentation transcript:

CHAPTER SIX Eigenvalues

Outlines System of linear ODE (Omit) Diagonalization
Hermitian matrices Ouadratic form Positive definite matrices

Motivations To simplify a linear dynamics such that it is as simple as possible. To realize a linear system characteristics e.g., the behavior of system dynamics.

Example: In a town, each year
30% married women get divorced 20% single women get married In the 1st year, 8000 married women 2000 single women. Total population remains constant be the women numbers at year i, where represent married & single women respectively.

If Question: Why does converges? Why does it converges to the same limit even when the initial condition is different?

Ans: Choose a basis Given an initial for some for example Question: How does one know choosing such a basis?

Def: Let . A scalar is said to be an
eigenvalue or characteristic value of A if such that The vector is said to be an eigenvector or characteristic vector belonging to . is called an eigen pair of A. Question: Given A, How to compute eigenvalues & eigenvectors?

characteristic polynomial of A, of degree n in
is an eigen pair of is singular Note that, is a polynomial, called characteristic polynomial of A, of degree n in Thus, by FTA, A has exactly n eigenvalues including multiplicities. is a eigenvector associated with eigenvalue while is eigenspace of A.

Example: Let are eigenvalues of A. To find eigenspace of 2:( i.e., )

To find eigenspaces of 3(i.e., )
Let

Let Then is an eigenvalue of A. has a nontrivial solution. is singular. loses rank.

Let If is an eigenvalue of A with eigenvector Then This means that is also an eigen-pair of A.

Let Where are eigenvalues of A. (i) Let (ii) Compare with the coefficient of , we have

Theorem 6.1.1: Let Then and consequently A & B have the same eigenvalues. Pf: Let for some nonsingular matrix S.

Diagonalization Goal: Given find nonsingular matrix S a diagonal matrix. Question1: Are all matrices diagonalizable? Question2: What kinds of A are diagonalizable? Question3: How to find S if A is diagonalizable?

NOT all matrices are diagonalizable
e.g., Let If A is diagonalizable nonsingular matrix S

To answer Q2, Suppose A is diagonalizable. nonsingular matrix S Let are eigenpair of A for This gives a condition for and diagonalizability and a way to find S.

Theorem 6.3.2: Let is diagonalizable
A has n linear independent eigenvectors. Note : Similarity transformation Change of coordinate diagonalization

Theorem6.3.1: If are distinct eigenvalues of a
matrix A with corresponding eigenvectors , then are linear independent. Pf: Suppose are linear independent not all zero Suppose are distinct. are linear independent.

Remarks: Let , and (i) is an eigenpair of A for (ii) The diagonalizing matrix S is not unique because Its columns can be reordered or multiplied by an nonzero scalar (iii) If A has n distinct eigenvalues , A is diagonalizable. If the eigenvalues are not distinct , then may or may not diagonalizable depending on whether or not A has n linear independent eigenvectors. (iv)

Example: Let For Let

Def: If an matrix A has fewer than n linear independent eigenvectors,we say that A is defective e.g. (i) is defective (ii) is defective

Example: Let A & B both have the same eigenvalues Nullity (A-2I)=1 The eigenspace associated with has only one dimension. A is NOT diagonalizable However, Nullity (B-2I)=2 B is diagonalizable

Question: Is the following matrix diagonalizable ?

The Exponential of a Matrix Motiration :
Motiration:The general solution of is The unique solution of Question:What is and how to compute ?

Note that Define

Suppose A is diagonalizable with

Example: Compute Sol: The eigenvalues A are with eigenvectors

Hermitian matrices : Let , then A can be written as where e.g. ,

Let , then e.g. ,

Def: A matrix A is said to be Hermitian if A is said to be skew-Hermitian if A is said to be unitary if ( → its column vectors form an orthonormal set in )

(ii) Let and be two eigenpairs of A with
Theorem6.4.1: Let Then (i) (ii) eigenvectors belonging to distinct eigenvalues are orthogonal Pf：(i)Let be an eigenpair of A (ii) Let and be two eigenpairs of A with

Theorem: Let Then Pf：(i) Let be an eigenpair of A is pure-imaginary

Theorem6.4.3: (Schur`s Theorem) Let
Then unitary matrix U is upper triangular Pf：Let be an eigenpair of A with Choose to be such that is unitary Choose Chose to be unitary Continue this process , we have the theorem

Theorem6.4.4: (Spectral Theorem)
If , then unitary matrix U that diagonalizes A . Pf：By previous Theorem , unitary matrix , where T is upper triangular . T is a diagonal matrix

Cor: Let A be real symmetric matrix .
Then (i) (ii) an orthogonal matrix U is a diagonal matrix Remark：If A is Hermitian , then , by Th6.4.4 , Complete orthonormal eigenbasis

Example: Find an orthogonal matrix U that diagonalizes A Sol : (i) (ii) (iii)By Gram-Schmidt Process The columns of form an orthogonormal eigenbasis (WHY?)

Note：If A has orthonormal eigenbasis
Question:In addition to Hermitian matrices , Is there any other matrices possessing orthonormal eigenbasis? Note：If A has orthonormal eigenbasis where U is Hermitian & D is diagonal

Def: A is said to be normal if
Remark：Hermitian , Skew- Hermitian and Unitary matrices are all normal

Theorem6.4.6: A is normal A possesses
orthonormal eigenbasis Pf： have proved By Th , unitary U is upper triangular T is also normal Compare the diagonal elements of T has orthonormal eigenbasis(WHY?)

Singular Value Decomposition(SVD) :
Theorem : Let with rank(A)=r Then unitary matrices With Where

Remark:In the SVD The scalars are called singular values of A Columns of U are called left singular vectors of A Columns of V are called right singular vectors of A

Pf：Note that , is Hermitian
& Positive semidefinite with unitary matrix V where Define (1) (2) Define (3) Define is unitary

Remark:In the SVD  The singular values of A are unique while U&V are not unique  Columns of U are orthonormal eigenbasis for  Columns of V are orthonormal eigenbasis for is an orthonormal basis for

rank(A) = number of nonzero singular values
but rank(A) ≠ number of nonzero eigenvalues for example

Example : Find SVD of Sol :  An orthonormal eigenbasis associate with can be found as  Find U is orthogonal A set of candidate for are  Thus

Lemma6.5.2 : Let be orthogonal . Then Pf :

Cor : Let be the SVD of A . Then

We`ll state the next result without proof :
Theorem6.5.3 : H.(1) be the SVD of A (2) C :

Application : Digital Image Processing
(especially efficient for matrix which has low rank)

Optimization : An application to the Calculus

Def : A quadratic equation in two variables x & y
is an equation of the form

Standard forms of conic sections
(ii) (iii) (iv) Note : Is there any difference between the eigenvalues A of the quadratic form ?

Goal : Try to transform the quadratic equation
into standard form by suitable translation and rotation

Example : (No xy term)  The eigenvalues of the quadratic terms are 9 , 4 → ellipse  →

→ By direct computation
Example : (Have xy term) → By direct computation is orthogonal ( why does such U exist ? ) → Let the original equation becomes

Example : → Let or

Optimization : Let  It is known from Taylor’s Theorem of Calculus that Where is Hessian matrix  is local extremum  If then is a local minimum

Def : A real symmetric matrix A is said to be
(i) Positive definite denoted by (ii) Negative definite denoted by (iii) Positive semidefinite denoted by (iv) Negative semidefinite denoted by example : is indefinite question : Given a real symmetric matrix , how to determine its definiteness efficiently ?

Theorem6.5.1: Let Then Pf : let be eigenpair of A Suppose Let be an orthonormal eigen-basis of A (Why can assume this ? )

Example: Find local extrema of
Sol : Thus f has local maximum at while are saddle points

Positive Definite Matrices :
Property I : P is nonsingular Property II : and all the leading principal submatrices of A are positive definite

Property III : P can be reduced to upper triangular form using only row operation III and the pivots elements will all be positive Sketch of the proof : & determinant is invariant under row operation of type III Continue this process , the property can be proved

Property IV : Let Then (i) A can be decompose as A=LU where L is lower triangular & U is upper triangular (ii) A can be decompose as A=LU where L is lower triangular & U is upper triangular with all the diagonal element being equal to 1 , D is an diagonal matrix Pf : by Gaussian elimination and the fact that the product of two lower (upper) triangular matrix is lower (upper) triangular

Example: Thus A=LU Also A=LDU with

Property V : Let If Pf : LHS is lower triangular & RHS is upper triangular with diagonal elements 1

Property VI : Let Then A can be factored into where D is a diagonal matrix & L is lower triangular with 1’s along the diagonal Pf : Since the LDU representation is unique

Property VII : (Cholesky decomposition)
Let Then A can be factored into where L is lower triangular with positive diagonal Hint :

Example: We have seen that
Note that Define we have the Choleoky decomposition

Theorem6.6.1 : Let , Then the followings are equivalent: (i) A>0
(ii) All the leading principal submatrices have positive determinants. (iii) A ～ U only using elementary row operation of type III. And the pivots are all positive , where U is an upper triangular matrix. (iv) A has Cholesky decomposition LLT. (v) A can be factored into BTB for some nonsingular matrix B row Pf : We have shown that (i) (ii) (iii) (iv) In addition , (iv) (v) (i) is trivial

Housholder Transformation :
Def : Then the matrix is called Housholder transformation  Geometrical lnterpretation:  Q is symmetric ,  Q is orthogonal ,  What is the eigenvalues , eigenvectors and determinant of Q ?

 Given , Find QR factorization

Theorem : Let and be a SVD of A with Then Pf : Cor : Let be nonsingular with singular values Then and

Application : In solving What is the effect of the
Solution when present measurement error ?

is said to be the condition number of A
 If A is orthogonal then This means that , due to the error in b the deviation of the associated solution of is minimum if A is orthogonal

Example : A is close to singular  Note that , is the solution for and is the solution for What does this mean ?  Similarly , i.e. small deviation in x results in large deviation in b  This is the reason why we use orthogonal factorization in Numerical solving Ax=b