CHAPTER SIX Eigenvalues

Slides:



Advertisements
Similar presentations
10.4 Complex Vector Spaces.
Advertisements

3D Geometry for Computer Graphics
Ch 7.7: Fundamental Matrices
Chapter 28 – Part II Matrix Operations. Gaussian elimination Gaussian elimination LU factorization LU factorization Gaussian elimination with partial.
CHAPTER ONE Matrices and System Equations
Lecture 17 Introduction to Eigenvalue Problems
8 CHAPTER Linear Algebra: Matrix Eigenvalue Problems Chapter 8 p1.
Linear Transformations
Symmetric Matrices and Quadratic Forms
Symmetric Matrices and Quadratic Forms
Chapter 5 Orthogonality
Computer Graphics Recitation 5.
Chapter 6 Eigenvalues.
5.II. Similarity 5.II.1. Definition and Examples
Matrices and Systems of Equations
Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues
Chapter 3 Determinants and Matrices
Chapter 2 Matrices Definition of a matrix.
Lecture 20 SVD and Its Applications Shang-Hua Teng.
MOHAMMAD IMRAN DEPARTMENT OF APPLIED SCIENCES JAHANGIRABAD EDUCATIONAL GROUP OF INSTITUTES.
Matrices CS485/685 Computer Vision Dr. George Bebis.
5.1 Orthogonality.
Orthogonal Matrices and Spectral Representation In Section 4.3 we saw that n  n matrix A was similar to a diagonal matrix if and only if it had n linearly.
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
Boyce/DiPrima 9th ed, Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues Elementary Differential Equations and Boundary Value Problems,
Linear Algebra Review 1 CS479/679 Pattern Recognition Dr. George Bebis.
1 February 24 Matrices 3.2 Matrices; Row reduction Standard form of a set of linear equations: Chapter 3 Linear Algebra Matrix of coefficients: Augmented.
Eigenvalue Problems Solving linear systems Ax = b is one part of numerical linear algebra, and involves manipulating the rows of a matrix. The second main.
8.1 Vector spaces A set of vector is said to form a linear vector space V Chapter 8 Matrices and vector spaces.
1 MAC 2103 Module 12 Eigenvalues and Eigenvectors.
CHAPTER 2 MATRIX. CHAPTER OUTLINE 2.1 Introduction 2.2 Types of Matrices 2.3 Determinants 2.4 The Inverse of a Square Matrix 2.5 Types of Solutions to.
CHAPTER FIVE Orthogonality Why orthogonal? Least square problem Accuracy of Numerical computation.
Matrices CHAPTER 8.1 ~ 8.8. Ch _2 Contents  8.1 Matrix Algebra 8.1 Matrix Algebra  8.2 Systems of Linear Algebra Equations 8.2 Systems of Linear.
Linear algebra: matrix Eigen-value Problems
Computing Eigen Information for Small Matrices The eigen equation can be rearranged as follows: Ax = x  Ax = I n x  Ax - I n x = 0  (A - I n )x = 0.
Chapter 5 Eigenvalues and Eigenvectors 大葉大學 資訊工程系 黃鈴玲 Linear Algebra.
Elementary Linear Algebra Anton & Rorres, 9th Edition
Linear algebra: matrix Eigen-value Problems Eng. Hassan S. Migdadi Part 1.
Eigenvectors and Linear Transformations Recall the definition of similar matrices: Let A and C be n  n matrices. We say that A is similar to C in case.
Diagonalization and Similar Matrices In Section 4.2 we showed how to compute eigenpairs (,p) of a matrix A by determining the roots of the characteristic.
Chap. 4 Vector Spaces 4.1 Vectors in Rn 4.2 Vector Spaces
Elementary Linear Algebra Anton & Rorres, 9 th Edition Lecture Set – 07 Chapter 7: Eigenvalues, Eigenvectors.
Eigenvalues The eigenvalue problem is to determine the nontrivial solutions of the equation Ax= x where A is an n-by-n matrix, x is a length n column.
Review of Linear Algebra Optimization 1/16/08 Recitation Joseph Bradley.
Chapter 6 Eigenvalues. Example In a certain town, 30 percent of the married women get divorced each year and 20 percent of the single women get married.
7.1 Eigenvalues and Eigenvectors
5.1 Eigenvalues and Eigenvectors
Advanced Computer Graphics Spring 2014 K. H. Ko School of Mechatronics Gwangju Institute of Science and Technology.
4.8 Rank Rank enables one to relate matrices to vectors, and vice versa. Definition Let A be an m  n matrix. The rows of A may be viewed as row vectors.
Instructor: Mircea Nicolescu Lecture 8 CS 485 / 685 Computer Vision.
1 Chapter 8 – Symmetric Matrices and Quadratic Forms Outline 8.1 Symmetric Matrices 8.2Quardratic Forms 8.3Singular ValuesSymmetric MatricesQuardratic.
5 5.1 © 2016 Pearson Education, Ltd. Eigenvalues and Eigenvectors EIGENVECTORS AND EIGENVALUES.
Matrices CHAPTER 8.9 ~ Ch _2 Contents  8.9 Power of Matrices 8.9 Power of Matrices  8.10 Orthogonal Matrices 8.10 Orthogonal Matrices 
4. The Eigenvalue.
CS479/679 Pattern Recognition Dr. George Bebis
Elementary Linear Algebra Anton & Rorres, 9th Edition
Matrices and vector spaces
Systems of First Order Linear Equations
Euclidean Inner Product on Rn
CS485/685 Computer Vision Dr. George Bebis
Boyce/DiPrima 10th ed, Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues Elementary Differential Equations and Boundary Value Problems,
Chapter 6 Eigenvalues Basil Hamed
Symmetric Matrices and Quadratic Forms
Maths for Signals and Systems Linear Algebra in Engineering Lectures 13 – 14, Tuesday 8th November 2016 DR TANIA STATHAKI READER (ASSOCIATE PROFFESOR)
Elementary Linear Algebra Anton & Rorres, 9th Edition
Linear Algebra Lecture 35.
Eigenvalues and Eigenvectors
Subject :- Applied Mathematics
Linear Algebra Lecture 28.
Symmetric Matrices and Quadratic Forms
Presentation transcript:

CHAPTER SIX Eigenvalues

Outlines System of linear ODE (Omit) Diagonalization Hermitian matrices Ouadratic form Positive definite matrices

Motivations To simplify a linear dynamics such that it is as simple as possible. To realize a linear system characteristics e.g., the behavior of system dynamics.

Example: In a town, each year 30% married women get divorced 20% single women get married In the 1st year, 8000 married women 2000 single women. Total population remains constant be the women numbers at year i, where represent married & single women respectively.

If Question: Why does converges? Why does it converges to the same limit even when the initial condition is different?

Ans: Choose a basis Given an initial for some for example Question: How does one know choosing such a basis?

Def: Let . A scalar is said to be an eigenvalue or characteristic value of A if such that The vector is said to be an eigenvector or characteristic vector belonging to . is called an eigen pair of A. Question: Given A, How to compute eigenvalues & eigenvectors?

characteristic polynomial of A, of degree n in is an eigen pair of is singular Note that, is a polynomial, called characteristic polynomial of A, of degree n in Thus, by FTA, A has exactly n eigenvalues including multiplicities. is a eigenvector associated with eigenvalue while is eigenspace of A.

Example: Let are eigenvalues of A. To find eigenspace of 2:( i.e., )

To find eigenspaces of 3(i.e., ) Let

Let . Then is an eigenvalue of A. has a nontrivial solution. is singular. loses rank.

Let . If is an eigenvalue of A with eigenvector Then This means that is also an eigen-pair of A.

Let . Where are eigenvalues of A. (i) Let (ii) Compare with the coefficient of , we have

Theorem 6.1.1: Let Then and consequently A & B have the same eigenvalues. Pf: Let for some nonsingular matrix S.

Diagonalization Goal: Given find nonsingular matrix S a diagonal matrix. Question1: Are all matrices diagonalizable? Question2: What kinds of A are diagonalizable? Question3: How to find S if A is diagonalizable?

NOT all matrices are diagonalizable e.g., Let If A is diagonalizable nonsingular matrix S

To answer Q2, Suppose A is diagonalizable. nonsingular matrix S Let are eigenpair of A for This gives a condition for and diagonalizability and a way to find S.

Theorem 6.3.2: Let is diagonalizable A has n linear independent eigenvectors. Note : Similarity transformation Change of coordinate diagonalization

Theorem6.3.1: If are distinct eigenvalues of a matrix A with corresponding eigenvectors , then are linear independent. Pf: Suppose are linear independent not all zero Suppose are distinct. are linear independent.

Remarks: Let , and (i) is an eigenpair of A for (ii) The diagonalizing matrix S is not unique because Its columns can be reordered or multiplied by an nonzero scalar (iii) If A has n distinct eigenvalues , A is diagonalizable. If the eigenvalues are not distinct , then may or may not diagonalizable depending on whether or not A has n linear independent eigenvectors. (iv)

Example: Let For Let

Def: If an matrix A has fewer than n linear independent eigenvectors,we say that A is defective e.g. (i) is defective (ii) is defective

Example: Let A & B both have the same eigenvalues Nullity (A-2I)=1 The eigenspace associated with has only one dimension. A is NOT diagonalizable However, Nullity (B-2I)=2 B is diagonalizable

Question: Is the following matrix diagonalizable ?

The Exponential of a Matrix Motiration : Motiration:The general solution of is The unique solution of Question:What is and how to compute ?

Note that Define

Suppose A is diagonalizable with

Example: Compute Sol: The eigenvalues A are with eigenvectors

Hermitian matrices : Let , then A can be written as where e.g. ,

Let , then e.g. , 

Def: A matrix A is said to be Hermitian if A is said to be skew-Hermitian if A is said to be unitary if ( → its column vectors form an orthonormal set in )

(ii) Let and be two eigenpairs of A with Theorem6.4.1: Let Then (i) (ii) eigenvectors belonging to distinct eigenvalues are orthogonal Pf:(i)Let be an eigenpair of A (ii) Let and be two eigenpairs of A with

Theorem: Let Then Pf:(i) Let be an eigenpair of A is pure-imaginary

Theorem6.4.3: (Schur`s Theorem) Let Then unitary matrix U is upper triangular Pf:Let be an eigenpair of A with Choose to be such that is unitary Choose Chose to be unitary Continue this process , we have the theorem

Theorem6.4.4: (Spectral Theorem) If , then unitary matrix U that diagonalizes A . Pf:By previous Theorem , unitary matrix , where T is upper triangular . T is a diagonal matrix

Cor: Let A be real symmetric matrix . Then (i) (ii) an orthogonal matrix U is a diagonal matrix Remark:If A is Hermitian , then , by Th6.4.4 , Complete orthonormal eigenbasis

Example: Find an orthogonal matrix U that diagonalizes A Sol : (i) (ii) (iii)By Gram-Schmidt Process The columns of form an orthogonormal eigenbasis (WHY?)

Note:If A has orthonormal eigenbasis Question:In addition to Hermitian matrices , Is there any other matrices possessing orthonormal eigenbasis? Note:If A has orthonormal eigenbasis where U is Hermitian & D is diagonal

Def: A is said to be normal if Remark:Hermitian , Skew- Hermitian and Unitary matrices are all normal

Theorem6.4.6: A is normal A possesses orthonormal eigenbasis Pf: have proved By Th.6.4.3 , unitary U is upper triangular T is also normal Compare the diagonal elements of T has orthonormal eigenbasis(WHY?)

Singular Value Decomposition(SVD) : Theorem : Let with rank(A)=r Then unitary matrices With Where

Remark:In the SVD The scalars are called singular values of A Columns of U are called left singular vectors of A Columns of V are called right singular vectors of A

Pf:Note that , is Hermitian & Positive semidefinite with unitary matrix V where Define (1) (2) Define (3) Define is unitary

Remark:In the SVD  The singular values of A are unique while U&V are not unique  Columns of U are orthonormal eigenbasis for  Columns of V are orthonormal eigenbasis for is an orthonormal basis for

rank(A) = number of nonzero singular values but rank(A) ≠ number of nonzero eigenvalues for example

Example : Find SVD of Sol :   An orthonormal eigenbasis associate with can be found as  Find U is orthogonal A set of candidate for are  Thus

Lemma6.5.2 : Let be orthogonal . Then Pf :

Cor : Let be the SVD of A . Then

We`ll state the next result without proof : Theorem6.5.3 : H.(1) be the SVD of A (2) C :

Application : Digital Image Processing (especially efficient for matrix which has low rank)

Quadratic Forms : To classify the type of quadratic surface (line) Optimization : An application to the Calculus

Def : A quadratic equation in two variables x & y is an equation of the form

Standard forms of conic sections (ii) (iii) (iv) Note : Is there any difference between the eigenvalues A of the quadratic form ?

Goal : Try to transform the quadratic equation into standard form by suitable translation and rotation

Example : (No xy term)  The eigenvalues of the quadratic terms are 9 , 4 → ellipse  → →

→ By direct computation Example : (Have xy term) → → By direct computation is orthogonal ( why does such U exist ? ) → Let the original equation becomes

Example : → → Let or

Optimization : Let  It is known from Taylor’s Theorem of Calculus that Where is Hessian matrix  is local extremum  If then is a local minimum

Def : A real symmetric matrix A is said to be (i) Positive definite denoted by (ii) Negative definite denoted by (iii) Positive semidefinite denoted by (iv) Negative semidefinite denoted by example : is indefinite question : Given a real symmetric matrix , how to determine its definiteness efficiently ?

Theorem6.5.1: Let Then Pf : let be eigenpair of A Suppose Let be an orthonormal eigen-basis of A (Why can assume this ? )

Example: Find local extrema of Sol : Thus f has local maximum at while are saddle points

Positive Definite Matrices : Property I : P is nonsingular Property II : and all the leading principal submatrices of A are positive definite

Property III : P can be reduced to upper triangular form using only row operation III and the pivots elements will all be positive Sketch of the proof : & determinant is invariant under row operation of type III Continue this process , the property can be proved

Property IV : Let Then (i) A can be decompose as A=LU where L is lower triangular & U is upper triangular (ii) A can be decompose as A=LU where L is lower triangular & U is upper triangular with all the diagonal element being equal to 1 , D is an diagonal matrix Pf : by Gaussian elimination and the fact that the product of two lower (upper) triangular matrix is lower (upper) triangular

Example: Thus A=LU Also A=LDU with

Property V : Let If Pf : LHS is lower triangular & RHS is upper triangular with diagonal elements 1

Property VI : Let Then A can be factored into where D is a diagonal matrix & L is lower triangular with 1’s along the diagonal Pf : Since the LDU representation is unique

Property VII : (Cholesky decomposition) Let Then A can be factored into where L is lower triangular with positive diagonal Hint :

Example: We have seen that Note that Define we have the Choleoky decomposition

Theorem6.6.1 : Let , Then the followings are equivalent: (i) A>0 (ii) All the leading principal submatrices have positive determinants. (iii) A ~ U only using elementary row operation of type III. And the pivots are all positive , where U is an upper triangular matrix. (iv) A has Cholesky decomposition LLT. (v) A can be factored into BTB for some nonsingular matrix B row Pf : We have shown that (i) (ii) (iii) (iv) In addition , (iv) (v) (i) is trivial

Housholder Transformation : Def : Then the matrix is called Housholder transformation  Geometrical lnterpretation:  Q is symmetric ,  Q is orthogonal ,  What is the eigenvalues , eigenvectors and determinant of Q ?

 Given , Find QR factorization

Theorem : Let and be a SVD of A with Then Pf : Cor : Let be nonsingular with singular values Then and

Application : In solving What is the effect of the Solution when present measurement error ?

is said to be the condition number of A   If A is orthogonal then This means that , due to the error in b the deviation of the associated solution of is minimum if A is orthogonal

Example : A is close to singular  Note that , is the solution for and is the solution for What does this mean ?  Similarly , i.e. small deviation in x results in large deviation in b  This is the reason why we use orthogonal factorization in Numerical solving Ax=b