2 Outlines System of linear ODE (Omit) Diagonalization Hermitian matricesOuadratic formPositive definite matrices
3 MotivationsTo simplify a linear dynamics such that it is as simple as possible.To realize a linear system characteristicse.g.,the behavior of system dynamics.
4 Example: In a town, each year 30% married women get divorced20% single women get marriedIn the 1st year, 8000 married women 2000 singlewomen.Total population remains constantbe the women numbers at year i,where represent married & single womenrespectively.
5 IfQuestion: Why does converges?Why does it converges to the same limit evenwhen the initial condition is different?
6 Ans: Choose a basisGiven an initial for somefor exampleQuestion: How does one know choosing such a basis?
7 Def: Let . A scalar is said to be an eigenvalue or characteristic value of A ifsuch thatThe vector is said to be an eigenvectoror characteristic vector belonging to .is called an eigen pair of A.Question: Given A, How to compute eigenvalues& eigenvectors?
8 characteristic polynomial of A, of degree n in is an eigen pair ofis singularNote that, is a polynomial, calledcharacteristic polynomial of A, of degree n inThus, by FTA, A has exactly n eigenvalues includingmultiplicities.is a eigenvector associated witheigenvalue while is eigenspace of A.
9 Example: Letare eigenvalues of A.To find eigenspace of 2:( i.e., )
11 Let Thenis an eigenvalue of A.has a nontrivial solution.is singular.loses rank.
12 LetIf is an eigenvalue of A with eigenvectorThenThis means that is also an eigen-pair ofA.
13 LetWhere are eigenvalues of A.(i) Let(ii) Compare with the coefficient of , wehave
14 Theorem 6.1.1: LetThen and consequently A & B have thesame eigenvalues.Pf: Let for some nonsingular matrix S.
15 DiagonalizationGoal: Given find nonsingular matrix Sa diagonal matrix.Question1: Are all matrices diagonalizable?Question2: What kinds of A are diagonalizable?Question3: How to find S if A is diagonalizable?
16 NOT all matrices are diagonalizable e.g., LetIf A is diagonalizablenonsingular matrix S
17 To answer Q2,Suppose A is diagonalizable.nonsingular matrix SLetare eigenpair of A forThis gives a condition for and diagonalizability and a wayto find S.
18 Theorem 6.3.2: Let is diagonalizable A has n linear independent eigenvectors.Note : Similarity transformationChange of coordinatediagonalization
19 Theorem6.3.1: If are distinct eigenvalues of a matrix A with corresponding eigenvectors, then are linear independent.Pf: Suppose are linear independentnot all zeroSupposeare distinct.are linear independent.
20 Remarks: Let , and(i) is an eigenpair of A for(ii) The diagonalizing matrix S is not unique becauseIts columns can be reordered or multiplied byan nonzero scalar(iii) If A has n distinct eigenvalues , A is diagonalizable.If the eigenvalues are not distinct , then may or may notdiagonalizable depending on whether or not A has nlinear independent eigenvectors.(iv)
31 Def:A matrix A is said to be Hermitian ifA is said to be skew-Hermitian ifA is said to be unitary if( → its column vectors form an orthonormal set in )
32 (ii) Let and be two eigenpairs of A with Theorem6.4.1: Let Then(i)(ii) eigenvectors belonging to distinct eigenvalues are orthogonalPf：(i)Let be an eigenpair of A(ii) Let and be two eigenpairs of A with
33 Theorem: Let ThenPf：(i) Let be an eigenpair of Ais pure-imaginary
34 Theorem6.4.3: (Schur`s Theorem) Let Then unitary matrix U isupper triangularPf：Let be an eigenpair of A withChoose to be such that is unitaryChooseChose to be unitaryContinue this process , we have the theorem
35 Theorem6.4.4: (Spectral Theorem) If , then unitary matrix Uthat diagonalizes A .Pf：By previous Theorem , unitary matrix, whereT is upper triangular .T is a diagonal matrix
36 Cor: Let A be real symmetric matrix . Then (i)(ii) an orthogonal matrix Uis a diagonal matrixRemark：If A is Hermitian ,then , by Th6.4.4 ,Complete orthonormal eigenbasis
37 Example:Find an orthogonal matrix U that diagonalizes ASol : (i)(ii)(iii)By Gram-Schmidt ProcessThe columns ofform an orthogonormal eigenbasis (WHY?)
38 Note：If A has orthonormal eigenbasis Question:In addition to Hermitian matrices ,Is there any other matrices possessing orthonormaleigenbasis?Note：If A has orthonormal eigenbasiswhere U is Hermitian &D is diagonal
39 Def: A is said to be normal if Remark：Hermitian , Skew- Hermitianand Unitary matrices areall normal
40 Theorem6.4.6: A is normal A possesses orthonormal eigenbasisPf： have provedBy Th , unitary Uis upper triangularT is also normalCompare the diagonal elements ofT has orthonormal eigenbasis(WHY?)
41 Singular Value Decomposition(SVD) : Theorem : Let with rank(A)=rThen unitary matricesWithWhere
42 Remark:In the SVDThe scalars are calledsingular values of AColumns of U are calledleft singular vectors of AColumns of V are calledright singular vectors of A
43 Pf：Note that , is Hermitian & Positive semidefinite withunitary matrix VwhereDefine(1)(2)Define (3)Define is unitary
44 Remark:In the SVD The singular values of A are uniquewhile U&V are not unique Columns of U are orthonormal eigenbasis for Columns of V are orthonormal eigenbasis foris an orthonormal basis for
45 rank(A) = number of nonzero singular values but rank(A) ≠ number of nonzero eigenvaluesfor example
46 Example : Find SVD ofSol : An orthonormal eigenbasis associate withcan be found as Find U is orthogonalA set of candidate for are Thus
58 Optimization :Let It is known from Taylor’s Theorem of Calculus thatWhere is Hessian matrix is local extremum Ifthen is a local minimum
59 Def : A real symmetric matrix A is said to be (i) Positive definite denoted by(ii) Negative definite denoted by(iii) Positive semidefinite denoted by(iv) Negative semidefinite denoted byexample :is indefinitequestion : Given a real symmetric matrix , how to determineits definiteness efficiently ?
60 Theorem6.5.1: Let ThenPf : let be eigenpair of ASupposeLet be an orthonormal eigen-basis of A(Why can assume this ? )
61 Example: Find local extrema of Sol :Thus f has local maximum at whileare saddle points
62 Positive Definite Matrices : Property I : P is nonsingularProperty II :and all the leading principal submatrices ofA are positive definite
63 Property III :P can be reduced to upper triangular formusing only row operation III and the pivotselements will all be positiveSketch of the proof :& determinant is invariant under rowoperation of type IIIContinue this process , the property can be proved
64 Property IV : Let Then(i) A can be decompose as A=LU where L is lower triangular & U is upper triangular(ii) A can be decompose as A=LU where L is lower triangular & U is upper triangular with all thediagonal element being equal to 1 , D is andiagonal matrixPf : by Gaussian elimination and the fact that theproduct of two lower (upper) triangular matrixis lower (upper) triangular
66 Property V : LetIfPf :LHS is lower triangular & RHS is upper triangularwith diagonal elements 1
67 Property VI : LetThen A can be factored intowhere D is a diagonal matrix & L is lower triangularwith 1’s along the diagonalPf :Since the LDU representation is unique
68 Property VII : (Cholesky decomposition) LetThen A can be factored into whereL is lower triangular with positive diagonalHint :
69 Example: We have seen that Note thatDefine we have theCholeoky decomposition
70 Theorem6.6.1 : Let , Then the followings are equivalent: (i) A>0 (ii) All the leading principal submatrices have positive determinants.(iii) A ～ U only using elementary row operation of type III. And the pivots are all positive , where U is an upper triangular matrix.(iv) A has Cholesky decomposition LLT.(v) A can be factored into BTB for some nonsingular matrix BrowPf : We have shown that (i) (ii) (iii) (iv) In addition , (iv) (v) (i) is trivial
71 Housholder Transformation : Def : Then the matrixis calledHousholder transformation Geometrical lnterpretation: Q is symmetric , Q is orthogonal , What is the eigenvalues , eigenvectorsand determinant of Q ?
73 Theorem : Let and be aSVD of A withThenPf :Cor : Let be nonsingular withsingular valuesThen and
74 Application : In solving What is the effect of the Solution when present measurement error ?
75 is said to be the condition number of A If A is orthogonal thenThis means that , due to the error in bthe deviation of the associated solution ofis minimum if A is orthogonal
76 Example :A is close to singular Note that , is the solution forand is the solution forWhat does this mean ? Similarly ,i.e. small deviation in x results in large deviation in b This is the reason why we use orthogonal factorizationin Numerical solving Ax=b