Ch 7.7: Fundamental Matrices

Slides:



Advertisements
Similar presentations
Ch 7.6: Complex Eigenvalues
Advertisements

Ch 3.2: Solutions of Linear Homogeneous Equations; Wronskian
Chapter 2 Simultaneous Linear Equations
Autar Kaw Humberto Isaza
Refresher: Vector and Matrix Algebra Mike Kirkpatrick Department of Chemical Engineering FAMU-FSU College of Engineering.
Eigenvalues and Eigenvectors
Eigenvalues and Eigenvectors
Symmetric Matrices and Quadratic Forms
Symmetric Matrices and Quadratic Forms
Ch 5.2: Series Solutions Near an Ordinary Point, Part I
Ch 7.8: Repeated Eigenvalues
Ch 7.9: Nonhomogeneous Linear Systems
Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues
Ch 7.5: Homogeneous Linear Systems with Constant Coefficients
Ch 3.3: Linear Independence and the Wronskian
MOHAMMAD IMRAN DEPARTMENT OF APPLIED SCIENCES JAHANGIRABAD EDUCATIONAL GROUP OF INSTITUTES.
Matrix Algebra THE INVERSE OF A MATRIX © 2012 Pearson Education, Inc.
5 5.1 © 2012 Pearson Education, Inc. Eigenvalues and Eigenvectors EIGENVECTORS AND EIGENVALUES.
5.1 Orthogonality.
Orthogonal Matrices and Spectral Representation In Section 4.3 we saw that n  n matrix A was similar to a diagonal matrix if and only if it had n linearly.
1 1.1 © 2012 Pearson Education, Inc. Linear Equations in Linear Algebra SYSTEMS OF LINEAR EQUATIONS.
Boyce/DiPrima 9th ed, Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues Elementary Differential Equations and Boundary Value Problems,
Linear Equations in Linear Algebra
Systems and Matrices (Chapter5)
PHY 301: MATH AND NUM TECH Chapter 5: Linear Algebra Applications I.Homogeneous Linear Equations II.Non-homogeneous equation III.Eigen value problem.
MATH 250 Linear Equations and Matrices
Row Reduction Method Lesson 6.4.
Chapter 2 Simultaneous Linear Equations (cont.)
Eigenvalues and Eigenvectors
A matrix equation has the same solution set as the vector equation which has the same solution set as the linear system whose augmented matrix is Therefore:
1 1.5 © 2016 Pearson Education, Inc. Linear Equations in Linear Algebra SOLUTION SETS OF LINEAR SYSTEMS.
Linear Equations in Linear Algebra
1 1.7 © 2016 Pearson Education, Inc. Linear Equations in Linear Algebra LINEAR INDEPENDENCE.
Copyright © 2013, 2009, 2005 Pearson Education, Inc. 1 5 Systems and Matrices Copyright © 2013, 2009, 2005 Pearson Education, Inc.
1 C ollege A lgebra Systems and Matrices (Chapter5) 1.
Matrices CHAPTER 8.1 ~ 8.8. Ch _2 Contents  8.1 Matrix Algebra 8.1 Matrix Algebra  8.2 Systems of Linear Algebra Equations 8.2 Systems of Linear.
Computing Eigen Information for Small Matrices The eigen equation can be rearranged as follows: Ax = x  Ax = I n x  Ax - I n x = 0  (A - I n )x = 0.
Chapter 5 Eigenvalues and Eigenvectors 大葉大學 資訊工程系 黃鈴玲 Linear Algebra.
Vector Norms and the related Matrix Norms. Properties of a Vector Norm: Euclidean Vector Norm: Riemannian metric:
Linear algebra: matrix Eigen-value Problems Eng. Hassan S. Migdadi Part 1.
Diagonalization and Similar Matrices In Section 4.2 we showed how to compute eigenpairs (,p) of a matrix A by determining the roots of the characteristic.
Elementary Linear Algebra Anton & Rorres, 9 th Edition Lecture Set – 07 Chapter 7: Eigenvalues, Eigenvectors.
Chapter 6 Systems of Linear Equations and Matrices Sections 6.3 – 6.5.
Eigenvalues The eigenvalue problem is to determine the nontrivial solutions of the equation Ax= x where A is an n-by-n matrix, x is a length n column.
Chapter 1 Linear Algebra S 2 Systems of Linear Equations.
Ch11: Normal Modes 1. Review diagonalization and eigenvalue problem 2. Normal modes of Coupled oscillators (3 springs, 2 masses) 3. Case of Weakly coupled.
Boyce/DiPrima 9 th ed, Ch 7.6: Complex Eigenvalues Elementary Differential Equations and Boundary Value Problems, 9 th edition, by William E. Boyce and.
TH EDITION LIAL HORNSBY SCHNEIDER COLLEGE ALGEBRA.
4.8 Rank Rank enables one to relate matrices to vectors, and vice versa. Definition Let A be an m  n matrix. The rows of A may be viewed as row vectors.
5 5.1 © 2016 Pearson Education, Ltd. Eigenvalues and Eigenvectors EIGENVECTORS AND EIGENVALUES.
Chapter 5 Chapter Content 1. Real Vector Spaces 2. Subspaces 3. Linear Independence 4. Basis and Dimension 5. Row Space, Column Space, and Nullspace 6.
2 2.2 © 2016 Pearson Education, Ltd. Matrix Algebra THE INVERSE OF A MATRIX.
Matrices CHAPTER 8.9 ~ Ch _2 Contents  8.9 Power of Matrices 8.9 Power of Matrices  8.10 Orthogonal Matrices 8.10 Orthogonal Matrices 
Chapter 5 Eigenvalues and Eigenvectors
College Algebra Chapter 6 Matrices and Determinants and Applications
Boyce/DiPrima 10th ed, Ch 7.9: Nonhomogeneous Linear Systems Elementary Differential Equations and Boundary Value Problems, 10th edition, by William E.
Systems of Linear Differential Equations
Eigenvalues and Eigenvectors
Boyce/DiPrima 10th ed, Ch 7.7: Fundamental Matrices Elementary Differential Equations and Boundary Value Problems, 10th edition, by William E. Boyce and.
Boyce/DiPrima 10th ed, Ch 7.8: Repeated Eigenvalues Elementary Differential Equations and Boundary Value Problems, 10th edition, by William E. Boyce.
Boyce/DiPrima 10th ed, Ch 7.1: Introduction to Systems of First Order Linear Equations Elementary Differential Equations and Boundary Value Problems,
Systems of First Order Linear Equations
Boyce/DiPrima 10th ed, Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues Elementary Differential Equations and Boundary Value Problems,
Chapter 6 Eigenvalues Basil Hamed
Linear Equations in Linear Algebra
Symmetric Matrices and Quadratic Forms
Eigenvalues and Eigenvectors
Linear Equations in Linear Algebra
Symmetric Matrices and Quadratic Forms
Presentation transcript:

Ch 7.7: Fundamental Matrices Suppose that x(1)(t),…, x(n)(t) form a fundamental set of solutions for x' = P(t)x on  < t < . The matrix whose columns are x(1)(t),…, x(n)(t), is a fundamental matrix for the system x' = P(t)x. This matrix is nonsingular since its columns are linearly independent, and hence det  0. Note also that since x(1)(t),…, x(n)(t) are solutions of x' = P(t)x,  satisfies the matrix differential equation ' = P(t).

Example 1: Consider the homogeneous equation x' = Ax below. In Chapter 7.5, we found the following fundamental solutions for this system: Thus a fundamental matrix for this system is

Fundamental Matrices and General Solution The general solution of x' = P(t)x can be expressed x = (t)c, where c is a constant vector with components c1,…, cn:

Fundamental Matrix & Initial Value Problem Consider an initial value problem x' = P(t)x, x(t0) = x0 where  < t0 <  and x0 is a given initial vector. Now the solution has the form x = (t)c, hence we choose c so as to satisfy x(t0) = x0. Recalling (t0) is nonsingular, it follows that Thus our solution x = (t)c can be expressed as

Recall: Theorem 7.4.4 Let Let x(1),…, x(n) be solutions of x' = P(t)x on I:  < t <  that satisfy the initial conditions Then x(1),…, x(n) are fundamental solutions of x' = P(t)x.

Fundamental Matrix & Theorem 7.4.4 Suppose x(1)(t),…, x(n)(t) form the fundamental solutions given by Thm 7.4.4. Denote the corresponding fundamental matrix by (t). Then columns of (t) are x(1)(t),…, x(n)(t), and hence Thus -1(t0) = I, and the hence general solution to the corresponding initial value problem is It follows that for any fundamental matrix (t),

The Fundamental Matrix  and Varying Initial Conditions Thus when using the fundamental matrix (t), the general solution to an IVP is This representation is useful if same system is to be solved for many different initial conditions, such as a physical system that can be started from many different initial states. Also, once (t) has been determined, the solution to each set of initial conditions can be found by matrix multiplication, as indicated by the equation above. Thus (t) represents a linear transformation of the initial conditions x0 into the solution x(t) at time t.

Example 2: Find (t) for 2 x 2 System (1 of 5) Find (t) such that (0) = I for the system below. Solution: First, we must obtain x(1)(t) and x(2)(t) such that We know from previous results that the general solution is Every solution can be expressed in terms of the general solution, and we use this fact to find x(1)(t) and x(2)(t).

Example 2: Use General Solution (2 of 5) Thus, to find x(1)(t), express it terms of the general solution and then find the coefficients c1 and c2. To do so, use the initial conditions to obtain or equivalently,

Example 2: Solve for x(1)(t) (3 of 5) To find x(1)(t), we therefore solve by row reducing the augmented matrix: Thus

Example 2: Solve for x(2)(t) (4 of 5) To find x(2)(t), we similarly solve by row reducing the augmented matrix: Thus

Example 2: Obtain (t) (5 of 5) The columns of (t) are given by x(1)(t) and x(2)(t), and thus from the previous slide we have Note (t) is more complicated than (t) found in Ex 1. However, now that we have (t), it is much easier to determine the solution to any set of initial conditions.

Matrix Exponential Functions Consider the following two cases: The solution to x' = ax, x(0) = x0, is x = x0eat, where e0 = 1. The solution to x' = Ax, x(0) = x0, is x = (t)x0, where (0) = I. Comparing the form and solution for both of these cases, we might expect (t) to have an exponential character. Indeed, it can be shown that (t) = eAt, where is a well defined matrix function that has all the usual properties of an exponential function. See text for details. Thus the solution to x' = Ax, x(0) = x0, is x = eAtx0.

Example 3: Matrix Exponential Function Consider the diagonal matrix A below. Then In general, Thus

Coupled Systems of Equations Recall that our constant coefficient homogeneous system written as x' = Ax with is a system of coupled equations that must be solved simultaneously to find all the unknown variables.

Uncoupled Systems & Diagonal Matrices In contrast, if each equation had only one variable, solved for independently of other equations, then task would be easier. In this case our system would have the form or x' = Dx, where D is a diagonal matrix:

Uncoupling: Transform Matrix T In order to explore transforming our given system x' = Ax of coupled equations into an uncoupled system x' = Dx, where D is a diagonal matrix, we will use the eigenvectors of A. Suppose A is n x n with n linearly independent eigenvectors (1),…, (n), and corresponding eigenvalues 1,…, n. Define n x n matrices T and D using the eigenvalues & eigenvectors of A: Note that T is nonsingular, and hence T-1 exists.

Uncoupling: T-1AT = D Recall here the definitions of A, T and D: Then the columns of AT are A(1),…, A(n), and hence It follows that T-1AT = D.

Similarity Transformations Thus, if the eigenvalues and eigenvectors of A are known, then A can be transformed into a diagonal matrix D, with T-1AT = D This process is known as a similarity transformation, and A is said to be similar to D. Alternatively, we could say that A is diagonalizable.

Similarity Transformations: Hermitian Case Recall: Our similarity transformation of A has the form T-1AT = D where D is diagonal and columns of T are eigenvectors of A. If A is Hermitian, then A has n linearly independent orthogonal eigenvectors (1),…, (n), normalized so that ((i), (i)) =1 for i = 1,…, n, and ((i), (k)) = 0 for i  k. With this selection of eigenvectors, it can be shown that T-1 = T*. In this case we can write our similarity transform as T*AT = D

Nondiagonalizable A Finally, if A is n x n with fewer than n linearly independent eigenvectors, then there is no matrix T such that T-1AT = D. In this case, A is not similar to a diagonal matrix and A is not diagonlizable.

Example 4: Find Transformation Matrix T (1 of 2) For the matrix A below, find the similarity transformation matrix T and show that A can be diagonalized. We already know that the eigenvalues are 1 = 3, 2 = -1 with corresponding eigenvectors Thus

Example 4: Similarity Transformation (2 of 2) To find T-1, augment the identity to T and row reduce: Then Thus A is similar to D, and hence A is diagonalizable.

Fundamental Matrices for Similar Systems (1 of 3) Recall our original system of differential equations x' = Ax. If A is n x n with n linearly independent eigenvectors, then A is diagonalizable. The eigenvectors form the columns of the nonsingular transform matrix T, and the eigenvalues are the corresponding nonzero entries in the diagonal matrix D. Suppose x satisfies x' = Ax, let y be the n x 1 vector such that x = Ty. That is, let y be defined by y = T-1x. Since x' = Ax and T is a constant matrix, we have Ty' = ATy, and hence y' = T-1ATy = Dy. Therefore y satisfies y' = Dy, the system similar to x' = Ax. Both of these systems have fundamental matrices, which we examine next.

Fundamental Matrix for Diagonal System (2 of 3) A fundamental matrix for y' = Dy is given by Q(t) = eDt. Recalling the definition of eDt, we have

Fundamental Matrix for Original System (3 of 3) To obtain a fundamental matrix (t) for x' = Ax, recall that the columns of (t) consist of fundamental solutions x satisfying x' = Ax. We also know x = Ty, and hence it follows that The columns of (t) given the expected fundamental solutions of x' = Ax.

Example 5: Fundamental Matrices for Similar Systems We now use the analysis and results of the last few slides. Applying the transformation x = Ty to x' = Ax below, this system becomes y' = T-1ATy = Dy: A fundamental matrix for y' = Dy is given by Q(t) = eDt: Thus a fundamental matrix (t) for x' = Ax is