Boyce/DiPrima 9th ed, Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues Elementary Differential Equations and Boundary Value Problems,

Slides:



Advertisements
Similar presentations
Chapter 4 Euclidean Vector Spaces
Advertisements

Ch 7.7: Fundamental Matrices
Chapter 6 Eigenvalues and Eigenvectors
Boyce/DiPrima 10th ed, Ch 10.1: Two-Point Boundary Value Problems Elementary Differential Equations and Boundary Value Problems, 10th edition, by William.
Ch 7.8: Repeated Eigenvalues
Ch 7.9: Nonhomogeneous Linear Systems
Matrices and Systems of Equations
Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues
Ch 7.2: Review of Matrices For theoretical and computation reasons, we review results of matrix theory in this section and the next. A matrix A is an m.
Ch 7.5: Homogeneous Linear Systems with Constant Coefficients
Boyce/DiPrima 9th ed, Ch 11.2: Sturm-Liouville Boundary Value Problems Elementary Differential Equations and Boundary Value Problems, 9th edition, by.
Boot Camp in Linear Algebra Joel Barajas Karla L Caballero University of California Silicon Valley Center October 8th, 2008.
化工應用數學 授課教師: 郭修伯 Lecture 9 Matrices
Boyce/DiPrima 9 th ed, Ch 2.1: Linear Equations; Method of Integrating Factors Elementary Differential Equations and Boundary Value Problems, 9 th edition,
Boyce/DiPrima 9 th ed, Ch 3.1: 2 nd Order Linear Homogeneous Equations-Constant Coefficients Elementary Differential Equations and Boundary Value Problems,
Differential Equations
Boyce/DiPrima 9th ed, Ch 3.4: Repeated Roots; Reduction of Order Elementary Differential Equations and Boundary Value Problems, 9th edition, by William.
Linear Equations in Linear Algebra
1 資訊科學數學 14 : Determinants & Inverses 陳光琦助理教授 (Kuang-Chi Chen)
Eigenvalue Problems Solving linear systems Ax = b is one part of numerical linear algebra, and involves manipulating the rows of a matrix. The second main.
1 MAC 2103 Module 12 Eigenvalues and Eigenvectors.
 Row and Reduced Row Echelon  Elementary Matrices.
Chap. 2 Matrices 2.1 Operations with Matrices
Elementary Linear Algebra Anton & Rorres, 9th Edition
A matrix equation has the same solution set as the vector equation which has the same solution set as the linear system whose augmented matrix is Therefore:
1 1.5 © 2016 Pearson Education, Inc. Linear Equations in Linear Algebra SOLUTION SETS OF LINEAR SYSTEMS.
1 MAC 2103 Module 3 Determinants. 2 Rev.F09 Learning Objectives Upon completing this module, you should be able to: 1. Determine the minor, cofactor,
Matrices CHAPTER 8.1 ~ 8.8. Ch _2 Contents  8.1 Matrix Algebra 8.1 Matrix Algebra  8.2 Systems of Linear Algebra Equations 8.2 Systems of Linear.
Boyce/DiPrima 9 th ed, Ch 5.1: Review of Power Series Elementary Differential Equations and Boundary Value Problems, 9 th edition, by William E. Boyce.
Boyce/DiPrima 9th ed, Ch 4.2: Homogeneous Equations with Constant Coefficients Elementary Differential Equations and Boundary Value Problems, 9th edition,
Computing Eigen Information for Small Matrices The eigen equation can be rearranged as follows: Ax = x  Ax = I n x  Ax - I n x = 0  (A - I n )x = 0.
Chapter 5 Eigenvalues and Eigenvectors 大葉大學 資訊工程系 黃鈴玲 Linear Algebra.
Linear algebra: matrix Eigen-value Problems Eng. Hassan S. Migdadi Part 1.
Elementary Linear Algebra Anton & Rorres, 9 th Edition Lecture Set – 02 Chapter 2: Determinants.
Chapter 3 Determinants Linear Algebra. Ch03_2 3.1 Introduction to Determinants Definition The determinant of a 2  2 matrix A is denoted |A| and is given.
Section 2.3 Properties of Solution Sets
OR Backgrounds-Convexity  Def: line segment joining two points is the collection of points.
Boyce/DiPrima 9th ed, Ch 4.1: Higher Order Linear ODEs: General Theory Elementary Differential Equations and Boundary Value Problems, 9th edition, by.
Solving Linear Systems Solving linear systems Ax = b is one part of numerical linear algebra, and involves manipulating the rows of a matrix. Solving linear.
4 © 2012 Pearson Education, Inc. Vector Spaces 4.4 COORDINATE SYSTEMS.
Elementary Linear Algebra Anton & Rorres, 9 th Edition Lecture Set – 07 Chapter 7: Eigenvalues, Eigenvectors.
Eigenvalues The eigenvalue problem is to determine the nontrivial solutions of the equation Ax= x where A is an n-by-n matrix, x is a length n column.
Boyce/DiPrima 9 th ed, Ch 7.6: Complex Eigenvalues Elementary Differential Equations and Boundary Value Problems, 9 th edition, by William E. Boyce and.
Boyce/DiPrima 9 th ed, Ch 6.2: Solution of Initial Value Problems Elementary Differential Equations and Boundary Value Problems, 9 th edition, by William.
Matrices and Matrix Operations. Matrices An m×n matrix A is a rectangular array of mn real numbers arranged in m horizontal rows and n vertical columns.
Boyce/DiPrima 9 th ed, Ch 11.3: Non- Homogeneous Boundary Value Problems Elementary Differential Equations and Boundary Value Problems, 9 th edition, by.
Instructor: Mircea Nicolescu Lecture 8 CS 485 / 685 Computer Vision.
A function is a rule f that associates with each element in a set A one and only one element in a set B. If f associates the element b with the element.
Boot Camp in Linear Algebra TIM 209 Prof. Ram Akella.
Matrices, Vectors, Determinants.
Boyce/DiPrima 9 th ed, Ch 5.3: Series Solutions Near an Ordinary Point, Part II Elementary Differential Equations and Boundary Value Problems, 9 th edition,
College Algebra Chapter 6 Matrices and Determinants and Applications
Boyce/DiPrima 10th ed, Ch 7.9: Nonhomogeneous Linear Systems Elementary Differential Equations and Boundary Value Problems, 10th edition, by William E.
Systems of Linear Differential Equations
Boyce/DiPrima 10th ed, Ch 7.2: Review of Matrices Elementary Differential Equations and Boundary Value Problems, 10th edition, by William E. Boyce and.
Elementary Linear Algebra Anton & Rorres, 9th Edition
Ch 10.1: Two-Point Boundary Value Problems
Section 4.1 Eigenvalues and Eigenvectors
Boyce/DiPrima 10th ed, Ch 7.7: Fundamental Matrices Elementary Differential Equations and Boundary Value Problems, 10th edition, by William E. Boyce and.
Boyce/DiPrima 10th ed, Ch 7.8: Repeated Eigenvalues Elementary Differential Equations and Boundary Value Problems, 10th edition, by William E. Boyce.
Boyce/DiPrima 10th ed, Ch 7.1: Introduction to Systems of First Order Linear Equations Elementary Differential Equations and Boundary Value Problems,
Systems of First Order Linear Equations
Boyce/DiPrima 10th ed, Ch 7.5: Homogeneous Linear Systems with Constant Coefficients Elementary Differential Equations and Boundary Value Problems, 10th.
Boyce/DiPrima 10th ed, Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues Elementary Differential Equations and Boundary Value Problems,
Linear Equations in Linear Algebra
Boyce/DiPrima 10th ed, Ch 7.6: Complex Eigenvalues Elementary Differential Equations and Boundary Value Problems, 10th edition, by William E. Boyce and.
Elementary Linear Algebra Anton & Rorres, 9th Edition
Eigenvalues and Eigenvectors
Linear Equations in Linear Algebra
Ch 7.5: Homogeneous Linear Systems with Constant Coefficients
Presentation transcript:

Boyce/DiPrima 9th ed, Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues Elementary Differential Equations and Boundary Value Problems, 9th edition, by William E. Boyce and Richard C. DiPrima, ©2009 by John Wiley & Sons, Inc. A system of n linear equations in n variables, can be expressed as a matrix equation Ax = b: If b = 0, then system is homogeneous; otherwise it is nonhomogeneous.

Nonsingular Case If the coefficient matrix A is nonsingular, then it is invertible and we can solve Ax = b as follows: This solution is therefore unique. Also, if b = 0, it follows that the unique solution to Ax = 0 is x = A-10 = 0. Thus if A is nonsingular, then the only solution to Ax = 0 is the trivial solution x = 0.

Example 1: Nonsingular Case (1 of 3) From a previous example, we know that the matrix A below is nonsingular with inverse as given. Using the definition of matrix multiplication, it follows that the only solution of Ax = 0 is x = 0:

Example 1: Nonsingular Case (2 of 3) Now let’s solve the nonhomogeneous linear system Ax = b below using A-1: This system of equations can be written as Ax = b, where Then

Example 1: Nonsingular Case (3 of 3) Alternatively, we could solve the nonhomogeneous linear system Ax = b below using row reduction. To do so, form the augmented matrix (A|b) and reduce, using elementary row operations.

Singular Case If the coefficient matrix A is singular, then A-1 does not exist, and either a solution to Ax = b does not exist, or there is more than one solution (not unique). Further, the homogeneous system Ax = 0 has more than one solution. That is, in addition to the trivial solution x = 0, there are infinitely many nontrivial solutions. The nonhomogeneous case Ax = b has no solution unless (b, y) = 0, for all vectors y satisfying A*y = 0, where A* is the adjoint of A. In this case, Ax = b has solutions (infinitely many), each of the form x = x(0) + , where x(0) is a particular solution of Ax = b, and  is any solution of Ax = 0.

Example 2: Singular Case (1 of 2) Solve the nonhomogeneous linear system Ax = b below using row reduction. Observe that the coefficients are nearly the same as in the previous example We will form the augmented matrix (A|b) and use some of the steps in Example 1 to transform the matrix more quickly

Example 2: Singular Case (2 of 2) From the previous slide, if , there is no solution to the system of equations Requiring that , assume, for example, that Then the reduced augmented matrix (A|b) becomes: It can be shown that the second term in x is a solution of the nonhomogeneous equation and that the first term is the most general solution of the homogeneous equation, letting , where α is arbitrary

Linear Dependence and Independence A set of vectors x(1), x(2),…, x(n) is linearly dependent if there exists scalars c1, c2,…, cn, not all zero, such that If the only solution of is c1= c2 = …= cn = 0, then x(1), x(2),…, x(n) is linearly independent.

Example 3: Linear Dependence (1 of 2) Determine whether the following vectors are linear dependent or linearly independent. We need to solve or

Example 3: Linear Dependence (2 of 2) We can reduce the augmented matrix (A|b), as before. So, the vectors are linearly dependent: Alternatively, we could show that the following determinant is zero:

Linear Independence and Invertibility Consider the previous two examples: The first matrix was known to be nonsingular, and its column vectors were linearly independent. The second matrix was known to be singular, and its column vectors were linearly dependent. This is true in general: the columns (or rows) of A are linearly independent iff A is nonsingular iff A-1 exists. Also, A is nonsingular iff detA  0, hence columns (or rows) of A are linearly independent iff detA  0. Further, if A = BC, then det(C) = det(A)det(B). Thus if the columns (or rows) of A and B are linearly independent, then the columns (or rows) of C are also.

Linear Dependence & Vector Functions Now consider vector functions x(1)(t), x(2)(t),…, x(n)(t), where As before, x(1)(t), x(2)(t),…, x(n)(t) is linearly dependent on I if there exists scalars c1, c2,…, cn, not all zero, such that Otherwise x(1)(t), x(2)(t),…, x(n)(t) is linearly independent on I See text for more discussion on this.

Eigenvalues and Eigenvectors The eqn. Ax = y can be viewed as a linear transformation that maps (or transforms) x into a new vector y. Nonzero vectors x that transform into multiples of themselves are important in many applications. Thus we solve Ax = x or equivalently, (A-I)x = 0. This equation has a nonzero solution if we choose  such that det(A-I) = 0. Such values of  are called eigenvalues of A, and the nonzero solutions x are called eigenvectors.

Example 4: Eigenvalues (1 of 3) Find the eigenvalues and eigenvectors of the matrix A. Solution: Choose  such that det(A-I) = 0, as follows.

Example 4: First Eigenvector (2 of 3) To find the eigenvectors of the matrix A, we need to solve (A-I)x = 0 for  = 2 and  = -1. Eigenvector for  = 2: Solve and this implies that . So

Example 4: Second Eigenvector (3 of 3) Eigenvector for  = -1: Solve and this implies that . So

Normalized Eigenvectors From the previous example, we see that eigenvectors are determined up to a nonzero multiplicative constant. If this constant is specified in some particular way, then the eigenvector is said to be normalized. For example, eigenvectors are sometimes normalized by choosing the constant so that ||x|| = (x, x)½ = 1.

Algebraic and Geometric Multiplicity In finding the eigenvalues  of an n x n matrix A, we solve det(A-I) = 0. Since this involves finding the determinant of an n x n matrix, the problem reduces to finding roots of an nth degree polynomial. Denote these roots, or eigenvalues, by 1, 2, …, n. If an eigenvalue is repeated m times, then its algebraic multiplicity is m. Each eigenvalue has at least one eigenvector, and a eigenvalue of algebraic multiplicity m may have q linearly independent eigevectors, 1  q  m, and q is called the geometric multiplicity of the eigenvalue.

Eigenvectors and Linear Independence If an eigenvalue  has algebraic multiplicity 1, then it is said to be simple, and the geometric multiplicity is 1 also. If each eigenvalue of an n x n matrix A is simple, then A has n distinct eigenvalues. It can be shown that the n eigenvectors corresponding to these eigenvalues are linearly independent. If an eigenvalue has one or more repeated eigenvalues, then there may be fewer than n linearly independent eigenvectors since for each repeated eigenvalue, we may have q < m. This may lead to complications in solving systems of differential equations.

Example 5: Eigenvalues (1 of 5) Find the eigenvalues and eigenvectors of the matrix A. Solution: Choose  such that det(A-I) = 0, as follows.

Example 5: First Eigenvector (2 of 5) Eigenvector for  = 2: Solve (A-I)x = 0, as follows.

Example 5: 2nd and 3rd Eigenvectors (3 of 5) Eigenvector for  = -1: Solve (A-I)x = 0, as follows.

Example 5: Eigenvectors of A (4 of 5) Thus three eigenvectors of A are where x(2), x(3) correspond to the double eigenvalue  = - 1. It can be shown that x(1), x(2), x(3) are linearly independent. Hence A is a 3 x 3 symmetric matrix (A = AT ) with 3 real eigenvalues and 3 linearly independent eigenvectors.

Example 5: Eigenvectors of A (5 of 5) Note that we could have we had chosen Then the eigenvectors are orthogonal, since Thus A is a 3 x 3 symmetric matrix with 3 real eigenvalues and 3 linearly independent orthogonal eigenvectors.

Hermitian Matrices A self-adjoint, or Hermitian matrix, satisfies A = A*, where we recall that A* = AT . Thus for a Hermitian matrix, aij = aji. Note that if A has real entries and is symmetric (see last example), then A is Hermitian. An n x n Hermitian matrix A has the following properties: All eigenvalues of A are real. There exists a full set of n linearly independent eigenvectors of A. If x(1) and x(2) are eigenvectors that correspond to different eigenvalues of A, then x(1) and x(2) are orthogonal. Corresponding to an eigenvalue of algebraic multiplicity m, it is possible to choose m mutually orthogonal eigenvectors, and hence A has a full set of n linearly independent orthogonal eigenvectors.