Linear Equations in Linear Algebra

Slides:



Advertisements
Similar presentations
2 2.3 © 2012 Pearson Education, Inc. Matrix Algebra CHARACTERIZATIONS OF INVERTIBLE MATRICES.
Advertisements

Eigenvalues and Eigenvectors
6 6.1 © 2012 Pearson Education, Inc. Orthogonality and Least Squares INNER PRODUCT, LENGTH, AND ORTHOGONALITY.
Matrices and Systems of Equations
Orthogonality and Least Squares
4 4.6 © 2012 Pearson Education, Inc. Vector Spaces RANK.
Linear Equations in Linear Algebra
1 © 2012 Pearson Education, Inc. Matrix Algebra THE INVERSE OF A MATRIX.
Matrix Algebra THE INVERSE OF A MATRIX © 2012 Pearson Education, Inc.
6 6.1 © 2012 Pearson Education, Inc. Orthogonality and Least Squares INNER PRODUCT, LENGTH, AND ORTHOGONALITY.
5 5.1 © 2012 Pearson Education, Inc. Eigenvalues and Eigenvectors EIGENVECTORS AND EIGENVALUES.
Arithmetic Operations on Matrices. 1. Definition of Matrix 2. Column, Row and Square Matrix 3. Addition and Subtraction of Matrices 4. Multiplying Row.
Chapter 1 Section 1.2 Echelon Form and Gauss-Jordan Elimination.
1 1.1 © 2012 Pearson Education, Inc. Linear Equations in Linear Algebra SYSTEMS OF LINEAR EQUATIONS.
4 4.2 © 2012 Pearson Education, Inc. Vector Spaces NULL SPACES, COLUMN SPACES, AND LINEAR TRANSFORMATIONS.
4 4.4 © 2012 Pearson Education, Inc. Vector Spaces COORDINATE SYSTEMS.
1 1.4 Linear Equations in Linear Algebra THE MATRIX EQUATION © 2016 Pearson Education, Inc.
2 2.1 © 2016 Pearson Education, Inc. Matrix Algebra MATRIX OPERATIONS.
A matrix equation has the same solution set as the vector equation which has the same solution set as the linear system whose augmented matrix is Therefore:
We will use Gauss-Jordan elimination to determine the solution set of this linear system.
Linear Equations in Linear Algebra
1 1.7 © 2016 Pearson Education, Inc. Linear Equations in Linear Algebra LINEAR INDEPENDENCE.
1 1.3 © 2012 Pearson Education, Inc. Linear Equations in Linear Algebra VECTOR EQUATIONS.
8.1 Matrices & Systems of Equations
1 C ollege A lgebra Systems and Matrices (Chapter5) 1.
10.4 Matrix Algebra 1.Matrix Notation 2.Sum/Difference of 2 matrices 3.Scalar multiple 4.Product of 2 matrices 5.Identity Matrix 6.Inverse of a matrix.
Algebra 3: Section 5.5 Objectives of this Section Find the Sum and Difference of Two Matrices Find Scalar Multiples of a Matrix Find the Product of Two.
Orthogonality and Least Squares
Vector Spaces RANK © 2016 Pearson Education, Inc..
4 © 2012 Pearson Education, Inc. Vector Spaces 4.4 COORDINATE SYSTEMS.
Chapter 6 Systems of Linear Equations and Matrices Sections 6.3 – 6.5.
2 2.1 © 2012 Pearson Education, Inc. Matrix Algebra MATRIX OPERATIONS.
Copyright © 2011 Pearson Education, Inc. Solving Linear Systems Using Matrices Section 6.1 Matrices and Determinants.
CHARACTERIZATIONS OF INVERTIBLE MATRICES
Copyright ©2015 Pearson Education, Inc. All rights reserved.
4.8 Rank Rank enables one to relate matrices to vectors, and vice versa. Definition Let A be an m  n matrix. The rows of A may be viewed as row vectors.
5 5.1 © 2016 Pearson Education, Ltd. Eigenvalues and Eigenvectors EIGENVECTORS AND EIGENVALUES.
2 2.2 © 2016 Pearson Education, Ltd. Matrix Algebra THE INVERSE OF A MATRIX.
The Matrix Equation A x = b (9/16/05) Definition. If A is an m  n matrix with columns a 1, a 2,…, a n and x is a vector in R n, then the product of A.
1 1.4 Linear Equations in Linear Algebra THE MATRIX EQUATION © 2016 Pearson Education, Ltd.
1 1.1 © 2012 Pearson Education, Inc. Linear Equations in Linear Algebra SYSTEMS OF LINEAR EQUATIONS.
1 1.3 © 2016 Pearson Education, Ltd. Linear Equations in Linear Algebra VECTOR EQUATIONS.
2.1 Matrix Operations 2. Matrix Algebra. j -th column i -th row Diagonal entries Diagonal matrix : a square matrix whose nondiagonal entries are zero.
Slide INTRODUCTION TO DETERMINANTS Determinants 3.1.
REVIEW Linear Combinations Given vectors and given scalars
1.4 The Matrix Equation Ax = b
Matrix Algebra MATRIX OPERATIONS © 2012 Pearson Education, Inc.
Matrix Algebra MATRIX OPERATIONS.
Linear Equations in Linear Algebra
Eigenvalues and Eigenvectors
Matrix Algebra MATRIX OPERATIONS © 2012 Pearson Education, Inc.
CHARACTERIZATIONS OF INVERTIBLE MATRICES
Linear Equations in Linear Algebra
Linear Equations in Linear Algebra
2. Matrix Algebra 2.1 Matrix Operations.
Orthogonality and Least Squares
1.3 Vector Equations.
Linear Algebra Lecture 37.
Linear Algebra Lecture 6.
Vector Spaces RANK © 2012 Pearson Education, Inc..
Matrix Algebra MATRIX OPERATIONS © 2012 Pearson Education, Inc.
Matrix Algebra THE INVERSE OF A MATRIX © 2012 Pearson Education, Inc.
NULL SPACES, COLUMN SPACES, AND LINEAR TRANSFORMATIONS
Vector Spaces COORDINATE SYSTEMS © 2012 Pearson Education, Inc.
CHARACTERIZATIONS OF INVERTIBLE MATRICES
Linear Equations in Linear Algebra
Orthogonality and Least Squares
Eigenvalues and Eigenvectors
Linear Equations in Linear Algebra
Presentation transcript:

Linear Equations in Linear Algebra THE MATRIX EQUATION © 2012 Pearson Education, Inc.

MATRIX EQUATION Definition: If A is an matrix, with columns a1, …, an, and if x is in , then the product of A and x, denoted by Ax, is the linear combination of the columns of A using the corresponding entries in x as weights; that is, . Ax is defined only if the number of columns of A equals the number of entries in x. © 2012 Pearson Education, Inc.

MATRIX EQUATION Example 1: For v1, v2, v3 in , write the linear combination as a matrix times a vector. Solution: Place v1, v2, v3 into the columns of a matrix A and place the weights 3, , and 7 into a vector x. That is, . © 2012 Pearson Education, Inc.

MATRIX EQUATION Now, write the system of linear equations as a vector equation involving a linear combination of vectors. For example, the following system ----(1) is equivalent to . ----(2) © 2012 Pearson Education, Inc.

MATRIX EQUATION As in the given example (1), the linear combination on the left side is a matrix times a vector, so that (2) becomes . ----(3) Equation (3) has the form . Such an equation is called a matrix equation. © 2012 Pearson Education, Inc.

MATRIX EQUATION Theorem 3: If A is an matrix, with columns a1, …, an, and if b is in , then the matrix equation has the same solution set as the vector equation , which, in turn, has the same solution set as the system of linear equations whose augmented matrix is . © 2012 Pearson Education, Inc.

EXISTENCE OF SOLUTIONS The equation has a solution if and only if b is a linear combination of the columns of A. Theorem 4: Let A be an matrix. Then the following statements are logically equivalent. That is, for a particular A, either they are all true statements or they are all false. For each b in , the equation has a solution. Each b in is a linear combination of the columns of A. The columns of A span . A has a pivot position in every row. © 2012 Pearson Education, Inc.

PROOF OF THEOREM 4 Statements (a), (b), and (c) are logically equivalent. So, it suffices to show (for an arbitrary matrix A) that (a) and (d) are either both true or false. Let U be an echelon form of A. Given b in , we can row reduce the augmented matrix to an augmented matrix for some d in : If statement (d) is true, then each row of U contains a pivot position, and there can be no pivot in the augmented column. © 2012 Pearson Education, Inc.

PROOF OF THEOREM 4 So has a solution for any b, and (a) is true. If (d) is false, then the last row of U is all zeros. Let d be any vector with a 1 in its last entry. Then represents an inconsistent system. Since row operations are reversible, can be transformed into the form . The new system is also inconsistent, and (a) is false. © 2012 Pearson Education, Inc.

COMPUTATION OF Ax Example 2: Compute Ax, where and . Solution: From the definition, © 2012 Pearson Education, Inc.

COMPUTATION OF Ax ---(1) . The first entry in the product Ax is a sum of products (a dot product), using the first row of A and the entries in x. © 2012 Pearson Education, Inc.

COMPUTATION OF Ax That is, . Similarly, the second entry in Ax can be calculated by multiplying the entries in the second row of A by the corresponding entries in x and then summing the resulting products. © 2012 Pearson Education, Inc.

ROW-VECTOR RULE FOR COMPUTING Ax Likewise, the third entry in Ax can be calculated from the third row of A and the entries in x. If the product Ax is defined, then the ith entry in Ax is the sum of the products of corresponding entries from row i of A and from the vertex x. The matrix with 1s on the diagonal and 0s elsewhere is called an identity matrix and is denoted by I. For example, is an identity matrix. © 2012 Pearson Education, Inc.

PROPERTIES OF THE MATRIX-VECTOR PRODUCT Ax Theorem 5: If A is an matrix, u and v are vectors in , and c is a scalar, then . Proof: For simplicity, take , , and u, v in . For let ui and vi be the ith entries in u and v, respectively. © 2012 Pearson Education, Inc.

PROPERTIES OF THE MATRIX-VECTOR PRODUCT Ax To prove statement (a), compute as a linear combination of the columns of A using the entries in as weights. Entries in Columns of A © 2012 Pearson Education, Inc.

PROPERTIES OF THE MATRIX-VECTOR PRODUCT Ax To prove statement (b), compute as a linear combination of the columns of A using the entries in cu as weights. © 2012 Pearson Education, Inc.