4.8 Rank Rank enables one to relate matrices to vectors, and vice versa. Definition Let A be an m  n matrix. The rows of A may be viewed as row vectors.

Slides:



Advertisements
Similar presentations
Vector Spaces A set V is called a vector space over a set K denoted V(K) if is an Abelian group, is a field, and For every element vV and K there exists.
Advertisements

10.4 Complex Vector Spaces.
5.4 Basis And Dimension.
6.4 Best Approximation; Least Squares
5.2 Rank of a Matrix. Set-up Recall block multiplication:
Linear Equations in Linear Algebra
Section 4.6 (Rank).
6 6.1 © 2012 Pearson Education, Inc. Orthogonality and Least Squares INNER PRODUCT, LENGTH, AND ORTHOGONALITY.
THE DIMENSION OF A VECTOR SPACE
Symmetric Matrices and Quadratic Forms
Chapter 5 Orthogonality
Class 25: Question 1 Which of the following vectors is orthogonal to the row space of A?
Orthogonality and Least Squares
4 4.6 © 2012 Pearson Education, Inc. Vector Spaces RANK.
Linear Equations in Linear Algebra
6 6.1 © 2012 Pearson Education, Inc. Orthogonality and Least Squares INNER PRODUCT, LENGTH, AND ORTHOGONALITY.
Subspaces, Basis, Dimension, Rank
5.1 Orthogonality.
1 MAC 2103 Module 10 lnner Product Spaces I. 2 Rev.F09 Learning Objectives Upon completing this module, you should be able to: 1. Define and find the.
6.4 Vectors and Dot Products The Definition of the Dot Product of Two Vectors The dot product of u = and v = is Ex.’s Find each dot product.
Linear Algebra Review 1 CS479/679 Pattern Recognition Dr. George Bebis.
Elementary Operations of Matrix
Linear Algebra Lecture 25.
Chapter 5: The Orthogonality and Least Squares
Chapter 5 Orthogonality.
Linear Algebra Chapter 4 Vector Spaces.
Chapter 2 Simultaneous Linear Equations (cont.)
4 4.4 © 2012 Pearson Education, Inc. Vector Spaces COORDINATE SYSTEMS.
Section 4.1 Vectors in ℝ n. ℝ n Vectors Vector addition Scalar multiplication.
Vectors CHAPTER 7. Ch7_2 Contents  7.1 Vectors in 2-Space 7.1 Vectors in 2-Space  7.2 Vectors in 3-Space 7.2 Vectors in 3-Space  7.3 Dot Product 7.3.
AN ORTHOGONAL PROJECTION
Orthogonality and Least Squares
Chapter 5 Eigenvalues and Eigenvectors 大葉大學 資訊工程系 黃鈴玲 Linear Algebra.
Elementary Linear Algebra Anton & Rorres, 9th Edition
5.5 Row Space, Column Space, and Nullspace
4 4.6 © 2012 Pearson Education, Inc. Vector Spaces RANK.
4.6: Rank. Definition: Let A be an mxn matrix. Then each row of A has n entries and can therefore be associated with a vector in The set of all linear.
Vector Spaces RANK © 2016 Pearson Education, Inc..
Chapter 10 Real Inner Products and Least-Square
Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition
4.8 Rank Rank enables one to relate matrices to vectors, and vice versa. Definition Let A be an m  n matrix. The rows of A may be viewed as row vectors.
Section 5.1 Length and Dot Product in ℝ n. Let v = ‹v 1­­, v 2, v 3,..., v n › and w = ‹w 1­­, w 2, w 3,..., w n › be vectors in ℝ n. The dot product.
Chap. 5 Inner Product Spaces 5.1 Length and Dot Product in R n 5.2 Inner Product Spaces 5.3 Orthonormal Bases: Gram-Schmidt Process 5.4 Mathematical Models.
4 © 2012 Pearson Education, Inc. Vector Spaces 4.4 COORDINATE SYSTEMS.
Elementary Linear Algebra Anton & Rorres, 9 th Edition Lecture Set – 07 Chapter 7: Eigenvalues, Eigenvectors.
Class 26: Question 1 1.An orthogonal basis for A 2.An orthogonal basis for the column space of A 3.An orthogonal basis for the row space of A 4.An orthogonal.
Class 24: Question 1 Which of the following set of vectors is not an orthogonal set?
Chapter 4 Vector Spaces Linear Algebra. Ch04_2 Definition 1: ……………………………………………………………………. The elements in R n called …………. 4.1 The vector Space R n Addition.
4 4.5 © 2016 Pearson Education, Inc. Vector Spaces THE DIMENSION OF A VECTOR SPACE.
12.3 The Dot Product. The dot product of u and v in the plane is The dot product of u and v in space is Two vectors u and v are orthogonal  if they meet.
1 1.4 Linear Equations in Linear Algebra THE MATRIX EQUATION © 2016 Pearson Education, Ltd.
Lecture XXVI.  The material for this lecture is found in James R. Schott Matrix Analysis for Statistics (New York: John Wiley & Sons, Inc. 1997).  A.
6 6.5 © 2016 Pearson Education, Ltd. Orthogonality and Least Squares LEAST-SQUARES PROBLEMS.
Linear Equations in Linear Algebra
Chapter 1 Linear Equations and Vectors
Row Space, Column Space, and Nullspace
Euclidean Inner Product on Rn
Linear Equations in Linear Algebra
4.6: Rank.
Orthogonality and Least Squares
Linear Algebra Lecture 39.
Linear Algebra Lecture 38.
Symmetric Matrices and Quadratic Forms
Vectors and Dot Products
Eigenvalues and Eigenvectors
Vector Spaces RANK © 2012 Pearson Education, Inc..
THE DIMENSION OF A VECTOR SPACE
Orthogonality and Least Squares
Symmetric Matrices and Quadratic Forms
Presentation transcript:

4.8 Rank Rank enables one to relate matrices to vectors, and vice versa. Definition Let A be an m  n matrix. The rows of A may be viewed as row vectors r1, …, rm, and the columns as column vectors c1, …, cn. Each row vector will have n components, and each column vector will have m components, The row vectors will span a subspace of Rn called the row space of A, and the column vectors will span a subspace of Rm called the column space of A.

Example 1 Consider the matrix (1) The row vectors of A are These vectors span a subspace of R4 called the row space of A. (2) The column vectors of A are These vectors span a subspace of R3 called the column space of A.

Theorem 4.16 The row space and the column space of a matrix A have the same dimension. Definition The dimension of the row space and the column space of a matrix A is called the rank of A. The rank of A is denoted rank(A).

Example 2 Determine the rank of the matrix Solution The third row of A is a linear combination of the first two rows: (2, 5, 8) = 2(1, 2, 3) + (0, 1, 2) Hence the three rows of A are linearly dependent. The rank of A must be less than 3. Since (1, 2, 3) is not a scalar multiple of (0, 1, 2), these two vectors are linearly independent. These vectors form a basis for the row space of A. Thus rank(A) = 2.

Theorem 4.17 The nonzero row vectors of a matrix A that is in reduced echelon form are a basis for the row space of A. The rank of A is the number of nonzero row vectors.

Example 3 Find the rank of the matrix This matrix is in reduced echelon form. There are three nonzero row vectors, namely (1, 2, 0, 0), (0, 0, 1, 0), and (0, 0, 0, 1). According to the previous theorem, these three vectors form a basis for the row space of A. Rank(A) = 3.

Theorem 4.18 Let A and B be row equivalent matrices. Then A and B have the same the row space. rank(A) = rank(B). Theorem 4.19 Let E be a reduced echelon form of a matrix A. The nonzero row vectors of E form a basis for the row space of A. The rank of A is the number of nonzero row vectors in E.

Example 4 Find a basis for the row space of the following matrix A, and determine its rank. Solution Use elementary row operations to find a reduced echelon form of the matrix A. We get The two vectors (1, 0, 7), (0, 1, -2) form a basis for the row space of A. Rank(A) = 2.

4.9 Orthonormal Vectors and Projections Definition A set of vectors in a vector space V is said to be an orthogonal set if every pair of vectors in the set is orthogonal. The set is said to be an orthonormal set if it is orthogonal and each vector is a unit vector.

Example 1 Show that the set is an orthonormal set. Solution (1) orthogonal: (2) unit vector: Thus the set is thus an orthonormal set.

Theorem 4.20 An orthogonal set of nonzero vectors in a vector space is linearly independent. Proof Let {v1, …, vm} be an orthogonal set of nonzero vectors in a vector space V. Let us examine the identity c1v1 + c2v2 + … + cmvm = 0 Let vi be the ith vector of the orthogonal set. Take the dot product of each side of the equation with vi and use the properties of the dot product. We get Since the vectors v1, …, v2 are mutually orthogonal, vj‧vi = 0 unless j = i. Thus Since vi is a nonzero, then vi‧vi  0. Thus ci = 0. Letting i = 1, …, m, we get c1 = 0, cm = 0, proving that the vectors are linearly independent.

Definition A basis that is an orthogonal set is said to be an orthogonal basis. A basis that is an orthonormal set is said to be an orthonormal basis. Standard Bases R2: {(1, 0), (0, 1)} R3: {(1, 0, 0), (0, 1, 0), (0, 0, 1)} orthonormal bases Rn: {(1, …, 0), …, (0, …, 1)} Theorem 4.21 Let {u1, …, un} be an orthonormal basis for a vector space V. Let v be a vector in V. v can be written as a linearly combination of these basis vectors as follows:

Example 2 The following vectors u1, u2, and u3 form an orthonormal basis for R3. Express the vector v = (7, -5, 10) as a linear combination of these vectors. Solution Thus

Projection of One vector onto Another Vector Let v and u be vectors in Rn with angle a (0  a  p) between them. Figure 4.17 OA : the projection of v onto u So we define

Definition The projection of a vector v onto a nonzero vector u in Rn is denoted projuv and is defined by O Figure 4.18

Example 3 Determine the projection of the vector v = (6, 7) onto the vector u = (1, 4). Solution Thus The projection of v onto u is (2, 8).

Theorem 4.22 The Gram-Schmidt Orthogonalization Process Let {v1, …, vn} be a basis for a vector space V. The set of vectors {u1, …, un} defined as follows is orthogonal. To obtain an orthonormal basis for V, normalize each of the vectors u1, …, un . Figure 4.19

Example 4 The set {(1, 2, 0, 3), (4, 0, 5, 8), (8, 1, 5, 6)} is linearly independent in R4. The vectors form a basis for a three-dimensional subspace V of R4. Construct an orthonormal basis for V. Solution Let v1 = (1, 2, 0, 3), v2 = (4, 0, 5, 8), v3 = (8, 1, 5, 6)}. Use the Gram-Schmidt process to construct an orthogonal set {u1, u2, u3} from these vectors.

The set {(1, 2, 0, 3), (2, -4, 5, 2), (4, 1, 0, -2)} is an orthogonal basis for V. Normalize them to get an orthonormal basis:  orthonormal basis for V: