4.8 Rank Rank enables one to relate matrices to vectors, and vice versa. Definition Let A be an m  n matrix. The rows of A may be viewed as row vectors.

Slides:



Advertisements
Similar presentations
Vector Spaces A set V is called a vector space over a set K denoted V(K) if is an Abelian group, is a field, and For every element vV and K there exists.
Advertisements

10.4 Complex Vector Spaces.
Example: Given a matrix defining a linear mapping Find a basis for the null space and a basis for the range Pamela Leutwyler.
Section 4.6 (Rank).
4 4.3 © 2012 Pearson Education, Inc. Vector Spaces LINEARLY INDEPENDENT SETS; BASES.
Lecture 19 Singular Value Decomposition
THE DIMENSION OF A VECTOR SPACE
Symmetric Matrices and Quadratic Forms
3.VI.1. Orthogonal Projection Into a Line 3.VI.2. Gram-Schmidt Orthogonalization 3.VI.3. Projection Into a Subspace 3.VI. Projection 3.VI.1. & 2. deal.
Lecture 20 SVD and Its Applications Shang-Hua Teng.
Orthogonality and Least Squares
4 4.6 © 2012 Pearson Education, Inc. Vector Spaces RANK.
6 6.1 © 2012 Pearson Education, Inc. Orthogonality and Least Squares INNER PRODUCT, LENGTH, AND ORTHOGONALITY.
Lecture 10 Dimensions, Independence, Basis and Complete Solution of Linear Systems Shang-Hua Teng.
Lecture 11 Fundamental Theorems of Linear Algebra Orthogonalily and Projection Shang-Hua Teng.
Subspaces, Basis, Dimension, Rank
5.1 Orthogonality.
1 MAC 2103 Module 10 lnner Product Spaces I. 2 Rev.F09 Learning Objectives Upon completing this module, you should be able to: 1. Define and find the.
Linear Algebra Lecture 25.
Chapter 5: The Orthogonality and Least Squares
Chapter 5 Orthogonality.
Linear Algebra Chapter 4 Vector Spaces.
Gram-Schmidt Orthogonalization
4 4.4 © 2012 Pearson Education, Inc. Vector Spaces COORDINATE SYSTEMS.
Section 4.1 Vectors in ℝ n. ℝ n Vectors Vector addition Scalar multiplication.
A matrix equation has the same solution set as the vector equation which has the same solution set as the linear system whose augmented matrix is Therefore:
Vectors CHAPTER 7. Ch7_2 Contents  7.1 Vectors in 2-Space 7.1 Vectors in 2-Space  7.2 Vectors in 3-Space 7.2 Vectors in 3-Space  7.3 Dot Product 7.3.
AN ORTHOGONAL PROJECTION
Orthogonality and Least Squares
Elementary Linear Algebra Anton & Rorres, 9th Edition
5.5 Row Space, Column Space, and Nullspace
4 4.6 © 2012 Pearson Education, Inc. Vector Spaces RANK.
4.6: Rank. Definition: Let A be an mxn matrix. Then each row of A has n entries and can therefore be associated with a vector in The set of all linear.
Vector Spaces RANK © 2016 Pearson Education, Inc..
Chapter 10 Real Inner Products and Least-Square
Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition
Section 5.1 Length and Dot Product in ℝ n. Let v = ‹v 1­­, v 2, v 3,..., v n › and w = ‹w 1­­, w 2, w 3,..., w n › be vectors in ℝ n. The dot product.
Chap. 5 Inner Product Spaces 5.1 Length and Dot Product in R n 5.2 Inner Product Spaces 5.3 Orthonormal Bases: Gram-Schmidt Process 5.4 Mathematical Models.
4 © 2012 Pearson Education, Inc. Vector Spaces 4.4 COORDINATE SYSTEMS.
2 2.9 © 2016 Pearson Education, Inc. Matrix Algebra DIMENSION AND RANK.
Elementary Linear Algebra Anton & Rorres, 9 th Edition Lecture Set – 07 Chapter 7: Eigenvalues, Eigenvectors.
Class 26: Question 1 1.An orthogonal basis for A 2.An orthogonal basis for the column space of A 3.An orthogonal basis for the row space of A 4.An orthogonal.
Class 24: Question 1 Which of the following set of vectors is not an orthogonal set?
1.7 Linear Independence. in R n is said to be linearly independent if has only the trivial solution. in R n is said to be linearly dependent if there.
Advanced Computer Graphics Spring 2014 K. H. Ko School of Mechatronics Gwangju Institute of Science and Technology.
4.8 Rank Rank enables one to relate matrices to vectors, and vice versa. Definition Let A be an m  n matrix. The rows of A may be viewed as row vectors.
Chapter 4 Vector Spaces Linear Algebra. Ch04_2 Definition 1: ……………………………………………………………………. The elements in R n called …………. 4.1 The vector Space R n Addition.
4.5: The Dimension of a Vector Space. Theorem 9 If a vector space V has a basis, then any set in V containing more than n vectors must be linearly dependent.
4 4.5 © 2016 Pearson Education, Inc. Vector Spaces THE DIMENSION OF A VECTOR SPACE.
12.3 The Dot Product. The dot product of u and v in the plane is The dot product of u and v in space is Two vectors u and v are orthogonal  if they meet.
4 Vector Spaces 4.1 Vector Spaces and Subspaces 4.2 Null Spaces, Column Spaces, and Linear Transformations 4.3 Linearly Independent Sets; Bases 4.4 Coordinate.
6 6.5 © 2016 Pearson Education, Ltd. Orthogonality and Least Squares LEAST-SQUARES PROBLEMS.
Lecture XXVII. Orthonormal Bases and Projections Suppose that a set of vectors {x 1,…,x r } for a basis for some space S in R m space such that r  m.
QR decomposition: A = QR Q is orthonormal R is upper triangular To find QR decomposition: 1.) Q: Use Gram-Schmidt to find orthonormal basis for column.
Chapter 1 Linear Equations and Vectors
Row Space, Column Space, and Nullspace
Euclidean Inner Product on Rn
Orthogonality and Least Squares
4.6: Rank.
Orthogonality and Least Squares
Linear Algebra Lecture 39.
Linear Algebra Lecture 38.
Symmetric Matrices and Quadratic Forms
Linear Algebra Lecture 41.
Vector Spaces, Subspaces
Row-equivalences again
Vector Spaces RANK © 2012 Pearson Education, Inc..
THE DIMENSION OF A VECTOR SPACE
Orthogonality and Least Squares
Symmetric Matrices and Quadratic Forms
Presentation transcript:

4.8 Rank Rank enables one to relate matrices to vectors, and vice versa. Definition Let A be an m  n matrix. The rows of A may be viewed as row vectors r1, …, rm, and the columns as column vectors c1, …, cn. Each row vector will have n components, and each column vector will have m components, The row vectors will span a subspace of Rn called the row space of A, and the column vectors will span a subspace of Rm called the column space of A.

Example 1 Consider the matrix (1) The row vectors of A are (2) The column vectors of A are

Theorem 4.16 The row space and the column space of a matrix A have the same dimension. Definition The dimension of the row space and the column space of a matrix A is called the rank of A. The rank of A is denoted rank(A).

Example 2 Determine the rank of the matrix Solution

Theorem 4.17 The nonzero row vectors of a matrix A that is in reduced echelon form are a basis for the row space of A. The rank of A is the number of nonzero row vectors.

Example 3 Find the rank of the matrix

Theorem 4.18 Let A and B be row equivalent matrices. Then A and B have the same the row space. rank(A) = rank(B). Theorem 4.19 Let E be a reduced echelon form of a matrix A. The nonzero row vectors of E form a basis for the row space of A. The rank of A is the number of nonzero row vectors in E.

Example 4 Find a basis for the row space of the following matrix A, and determine its rank.

4.9 Orthonormal Vectors and Projections Definition A set of vectors in a vector space V is said to be an orthogonal set if every pair of vectors in the set is orthogonal. The set is said to be an orthonormal set if it is orthogonal and each vector is a unit vector.

Example 1 Show that the set is an orthonormal set. Solution

Theorem 4.20 An orthogonal set of nonzero vectors in a vector space is linearly independent. Proof

Definition A basis that is an orthogonal set is said to be an orthogonal basis. A basis that is an orthonormal set is said to be an orthonormal basis. Standard Bases R2: R3: Rn: Theorem 4.21 Let {u1, …, un} be an orthonormal basis for a vector space V. Let v be a vector in V. v can be written as a linearly combination of these basis vectors as follows:

Example 2 The following vectors u1, u2, and u3 form an orthonormal basis for R3. Express the vector v = (7, -5, 10) as a linear combination of these vectors. Solution

Projection of One vector onto Another Vector Let v and u be vectors in Rn with angle a (0  a  p) between them. Figure 4.17

Definition The projection of a vector v onto a nonzero vector u in Rn is denoted projuv and is defined by O Figure 4.18

Example 3 Determine the projection of the vector v = (6, 7) onto the vector u = (1, 4). Solution

Theorem 4.22 The Gram-Schmidt Orthogonalization Process Let {v1, …, vn} be a basis for a vector space V. The set of vectors {u1, …, un} defined as follows is orthogonal. To obtain an orthonormal basis for V, normalize each of the vectors u1, …, un . Figure 4.19

Example 4 The set {(1, 2, 0, 3), (4, 0, 5, 8), (8, 1, 5, 6)} is linearly independent in R4. The vectors form a basis for a three-dimensional subspace V of R4. Construct an orthonormal basis for V. Solution