Orthogonality and Least Squares

Slides:



Advertisements
Similar presentations
Eigenvalues and Eigenvectors
Advertisements

Eigenvalues and Eigenvectors
6 6.1 © 2012 Pearson Education, Inc. Orthogonality and Least Squares INNER PRODUCT, LENGTH, AND ORTHOGONALITY.
THE DIMENSION OF A VECTOR SPACE
Symmetric Matrices and Quadratic Forms
Chapter 5 Orthogonality
Lecture 20 SVD and Its Applications Shang-Hua Teng.
Orthogonality and Least Squares
4 4.6 © 2012 Pearson Education, Inc. Vector Spaces RANK.
6 6.1 © 2012 Pearson Education, Inc. Orthogonality and Least Squares INNER PRODUCT, LENGTH, AND ORTHOGONALITY.
6 6.3 © 2012 Pearson Education, Inc. Orthogonality and Least Squares ORTHOGONAL PROJECTIONS.
5 5.1 © 2012 Pearson Education, Inc. Eigenvalues and Eigenvectors EIGENVECTORS AND EIGENVALUES.
5.1 Orthogonality.
Instructor: Irvin Roy Hentzel Office 432 Carver Phone
Chapter 5: The Orthogonality and Least Squares
Gram-Schmidt Orthogonalization
Eigenvalues and Eigenvectors
Linear Algebra (Aljabar Linier) Week 10 Universitas Multimedia Nusantara Serpong, Tangerang Dr. Ananda Kusuma
Chapter Content Real Vector Spaces Subspaces Linear Independence
Linear Equations in Linear Algebra
AN ORTHOGONAL PROJECTION
Orthogonality and Least Squares
Elementary Linear Algebra Anton & Rorres, 9th Edition
4 4.6 © 2012 Pearson Education, Inc. Vector Spaces RANK.
Vector Spaces RANK © 2016 Pearson Education, Inc..
Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition
4.8 Rank Rank enables one to relate matrices to vectors, and vice versa. Definition Let A be an m  n matrix. The rows of A may be viewed as row vectors.
Chap. 5 Inner Product Spaces 5.1 Length and Dot Product in R n 5.2 Inner Product Spaces 5.3 Orthonormal Bases: Gram-Schmidt Process 5.4 Mathematical Models.
4 © 2012 Pearson Education, Inc. Vector Spaces 4.4 COORDINATE SYSTEMS.
Class 24: Question 1 Which of the following set of vectors is not an orthogonal set?
5.1 Eigenvectors and Eigenvalues 5. Eigenvalues and Eigenvectors.
4.8 Rank Rank enables one to relate matrices to vectors, and vice versa. Definition Let A be an m  n matrix. The rows of A may be viewed as row vectors.
6 6.3 © 2016 Pearson Education, Inc. Orthogonality and Least Squares ORTHOGONAL PROJECTIONS.
5 5.1 © 2016 Pearson Education, Ltd. Eigenvalues and Eigenvectors EIGENVECTORS AND EIGENVALUES.
4 4.5 © 2016 Pearson Education, Inc. Vector Spaces THE DIMENSION OF A VECTOR SPACE.
Let W be a subspace of R n, y any vector in R n, and the orthogonal projection of y onto W. …
6 6.5 © 2016 Pearson Education, Ltd. Orthogonality and Least Squares LEAST-SQUARES PROBLEMS.
1 Chapter 6 Orthogonality 6.1 Inner product, length, and orthogonality 内积 , 模长 , 正交 6.2 Orthogonal sets 正交组,正交集 6.4 The Gram-Schmidt process 格莱姆 - 施密特过程.
QR decomposition: A = QR Q is orthonormal R is upper triangular To find QR decomposition: 1.) Q: Use Gram-Schmidt to find orthonormal basis for column.
Chapter 5 Eigenvalues and Eigenvectors
Eigenvalues and Eigenvectors
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
CS479/679 Pattern Recognition Dr. George Bebis
Elementary Linear Algebra Anton & Rorres, 9th Edition
Eigenvalues and Eigenvectors
CHARACTERIZATIONS OF INVERTIBLE MATRICES
Row Space, Column Space, and Nullspace
Linear Algebra Lecture 22.
Orthogonal Projection
Orthogonality and Least Squares
Linear Algebra Lecture 40.
Linear Algebra Lecture 39.
Lecture 13 Operations in Graphics and Geometric Modeling I: Projection, Rotation, and Reflection Shang-Hua Teng.
Linear Algebra Lecture 38.
Symmetric Matrices and Quadratic Forms
Elementary Linear Algebra Anton & Rorres, 9th Edition
Lecture 13 Operations in Graphics and Geometric Modeling I: Projection, Rotation, and Reflection Shang-Hua Teng.
EIGENVECTORS AND EIGENVALUES
Linear Algebra Lecture 41.
Vector Spaces, Subspaces
Eigenvalues and Eigenvectors
Vector Spaces RANK © 2012 Pearson Education, Inc..
THE DIMENSION OF A VECTOR SPACE
Linear Equations in Linear Algebra
Vector Spaces COORDINATE SYSTEMS © 2012 Pearson Education, Inc.
Orthogonality and Least Squares
Eigenvalues and Eigenvectors
Eigenvalues and Eigenvectors
Symmetric Matrices and Quadratic Forms
Presentation transcript:

Orthogonality and Least Squares THE GRAM-SCHMIDT PROCESS © 2016 Pearson Education, Inc.

THE GRAM-SCHMIDT PROCESS Theorem 11: The Gram-Schmidt Process Given a basis {x1, . . . , xp} for a nonzero subspace W of ℝ 𝑛 , define 𝑣1=𝑥1 𝑣2=𝑥2− 𝑥2∙𝑣1 𝑣1∙𝑣1 v1 𝑣3=𝑥3− 𝑥3∙𝑣1 𝑣1∙𝑣1 v1− 𝑥3∙𝑣2 𝑣2∙𝑣2 v2 𝑣𝑝=𝑥𝑝− 𝑥𝑝∙𝑣1 𝑣1∙𝑣1 v1− 𝑥𝑝∙𝑣2 𝑣2∙𝑣2 v2− . . . − 𝑥𝑝∙𝑣 𝑝−1 𝑣𝑝−1∙𝑣𝑝−1 v𝑝−1 Then {v1 , . . . , vp} is an orthogonal basis for W. In addition Span{v1 , . . . , vk} = Span{x1 , . . . , xk} for 1≤𝑘≤𝑝 . (1) © 2016 Pearson Education, Inc.

THE GRAM-SCHMIDT PROCESS Proof For 1≤𝑘≤𝑝, let Wk = Span{x1 , . . . , xk}. Set v1 = x1, so that Span{v1} = Span {x1}. Suppose, for some k < p, we have constructed v1, . . . , vk so that {v1 , . . . , vk} is an orthogonal basis for Wk. Define vk+1 = xk+1 – projWkxk+1 By the Orthogonal Decomposition Theorem, vk+1 is orthogonal to Wk. Furthermore, vk+1 ≠ 0 because xk+1 is not in Wk = Span{x1 , . . . , xk} Hence {v1 , . . . , vk} is an orthogonal set of nonzero vectors in the (k + 1)-dimensional space Wk+1. By the Basis Theorem in Section 4.5, this set is an orthogonal basis for Wk+1. Hence Wk+1 = Span{v1 , . . . , vk+1}. When k + 1 = p, the process stops. (2) © 2016 Pearson Education, Inc.

ORTHONORMAL BASES Example 3 Example 1 constructed the orthogonal basis 𝑣1= 3 6 0 , 𝑣2= 0 0 2 An orthonormal basis is 𝑢1= 1 𝑣1 v1= 1 45 3 6 0 = 1/ 5 2/ 5 0 𝑢2= 1 𝑣2 v2= 0 0 1 © 2016 Pearson Education, Inc.

QR FACTORIZATION OF MATRICES Theorem 12: The QR Factorization If A is an m × n matrix with linearly independent columns, then A can be factored as A = QR, where Q is an m × n matrix whose columns form an orthonormal basis for Col A and R is an n × n upper triangular invertible matrix with positive entries on its diagonal. Proof The columns of A form a basis {x1, . . . , xn} for Col A. Construct an orthonormal basis {u1, . . . , un} for W = Col A with property (1) in Theorem 11. This basis may be constructed by the Gram-Schmidt process or some other means. © 2016 Pearson Education, Inc.

QR FACTORIZATION OF MATRICES Let 𝑄=[u1 u2 . . . un] For k = 1 , . . . , xk is in Span{x1, . . . , xk} = Span{u1, . . . , uk}. So there are constants, r1k , . . . , rkk, such that 𝑥𝑘=𝑟1𝑘𝑢1+…+𝑟𝑘𝑘𝑢𝑘+0∙𝑢𝑘+1+…+0∙𝑢𝑛 We may assume that rkk ≥ 0. This shows that xk is a linear combination of the columns of Q using as weights the entries in the vector © 2016 Pearson Education, Inc.

QR FACTORIZATION OF MATRICES 𝑟𝑘= 𝑟1𝑘 . . . 𝑟𝑘𝑘 0 . . . 0 That is, xk = Qrk for k = 1, . . . , n. Let R = [r1 . . . rn]. Then A = [x1 . . . xn] = [Qr1 . . . Qrn] = QR The fact that R is invertible follows easily from the fact that the columns of A are linearly independent. Since R is clearly upper triangular, its nonnegative diagonal entries must be positive. © 2016 Pearson Education, Inc.

QR FACTORIZATION OF MATRICES Example 4 Find a QR factorization of A = 1 0 0 1 1 0 1 1 1 1 1 1 . Solution The columns of A are the vectors x1, x2, and x3 in Example 2. An orthogonal basis for Col A = Span{x1, x2, x3} was found in that example: 𝑣1= 1 1 1 1 , 𝑣2= −3 1 1 1 , 𝑣3= 0 −2/3 1/3 1/3 © 2016 Pearson Education, Inc.

QR FACTORIZATION OF MATRICES To simplify the arithmetic that follows, scale v3 by letting v3 = 3v3. Then normalize the three vectors to obtain u1, u2, and u3, and use these vectors as the columns of Q: Q = 1/2 −3/ 12 0 1/2 1/ 12 −2/ 6 1/2 1/2 1/ 12 1/ 12 1/ 6 1/ 6 . By construction, the first k columns of Q are an orthonormal basis of Span{x1 , . . . , xk}. © 2016 Pearson Education, Inc.

QR FACTORIZATION OF MATRICES From the proof of Theorem 12, A = QR for some R. To find R, observe that QTQ = I, because the columns of Q are orthonormal. Hence 𝑄𝑇𝐴=𝑄𝑇 𝑄𝑅 =𝐼𝑅=𝑅 and R = 1/2 1/2 1/2 −3/ 12 1/ 12 1/ 12 0 −2/ 6 1/ 12 1/2 1/ 12 1/ 6 1 0 0 1 1 0 1 1 1 1 1 1 = 2 3/2 1 0 3/ 12 2/ 12 0 0 3/ 12 © 2016 Pearson Education, Inc.