Presentation is loading. Please wait.

Presentation is loading. Please wait.

Orthogonality and Least Squares

Similar presentations


Presentation on theme: "Orthogonality and Least Squares"β€” Presentation transcript:

1 Orthogonality and Least Squares
THE GRAM-SCHMIDT PROCESS Β© 2016 Pearson Education, Inc.

2 THE GRAM-SCHMIDT PROCESS
Theorem 11: The Gram-Schmidt Process Given a basis {x1, , xp} for a nonzero subspace W of ℝ 𝑛 , define 𝑣1=π‘₯1 𝑣2=π‘₯2βˆ’ π‘₯2βˆ™π‘£1 𝑣1βˆ™π‘£1 v1 𝑣3=π‘₯3βˆ’ π‘₯3βˆ™π‘£1 𝑣1βˆ™π‘£1 v1βˆ’ π‘₯3βˆ™π‘£2 𝑣2βˆ™π‘£2 v2 𝑣𝑝=π‘₯π‘βˆ’ π‘₯π‘βˆ™π‘£1 𝑣1βˆ™π‘£1 v1βˆ’ π‘₯π‘βˆ™π‘£2 𝑣2βˆ™π‘£2 v2βˆ’ βˆ’ π‘₯π‘βˆ™π‘£ π‘βˆ’1 π‘£π‘βˆ’1βˆ™π‘£π‘βˆ’1 vπ‘βˆ’1 Then {v1 , , vp} is an orthogonal basis for W. In addition Span{v1 , , vk} = Span{x1 , , xk} for 1β‰€π‘˜β‰€π‘ . (1) Β© 2016 Pearson Education, Inc.

3 THE GRAM-SCHMIDT PROCESS
Proof For 1β‰€π‘˜β‰€π‘, let Wk = Span{x1 , , xk}. Set v1 = x1, so that Span{v1} = Span {x1}. Suppose, for some k < p, we have constructed v1, , vk so that {v1 , , vk} is an orthogonal basis for Wk. Define vk+1 = xk+1 – projWkxk+1 By the Orthogonal Decomposition Theorem, vk+1 is orthogonal to Wk. Furthermore, vk+1 β‰  0 because xk+1 is not in Wk = Span{x1 , , xk} Hence {v1 , , vk} is an orthogonal set of nonzero vectors in the (k + 1)-dimensional space Wk+1. By the Basis Theorem in Section 4.5, this set is an orthogonal basis for Wk+1. Hence Wk+1 = Span{v1 , , vk+1}. When k + 1 = p, the process stops. (2) Β© 2016 Pearson Education, Inc.

4 ORTHONORMAL BASES Example 3 Example 1 constructed the orthogonal basis
𝑣1= , 𝑣2= An orthonormal basis is 𝑒1= 1 𝑣1 v1= = 1/ 5 2/ 5 0 𝑒2= 1 𝑣2 v2= Β© 2016 Pearson Education, Inc.

5 QR FACTORIZATION OF MATRICES
Theorem 12: The QR Factorization If A is an m Γ— n matrix with linearly independent columns, then A can be factored as A = QR, where Q is an m Γ— n matrix whose columns form an orthonormal basis for Col A and R is an n Γ— n upper triangular invertible matrix with positive entries on its diagonal. Proof The columns of A form a basis {x1, , xn} for Col A. Construct an orthonormal basis {u1, , un} for W = Col A with property (1) in Theorem 11. This basis may be constructed by the Gram-Schmidt process or some other means. Β© 2016 Pearson Education, Inc.

6 QR FACTORIZATION OF MATRICES
Let 𝑄=[u1 u un] For k = 1 , , xk is in Span{x1, , xk} = Span{u1, , uk}. So there are constants, r1k , , rkk, such that π‘₯π‘˜=π‘Ÿ1π‘˜π‘’1+…+π‘Ÿπ‘˜π‘˜π‘’π‘˜+0βˆ™π‘’π‘˜+1+…+0βˆ™π‘’π‘› We may assume that rkk β‰₯ 0. This shows that xk is a linear combination of the columns of Q using as weights the entries in the vector Β© 2016 Pearson Education, Inc.

7 QR FACTORIZATION OF MATRICES
π‘Ÿπ‘˜= π‘Ÿ1π‘˜ π‘Ÿπ‘˜π‘˜ That is, xk = Qrk for k = 1, , n. Let R = [r rn]. Then A = [x xn] = [Qr Qrn] = QR The fact that R is invertible follows easily from the fact that the columns of A are linearly independent. Since R is clearly upper triangular, its nonnegative diagonal entries must be positive. Β© 2016 Pearson Education, Inc.

8 QR FACTORIZATION OF MATRICES
Example 4 Find a QR factorization of A = Solution The columns of A are the vectors x1, x2, and x3 in Example 2. An orthogonal basis for Col A = Span{x1, x2, x3} was found in that example: 𝑣1= , 𝑣2= βˆ’ , 𝑣3= 0 βˆ’2/3 1/3 1/3 Β© 2016 Pearson Education, Inc.

9 QR FACTORIZATION OF MATRICES
To simplify the arithmetic that follows, scale v3 by letting v3 = 3v3. Then normalize the three vectors to obtain u1, u2, and u3, and use these vectors as the columns of Q: Q = 1/2 βˆ’3/ /2 1/ 12 βˆ’2/ /2 1/2 1/ / / 6 1/ By construction, the first k columns of Q are an orthonormal basis of Span{x1 , , xk}. Β© 2016 Pearson Education, Inc.

10 QR FACTORIZATION OF MATRICES
From the proof of Theorem 12, A = QR for some R. To find R, observe that QTQ = I, because the columns of Q are orthonormal. Hence 𝑄𝑇𝐴=𝑄𝑇 𝑄𝑅 =𝐼𝑅=𝑅 and R = 1/2 1/2 1/2 βˆ’3/ / / βˆ’2/ 6 1/ /2 1/ / = 2 3/ / / / 12 Β© 2016 Pearson Education, Inc.


Download ppt "Orthogonality and Least Squares"

Similar presentations


Ads by Google