Linear Algebra Lecture 41.

Slides:



Advertisements
Similar presentations
Lecture 19 Singular Value Decomposition
Advertisements

Determinants Bases, linear Indep., etc Gram-Schmidt Eigenvalue and Eigenvectors Misc
Symmetric Matrices and Quadratic Forms
Chapter 5 Orthogonality
Lecture 12 Least Square Approximation Shang-Hua Teng.
Lecture 13 Operations in Graphics and Geometric Modeling I: Projection, Rotation, and Reflection Shang-Hua Teng.
Orthogonality and Least Squares
6 6.3 © 2012 Pearson Education, Inc. Orthogonality and Least Squares ORTHOGONAL PROJECTIONS.
5.1 Orthogonality.
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
Instructor: Irvin Roy Hentzel Office 432 Carver Phone
Chapter 5: The Orthogonality and Least Squares
Gram-Schmidt Orthogonalization
Linear Algebra (Aljabar Linier) Week 10 Universitas Multimedia Nusantara Serpong, Tangerang Dr. Ananda Kusuma
AN ORTHOGONAL PROJECTION
Orthogonality and Least Squares
Elementary Linear Algebra Anton & Rorres, 9th Edition
4.8 Rank Rank enables one to relate matrices to vectors, and vice versa. Definition Let A be an m  n matrix. The rows of A may be viewed as row vectors.
Elementary Linear Algebra Anton & Rorres, 9 th Edition Lecture Set – 07 Chapter 7: Eigenvalues, Eigenvectors.
Class 26: Question 1 1.An orthogonal basis for A 2.An orthogonal basis for the column space of A 3.An orthogonal basis for the row space of A 4.An orthogonal.
Class 24: Question 1 Which of the following set of vectors is not an orthogonal set?
5.1 Eigenvectors and Eigenvalues 5. Eigenvalues and Eigenvectors.
Advanced Computer Graphics Spring 2014 K. H. Ko School of Mechatronics Gwangju Institute of Science and Technology.
4.8 Rank Rank enables one to relate matrices to vectors, and vice versa. Definition Let A be an m  n matrix. The rows of A may be viewed as row vectors.
6 6.3 © 2016 Pearson Education, Inc. Orthogonality and Least Squares ORTHOGONAL PROJECTIONS.
4 Vector Spaces 4.1 Vector Spaces and Subspaces 4.2 Null Spaces, Column Spaces, and Linear Transformations 4.3 Linearly Independent Sets; Bases 4.4 Coordinate.
Let W be a subspace of R n, y any vector in R n, and the orthogonal projection of y onto W. …
6 6.5 © 2016 Pearson Education, Ltd. Orthogonality and Least Squares LEAST-SQUARES PROBLEMS.
Lecture XXVII. Orthonormal Bases and Projections Suppose that a set of vectors {x 1,…,x r } for a basis for some space S in R m space such that r  m.
QR decomposition: A = QR Q is orthonormal R is upper triangular To find QR decomposition: 1.) Q: Use Gram-Schmidt to find orthonormal basis for column.
Eigenvalues and Eigenvectors
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
CS479/679 Pattern Recognition Dr. George Bebis
Elementary Linear Algebra Anton & Rorres, 9th Edition
Linear Algebra Lecture 26.
Eigenvalues and Eigenvectors
Eigen Decomposition Based on the slides by Mani Thomas and book by Gilbert Strang. Modified and extended by Longin Jan Latecki.
CHARACTERIZATIONS OF INVERTIBLE MATRICES
Euclidean Inner Product on Rn
Orthogonality and Least Squares
Linear Algebra Lecture 22.
Eigen Decomposition Based on the slides by Mani Thomas and book by Gilbert Strang. Modified and extended by Longin Jan Latecki.
Eigen Decomposition Based on the slides by Mani Thomas and book by Gilbert Strang. Modified and extended by Longin Jan Latecki.
Orthogonal Projection
Orthogonality and Least Squares
Eigen Decomposition Based on the slides by Mani Thomas and book by Gilbert Strang. Modified and extended by Longin Jan Latecki.
Linear Algebra Lecture 40.
Linear Algebra Lecture 39.
Lecture 13 Operations in Graphics and Geometric Modeling I: Projection, Rotation, and Reflection Shang-Hua Teng.
Signal & Weight Vector Spaces
Linear Algebra Lecture 21.
Linear Algebra Lecture 38.
Symmetric Matrices and Quadratic Forms
Maths for Signals and Systems Linear Algebra in Engineering Lecture 18, Friday 18th November 2016 DR TANIA STATHAKI READER (ASSOCIATE PROFFESOR) IN SIGNAL.
Elementary Linear Algebra Anton & Rorres, 9th Edition
Lecture 13 Operations in Graphics and Geometric Modeling I: Projection, Rotation, and Reflection Shang-Hua Teng.
Eigen Decomposition Based on the slides by Mani Thomas and book by Gilbert Strang. Modified and extended by Longin Jan Latecki.
Linear Algebra Lecture 20.
Elementary Linear Algebra Anton & Rorres, 9th Edition
Vector Spaces, Subspaces
Linear Algebra Lecture 30.
Linear Algebra Lecture 35.
Eigenvalues and Eigenvectors
THE DIMENSION OF A VECTOR SPACE
Linear Algebra Lecture 28.
CHARACTERIZATIONS OF INVERTIBLE MATRICES
Orthogonality and Least Squares
Eigenvalues and Eigenvectors
Symmetric Matrices and Quadratic Forms
Presentation transcript:

Linear Algebra Lecture 41

Linear Algebra Lecture 41

Segment VI Orthogonality and Least Squares

Gram-Schmidt Process

Construct an orthogonal basis {v1, v2} for W. Example 1 Let W = Span {x1, x2}, where Construct an orthogonal basis {v1, v2} for W.

Example 2 Then {x1, x2, x3} is linearly independent and thus, a basis for a subspace W of R4. Construct an orthogonal basis for W.

Given a basis {x1, …, xp} for a subspace W of Rn , define Theorem Given a basis {x1, …, xp} for a subspace W of Rn , define …

Continued …

Then {v1, …,vp} is an orthogonal basis for W. In addition Continued Then {v1, …,vp} is an orthogonal basis for W. In addition Span {v1, …, vk}= Span {x1,…, xk} for

Orthonormal Bases

Example 3

Theorem If A is an m x n matrix with linearly independent columns, then A can be factored as A = QR …

Continued Where Q is an m x n matrix whose columns form an orthonormal basis for Col A and R is an n x n upper triangular invertible matrix with positive entries on its diagonal.

Proof of the Theorem

Find a QR factorization of Example 4 Find a QR factorization of

Construct an orthonormal basis for W. Example 5 Let W = Span {x1, x2}, where Construct an orthonormal basis for W.

Given a basis {x1, …, xp} for a subspace W of Rn , define Theorem Given a basis {x1, …, xp} for a subspace W of Rn , define …

Continued …

Then {v1, …,vp} is an orthogonal basis for W. In addition Continued Then {v1, …,vp} is an orthogonal basis for W. In addition Span {v1, …, vk}= Span {x1,…, xk} for

Decomposition Theorem The Orthogonal Decomposition Theorem Let W be a subspace of Rn. Then each y in Rn can be written uniquely in the form where is in W and z is in . …

In fact, if {u1, …, up} is any orthogonal basis of W, then Continued In fact, if {u1, …, up} is any orthogonal basis of W, then and z = y – . The vector is called the orthogonal projection of y onto W and often is written as projw y.

Best Approximation Theorem Let W be a subspace of Rn, y any vector in Rn, and the orthogonal projection of y onto W. Then is the closest point in W to y, in the sense that for all v in W distinct from . …

Continued The vector in this theorem is called the best approximation to y by elements of W.

Theorem

Linear Algebra Lecture 41