Orthogonality and Least Squares

Slides:



Advertisements
Similar presentations
Linear Equations in Linear Algebra
Advertisements

2 2.3 © 2012 Pearson Education, Inc. Matrix Algebra CHARACTERIZATIONS OF INVERTIBLE MATRICES.
4 4.3 © 2012 Pearson Education, Inc. Vector Spaces LINEARLY INDEPENDENT SETS; BASES.
Eigenvalues and Eigenvectors
6 6.1 © 2012 Pearson Education, Inc. Orthogonality and Least Squares INNER PRODUCT, LENGTH, AND ORTHOGONALITY.
THE DIMENSION OF A VECTOR SPACE
Symmetric Matrices and Quadratic Forms
Chapter 5 Orthogonality
Lecture 12 Least Square Approximation Shang-Hua Teng.
Class 25: Question 1 Which of the following vectors is orthogonal to the row space of A?
Orthogonality and Least Squares
4 4.6 © 2012 Pearson Education, Inc. Vector Spaces RANK.
Linear Equations in Linear Algebra
1 © 2012 Pearson Education, Inc. Matrix Algebra THE INVERSE OF A MATRIX.
Matrix Algebra THE INVERSE OF A MATRIX © 2012 Pearson Education, Inc.
6 6.1 © 2012 Pearson Education, Inc. Orthogonality and Least Squares INNER PRODUCT, LENGTH, AND ORTHOGONALITY.
6 6.3 © 2012 Pearson Education, Inc. Orthogonality and Least Squares ORTHOGONAL PROJECTIONS.
5 5.1 © 2012 Pearson Education, Inc. Eigenvalues and Eigenvectors EIGENVECTORS AND EIGENVALUES.
5.1 Orthogonality.
Length and Dot Product in R n Notes: is called a unit vector. Notes: The length of a vector is also called its norm. Chapter 5 Inner Product Spaces.
Linear Algebra Lecture 25.
Chapter 5: The Orthogonality and Least Squares
CHAPTER FIVE Orthogonality Why orthogonal? Least square problem Accuracy of Numerical computation.
4 4.2 © 2012 Pearson Education, Inc. Vector Spaces NULL SPACES, COLUMN SPACES, AND LINEAR TRANSFORMATIONS.
4 4.4 © 2012 Pearson Education, Inc. Vector Spaces COORDINATE SYSTEMS.
1 1.4 Linear Equations in Linear Algebra THE MATRIX EQUATION © 2016 Pearson Education, Inc.
Linear Equations in Linear Algebra
AN ORTHOGONAL PROJECTION
6 6.1 © 2016 Pearson Education, Inc. Orthogonality and Least Squares INNER PRODUCT, LENGTH, AND ORTHOGONALITY.
Chapter 5 Eigenvalues and Eigenvectors 大葉大學 資訊工程系 黃鈴玲 Linear Algebra.
Elementary Linear Algebra Anton & Rorres, 9th Edition
4 4.6 © 2012 Pearson Education, Inc. Vector Spaces RANK.
Vector Spaces RANK © 2016 Pearson Education, Inc..
4.8 Rank Rank enables one to relate matrices to vectors, and vice versa. Definition Let A be an m  n matrix. The rows of A may be viewed as row vectors.
4 © 2012 Pearson Education, Inc. Vector Spaces 4.4 COORDINATE SYSTEMS.
Class 24: Question 1 Which of the following set of vectors is not an orthogonal set?
4.3 Linearly Independent Sets; Bases
CHARACTERIZATIONS OF INVERTIBLE MATRICES
4.8 Rank Rank enables one to relate matrices to vectors, and vice versa. Definition Let A be an m  n matrix. The rows of A may be viewed as row vectors.
1 MAC 2103 Module 11 lnner Product Spaces II. 2 Rev.F09 Learning Objectives Upon completing this module, you should be able to: 1. Construct an orthonormal.
6 6.3 © 2016 Pearson Education, Inc. Orthogonality and Least Squares ORTHOGONAL PROJECTIONS.
5 5.1 © 2016 Pearson Education, Ltd. Eigenvalues and Eigenvectors EIGENVECTORS AND EIGENVALUES.
4 4.5 © 2016 Pearson Education, Inc. Vector Spaces THE DIMENSION OF A VECTOR SPACE.
4 4.2 © 2016 Pearson Education, Inc. Vector Spaces NULL SPACES, COLUMN SPACES, AND LINEAR TRANSFORMATIONS.
4 Vector Spaces 4.1 Vector Spaces and Subspaces 4.2 Null Spaces, Column Spaces, and Linear Transformations 4.3 Linearly Independent Sets; Bases 4.4 Coordinate.
1 1.4 Linear Equations in Linear Algebra THE MATRIX EQUATION © 2016 Pearson Education, Ltd.
6 6.1 © 2016 Pearson Education, Ltd. Orthogonality and Least Squares INNER PRODUCT, LENGTH, AND ORTHOGONALITY.
Let W be a subspace of R n, y any vector in R n, and the orthogonal projection of y onto W. …
6 6.5 © 2016 Pearson Education, Ltd. Orthogonality and Least Squares LEAST-SQUARES PROBLEMS.
REVIEW Linear Combinations Given vectors and given scalars
Linear Equations in Linear Algebra
CHARACTERIZATIONS OF INVERTIBLE MATRICES
Least Squares Approximations
Orthogonality and Least Squares
Linear Equations in Linear Algebra
Least Squares Approximations
Orthogonal Projection
Orthogonality and Least Squares
Linear Algebra Lecture 40.
Linear Algebra Lecture 39.
Determinants CHANGE OF BASIS © 2016 Pearson Education, Inc.
Symmetric Matrices and Quadratic Forms
Linear Algebra Lecture 6.
Linear Algebra Lecture 41.
Vector Spaces RANK © 2012 Pearson Education, Inc..
NULL SPACES, COLUMN SPACES, AND LINEAR TRANSFORMATIONS
Vector Spaces COORDINATE SYSTEMS © 2012 Pearson Education, Inc.
CHARACTERIZATIONS OF INVERTIBLE MATRICES
Orthogonality and Least Squares
Symmetric Matrices and Quadratic Forms
Presentation transcript:

Orthogonality and Least Squares LEAST-SQUARES PROBLEMS © 2012 Pearson Education, Inc.

LEAST-SQUARES PROBLEMS Definition: If A is and b is in , a least-squares solution of is an in such that for all x in . The most important aspect of the least-squares problem is that no matter what x we select, the vector Ax will necessarily be in the column space, Col A. So we seek an x that makes Ax the closest point in Col A to b. See the figure on the next slide. © 2012 Pearson Education, Inc.

LEAST-SQUARES PROBLEMS Solution of the General Least-Squares Problem Given A and b, apply the Best Approximation Theorem to the subspace Col A. Let © 2012 Pearson Education, Inc.

SOLUTION OF THE GENREAL LEAST-SQUARES PROBLEM Because is in the column space A, the equation is consistent, and there is an in such that ----(1) Since is the closest point in Col A to b, a vector is a least-squares solution of if and only if satisfies (1). Such an in is a list of weights that will build out of the columns of A. See the figure on the next slide. © 2012 Pearson Education, Inc.

SOLUTION OF THE GENREAL LEAST-SQUARES PROBLEM Suppose satisfies . By the Orthogonal Decomposition Theorem, the projection has the property that is orthogonal to Col A, so is orthogonal to each column of A. If aj is any column of A, then , and . © 2012 Pearson Education, Inc.

SOLUTION OF THE GENREAL LEAST-SQUARES PROBLEM Since each is a row of AT, ----(2) Thus These calculations show that each least-squares solution of satisfies the equation ----(3) The matrix equation (3) represents a system of equations called the normal equations for . A solution of (3) is often denoted by . © 2012 Pearson Education, Inc.

SOLUTION OF THE GENREAL LEAST-SQUARES PROBLEM Theorem 13: The set of least-squares solutions of coincides with the nonempty set of solutions of the normal equation . Proof: The set of least-squares solutions is nonempty and each least-squares solution satisfies the normal equations. Conversely, suppose satisfies . Then satisfies (2), which shows that is orthogonal to the rows of AT and hence is orthogonal to the columns of A. Since the columns of A span Col A, the vector is orthogonal to all of Col A. © 2012 Pearson Education, Inc.

SOLUTION OF THE GENREAL LEAST-SQUARES PROBLEM Hence the equation is a decomposition of b into the sum of a vector in Col A and a vector orthogonal to Col A. By the uniqueness of the orthogonal decomposition, must be the orthogonal projection of b onto Col A. That is, and is a least-squares solution. © 2012 Pearson Education, Inc.

SOLUTION OF THE GENREAL LEAST-SQUARES PROBLEM Example 1: Find a least-squares solution of the inconsistent system for Solution: To use normal equations (3), compute: © 2012 Pearson Education, Inc.

SOLUTION OF THE GENREAL LEAST-SQUARES PROBLEM Then the equation becomes © 2012 Pearson Education, Inc.

SOLUTION OF THE GENREAL LEAST-SQUARES PROBLEM Row operations can be used to solve the system on the previous slide, but since ATA is invertible and , it is probably faster to compute and then solve as © 2012 Pearson Education, Inc.

SOLUTION OF THE GENREAL LEAST-SQUARES PROBLEM Theorem 14: Let A be an matrix. The following statements are logically equivalent: The equation has a unique least-squares solution for each b in . The columns of A are linearly independent. The matrix ATA is invertible. When these statements are true, the least-squares solution is given by ----(4) When a least-squares solution is used to produce as an approximation to b, the distance from b to is called the least-squares error of this approximation. © 2012 Pearson Education, Inc.

ALTERNATIVE CALCULATIONS OF LEAST-SQUARES SOLUTIONS Example 2: Find a least-squares solution of for Solution: Because the columns a1 and a2 of A are orthogonal, the orthogonal projection of b onto Col A is given by ----(5) © 2012 Pearson Education, Inc.

ALTERNATIVE CALCULATIONS OF LEAST-SQUARES SOLUTIONS Now that is known, we can solve . But this is trivial, since we already know weights to place on the columns of A to produce . It is clear from (5) that © 2012 Pearson Education, Inc.

ALTERNATIVE CALCULATIONS OF LEAST-SQUARES SOLUTIONS Theorem 15: Given an matrix A with linearly independent columns, let be a QR factorization of A. Then, for each b in , the equation has a unique least-squares solution, given by ----(6) Proof: Let . Then © 2012 Pearson Education, Inc.

ALTERNATIVE CALCULATIONS OF LEAST-SQUARES SOLUTIONS The columns of Q form an orthonormal basis for Col A. Hence, by Theorem 10, QQTb is the orthogonal projection of b onto Col A. Then , which shows that is a least-squares solution of . The uniqueness of follows from Theorem 14. © 2012 Pearson Education, Inc.