6 6.5 © 2016 Pearson Education, Ltd. Orthogonality and Least Squares LEAST-SQUARES PROBLEMS.

Slides:



Advertisements
Similar presentations
Vector Spaces A set V is called a vector space over a set K denoted V(K) if is an Abelian group, is a field, and For every element vV and K there exists.
Advertisements

Linear Equations in Linear Algebra
2 2.3 © 2012 Pearson Education, Inc. Matrix Algebra CHARACTERIZATIONS OF INVERTIBLE MATRICES.
4 4.3 © 2012 Pearson Education, Inc. Vector Spaces LINEARLY INDEPENDENT SETS; BASES.
Eigenvalues and Eigenvectors
6 6.1 © 2012 Pearson Education, Inc. Orthogonality and Least Squares INNER PRODUCT, LENGTH, AND ORTHOGONALITY.
Symmetric Matrices and Quadratic Forms
THE DIMENSION OF A VECTOR SPACE
Symmetric Matrices and Quadratic Forms
Chapter 5 Orthogonality
Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues
Lecture 12 Least Square Approximation Shang-Hua Teng.
Orthogonality and Least Squares
4 4.6 © 2012 Pearson Education, Inc. Vector Spaces RANK.
1 © 2012 Pearson Education, Inc. Matrix Algebra THE INVERSE OF A MATRIX.
Matrix Algebra THE INVERSE OF A MATRIX © 2012 Pearson Education, Inc.
6 6.1 © 2012 Pearson Education, Inc. Orthogonality and Least Squares INNER PRODUCT, LENGTH, AND ORTHOGONALITY.
6 6.3 © 2012 Pearson Education, Inc. Orthogonality and Least Squares ORTHOGONAL PROJECTIONS.
5.1 Orthogonality.
Linear Algebra Lecture 25.
Chapter 5: The Orthogonality and Least Squares
CHAPTER FIVE Orthogonality Why orthogonal? Least square problem Accuracy of Numerical computation.
4 4.2 © 2012 Pearson Education, Inc. Vector Spaces NULL SPACES, COLUMN SPACES, AND LINEAR TRANSFORMATIONS.
4 4.4 © 2012 Pearson Education, Inc. Vector Spaces COORDINATE SYSTEMS.
Linear Equations in Linear Algebra
AN ORTHOGONAL PROJECTION
1 1.7 © 2016 Pearson Education, Inc. Linear Equations in Linear Algebra LINEAR INDEPENDENCE.
Orthogonality and Least Squares
6 6.1 © 2016 Pearson Education, Inc. Orthogonality and Least Squares INNER PRODUCT, LENGTH, AND ORTHOGONALITY.
Elementary Linear Algebra Anton & Rorres, 9th Edition
4 4.6 © 2012 Pearson Education, Inc. Vector Spaces RANK.
Vector Spaces RANK © 2016 Pearson Education, Inc..
4.8 Rank Rank enables one to relate matrices to vectors, and vice versa. Definition Let A be an m  n matrix. The rows of A may be viewed as row vectors.
Section 5.1 Length and Dot Product in ℝ n. Let v = ‹v 1­­, v 2, v 3,..., v n › and w = ‹w 1­­, w 2, w 3,..., w n › be vectors in ℝ n. The dot product.
4 © 2012 Pearson Education, Inc. Vector Spaces 4.4 COORDINATE SYSTEMS.
2 2.9 © 2016 Pearson Education, Inc. Matrix Algebra DIMENSION AND RANK.
Class 24: Question 1 Which of the following set of vectors is not an orthogonal set?
4.3 Linearly Independent Sets; Bases
CHARACTERIZATIONS OF INVERTIBLE MATRICES
4.8 Rank Rank enables one to relate matrices to vectors, and vice versa. Definition Let A be an m  n matrix. The rows of A may be viewed as row vectors.
1 MAC 2103 Module 11 lnner Product Spaces II. 2 Rev.F09 Learning Objectives Upon completing this module, you should be able to: 1. Construct an orthonormal.
6 6.3 © 2016 Pearson Education, Inc. Orthogonality and Least Squares ORTHOGONAL PROJECTIONS.
5 5.1 © 2016 Pearson Education, Ltd. Eigenvalues and Eigenvectors EIGENVECTORS AND EIGENVALUES.
4 4.1 © 2016 Pearson Education, Ltd. Vector Spaces VECTOR SPACES AND SUBSPACES.
4 4.5 © 2016 Pearson Education, Inc. Vector Spaces THE DIMENSION OF A VECTOR SPACE.
4 4.2 © 2016 Pearson Education, Inc. Vector Spaces NULL SPACES, COLUMN SPACES, AND LINEAR TRANSFORMATIONS.
4 Vector Spaces 4.1 Vector Spaces and Subspaces 4.2 Null Spaces, Column Spaces, and Linear Transformations 4.3 Linearly Independent Sets; Bases 4.4 Coordinate.
2 2.2 © 2016 Pearson Education, Ltd. Matrix Algebra THE INVERSE OF A MATRIX.
1 1.4 Linear Equations in Linear Algebra THE MATRIX EQUATION © 2016 Pearson Education, Ltd.
6 6.1 © 2016 Pearson Education, Ltd. Orthogonality and Least Squares INNER PRODUCT, LENGTH, AND ORTHOGONALITY.
Let W be a subspace of R n, y any vector in R n, and the orthogonal projection of y onto W. …
Lecture XXVII. Orthonormal Bases and Projections Suppose that a set of vectors {x 1,…,x r } for a basis for some space S in R m space such that r  m.
CHARACTERIZATIONS OF INVERTIBLE MATRICES
Least Squares Approximations
Orthogonality and Least Squares
Least Squares Approximations
Orthogonal Projection
Orthogonality and Least Squares
Linear Algebra Lecture 39.
Determinants CHANGE OF BASIS © 2016 Pearson Education, Inc.
Symmetric Matrices and Quadratic Forms
Linear Algebra Lecture 41.
Vector Spaces RANK © 2012 Pearson Education, Inc..
THE DIMENSION OF A VECTOR SPACE
Matrix Algebra THE INVERSE OF A MATRIX © 2012 Pearson Education, Inc.
NULL SPACES, COLUMN SPACES, AND LINEAR TRANSFORMATIONS
Vector Spaces COORDINATE SYSTEMS © 2012 Pearson Education, Inc.
CHARACTERIZATIONS OF INVERTIBLE MATRICES
Orthogonality and Least Squares
Symmetric Matrices and Quadratic Forms
Presentation transcript:

6 6.5 © 2016 Pearson Education, Ltd. Orthogonality and Least Squares LEAST-SQUARES PROBLEMS

Slide © 2016 Pearson Education, Ltd. LEAST-SQUARES PROBLEMS

Slide LEAST-SQUARES PROBLEMS  Solution of the General Least-Squares Problem  Given A and b, apply the Best Approximation Theorem to the subspace Col A.  Let © 2016 Pearson Education, Ltd.

Slide © 2016 Pearson Education, Ltd. SOLUTION OF THE GENREAL LEAST-SQUARES PROBLEM

Slide © 2016 Pearson Education, Ltd. SOLUTION OF THE GENREAL LEAST-SQUARES PROBLEM

Slide © 2016 Pearson Education, Ltd. SOLUTION OF THE GENREAL LEAST-SQUARES PROBLEM  Since each is a row of A T, (2)  Thus  These calculations show that each least-squares solution of satisfies the equation (3)  The matrix equation (3) represents a system of equations called the normal equations for.  A solution of (3) is often denoted by.

Slide © 2016 Pearson Education, Ltd. SOLUTION OF THE GENREAL LEAST-SQUARES PROBLEM  Theorem 13: The set of least-squares solutions of coincides with the nonempty set of solutions of the normal equation.  Proof: The set of least-squares solutions is nonempty and each least-squares solution satisfies the normal equations.  Conversely, suppose satisfies.  Then satisfies (2), which shows that is orthogonal to the rows of A T and hence is orthogonal to the columns of A.  Since the columns of A span Col A, the vector is orthogonal to all of Col A.

Slide © 2016 Pearson Education, Ltd. SOLUTION OF THE GENREAL LEAST-SQUARES PROBLEM

Slide © 2016 Pearson Education, Ltd. SOLUTION OF THE GENREAL LEAST-SQUARES PROBLEM  Example 1: Find a least-squares solution of the inconsistent system for  Solution: To use normal equations (3), compute:

Slide © 2016 Pearson Education, Ltd. SOLUTION OF THE GENREAL LEAST-SQUARES PROBLEM  Then the equation becomes

Slide © 2016 Pearson Education, Ltd. SOLUTION OF THE GENREAL LEAST-SQUARES PROBLEM  Row operations can be used to solve the system on the previous slide, but since A T A is invertible and, it is probably faster to compute and then solve as

Slide © 2016 Pearson Education, Ltd. SOLUTION OF THE GENREAL LEAST-SQUARES PROBLEM

Slide © 2016 Pearson Education, Ltd. ALTERNATIVE CALCULATIONS OF LEAST- SQUARES SOLUTIONS

Slide © 2016 Pearson Education, Ltd. ALTERNATIVE CALCULATIONS OF LEAST- SQUARES SOLUTIONS  Now that is known, we can solve.  But this is trivial, since we already know weights to place on the columns of A to produce.  It is clear from (5) that

Slide © 2016 Pearson Education, Ltd. ALTERNATIVE CALCULATIONS OF LEAST- SQUARES SOLUTIONS

Slide © 2016 Pearson Education, Ltd. ALTERNATIVE CALCULATIONS OF LEAST- SQUARES SOLUTIONS  The columns of Q form an orthonormal basis for Col A. (by Theorem 12)  Hence, by Theorem 10, QQ T b is the orthogonal projection of b onto Col A.  Then, which shows that is a least-squares solution of.  The uniqueness of follows from Theorem 14.