Linear Least Squares QR Factorization. Systems of linear equations Problem to solve: M x = b Given M x = b : Is there a solution? Is the solution unique?

Slides:



Advertisements
Similar presentations
3D Geometry for Computer Graphics
Advertisements

Least Squares example There are 3 mountains u,y,z that from one site have been measured as 2474 ft., 3882 ft., and 4834 ft.. But from u, y looks 1422 ft.
統計計算與模擬 政治大學統計系余清祥 2004 年 3 月 29 日至 4 月 14 日 第七、九週:矩陣運算
Iterative Methods and QR Factorization Lecture 5 Alessandra Nardi Thanks to Prof. Jacob White, Suvranu De, Deepak Ramaswamy, Michal Rewienski, and Karen.
Determinants Bases, linear Indep., etc Gram-Schmidt Eigenvalue and Eigenvectors Misc
Symmetric Matrices and Quadratic Forms
1cs542g-term Notes  Simpler right-looking derivation (sorry):
Chapter 5 Orthogonality
Computer Graphics Recitation 5.
QR Factorization –Direct Method to solve linear systems Problems that generate Singular matrices –Modified Gram-Schmidt Algorithm –QR Pivoting Matrix must.
Some useful linear algebra. Linearly independent vectors span(V): span of vector space V is all linear combinations of vectors v i, i.e.
1cs542g-term Notes  r 2 log r is technically not defined at r=0 but can be smoothly continued to =0 there  Question (not required in assignment):
Math for CSLecture 41 Linear Least Squares Problem Over-determined systems Minimization problem: Least squares norm Normal Equations Singular Value Decomposition.
QR-RLS Algorithm Cy Shimabukuro EE 491D
The Terms that You Have to Know! Basis, Linear independent, Orthogonal Column space, Row space, Rank Linear combination Linear transformation Inner product.
Lecture 12 Least Square Approximation Shang-Hua Teng.
Lecture 13 Operations in Graphics and Geometric Modeling I: Projection, Rotation, and Reflection Shang-Hua Teng.
Math for CSTutorial 41 Contents: 1.Least squares solution for overcomplete linear systems. 2.… via normal equations 3.… via A = QR factorization 4.… via.
Class 25: Question 1 Which of the following vectors is orthogonal to the row space of A?
Orthogonality and Least Squares
MOHAMMAD IMRAN DEPARTMENT OF APPLIED SCIENCES JAHANGIRABAD EDUCATIONAL GROUP OF INSTITUTES.
ECE 530 – Analysis Techniques for Large-Scale Electrical Systems
6 6.3 © 2012 Pearson Education, Inc. Orthogonality and Least Squares ORTHOGONAL PROJECTIONS.
Scientific Computing QR Factorization Part 1 – Orthogonal Matrices, Reflections and Rotations.
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
Length and Dot Product in R n Notes: is called a unit vector. Notes: The length of a vector is also called its norm. Chapter 5 Inner Product Spaces.
Inner Product Spaces Euclidean n-space: Euclidean n-space: vector lengthdot productEuclidean n-space R n was defined to be the set of all ordered.
Chapter 5: The Orthogonality and Least Squares
CHAPTER FIVE Orthogonality Why orthogonal? Least square problem Accuracy of Numerical computation.
Gram-Schmidt Orthogonalization
Linear Algebra (Aljabar Linier) Week 10 Universitas Multimedia Nusantara Serpong, Tangerang Dr. Ananda Kusuma
AN ORTHOGONAL PROJECTION
Linear algebra: matrix Eigen-value Problems
1 Chapter 5 – Orthogonality and Least Squares Outline 5.1 Orthonormal Bases and Orthogonal Projections 5.2 Gram-Schmidt Process and QR Factorization 5.3.
Orthogonality and Least Squares
AGC DSP AGC DSP Professor A G Constantinides©1 Hilbert Spaces Linear Transformations and Least Squares: Hilbert Spaces.
Elementary Linear Algebra Anton & Rorres, 9th Edition
MATH 685/ CSI 700/ OR 682 Lecture Notes Lecture 4. Least squares.
Chap. 5 Inner Product Spaces 5.1 Length and Dot Product in R n 5.2 Inner Product Spaces 5.3 Orthonormal Bases: Gram-Schmidt Process 5.4 Mathematical Models.
Elementary Linear Algebra Anton & Rorres, 9 th Edition Lecture Set – 07 Chapter 7: Eigenvalues, Eigenvectors.
Direct Methods for Linear Systems Lecture 3 Alessandra Nardi Thanks to Prof. Jacob White, Suvranu De, Deepak Ramaswamy, Michal Rewienski, and Karen Veroy.
Solving Quadratic Equations Quadratic Equations: Think of other examples?
Class 24: Question 1 Which of the following set of vectors is not an orthogonal set?
Matrices and Determinants
1 MAC 2103 Module 11 lnner Product Spaces II. 2 Rev.F09 Learning Objectives Upon completing this module, you should be able to: 1. Construct an orthonormal.
Instructor: Mircea Nicolescu Lecture 8 CS 485 / 685 Computer Vision.
ECE 530 – Analysis Techniques for Large-Scale Electrical Systems Prof. Hao Zhu Dept. of Electrical and Computer Engineering University of Illinois at Urbana-Champaign.
Feature Generation and Cluster-based Feature Selection.
Chapter 12: Data Analysis by linear least squares Overview: Formulate problem as an over-determined linear system of equations Solve the linear system.
Let W be a subspace of R n, y any vector in R n, and the orthogonal projection of y onto W. …
6 6.5 © 2016 Pearson Education, Ltd. Orthogonality and Least Squares LEAST-SQUARES PROBLEMS.
Inner Product Spaces Euclidean n-space: Euclidean n-space: vector lengthdot productEuclidean n-space R n was defined to be the set of all ordered.
Lecture 11 Inner Product Spaces Last Time Change of Basis (Cont.) Length and Dot Product in R n Inner Product Spaces Elementary Linear Algebra R. Larsen.
Lecture XXVII. Orthonormal Bases and Projections Suppose that a set of vectors {x 1,…,x r } for a basis for some space S in R m space such that r  m.
QR decomposition: A = QR Q is orthonormal R is upper triangular To find QR decomposition: 1.) Q: Use Gram-Schmidt to find orthonormal basis for column.
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
Review of Linear Algebra
Euclidean Inner Product on Rn
Orthogonality and Least Squares
Chapter 10: Solving Linear Systems of Equations
Lecture 11 Matrices and Linear Algebra with MATLAB
Orthogonality and Least Squares
Linear Algebra Lecture 39.
~ Least Squares example
Lecture 13 Operations in Graphics and Geometric Modeling I: Projection, Rotation, and Reflection Shang-Hua Teng.
Elementary Linear Algebra Anton & Rorres, 9th Edition
~ Least Squares example
Lecture 13 Operations in Graphics and Geometric Modeling I: Projection, Rotation, and Reflection Shang-Hua Teng.
Linear Algebra Lecture 41.
Orthogonality and Least Squares
Presentation transcript:

Linear Least Squares QR Factorization

Systems of linear equations Problem to solve: M x = b Given M x = b : Is there a solution? Is the solution unique?

Systems of linear equations Find a set of weights x so that the weighted sum of the columns of the matrix M is equal to the right hand side b

Systems of linear equations - Existence A solution exists when b is in the span of the columns of M A solution exists if: There exist weights, x 1, …., x N, such that:

Systems of linear equations - Uniqueness A solution is unique only if the columns of M are linearly independent. Then: Mx = b  Mx + My= b  M(x+y) = b Suppose there exist weights, y 1, …., y N, not all zero, such that:

QR factorization 1 A matrix Q is said to be orthogonal if its columns are orthonormal, i.e. Q T ·Q=I. Orthogonal transformations preserve the Euclidean norm since Orthogonal matrices can transform vectors in various ways, such as rotation or reflections but they do not change the Euclidean length of the vector. Hence, they preserve the solution to a linear least squares problem.

QR factorization 2 Any matrix A (m·n) can be represented as A = Q·R,where Q (m·n) is orthonormal and R (n·n) is upper triangular:

QR factorization 2 Given A, let its QR decomposition be given as A=Q·R, where Q is an (m x n) orthonormal matrix and R is upper triangular. QR factorization transform the linear least square problem into a triangular least squares. Q·R·x = b R·x = Q T ·b x=R -1 ·Q T ·b Matlab Code:

Normal Equations Consider the system It can be a result of some physical measurements, which usually incorporate some errors. Since, we can not solve it exactly, we would like to minimize the error: r=b-Ax r 2 =r T r=(b-Ax) T (b-Ax)=b T b-2x T A T b+x T A T Ax (r 2 ) x =0 - zero derivative is a (necessary) minimum condition -2A T b+2A T Ax=0; A T Ax = A T b; – Normal Equations

Normal Equations 2 A T Ax = A T b – Normal Equations

Least squares via A=QR decomposition A (m,n) =Q (m,n) R (n,n), Q is orthogonal, therefore Q T Q=I. QRx=b R (n,n) x=Q T (n,m) b (m,1) -well defined linear system x=R -1 Q T b Q is found by Gram=Schmidt orthogonalization of A. How to find R? QR=A Q T QR=Q T A, but Q is orthogonal, therefore Q T Q=I: R=Q T A R is upper triangular, since in orthogonalization procedure only a 1,..a k (without a k+1,…) are used to produce q k

Least squares via A=QR decomposition 2 Let us check the correctness: QRx=b Rx=Q T b x=R -1 Q T b

Last lecture reminder QR Factorization – By picture

For i = 1 to N “For each Target Column” For j = 1 to i-1 “For each Source Column left of target” end Normalize Orthogonalize Search Direction QR Factorization – Minimization View Minimization Algorithm