Lecture 12 Projection and Least Square Approximation Shang-Hua Teng.

Slides:



Advertisements
Similar presentations
Ordinary Least-Squares
Advertisements

Chapter 6 Method of Successive Quadratic Programming Professor Shi-Shang Jang National Tsing-Hua University Chemical Engineering Department.
STROUD Worked examples and exercises are in the text PROGRAMME F6 POLYNOMIAL EQUATIONS.
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 ~ Curve Fitting ~ Least Squares Regression Chapter.
Linear Algebra (Aljabar Linier) Week 13 Universitas Multimedia Nusantara Serpong, Tangerang Dr. Ananda Kusuma
Numeriska beräkningar i Naturvetenskap och Teknik Today’s topic: Approximations Least square method Interpolations Fit of polynomials Splines.
Solving Quadratic Equations Lesson 9-3
Definition  Regression Model  Regression Equation Y i =  0 +  1 X i ^ Given a collection of paired data, the regression equation algebraically describes.
COMP 116: Introduction to Scientific Programming Lecture 11: Linear Regression.
4.4 Application--OLS Estimation. Background When we do a study of data and are looking at the relationship between 2 variables, and have reason to believe.
Solving Equations by factoring Algebra I Mrs. Stoltzfus.
Solving Quadratic Equations Algebraically Lesson 2.2.
Solving Quadratic Equations by Factoring Algebra I.
Determinants Bases, linear Indep., etc Gram-Schmidt Eigenvalue and Eigenvectors Misc
3D Geometry for Computer Graphics. 2 The plan today Least squares approach  General / Polynomial fitting  Linear systems of equations  Local polynomial.
1 MF-852 Financial Econometrics Lecture 2 Matrix Operations in Econometrics, Optimization with Excel Roy J. Epstein Fall 2003.
NUU Department of Electrical Engineering Linear Algebra---Meiling CHEN1 Lecture 15 Projection Least squares Projection matrix.
Lecture 12 Least Square Approximation Shang-Hua Teng.
Lecture 13 Operations in Graphics and Geometric Modeling I: Projection, Rotation, and Reflection Shang-Hua Teng.
6 6.3 © 2012 Pearson Education, Inc. Orthogonality and Least Squares ORTHOGONAL PROJECTIONS.
Lecture 10 Dimensions, Independence, Basis and Complete Solution of Linear Systems Shang-Hua Teng.
Lecture 11 Fundamental Theorems of Linear Algebra Orthogonalily and Projection Shang-Hua Teng.
Computer Graphics Recitation The plan today Least squares approach  General / Polynomial fitting  Linear systems of equations  Local polynomial.
Last lecture summary independent vectors x
EXAMPLE 3 Write an equation for a function
Chapter 10 Real Inner Products and Least-Square (cont.)
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
Lecture 3: Bivariate Data & Linear Regression 1.Introduction 2.Bivariate Data 3.Linear Analysis of Data a)Freehand Linear Fit b)Least Squares Fit c)Interpolation/Extrapolation.
Length and Dot Product in R n Notes: is called a unit vector. Notes: The length of a vector is also called its norm. Chapter 5 Inner Product Spaces.
CHAPTER FIVE Orthogonality Why orthogonal? Least square problem Accuracy of Numerical computation.
Scientific Computing Linear Least Squares. Interpolation vs Approximation Recall: Given a set of (x,y) data points, Interpolation is the process of finding.
AN ORTHOGONAL PROJECTION
Biostatistics Lecture 17 6/15 & 6/16/2015. Chapter 17 – Correlation & Regression Correlation (Pearson’s correlation coefficient) Linear Regression Multiple.
1 Chapter 5 – Orthogonality and Least Squares Outline 5.1 Orthonormal Bases and Orthogonal Projections 5.2 Gram-Schmidt Process and QR Factorization 5.3.
Orthogonality and Least Squares
Factor: Factor: 1. s 2 r 2 – 4s 4 1. s 2 r 2 – 4s b b 3 c + 18b 2 c b b 3 c + 18b 2 c 2 3. xy + 3x – 2y xy + 3x – 2y -
Regression Regression relationship = trend + scatter
Objective: 6.4 Factoring and Solving Polynomial Equations 1 5 Minute Check  Simplify the expression
Lesson 5-11 Using Several Methods of Factoring
9.3 Graphing Quadratic Functions
Scientific Computing General Least Squares. Polynomial Least Squares Polynomial Least Squares: We assume that the class of functions is the class of all.
Lecture 16 - Approximation Methods CVEN 302 July 15, 2002.
Math 4030 – 11b Method of Least Squares. Model: Dependent (response) Variable Independent (control) Variable Random Error Objectives: Find (estimated)
Solving Equations by Factoring (x+3)(2x-1) (3x-1)(x-2)
Solving Quadratic Equations by Factoring. Zero-Product Property If ab=0, then either a=0, b=0 or both=0 States that if the product of two factors is zero.
Factor and Solve Polynomial Equations 2.3 (GREEN book)
Curve Fitting Introduction Least-Squares Regression Linear Regression Polynomial Regression Multiple Linear Regression Today’s class Numerical Methods.
Statistics 350 Lecture 2. Today Last Day: Section Today: Section 1.6 Homework #1: Chapter 1 Problems (page 33-38): 2, 5, 6, 7, 22, 26, 33, 34,
Section 8 Numerical Analysis CSCI E-24 José Luis Ramírez Herrán October 20, 2015.
Let W be a subspace of R n, y any vector in R n, and the orthogonal projection of y onto W. …
ELEC 413 Linear Least Squares. Regression Analysis The study and measure of the statistical relationship that exists between two or more variables Two.
2.2 Solving Quadratic Equations Algebraically Quadratic Equation: Equation written in the form ax 2 + bx + c = 0 ( where a ≠ 0). Zero Product Property:
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
Polynomials and Polynomial Functions
The Quadratic Formula.
Linear regression Fitting a straight line to observations.
Orthogonal Projection
Least Squares Fitting A mathematical procedure for finding the best-fitting curve to a given set of points by minimizing the sum of the squares of the.
Orthogonality and Least Squares
Objectives Solve quadratic equations by graphing or factoring.
Linear Algebra Lecture 40.
Section 2: Linear Regression.
Nonlinear Fitting.
Lecture 13 Operations in Graphics and Geometric Modeling I: Projection, Rotation, and Reflection Shang-Hua Teng.
Discrete Least Squares Approximation
Solving Quadratic Equations by Factoring
Lecture 13 Operations in Graphics and Geometric Modeling I: Projection, Rotation, and Reflection Shang-Hua Teng.
Chapter 7: Systems of Equations and Inequalities; Matrices
Orthogonality and Least Squares
Approximation of Functions
Presentation transcript:

Lecture 12 Projection and Least Square Approximation Shang-Hua Teng

Line Fitting and Predication Input: Table of paired data values (x, y) –Some connection between x and y. –Example: height weight –Example: revenue stock price –Example: Yesterday’s temperature at Pittsburgh today’s temperature at Boston Output: a and b that best predicates y from x: y = ax + b

      Scatter Plot of Data Revenue Stock Price

        Regression Line y = ax+b Revenue Stock Price

        Predication with Regression Line y = ax+b Revenue Stock Price

When Life is Perfect y = ax+b Revenue Stock Price  x1x1 x7x7 y7y7 x2x2 y1y1 y3y3 x3x3 y2y2 How do we find a and b

When Life is Perfect y = ax+b Revenue Stock Price         x1x1 x7x7 y7y7 x2x2 y1y1 y3y3 x3x3 y2y2

How to Solve it? By Elimination What will happen?

Another Method: Try to Solve In general: if A x = b has a solution, then A T Ax = A T b has the same solution

        When Life is not Perfect No perfect Regression Line y = ax+b Revenue Stock Price

        When Life is not Perfect No perfect Regression Line y = ax+b Revenue Stock Price No solution!!!!!! What happen during elimination

Linear Algebra Magic In general: if A x = b has no solution, then A T Ax = A T b gives the best approximation

Least Squares No errors in x Errors in y Best Fit Find the line that minimize the norm of the y errors (sum of the squares)

        When Life is not Perfect Least Square Approximation Revenue Stock Price

In General: When Ax = b Does not Have Solution Residue error Least Square Approximation: Find the best

One Dimension

In General

Least Square Approximation In general: if A x = b has no solution, then Solving A T Ax = A T b produces the least square approximation

Polynomial Regression Minimize the residual between the data points and the curve -- least-squares regression Data Find values of a 0, a 1, a 2, … a m Linear Quadratic Cubic General Parabola

Polynomial Regression Residual Sum of squared residuals Linear Equations

Least Square Solution Normal Equations

Example x y x y

Example

Regression Equation y = x x x 3

Projection Projection onto an axis (a,b) x axis is a vector subspace

Projection onto an Arbitrary Line Passing through 0 (a,b)

Projection on to a Plane

Projection onto a Subspace Input: 1. Given a vector subspace V in R m 2.A vector b in R m … Desirable Output: –A vector in x in V that is closest to b –The projection x of b in V –A vector x in V such that (b-x) is orthogonal to V

How to Describe a Vector Subspace V in R m If dim(V) = n, then V has n basis vectors –a 1, a 2, …, a n –They are independent V = C(A) where A = [a 1, a 2, …, a n ]

Projection onto a Subspace Input: 1. Given n independent vectors a 1, a 2, …, a n in R m 2.A vector b in R m … Desirable Output: –A vector in x in C([a 1, a 2, …, a n ]) that is closest to b –The projection x of b in C([a 1, a 2, …, a n ]) –A vector x in V such that (b-x) is orthogonal to C([a 1, a 2, …, a n ])

Think about this Picture C(A T ) N(A) RnRn RmRm C(A) N(A T ) xnxn A x n = 0 xrxr b A x r = b A x= b dim r dim n- r dim m- r

Projection on to a Line b a  p

Projection Matrix: on to a Line b a  p What matrix P has the property p = Pb