Download presentation
Presentation is loading. Please wait.
1
Lecture 12 Projection and Least Square Approximation Shang-Hua Teng
2
Line Fitting and Predication Input: Table of paired data values (x, y) –Some connection between x and y. –Example: height ------ weight –Example: revenue ------ stock price –Example: Yesterday’s temperature at Pittsburgh --------- today’s temperature at Boston Output: a and b that best predicates y from x: y = ax + b
3
Scatter Plot of Data Revenue Stock Price
4
Regression Line y = ax+b Revenue Stock Price
5
Predication with Regression Line y = ax+b Revenue Stock Price
6
When Life is Perfect y = ax+b Revenue Stock Price x1x1 x7x7 y7y7 x2x2 y1y1 y3y3 x3x3 y2y2 How do we find a and b
7
When Life is Perfect y = ax+b Revenue Stock Price x1x1 x7x7 y7y7 x2x2 y1y1 y3y3 x3x3 y2y2
8
How to Solve it? By Elimination What will happen?
9
Another Method: Try to Solve In general: if A x = b has a solution, then A T Ax = A T b has the same solution
10
When Life is not Perfect No perfect Regression Line y = ax+b Revenue Stock Price
11
When Life is not Perfect No perfect Regression Line y = ax+b Revenue Stock Price No solution!!!!!! What happen during elimination
12
Linear Algebra Magic In general: if A x = b has no solution, then A T Ax = A T b gives the best approximation
13
Least Squares No errors in x Errors in y Best Fit Find the line that minimize the norm of the y errors (sum of the squares)
14
When Life is not Perfect Least Square Approximation Revenue Stock Price
15
In General: When Ax = b Does not Have Solution Residue error Least Square Approximation: Find the best
16
One Dimension
18
In General
20
Least Square Approximation In general: if A x = b has no solution, then Solving A T Ax = A T b produces the least square approximation
21
Polynomial Regression Minimize the residual between the data points and the curve -- least-squares regression Data Find values of a 0, a 1, a 2, … a m Linear Quadratic Cubic General Parabola
22
Polynomial Regression Residual Sum of squared residuals Linear Equations
23
Least Square Solution Normal Equations
24
Example x01.01.52.32.54.05.16.06.57.08.19.0 y0.20.82.5 3.54.33.05.03.52.41.32.0 x9.311.011.312.113.114.015.516.017.517.819.020.0 y-0.3-1.3-3.0-4.0-4.9-4.0-5.2-3.0-3.5-1.6-1.4-0.1
25
Example
26
Regression Equation y = - 0.359 + 2.305x - 0.353x 2 + 0.012x 3
27
Projection Projection onto an axis (a,b) x axis is a vector subspace
28
Projection onto an Arbitrary Line Passing through 0 (a,b)
29
Projection on to a Plane
30
Projection onto a Subspace Input: 1. Given a vector subspace V in R m 2.A vector b in R m … Desirable Output: –A vector in x in V that is closest to b –The projection x of b in V –A vector x in V such that (b-x) is orthogonal to V
31
How to Describe a Vector Subspace V in R m If dim(V) = n, then V has n basis vectors –a 1, a 2, …, a n –They are independent V = C(A) where A = [a 1, a 2, …, a n ]
32
Projection onto a Subspace Input: 1. Given n independent vectors a 1, a 2, …, a n in R m 2.A vector b in R m … Desirable Output: –A vector in x in C([a 1, a 2, …, a n ]) that is closest to b –The projection x of b in C([a 1, a 2, …, a n ]) –A vector x in V such that (b-x) is orthogonal to C([a 1, a 2, …, a n ])
33
Think about this Picture C(A T ) N(A) RnRn RmRm C(A) N(A T ) xnxn A x n = 0 xrxr b A x r = b A x= b dim r dim n- r dim m- r
34
Projection on to a Line b a p
35
Projection Matrix: on to a Line b a p What matrix P has the property p = Pb
Similar presentations
© 2024 SlidePlayer.com Inc.
All rights reserved.