Presentation is loading. Please wait.

Presentation is loading. Please wait.

Lecture 12 Least Square Approximation Shang-Hua Teng.

Similar presentations


Presentation on theme: "Lecture 12 Least Square Approximation Shang-Hua Teng."— Presentation transcript:

1 Lecture 12 Least Square Approximation Shang-Hua Teng

2 Line Fitting and Predication Input: Table of paired data values (x, y) –Some connection between x and y. –Example: height ------ weight –Example: revenue ------ stock price –Example: Yesterday’s temperature at Pittsburgh --------- today’s temperature at Boston Output: a and b that best predicates y from x: y = ax + b

3       Scatter Plot of Data Revenue Stock Price

4         Regression Line y = ax+b Revenue Stock Price

5         Predication with Regression Line y = ax+b Revenue Stock Price

6 When Life is Perfect y = ax+b Revenue Stock Price  x1x1 x7x7 y7y7 x2x2 y1y1 y3y3 x3x3 y2y2 How do we find a and b

7 When Life is Perfect y = ax+b Revenue Stock Price         x1x1 x7x7 y7y7 x2x2 y1y1 y3y3 x3x3 y2y2

8 How to Solve it? By Elimination What will happen?

9 Another Method: Try to Solve In general: if A x = b has a solution, then A T Ax = A T b has the same solution

10         When Life is not Perfect No perfect Regression Line y = ax+b Revenue Stock Price

11         When Life is not Perfect No perfect Regression Line y = ax+b Revenue Stock Price No solution!!!!!! What happen during elimination

12 Linear Algebra Magic In general: if A x = b has no solution, then A T Ax = A T b gives the best approximation

13 Least Squares No errors in x Errors in y Best Fit Find the line that minimize the norm of the y errors (sum of the squares)

14         When Life is not Perfect Least Square Approximation Revenue Stock Price

15 In General: When Ax = b Does not Have Solution Residue error Least Square Approximation: Find the best

16 One Dimension

17

18 In General

19

20 Least Square Approximation In general: if A x = b has no solution, then Solving A T Ax = A T b produces the least square approximation

21 Polynomial Regression Minimize the residual between the data points and the curve -- least-squares regression Data Find values of a 0, a 1, a 2, … a m Linear Quadratic Cubic General Parabola

22 Polynomial Regression Residual Sum of squared residuals Linear Equations

23 Least Square Solution Normal Equations

24 Example x01.01.52.32.54.05.16.06.57.08.19.0 y0.20.82.5 3.54.33.05.03.52.41.32.0 x9.311.011.312.113.114.015.516.017.517.819.020.0 y-0.3-1.3-3.0-4.0-4.9-4.0-5.2-3.0-3.5-1.6-1.4-0.1

25 Example

26 Regression Equation y = - 0.359 + 2.305x - 0.353x 2 + 0.012x 3

27 Projection onto a Subspace Input: 1. Given n independent vectors a 1, a 2, …, a n in R m 2.A vector b in R m … Desirable Output: –A vector in x in C([a 1, a 2, …, a n ]) that is closest to b –The projection x of b in C([a 1, a 2, …, a n ]) –A vector x in V such that (b-x) is orthogonal to C([a 1, a 2, …, a n ])

28 Connection to Least Square Approximation

29 Rotation 

30 Properties of The Rotation Matrix

31 Q is an orthonormal matrix: Q T Q = I

32 Rotation Matrix in High Dimensions Q is an orthonormal matrix: Q T Q = I

33 Rotation Matrix in High Dimensions Q is an orthonormal matrix: Q T Q = I

34 Reflection u b mirror

35 Reflection u b

36 u b mirror

37 Reflection is Symmetric and Orthonormal u b mirror

38 Orthonormal Vectors are orthonormal if

39 Orthonormal Matrices Q is orthonormal if Q T Q = I The columns of Q are orthonormal vectors Theorem: For any vectors x and y,

40 Products of Orthonormal Matrices Theorem: If Q and P are both orthonormal matrices then QP is also an orthonormal matrix. Proof:

41 Orthonormal Basis and Gram-Schmidt Input: an m by n matrix A Desirable output: Q such that –C(A) = C(Q), and –Q is orthonormal

42 Basic Idea Suppose A = [a 1 … a n ] If n = 1, then Q = [a 1 /|| a 1 ||] If n = 2, –q 1 = a 1 /|| a 1 || –Start with a 2 and subtract its projection along a 1 –Normalize

43 Gram-Schmidt Suppose A = [a 1 … a n ] –q 1 = a 1 /|| a 1 || –For i = 2 to n What is the complexity? O( mn 2 )

44 Theorem: QR-Decomposition Suppose A = [a 1 … a n ] –There exist an upper triangular matrix R such that –A = QR

45 Using QR to Find Least Square Approximation Can be solved by back substitution


Download ppt "Lecture 12 Least Square Approximation Shang-Hua Teng."

Similar presentations


Ads by Google