Discrete Least Squares Approximation

Slides:



Advertisements
Similar presentations
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 ~ Curve Fitting ~ Least Squares Regression Chapter.
Advertisements

1 Functions and Applications
6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
Ch11 Curve Fitting Dr. Deshi Ye
Definition  Regression Model  Regression Equation Y i =  0 +  1 X i ^ Given a collection of paired data, the regression equation algebraically describes.
P M V Subbarao Professor Mechanical Engineering Department
Read Chapter 17 of the textbook
Curve Fitting and Interpolation: Lecture (IV)
Least Square Regression
Curve-Fitting Regression
ECIV 301 Programming & Graphics Numerical Methods for Engineers Lecture 26 Regression Analysis-Chapter 17.
Least Square Regression
The Islamic University of Gaza Faculty of Engineering Civil Engineering Department Numerical Analysis ECIV 3306 Chapter 17 Least Square Regression.
Lecture 12 Projection and Least Square Approximation Shang-Hua Teng.
Business 205. Review Chi-Square Preview Regression.
ESTIMATING THE REGRESSION COEFFICIENTS FOR SIMPLE LINEAR REGRESSION.
Probability & Statistics for Engineers & Scientists, by Walpole, Myers, Myers & Ye ~ Chapter 11 Notes Class notes for ISE 201 San Jose State University.
ECIV 301 Programming & Graphics Numerical Methods for Engineers Lecture 24 Regression Analysis-Chapter 17.
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. by Lale Yurttas, Texas A&M University Chapter 171 CURVE.
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. by Lale Yurttas, Texas A&M University Chapter 171 Least.
Chapter 10 Real Inner Products and Least-Square (cont.)
Calibration & Curve Fitting
Objective - To graph linear equations using x-y charts. One Variable Equations Two Variable Equations 2x - 3 = x = 14 x = 7 One Solution.
Linear Regression.
4.2 Graphing linear equations Objective: Graph a linear equation using a table of values.
Least-Squares Regression
CpE- 310B Engineering Computation and Simulation Dr. Manal Al-Bzoor
CSE 330: Numerical Methods.  Regression analysis gives information on the relationship between a response (dependent) variable and one or more predictor.
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 ~ Curve Fitting ~ Least Squares Regression Chapter.
Stats for Engineers Lecture 9. Summary From Last Time Confidence Intervals for the mean t-tables Q Student t-distribution.
Scientific Computing Linear Least Squares. Interpolation vs Approximation Recall: Given a set of (x,y) data points, Interpolation is the process of finding.
Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 Part 4 Curve Fitting.
Linear Regression James H. Steiger. Regression – The General Setup You have a set of data on two variables, X and Y, represented in a scatter plot. You.
CISE301_Topic41 CISE301: Numerical Methods Topic 4: Least Squares Curve Fitting Lectures 18-19: KFUPM Read Chapter 17 of the textbook.
MECN 3500 Inter - Bayamon Lecture 9 Numerical Methods for Engineering MECN 3500 Professor: Dr. Omar E. Meza Castillo
Chapter 8 Curve Fitting.
Linear Regression Least Squares Method: an introduction.
Curve-Fitting Regression
1 Multiple Regression A single numerical response variable, Y. Multiple numerical explanatory variables, X 1, X 2,…, X k.
9.2A- Linear Regression Regression Line = Line of best fit The line for which the sum of the squares of the residuals is a minimum Residuals (d) = distance.
Scientific Computing General Least Squares. Polynomial Least Squares Polynomial Least Squares: We assume that the class of functions is the class of all.
Lecture 16 - Approximation Methods CVEN 302 July 15, 2002.
12/17/ Linear Regression Dr. A. Emamzadeh. 2 What is Regression? What is regression? Given n data points best fitto the data. The best fit is generally.
Math 4030 – 11b Method of Least Squares. Model: Dependent (response) Variable Independent (control) Variable Random Error Objectives: Find (estimated)
Curve Fitting Pertemuan 10 Matakuliah: S0262-Analisis Numerik Tahun: 2010.
Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 Part 4 Chapter 15 General Least Squares and Non- Linear.
3.1 Solving Systems Using Tables and Graphs When you have two or more related unknowns, you may be able to represent their relationship with a system of.
Curve Fitting Introduction Least-Squares Regression Linear Regression Polynomial Regression Multiple Linear Regression Today’s class Numerical Methods.
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 ~ Curve Fitting ~ Least Squares Regression.
1 Simple Linear Regression and Correlation Least Squares Method The Model Estimating the Coefficients EXAMPLE 1: USED CAR SALES.
ABE425 Engineering Measurement Systems Ordinary Least Squares (OLS) Fitting Dr. Tony E. Grift Dept. of Agricultural & Biological Engineering University.
Statistics 350 Lecture 2. Today Last Day: Section Today: Section 1.6 Homework #1: Chapter 1 Problems (page 33-38): 2, 5, 6, 7, 22, 26, 33, 34,
Linear Regression. Regression Consider the following 10 data pairs comparing the yield of an experiment to the temperature at which the experiment was.
Lesson 4-4 Page 295 Linear Functions. Understand how to use tables and graphs to represent functions.
CSE 330: Numerical Methods. What is regression analysis? Regression analysis gives information on the relationship between a response (dependent) variable.
The simple linear regression model and parameter estimation
Solving Higher Degree Polynomial Equations.
Part 5 - Chapter
Part 5 - Chapter 17.
Multiple Regression.
Chapter 12 Curve Fitting : Fitting a Straight Line Gab-Byung Chae
Part 5 - Chapter 17.
Linear regression Fitting a straight line to observations.
Least Squares Fitting A mathematical procedure for finding the best-fitting curve to a given set of points by minimizing the sum of the squares of the.
7.4 – The Method of Least-Squares
Least Square Regression
Mathematical Sciences
5.1 Solving Systems of Equations by Graphing
SKTN 2393 Numerical Methods for Nuclear Engineers
Regression and Correlation of Data
Presentation transcript:

Discrete Least Squares Approximation Sec:8.1 Discrete Least Squares Approximation

Sec:8.1 Discrete Least Squares Approximation Find a relationship between x and y. From this graph, it appears that the actual relationship between x and y is linear. From this graph, it appears that the actual relationship between x and y is linear. The likely reason that no line precisely fits the data is because of errors in the data. The likely reason that no line precisely fits the data is because of errors in the data. 𝒑 𝟐 (𝒙) = 𝒂 𝟎 + 𝒂 𝟏 𝒙+ 𝒂 𝟐 𝒙 𝟐 𝒑 𝟏 (𝒙)= 𝒂 𝟎 + 𝒂 𝟏 𝒙

Sec:8.1 Discrete Least Squares Approximation 𝒀=𝑿 ∗𝒂 Fit the data in Table with the discrete least squares polynomial of degree at most 2. 𝑿 𝑻 ∗𝒀= 𝑿 𝑻 ∗ 𝑿 ∗𝒂 This is a linear system of 3 equations in 3 unknowns 𝒁 = 𝑩 ∗𝒂 𝟑×𝟑 3×𝟏 𝟑×𝟏 𝒑 𝟐 (𝒙) = 𝒂 𝟎 + 𝒂 𝟏 𝒙+ 𝒂 𝟐 𝒙 𝟐 1 1 1 1 1 0 0.25 0.5 0.75 1 0 0.25 2 0.5 2 0.75 2 1 1 0 0 2 1 0.25 0.25 2 1 0.5 0.5 2 1 0.75 0.75 2 1 1 1 2 𝒂 𝟎 𝒂 𝟏 𝒂 𝟐 𝟏.𝟎 = 𝒂 𝟎 + 𝟎∙𝒂 𝟏 + 𝟎 𝟐 ∙𝒂 𝟐 𝑋 𝑇 1.0 1.284 1.6487 2.117 2.7183 1.284 = 𝒂 𝟎 + 𝟎.𝟐𝟓𝒂 𝟏 + 𝟎.𝟐𝟓 𝟐 𝒂 𝟐 = 1.6487 = 𝒂 𝟎 + 𝟎.𝟓𝒂 𝟏 + 𝟎.𝟓 𝟐 𝒂 𝟐 2.117 = 𝒂 𝟎 + 𝟎.𝟕𝟓𝒂 𝟏 + 𝟎.𝟕𝟓 𝟐 𝒂 𝟐 8.7680 30.5736 122.8038 5 15 55 15 55 225 55 225 979 𝒂 𝟎 𝒂 𝟏 𝒂 𝟐 2.7183 = 𝒂 𝟎 + 𝟏𝒂 𝟏 + 𝟏 𝟐 𝒂 𝟐 normal equations = Matrix Form 𝒁 = 𝑩 ∗ 𝒂 1.0 1.284 1.6487 2.117 2.7183 = 1 0 0 2 1 0.25 0.25 2 1 0.5 0.5 2 1 0.75 0.75 2 1 1 1 2 𝒂 𝟎 𝒂 𝟏 𝒂 𝟐 𝒂 𝟎 𝒂 𝟏 𝒂 𝟐 0.8418 0.1106 0.0527 = 𝒂 𝒀 𝑿 𝒑 𝟐 (𝒙) =0.8418+0.1106𝒙+0.0527 𝒙 𝟐 𝟓×𝟏 𝟓×𝟑 𝟑×𝟏

Sec:8.1 Discrete Least Squares Approximation Fit the data in Table with the discrete least squares polynomial of degree at most 2. 𝒑 𝟐 (𝒙) =0.8418+0.1106𝒙+0.0527 𝒙 𝟐 0 1.0000 1.0051 0.0051 0.2500 1.2840 1.2739 0.0101 0.5000 1.6487 1.6481 0.0006 0.7500 2.1170 2.1278 0.0108 1.0000 2.7183 2.7130 0.0053 𝒙 𝒚 𝒑 𝟐 𝑬𝒓𝒓𝒐𝒓

Sec:8.1 Discrete Least Squares Approximation EXAMPLE: Use least-squares regression to fit a straight line to x [ 0 2 4 6 9 11 12 15 17 19 ] y [ 5 6 7 6 9 8 7 10 12 12 ] Plot the data and the regression line. 𝒑 𝟏 (𝒙)=4.8515+0.3525𝒙 clear x = [ 0 2 4 6 9 11 12 15 17 19 ]'; Y = [ 5 6 7 6 9 8 7 10 12 12 ]'; n = length(x) X = [ones(n,1) x ]; % Y = X * a % B = X'*X; Z = X' * Y; a = B\Z a = X\Y xx=[0:0.1:19]'; yy=a(1)+a(2)*xx; plot(x,Y,'xr',xx,yy,'-b'); grid on 𝒆 𝒊 = 𝒚 𝒊 −( 𝒂 𝟎 + 𝒂 𝟏 𝒙 𝒊 )

Sec:8.1 Discrete Least Squares Approximation Derivation for the formula: EXAMPLE: One strategy for fitting a “best” line through the data would be to minimize the sum of the absolute value of residual errors for all the available data Use least-squares regression to fit a straight line to x [ 0 2 4 6 9 11 12 15 17 19 ] y [ 5 6 7 6 9 8 7 10 12 12 ] 𝒊=𝟏 𝟏𝟎 𝒆 𝒊 = 𝒊=𝟏 𝟏𝟎 𝒚 𝒊 − 𝒂 𝟎 − 𝒂 𝟏 𝒙 𝒊 𝒆 𝒊 = 𝒚 𝒊 −( 𝒂 𝟎 + 𝒂 𝟏 𝒙 𝒊 ) find values of 𝒂 𝟎 and 𝒂 𝟏 min 𝒂 𝟎 , 𝒂 𝟏 𝒊=𝟏 𝟏𝟎 𝒚 𝒊 − 𝒂 𝟎 − 𝒂 𝟏 𝒙 𝒊 𝒑 𝟏 (𝒙)=4.8515+0.3525𝒙 To minimize a function of two variables, we need to set its partial derivatives to zero and simultaneously solve the resulting equations. 𝒆 𝟏 = 𝒚 𝟏 − 𝒂 𝟎 − 𝒂 𝟏 𝒙 𝟏 𝒆 𝟐 = 𝒚 𝟐 − 𝒂 𝟎 − 𝒂 𝟏 𝒙 𝟐 ⋮ ⋮ The problem is that the absolute-value function is not differentiable at zero, and we might not be able to find solutions to this pair of equations. 𝒆 𝟏𝟎 = 𝒚 𝟏𝟎 − 𝒂 𝟎 − 𝒂 𝟏 𝒙 𝟏𝟎

Sec:8.1 Discrete Least Squares Approximation EXAMPLE: Least Squares Use least-squares regression to fit a straight line to The least squares approach to this problem involves determining the best approximating line when the error involved is the sum of the squares x [ 0 2 4 6 9 11 12 15 17 19 ] y [ 5 6 7 6 9 8 7 10 12 12 ] 𝒆 𝒊 = 𝒚 𝒊 −( 𝒂 𝟎 + 𝒂 𝟏 𝒙 𝒊 ) min 𝒂 𝟎 , 𝒂 𝟏 𝒊=𝟏 𝟏𝟎 𝒚 𝒊 − 𝒂 𝟎 − 𝒂 𝟏 𝒙 𝒊 𝟐 The least squares method is the most convenient procedure for determining best linear approximations 𝒑 𝟏 (𝒙)=4.8515+0.3525𝒙 𝑬 𝒂 𝟎 , 𝒂 𝟏 = 𝒊=𝟏 𝟏𝟎 𝒆 𝟐 For a minimum to occur, we need both = 𝒊=𝟏 𝟏𝟎 𝒚 𝒊 − 𝒂 𝟎 − 𝒂 𝟏 𝒙 𝒊 𝟐 𝝏𝑬 𝝏 𝒂 𝟎 =𝟎 𝝏𝑬 𝝏 𝒂 𝟏 =𝟎

Sec:8.1 Discrete Least Squares Approximation 𝒊=𝟏 𝒎 −𝟐 𝒙 𝒊 𝒚 𝒊 − 𝒂 𝟎 − 𝒂 𝟏 𝒙 𝒊 =𝟎 𝝏𝑬 𝝏 𝒂 𝟏 =𝟎 = 𝒊=𝟏 𝒎 𝒚 𝒊 − 𝒂 𝟎 − 𝒂 𝟏 𝒙 𝒊 𝟐 𝑬 𝒂 𝟎 , 𝒂 𝟏 − 𝒊=𝟏 𝒎 𝒙 𝒊 𝒚 𝒊 − 𝒂 𝟎 − 𝒂 𝟏 𝒙 𝒊 =𝟎 For a minimum to occur, we need both 𝝏𝑬 𝝏 𝒂 𝟎 =𝟎 𝝏𝑬 𝝏 𝒂 𝟏 =𝟎 − 𝒊=𝟏 𝒎 𝒙 𝒊 𝒚 𝒊 + 𝒊=𝟏 𝒎 𝒂 𝟎 𝒙 𝒊 + 𝒊=𝟏 𝒎 𝒂 𝟏 𝒙 𝒊 𝟐 =𝟎 𝒊=𝟏 𝒎 −𝟐 𝒚 𝒊 − 𝒂 𝟎 − 𝒂 𝟏 𝒙 𝒊 =𝟎 𝝏𝑬 𝝏 𝒂 𝟎 =𝟎 𝒊=𝟏 𝒎 𝒙 𝒊 𝒂 𝟎 + 𝒊=𝟏 𝒎 𝒙 𝒊 𝟐 𝒂 𝟏 = 𝒊=𝟏 𝒎 𝒙 𝒊 𝒚 𝒊 − 𝒊=𝟏 𝒎 𝒚 𝒊 − 𝒂 𝟎 − 𝒂 𝟏 𝒙 𝒊 =𝟎 + 𝒊=𝟏 𝒎 𝒙 𝒊 𝒂 𝟏 = 𝒊=𝟏 𝒎 𝒚 𝒊 𝒎 𝒂 𝟎 − 𝒊=𝟏 𝒎 𝒚 𝒊 + 𝒊=𝟏 𝒎 𝒂 𝟎 + 𝒊=𝟏 𝒎 𝒂 𝟏 𝒙 𝒊 =𝟎 𝒊=𝟏 𝒎 𝒙 𝒊 𝒂 𝟎 + 𝒊=𝟏 𝒎 𝒙 𝒊 𝟐 𝒂 𝟏 = 𝒊=𝟏 𝒎 𝒙 𝒊 𝒚 𝒊 + 𝒊=𝟏 𝒎 𝒙 𝒊 𝒂 𝟏 = 𝒊=𝟏 𝒎 𝒚 𝒊 𝒎 𝒂 𝟎 These are called the normal equations.

Sec:8.1 Discrete Least Squares Approximation + 𝒊=𝟏 𝒎 𝒙 𝒊 𝒂 𝟏 = 𝒊=𝟏 𝒎 𝒚 𝒊 𝑦 1 ⋮ 𝑦 𝑚 = 1 𝑥 1 ⋮ ⋮ 1 𝑥 𝑚 𝒂 𝟎 𝒂 𝟏 𝒎 𝒂 𝟎 Least Squares Use least-squares regression to fit a straight line to 𝒊=𝟏 𝒎 𝒙 𝒊 𝒂 𝟎 + 𝒊=𝟏 𝒎 𝒙 𝒊 𝟐 𝒂 𝟏 = 𝒊=𝟏 𝒎 𝒙 𝒊 𝒚 𝒊 𝒀 𝑿 𝒂 𝑥 1 𝑦 1 𝑥 2 𝑦 2 ⋮ ⋮ 𝑥 𝑚 𝑦 𝑚 𝑥 𝑦 𝒎×𝟏 𝒎×𝟐 𝟐×𝟏 These are called the normal equations. 𝒀=𝑿 ∗𝒂 𝑿 𝑻 ∗𝒀= 𝑿 𝑻 ∗ 𝑿 ∗𝒂 Matrix Form 𝒑 𝟏 (𝒙)= 𝒂 𝟎 + 𝒂 𝟏 𝒙 𝑚 𝑥 𝑖 𝑥 𝑖 𝑥 𝑖 2 𝒂 𝟎 𝒂 𝟏 = 𝑦 𝑖 𝑥 𝑖 𝑦 𝑖 1 ⋯ 1 𝑥 1 ⋯ 𝑥 𝑚 𝑦 1 ⋮ 𝑦 𝑚 = 1 ⋯ 1 𝑥 1 ⋯ 𝑥 𝑚 1 𝑥 1 ⋮ ⋮ 1 𝑥 𝑚 𝒂 𝟎 𝒂 𝟏 𝑦 1 +⋯+ 𝑦 𝑚 𝑥 1 𝑦 1 +⋯+ 𝑥 𝑚 𝑦 𝑚 = 1+⋯+1 𝑥 1 +⋯+ 𝑥 𝑚 𝑥 1 +⋯+ 𝑥 𝑚 𝑥 1 2 +⋯+ 𝑥 𝑚 2 𝒂 𝟎 𝒂 𝟏

Derivation for the formula: (polynomial of degree 2) Sec:8.1 Discrete Least Squares Approximation Derivation for the formula: (polynomial of degree 2)

Derivation for the formula: (polynomial of degree 2) Sec:8.1 Discrete Least Squares Approximation Least Squares = 𝒊=𝟏 𝒎 𝒚 𝒊 − 𝒂 𝟎 − 𝒂 𝟏 𝒙 𝒊 − 𝒂 𝟐 𝒙 𝒊 𝟐 𝟐 𝑥 1 𝑦 1 𝑥 2 𝑦 2 ⋮ ⋮ 𝑥 𝑚 𝑦 𝑚 𝑥 𝑦 𝑬 𝒂 𝟎 , 𝒂 𝟏 Use least-squares regression to fit a polynomial of degree 2 to For a minimum to occur, we need both 𝒑 𝟏 𝒙 = 𝒂 𝟎 + 𝒂 𝟏 𝒙+ 𝒂 𝟐 𝒙 𝟐 𝝏𝑬 𝝏 𝒂 𝟎 =𝟎 𝝏𝑬 𝝏 𝒂 𝟏 =𝟎 𝝏𝑬 𝝏 𝒂 𝟐 =𝟎 𝒊=𝟏 𝒎 −𝟐 𝒚 𝒊 − 𝒂 𝟎 − 𝒂 𝟏 𝒙 𝒊 − 𝒂 𝟐 𝒙 𝒊 𝟐 =𝟎 𝝏𝑬 𝝏 𝒂 𝟎 =𝟎 Derivation for the formula: (polynomial of degree 2) − 𝒊=𝟏 𝒎 𝒚 𝒊 − 𝒂 𝟎 − 𝒂 𝟏 𝒙 𝒊 − 𝒂 𝟐 𝒙 𝒊 𝟐 =𝟎 − 𝒊=𝟏 𝒎 𝒚 𝒊 + 𝒊=𝟏 𝒎 𝒂 𝟎 + 𝒊=𝟏 𝒎 𝒂 𝟏 𝒙 𝒊 + 𝒊=𝟏 𝒎 𝒂 𝟐 𝒙 𝒊 𝟐 =𝟎 + 𝒊=𝟏 𝒎 𝒙 𝒊 𝒂 𝟏 + 𝒊=𝟏 𝒎 𝒙 𝒊 𝟐 𝒂 𝟐 = 𝒊=𝟏 𝒎 𝒚 𝒊 𝒎 𝒂 𝟎

Sec:8.1 Discrete Least Squares Approximation = 𝒊=𝟏 𝒎 𝒚 𝒊 − 𝒂 𝟎 − 𝒂 𝟏 𝒙 𝒊 − 𝒂 𝟐 𝒙 𝒊 𝟐 𝟐 𝑥 1 𝑦 1 𝑥 2 𝑦 2 ⋮ ⋮ 𝑥 𝑚 𝑦 𝑚 𝑥 𝑦 𝑬 𝒂 𝟎 , 𝒂 𝟏 Use least-squares regression to fit a polynomial of degree 2 to For a minimum to occur, we need both 𝒑 𝟏 𝒙 = 𝒂 𝟎 + 𝒂 𝟏 𝒙+ 𝒂 𝟐 𝒙 𝟐 𝝏𝑬 𝝏 𝒂 𝟎 =𝟎 𝝏𝑬 𝝏 𝒂 𝟏 =𝟎 𝝏𝑬 𝝏 𝒂 𝟐 =𝟎 𝒊=𝟏 𝒎 −𝟐 𝒙 𝒊 𝒚 𝒊 − 𝒂 𝟎 − 𝒂 𝟏 𝒙 𝒊 − 𝒂 𝟐 𝒙 𝒊 𝟐 =𝟎 𝝏𝑬 𝝏 𝒂 𝟏 =𝟎 − 𝒊=𝟏 𝒎 𝒙 𝒊 𝒚 𝒊 − 𝒂 𝟎 − 𝒂 𝟏 𝒙 𝒊 − 𝒂 𝟐 𝒙 𝒊 𝟐 =𝟎 − 𝒊=𝟏 𝒎 𝒙 𝒊 𝒚 𝒊 + 𝒊=𝟏 𝒎 𝒙 𝒊 𝒂 𝟎 + 𝒊=𝟏 𝒎 𝒂 𝟏 𝒙 𝒊 𝟐 + 𝒊=𝟏 𝒎 𝒂 𝟐 𝒙 𝒊 𝟑 =𝟎 𝒊=𝟏 𝒎 𝒙 𝒊 𝒂 𝟎 + 𝒊=𝟏 𝒎 𝒙 𝒊 𝟐 𝒂 𝟏 + 𝒊=𝟏 𝒎 𝒙 𝒊 𝟑 𝒂 𝟐 = 𝒊=𝟏 𝒎 𝒙 𝒊 𝒚 𝒊

Sec:8.1 Discrete Least Squares Approximation = 𝒊=𝟏 𝒎 𝒚 𝒊 − 𝒂 𝟎 − 𝒂 𝟏 𝒙 𝒊 − 𝒂 𝟐 𝒙 𝒊 𝟐 𝟐 𝑥 1 𝑦 1 𝑥 2 𝑦 2 ⋮ ⋮ 𝑥 𝑚 𝑦 𝑚 𝑥 𝑦 𝑬 𝒂 𝟎 , 𝒂 𝟏 Use least-squares regression to fit a polynomial of degree 2 to For a minimum to occur, we need both 𝒑 𝟏 𝒙 = 𝒂 𝟎 + 𝒂 𝟏 𝒙+ 𝒂 𝟐 𝒙 𝟐 𝝏𝑬 𝝏 𝒂 𝟎 =𝟎 𝝏𝑬 𝝏 𝒂 𝟏 =𝟎 𝝏𝑬 𝝏 𝒂 𝟐 =𝟎 𝒊=𝟏 𝒎 −𝟐 𝒙 𝒊 𝟐 𝒚 𝒊 − 𝒂 𝟎 − 𝒂 𝟏 𝒙 𝒊 − 𝒂 𝟐 𝒙 𝒊 𝟐 =𝟎 𝝏𝑬 𝝏 𝒂 𝟐 =𝟎 − 𝒊=𝟏 𝒎 𝒙 𝒊 𝟐 𝒚 𝒊 − 𝒂 𝟎 − 𝒂 𝟏 𝒙 𝒊 − 𝒂 𝟐 𝒙 𝒊 𝟐 =𝟎 − 𝒊=𝟏 𝒎 𝒙 𝒊 𝟐 𝒚 𝒊 + 𝒊=𝟏 𝒎 𝒙 𝒊 𝟐 𝒂 𝟎 + 𝒊=𝟏 𝒎 𝒂 𝟏 𝒙 𝒊 𝟑 + 𝒊=𝟏 𝒎 𝒂 𝟐 𝒙 𝒊 𝟒 =𝟎 𝒊=𝟏 𝒎 𝒙 𝒊 𝟐 𝒂 𝟎 + 𝒊=𝟏 𝒎 𝒙 𝒊 𝟑 𝒂 𝟏 + 𝒊=𝟏 𝒎 𝒙 𝒊 𝟒 𝒂 𝟐 = 𝒊=𝟏 𝒎 𝒙 𝒊 𝟐 𝒚 𝒊

Sec:8.1 Discrete Least Squares Approximation = 𝒊=𝟏 𝒎 𝒚 𝒊 − 𝒂 𝟎 − 𝒂 𝟏 𝒙 𝒊 − 𝒂 𝟐 𝒙 𝒊 𝟐 𝟐 𝑥 1 𝑦 1 𝑥 2 𝑦 2 ⋮ ⋮ 𝑥 𝑚 𝑦 𝑚 𝑥 𝑦 𝑬 𝒂 𝟎 , 𝒂 𝟏 Use least-squares regression to fit a polynomial of degree 2 to For a minimum to occur, we need both 𝒑 𝟐 𝒙 = 𝒂 𝟎 + 𝒂 𝟏 𝒙+ 𝒂 𝟐 𝒙 𝟐 𝝏𝑬 𝝏 𝒂 𝟎 =𝟎 𝝏𝑬 𝝏 𝒂 𝟏 =𝟎 𝝏𝑬 𝝏 𝒂 𝟐 =𝟎 normal equations normal equations in Matrix Form + 𝒊=𝟏 𝒎 𝒙 𝒊 𝒂 𝟏 + 𝒊=𝟏 𝒎 𝒙 𝒊 𝟐 𝒂 𝟐 = 𝒊=𝟏 𝒎 𝒚 𝒊 𝒎 𝒂 𝟎 𝒊=𝟏 𝒎 𝒙 𝒊 𝒂 𝟎 + 𝒊=𝟏 𝒎 𝒙 𝒊 𝟐 𝒂 𝟏 + 𝒊=𝟏 𝒎 𝒙 𝒊 𝟑 𝒂 𝟐 = 𝒊=𝟏 𝒎 𝒙 𝒊 𝒚 𝒊 𝑛 𝒙 𝒊 𝑥 𝑖 2 𝒙 𝒊 𝑥 𝑖 2 𝑥 𝑖 3 𝑥 𝑖 2 𝑥 𝑖 3 𝑥 𝑖 4 𝒂 𝟎 𝒂 𝟏 𝒂 𝟐 = 𝒚 𝒊 𝒙 𝒊 𝒚 𝒊 𝑥 𝑖 2 𝒚 𝒊 𝒊=𝟏 𝒎 𝒙 𝒊 𝟐 𝒂 𝟎 + 𝒊=𝟏 𝒎 𝒙 𝒊 𝟑 𝒂 𝟏 + 𝒊=𝟏 𝒎 𝒙 𝒊 𝟒 𝒂 𝟐 = 𝒊=𝟏 𝒎 𝒙 𝒊 𝟐 𝒚 𝒊

Sec:8.1 Discrete Least Squares Approximation 𝒊=𝟏 𝒎 𝒙 𝒊 𝟐 𝒂 𝟎 + 𝒊=𝟏 𝒎 𝒙 𝒊 𝟑 𝒂 𝟏 + 𝒊=𝟏 𝒎 𝒙 𝒊 𝟒 𝒂 𝟐 = 𝒊=𝟏 𝒎 𝒙 𝒊 𝟐 𝒚 𝒊 𝒊=𝟏 𝒎 𝒙 𝒊 𝒂 𝟎 + 𝒊=𝟏 𝒎 𝒙 𝒊 𝟐 𝒂 𝟏 + 𝒊=𝟏 𝒎 𝒙 𝒊 𝟑 𝒂 𝟐 = 𝒊=𝟏 𝒎 𝒙 𝒊 𝒚 𝒊 𝒎 𝒂 𝟎 + 𝒊=𝟏 𝒎 𝒙 𝒊 𝒂 𝟏 + 𝒊=𝟏 𝒎 𝒙 𝒊 𝟐 𝒂 𝟐 = 𝒊=𝟏 𝒎 𝒚 𝒊 𝑦 1 ⋮ 𝑦 𝑚 = 1 𝑥 1 𝑥 1 2 ⋮ ⋮ ⋮ 1 𝑥 𝑚 𝑥 𝑚 2 𝒂 𝟎 𝒂 𝟏 𝒂 𝟐 Least Squares Use least-squares regression to fit a poly of deg 2 to 𝒀 𝑥 1 𝑦 1 𝑥 2 𝑦 2 ⋮ ⋮ 𝑥 𝑚 𝑦 𝑚 𝑥 𝑦 𝑿 𝒂 𝒎×𝟏 𝒎×𝟑 𝟑×𝟏 𝒀=𝑿 ∗𝒂 normal equations. 𝑿 𝑻 ∗𝒀= 𝑿 𝑻 ∗ 𝑿 ∗𝒂 Matrix Form 𝒑 𝟐 𝒙 = 𝒂 𝟎 + 𝒂 𝟏 𝒙+ 𝒂 𝟐 𝒙 𝟐 𝑚 𝒙 𝒊 𝑥 𝑖 2 𝒙 𝒊 𝑥 𝑖 2 𝑥 𝑖 3 𝑥 𝑖 2 𝑥 𝑖 3 𝑥 𝑖 4 𝒂 𝟎 𝒂 𝟏 𝒂 𝟐 = 𝒚 𝒊 𝒙 𝒊 𝒚 𝒊 𝑥 𝑖 2 𝒚 𝒊 1 ⋯ 1 𝑥 1 ⋯ 𝑥 𝑚 𝑥 1 2 ⋯ 𝑥 𝑚 2 𝑦 1 ⋮ 𝑦 𝑚 = 1 ⋯ 1 𝑥 1 ⋯ 𝑥 𝑚 𝑥 1 2 ⋯ 𝑥 𝑚 2 1 𝑥 1 𝑥 1 2 ⋮ ⋮ ⋮ 1 𝑥 𝑚 𝑥 𝑚 2 𝒂 𝟎 𝒂 𝟏 𝒂 𝟐 𝑦 1 +⋯+ 𝑦 𝑚 𝑥 1 𝑦 1 +⋯+ 𝑥 𝑚 𝑦 𝑚 𝑥 1 2 𝑦 1 +⋯+ 𝑥 𝑚 2 𝑦 𝑚 = 1+⋯+1 𝑥 1 +⋯+ 𝑥 𝑚 𝑥 1 2 +⋯+ 𝑥 𝑚 2 𝑥 1 +⋯+ 𝑥 𝑚 𝑥 1 2 +⋯+ 𝑥 𝑚 2 𝑥 1 3 +⋯+ 𝑥 𝑚 3 𝑥 1 2 +⋯+ 𝑥 𝑚 2 𝑥 1 3 +⋯+ 𝑥 𝑚 3 𝑥 1 4 +⋯+ 𝑥 𝑚 4 𝒂 𝟎 𝒂 𝟏 𝒂 𝟐

𝒎 𝒏 Sec:8.1 Discrete Least Squares Approximation = 𝒊=𝟏 𝒎 𝒚 𝒊 − 𝒂 𝟎 − 𝒂 𝟏 𝒙 𝒊 −⋯− 𝒂 𝒏 𝒙 𝒊 𝒏 𝟐 𝑥 1 𝑦 1 𝑥 2 𝑦 2 ⋮ ⋮ 𝑥 𝑚 𝑦 𝑚 𝑥 𝑦 𝑬 𝒂 𝟎 , 𝒂 𝟏 Use least-squares regression to fit a polynomial of degree n to For a minimum to occur, we need both 𝝏𝑬 𝝏 𝒂 𝟎 =𝟎 𝝏𝑬 𝝏 𝒂 𝒏 =𝟎 𝝏𝑬 𝝏 𝒂 𝟏 =𝟎 ⋯⋯ 𝒑 𝟐 𝒙 = 𝒂 𝟎 + 𝒂 𝟏 𝒙+⋯+ 𝒂 𝒏 𝒙 𝒏 normal equations in Matrix Form 𝑚 𝒙 𝒊 ⋯ 𝑥 𝑖 𝑛 𝒙 𝒊 𝑥 𝑖 2 ⋯ 𝑥 𝑖 𝑛+1 ⋮ ⋮ ⋱ ⋮ 𝑥 𝑖 𝑛 𝑥 𝑖 𝑛+1 ⋯ 𝑥 𝑖 2𝑛 𝒂 𝟎 𝒂 𝟏 ⋮ 𝒂 𝒏 = 𝒚 𝒊 𝒙 𝒊 𝒚 𝒊 ⋮ 𝑥 𝑖 𝑛 𝒚 𝒊 𝑥 𝑖 0 𝒎 Number of data points 𝒏 Degree of polynomial (𝑛+1)×(𝑛+1) (𝑛+1)×1

Sec:8.1 Discrete Least Squares Approximation