Math 4030 – 11b Method of Least Squares
Model: Dependent (response) Variable Independent (control) Variable Random Error Objectives: Find (estimated) values of the coefficients and (based on the sample data) Evaluate the model efficiency; Predict or estimate the Y values for “un-tested” x values. XY x1x1 y1y1 x2x2 y2y2 …… xnxn ynyn Raw data:
Regression Line (Best Fitting Line) Estimated Y value for given x Estimated coefficients for and Error Finding the equation of the best fitting line is to find the coefficients a and b, the estimated values of and . How?
The Method of Least Squares Method of Least Squares finds the line (or a and b) such that sum of squared errors is minimized (among all choices of a and b.
Calculation: Solve a system of 2 linear equations.
Solution by Cramer’s Rule:
Square-Sum Notations:
Solutions: Estimate : Estimate : Equation for the Regression Line: Residual sum of squares (or error sum of squares): Note: exchange x and y will end in different regression line.
Curvilinear Regression (Sec. 11.3): Exponential: Reciprocal: Power:
Objectives: Find (estimated) values of the coefficients i ‘s (based on the sample data) Predict or estimate the Y values for “un-tested” x values. XY x1x1 y1y1 x2x2 y2y2 …… xnxn ynyn Raw data: Polynomial Regression:
Calculation: Solve a system of p + 1 linear equations to solve p + 1 unknowns. If all x i values are distinct, the system has the unique solution.