Math 4030 – 11b Method of Least Squares. Model: Dependent (response) Variable Independent (control) Variable Random Error Objectives: Find (estimated)

Slides:



Advertisements
Similar presentations
Modeling of Data. Basic Bayes theorem Bayes theorem relates the conditional probabilities of two events A, and B: A might be a hypothesis and B might.
Advertisements

Topic 12: Multiple Linear Regression
Kin 304 Regression Linear Regression Least Sum of Squares
Chapter 12 Simple Linear Regression
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 ~ Curve Fitting ~ Least Squares Regression Chapter.
1 Simple Linear Regression and Correlation The Model Estimating the Coefficients EXAMPLE 1: USED CAR SALES Assessing the model –T-tests –R-square.
Simple Regression. Major Questions Given an economic model involving a relationship between two economic variables, how do we go about specifying the.
Ch11 Curve Fitting Dr. Deshi Ye
Definition  Regression Model  Regression Equation Y i =  0 +  1 X i ^ Given a collection of paired data, the regression equation algebraically describes.
Simple Linear Regression
Chapter 12 Simple Linear Regression
Read Chapter 17 of the textbook
LINEAR REGRESSION: What it Is and How it Works Overview What is Bivariate Linear Regression? The Regression Equation How It’s Based on r.
LINEAR REGRESSION: What it Is and How it Works. Overview What is Bivariate Linear Regression? The Regression Equation How It’s Based on r.
Lesson #32 Simple Linear Regression. Regression is used to model and/or predict a variable; called the dependent variable, Y; based on one or more independent.
Lecture 12 Projection and Least Square Approximation Shang-Hua Teng.
Probability & Statistics for Engineers & Scientists, by Walpole, Myers, Myers & Ye ~ Chapter 11 Notes Class notes for ISE 201 San Jose State University.
Chapter 6 (cont.) Regression Estimation. Simple Linear Regression: review of least squares procedure 2.
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 ~ Curve Fitting ~ Least Squares Regression Chapter.
Regression Analysis What is regression ?What is regression ? Best-fit lineBest-fit line Least squareLeast square What is regression ?What is regression.
Stats for Engineers Lecture 9. Summary From Last Time Confidence Intervals for the mean t-tables Q Student t-distribution.
Linear Regression James H. Steiger. Regression – The General Setup You have a set of data on two variables, X and Y, represented in a scatter plot. You.
CISE301_Topic41 CISE301: Numerical Methods Topic 4: Least Squares Curve Fitting Lectures 18-19: KFUPM Read Chapter 17 of the textbook.
Section 5.2: Linear Regression: Fitting a Line to Bivariate Data.
MTH 161: Introduction To Statistics
Thomas Knotts. Engineers often: Regress data  Analysis  Fit to theory  Data reduction Use the regression of others  Antoine Equation  DIPPR.
Regression Regression relationship = trend + scatter
1 Multiple Regression A single numerical response variable, Y. Multiple numerical explanatory variables, X 1, X 2,…, X k.
Chapter 11 Correlation and Simple Linear Regression Statistics for Business (Econ) 1.
Chapter 6 (cont.) Difference Estimation. Recall the Regression Estimation Procedure 2.
Math 4030 – 13a Correlation & Regression. Correlation (Sec. 11.6):  Two random variables, X and Y, both continuous numerical;  Correlation exists when.
Economics 173 Business Statistics Lecture 10 Fall, 2001 Professor J. Petry
Linear Prediction Correlation can be used to make predictions – Values on X can be used to predict values on Y – Stronger relationships between X and Y.
Curve Fitting Pertemuan 10 Matakuliah: S0262-Analisis Numerik Tahun: 2010.
© 2001 Prentice-Hall, Inc.Chap 13-1 BA 201 Lecture 18 Introduction to Simple Linear Regression (Data)Data.
5.4 Third Order Determinants and Cramer’s Rule. Third Order Determinants To solve a linear system in three variables, we can use third order determinants.
Chapter 11: Linear Regression and Correlation Regression analysis is a statistical tool that utilizes the relation between two or more quantitative variables.
Curve Fitting Introduction Least-Squares Regression Linear Regression Polynomial Regression Multiple Linear Regression Today’s class Numerical Methods.
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 ~ Curve Fitting ~ Least Squares Regression.
1 Simple Linear Regression and Correlation Least Squares Method The Model Estimating the Coefficients EXAMPLE 1: USED CAR SALES.
Adam Q. Colley EMIS8381 March 31,  One Dependent Variable  One or more (n) Independent Variables  (Independent to the Dependent Variable)  More.
Differential Equations Linear Equations with Variable Coefficients.
Dependent (response) Variable Independent (control) Variable Random Error XY x1x1 y1y1 x2x2 y2y2 …… xnxn ynyn Raw data: Assumption:  i ‘s are independent.
1 1 Slide The Simple Linear Regression Model n Simple Linear Regression Model y =  0 +  1 x +  n Simple Linear Regression Equation E( y ) =  0 + 
Correlation and Regression Chapter 9. § 9.2 Linear Regression.
Chapter 14 Introduction to Regression Analysis. Objectives Regression Analysis Uses of Regression Analysis Method of Least Squares Difference between.
Statistics 350 Lecture 2. Today Last Day: Section Today: Section 1.6 Homework #1: Chapter 1 Problems (page 33-38): 2, 5, 6, 7, 22, 26, 33, 34,
Chapter 11 Linear Regression and Correlation. Explanatory and Response Variables are Numeric Relationship between the mean of the response variable and.
Chapter 13 Nonlinear and Multiple Regression. Polynomial Regression For k th-degree model: Y=  0 +  1 x +  2 x 2 +…+  k x k +  Where  is N(0, 
Regression and Correlation of Data Summary
Part 5 - Chapter
Probability Theory and Parameter Estimation I
Kin 304 Regression Linear Regression Least Sum of Squares
Ch12.1 Simple Linear Regression
BPK 304W Regression Linear Regression Least Sum of Squares
Warmup: Find the product, if possible. −6 4 − 
Solving Equations by Factoring and Problem Solving
LESSON 21: REGRESSION ANALYSIS
Linear Regression.
Lesson 13-3: Determinants & Cramer’s Rule
Linear regression Fitting a straight line to observations.
Nonlinear Fitting.
Discrete Least Squares Approximation
Least Square Regression
CALCULATING EQUATION OF LEAST SQUARES REGRESSION LINE
Multivariate Analysis Regression
Ch 4.1 & 4.2 Two dimensions concept
Adequacy of Linear Regression Models
Regression and Correlation of Data
Regression and Correlation of Data
Presentation transcript:

Math 4030 – 11b Method of Least Squares

Model: Dependent (response) Variable Independent (control) Variable Random Error Objectives: Find (estimated) values of the coefficients  and  (based on the sample data) Evaluate the model efficiency; Predict or estimate the Y values for “un-tested” x values. XY x1x1 y1y1 x2x2 y2y2 …… xnxn ynyn Raw data:

Regression Line (Best Fitting Line) Estimated Y value for given x Estimated coefficients for  and  Error Finding the equation of the best fitting line is to find the coefficients a and b, the estimated values of  and . How?

The Method of Least Squares Method of Least Squares finds the line (or a and b) such that sum of squared errors is minimized (among all choices of a and b.

Calculation: Solve a system of 2 linear equations.

Solution by Cramer’s Rule:

Square-Sum Notations:

Solutions: Estimate  : Estimate  : Equation for the Regression Line: Residual sum of squares (or error sum of squares): Note: exchange x and y will end in different regression line.

Curvilinear Regression (Sec. 11.3): Exponential: Reciprocal: Power:

Objectives: Find (estimated) values of the coefficients  i ‘s (based on the sample data) Predict or estimate the Y values for “un-tested” x values. XY x1x1 y1y1 x2x2 y2y2 …… xnxn ynyn Raw data: Polynomial Regression:

Calculation: Solve a system of p + 1 linear equations to solve p + 1 unknowns. If all x i values are distinct, the system has the unique solution.