Presentation is loading. Please wait.

Presentation is loading. Please wait.

Chapter 6 (cont.) Regression Estimation. Simple Linear Regression: review of least squares procedure 2.

Similar presentations


Presentation on theme: "Chapter 6 (cont.) Regression Estimation. Simple Linear Regression: review of least squares procedure 2."— Presentation transcript:

1 Chapter 6 (cont.) Regression Estimation

2 Simple Linear Regression: review of least squares procedure 2

3 Introduction n We will examine the relationship between quantitative variables x and y via a mathematical equation. n x: explanatory variable n y: response variable n Data: 3

4 The Model 4 House size House Cost Most lots sell for $25,000 Building a house costs about $75 per square foot. House cost = 25000 + 75(Size) The model has a deterministic and a probabilistic component

5 The Model 5 House cost = 25000 + 75(Size) House size House Cost Most lots sell for $25,000   However, house costs vary even among same size houses! Since cost behave unpredictably, we add a random component.

6 The Model n The first order linear model y = response variable x = explanatory variable  0 = y-intercept  1 = slope of the line  = error variable 6 x y 00 Run Rise   = Rise/Run  0 and  1 are unknown population parameters, therefore are estimated from the data.

7 Estimating the Coefficients n The estimates are determined by –drawing a sample from the population of interest, –calculating sample statistics. –producing a straight line that cuts into the data. 7           Question: What should be considered a good line? x y

8 The Least Squares (Regression) Line 8

9 9 3 3     4 1 1 4 (1,2) 2 2 (2,4) (3,1.5) Sum of squared differences =(2 - 1) 2 +(4 - 2) 2 +(1.5 - 3) 2 + (4,3.2) (3.2 - 4) 2 = 6.89 Sum of squared differences =(2 -2.5) 2 +(4 - 2.5) 2 +(1.5 - 2.5) 2 +(3.2 - 2.5) 2 = 3.99 2.5 Let us compare two lines The second line is horizontal The smaller the sum of squared differences the better the fit of the line to the data.

10 The Estimated Coefficients 10 To calculate the estimates of the slope and intercept of the least squares line, use the formulas: The least squares prediction equation that estimates the mean value of y for a particular value of x is:

11 Example:  Consumer’s Union recently evaluated 26 brands of frozen pizza based on taste (y)  We will examine the taste scores (y) and the corresponding fat content (x). 11 Simple Linear Regression

12 The Simple Linear Regression Line (example, cont.) 12 Solution –Solving by hand: Calculate a number of statistics where n = 26.

13 The Simple Linear Regression Line (example, cont.) 13 Solution – continued –Using the computer 1. Scatterplot 2. Trend function 3. Data tab > Data Analysis > Regression

14 Regression Statistics Multiple R0.723546339 R Square0.523519305 Adjusted R Square0.503665943 Standard Error8.785081398 Observations26 ANOVA dfSSMSFSignificance F Regression12035.1208912035.12126.36932.95293E-05 Residual241852.26372477.17766 Total253887.384615 CoefficientsStandard Errort StatP-valueLower 95%Upper 95% Intercept39.002083225.5610982197.0133782.99E-0727.524540650.47962583 Fat1.726028940.3361234075.1351052.95E-051.0323043242.419753555 The Simple Linear Regression Line (example, cont.) 14

15 The Simple Linear Regression Line (example, cont.) 15

16 Regression estimator of a population mean  y


Download ppt "Chapter 6 (cont.) Regression Estimation. Simple Linear Regression: review of least squares procedure 2."

Similar presentations


Ads by Google