Chapter 12 Simple Linear Regression n Simple Linear Regression Model n Least Squares Method n Coefficient of Determination n Model Assumptions n Testing.

Slides:



Advertisements
Similar presentations
Lesson 10: Linear Regression and Correlation
Advertisements

Chapter 12 Simple Linear Regression
Forecasting Using the Simple Linear Regression Model and Correlation
Chapter 14, part D Statistical Significance. IV. Model Assumptions The error term is a normally distributed random variable and The variance of  is constant.
Regresi dan Korelasi Linear Pertemuan 19
6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
Regression Analysis Once a linear relationship is defined, the independent variable can be used to forecast the dependent variable. Y ^ = bo + bX bo is.
Chapter 12 Simple Linear Regression
1 1 Slide © 2014 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole.
Chapter 10 Simple Regression.
Chapter 13 Introduction to Linear Regression and Correlation Analysis
Chapter 12b Testing for significance—the t-test Developing confidence intervals for estimates of β 1. Testing for significance—the f-test Using Excel’s.
Chapter 12a Simple Linear Regression
SIMPLE LINEAR REGRESSION
Chapter Topics Types of Regression Models
SIMPLE LINEAR REGRESSION
Example: Test score A survey was conducted to measure the relationship between hours spent on studying JMSC0103 and the scores in the final examination.
Korelasi dan Regresi Linear Sederhana Pertemuan 25
This Week Continue with linear regression Begin multiple regression –Le 8.2 –C & S 9:A-E Handout: Class examples and assignment 3.
1 Pertemuan 13 Regresi Linear dan Korelasi Matakuliah: I0262 – Statistik Probabilitas Tahun: 2007 Versi: Revisi.
1 1 Slide Simple Linear Regression Chapter 14 BA 303 – Spring 2011.
1 1 Slide © 2008 Thomson South-Western. All Rights Reserved Slides by JOHN LOUCKS & Updated by SPIROS VELIANITIS.
Lecture 5 Correlation and Regression
SIMPLE LINEAR REGRESSION
Introduction to Linear Regression and Correlation Analysis
1 1 Slide © 2009 Thomson South-Western. All Rights Reserved Slides by JOHN LOUCKS St. Edward’s University.
Chapter 13: Inference in Regression
1 1 Slide © 2012 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole.
EQT 272 PROBABILITY AND STATISTICS
1 1 Slide © 2005 Thomson/South-Western Slides Prepared by JOHN S. LOUCKS St. Edward’s University Slides Prepared by JOHN S. LOUCKS St. Edward’s University.
1 1 Slide Simple Linear Regression Part A n Simple Linear Regression Model n Least Squares Method n Coefficient of Determination n Model Assumptions n.
Econ 3790: Business and Economics Statistics
OPIM 303-Lecture #8 Jose M. Cruz Assistant Professor.
1 1 Slide © 2007 Thomson South-Western. All Rights Reserved Chapter 13 Multiple Regression n Multiple Regression Model n Least Squares Method n Multiple.
1 1 Slide Multiple Regression n Multiple Regression Model n Least Squares Method n Multiple Coefficient of Determination n Model Assumptions n Testing.
1 1 Slide © 2004 Thomson/South-Western Slides Prepared by JOHN S. LOUCKS St. Edward’s University Slides Prepared by JOHN S. LOUCKS St. Edward’s University.
Econ 3790: Business and Economics Statistics Instructor: Yogesh Uppal
QMS 6351 Statistics and Research Methods Regression Analysis: Testing for Significance Chapter 14 ( ) Chapter 15 (15.5) Prof. Vera Adamchik.
© 2003 Prentice-Hall, Inc.Chap 13-1 Basic Business Statistics (9 th Edition) Chapter 13 Simple Linear Regression.
1 1 Slide © 2008 Thomson South-Western. All Rights Reserved Chapter 15 Multiple Regression n Multiple Regression Model n Least Squares Method n Multiple.
INTRODUCTORY LINEAR REGRESSION SIMPLE LINEAR REGRESSION - Curve fitting - Inferences about estimated parameter - Adequacy of the models - Linear.
1 1 Slide Simple Linear Regression Coefficient of Determination Chapter 14 BA 303 – Spring 2011.
1 1 Slide © 2014 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole.
1 1 Slide The Simple Linear Regression Model n Simple Linear Regression Model y =  0 +  1 x +  n Simple Linear Regression Equation E( y ) =  0 + 
1 Chapter 12 Simple Linear Regression. 2 Chapter Outline  Simple Linear Regression Model  Least Squares Method  Coefficient of Determination  Model.
Chapter 13 Multiple Regression
1 1 Slide Simple Linear Regression Estimation and Residuals Chapter 14 BA 303 – Spring 2011.
1 1 Slide © 2003 South-Western/Thomson Learning™ Slides Prepared by JOHN S. LOUCKS St. Edward’s University.
© 2016 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part, except for use as permitted in a license.
1 1 Slide The Simple Linear Regression Model n Simple Linear Regression Model y =  0 +  1 x +  n Simple Linear Regression Equation E( y ) =  0 + 
1 1 Slide © 2011 Cengage Learning Assumptions About the Error Term  1. The error  is a random variable with mean of zero. 2. The variance of , denoted.
REGRESSION AND CORRELATION SIMPLE LINEAR REGRESSION 10.2 SCATTER DIAGRAM 10.3 GRAPHICAL METHOD FOR DETERMINING REGRESSION 10.4 LEAST SQUARE METHOD.
INTRODUCTION TO MULTIPLE REGRESSION MULTIPLE REGRESSION MODEL 11.2 MULTIPLE COEFFICIENT OF DETERMINATION 11.3 MODEL ASSUMPTIONS 11.4 TEST OF SIGNIFICANCE.
1 1 Slide © 2008 Thomson South-Western. All Rights Reserved Slides by JOHN LOUCKS St. Edward’s University.
Lecture 11: Simple Linear Regression
Pertemuan 22 Analisis Varians Untuk Regresi
Pengujian Parameter Regresi dan Korelasi Pertemuan 20
INFERENSIA KORELASI DAN REGRESI LINIER SEDERHANA Pertemuan 12
Statistics for Business and Economics (13e)
Quantitative Methods Simple Regression.
Econ 3790: Business and Economics Statistics
Slides by JOHN LOUCKS St. Edward’s University.
Correlation and Simple Linear Regression
Correlation and Simple Linear Regression
SIMPLE LINEAR REGRESSION
Simple Linear Regression and Correlation
SIMPLE LINEAR REGRESSION
St. Edward’s University
Presentation transcript:

Chapter 12 Simple Linear Regression n Simple Linear Regression Model n Least Squares Method n Coefficient of Determination n Model Assumptions n Testing for Significance n Using the Estimated Regression Equation for Estimation and Prediction for Estimation and Prediction n Computer Solution n Residual Analysis: Validating Model Assumptions

Simple Linear Regression Model n The equation that describes how y is related to x and an error term is called the regression model. n The simple linear regression model is: y =  0 +  1 x +   0 and  1 are called parameters of the model.  0 and  1 are called parameters of the model.  is a random variable called the error term.  is a random variable called the error term.

Simple Linear Regression Equation n The simple linear regression equation is: E ( y ) =  0 +  1 x Graph of the regression equation is a straight line. Graph of the regression equation is a straight line.  0 is the y intercept of the regression line.  0 is the y intercept of the regression line.  1 is the slope of the regression line.  1 is the slope of the regression line. E ( y ) is the expected value of y for a given x value. E ( y ) is the expected value of y for a given x value.

Simple Linear Regression Equation n Positive Linear Relationship E(y)E(y)E(y)E(y) x Slope  1 is positive Regression line Intercept  0

Simple Linear Regression Equation n Negative Linear Relationship E(y)E(y)E(y)E(y) x Slope  1 is negative Regression line Intercept  0

Simple Linear Regression Equation n No Relationship E(y)E(y)E(y)E(y) x Slope  1 is 0 Regression line Intercept  0

Estimated Simple Linear Regression Equation n The estimated simple linear regression equation is: The graph is called the estimated regression line. The graph is called the estimated regression line. b 0 is the y intercept of the line. b 0 is the y intercept of the line. b 1 is the slope of the line. b 1 is the slope of the line. is the estimated value of y for a given x value. is the estimated value of y for a given x value.

Estimation Process Regression Model y =  0 +  1 x +  Regression Equation E ( y ) =  0 +  1 x Unknown Parameters  0,  1 Sample Data: x y x 1 y x n y n Estimated Regression Equation Sample Statistics b 0, b 1 b 0 and b 1 provide estimates of  0 and  1

Least Squares Method n Least Squares Criterion where: y i = observed value of the dependent variable for the i th observation for the i th observation y i = estimated value of the dependent variable for the i th observation for the i th observation ^

n Slope for the Estimated Regression Equation The Least Squares Method

n y -Intercept for the Estimated Regression Equation where: x i = value of independent variable for i th observation y i = value of dependent variable for i th observation x = mean value for independent variable x = mean value for independent variable y = mean value for dependent variable y = mean value for dependent variable n = total number of observations n = total number of observations _ _ The Least Squares Method

Example: Reed Auto Sales n Simple Linear Regression Reed Auto periodically has a special week-long sale. As part of the advertising campaign Reed runs one or more television commercials during the weekend preceding the sale. Data from a sample of 5 previous sales are shown on the next slide. Reed Auto periodically has a special week-long sale. As part of the advertising campaign Reed runs one or more television commercials during the weekend preceding the sale. Data from a sample of 5 previous sales are shown on the next slide.

Example: Reed Auto Sales n Simple Linear Regression Number of TV Ads Number of Cars Sold Number of TV Ads Number of Cars Sold

n Slope for the Estimated Regression Equation b 1 = (10)(100)/5 = 5 b 1 = (10)(100)/5 = (10) 2 / (10) 2 /5 n y -Intercept for the Estimated Regression Equation b 0 = (2) = 10 b 0 = (2) = 10 n Estimated Regression Equation y = x ^ Example: Reed Auto Sales

n Scatter Diagram ^

The Coefficient of Determination n Relationship Among SST, SSR, SSE SST = SSR + SSE where: SST = total sum of squares SST = total sum of squares SSR = sum of squares due to regression SSR = sum of squares due to regression SSE = sum of squares due to error SSE = sum of squares due to error ^^

n The coefficient of determination is: r 2 = SSR/SST where: SST = total sum of squares SST = total sum of squares SSR = sum of squares due to regression SSR = sum of squares due to regression The Coefficient of Determination

n Coefficient of Determination r 2 = SSR/SST = 100/114 =.8772 The regression relationship is very strong because 88% of the variation in number of cars sold can be explained by the linear relationship between the number of TV ads and the number of cars sold. Example: Reed Auto Sales

The Correlation Coefficient n Sample Correlation Coefficient where: b 1 = the slope of the estimated regression b 1 = the slope of the estimated regressionequation

Example: Reed Auto Sales n Sample Correlation Coefficient The sign of b 1 in the equation is “+”. r xy = r xy =

Model Assumptions Assumptions About the Error Term  Assumptions About the Error Term  1.The error  is a random variable with mean of zero. 2.The variance of , denoted by  2, is the same for all values of the independent variable. 3.The values of  are independent. 4.The error  is a normally distributed random variable.

Testing for Significance To test for a significant regression relationship, we must conduct a hypothesis test to determine whether the value of  1 is zero. To test for a significant regression relationship, we must conduct a hypothesis test to determine whether the value of  1 is zero. n Two tests are commonly used t Test t Test F Test F Test Both tests require an estimate of  2, the variance of  in the regression model. Both tests require an estimate of  2, the variance of  in the regression model.

An Estimate of  2 An Estimate of  2 The mean square error (MSE) provides the estimate of  2, and the notation s 2 is also used. s 2 = MSE = SSE/(n-2) s 2 = MSE = SSE/(n-2)where: Testing for Significance

An Estimate of  An Estimate of  To estimate  we take the square root of  2. To estimate  we take the square root of  2. The resulting s is called the standard error of the estimate. The resulting s is called the standard error of the estimate.

n Hypotheses H 0 :  1 = 0 H 0 :  1 = 0 H a :  1 = 0 H a :  1 = 0 n Test Statistic Testing for Significance: t Test

n Rejection Rule Reject H 0 if t t  where: t  is based on a t distribution with n - 2 degrees of freedom with n - 2 degrees of freedom Testing for Significance: t Test

n t Test Hypotheses Hypotheses H 0 :  1 = 0 H 0 :  1 = 0 H a :  1 = 0 H a :  1 = 0 Rejection Rule Rejection Rule For  =.05 and d.f. = 3, t.025 = For  =.05 and d.f. = 3, t.025 = Reject H 0 if t > Reject H 0 if t > Example: Reed Auto Sales

n t Test Test Statistics Test Statistics t = 5/1.08 = 4.63 Conclusions Conclusions t = 4.63 > 3.182, so reject H 0 t = 4.63 > 3.182, so reject H 0 Example: Reed Auto Sales

Confidence Interval for  1 We can use a 95% confidence interval for  1 to test the hypotheses just used in the t test. We can use a 95% confidence interval for  1 to test the hypotheses just used in the t test. H 0 is rejected if the hypothesized value of  1 is not included in the confidence interval for  1. H 0 is rejected if the hypothesized value of  1 is not included in the confidence interval for  1.

Confidence Interval for  1 The form of a confidence interval for  1 is: The form of a confidence interval for  1 is: where b 1 is the point estimate is the margin of error is the t value providing an area of  /2 in the upper tail of a t distribution with n - 2 degrees t distribution with n - 2 degrees of freedom

n Rejection Rule Reject H 0 if 0 is not included in the confidence interval for  1. 95% Confidence Interval for  1 95% Confidence Interval for  1 = 5 +/ (1.08) = 5 +/ = 5 +/ (1.08) = 5 +/ or 1.56 to 8.44 n Conclusion 0 is not included in the confidence interval. Reject H 0 Reject H 0 Example: Reed Auto Sales

n Hypotheses H 0 :  1 = 0 H 0 :  1 = 0 H a :  1 = 0 H a :  1 = 0 n Test Statistic F = MSR/MSE Testing for Significance: F Test

n Rejection Rule Reject H 0 if F > F  where: F  is based on an F distribution with 1 d.f. in the numerator and n - 2 d.f. in the denominator Testing for Significance: F Test

n F Test Hypotheses Hypotheses H 0 :  1 = 0 H 0 :  1 = 0 H a :  1 = 0 H a :  1 = 0 Rejection Rule Rejection Rule For  =.05 and d.f. = 1, 3: F.05 = For  =.05 and d.f. = 1, 3: F.05 = Reject H 0 if F > Reject H 0 if F > Example: Reed Auto Sales

n F Test Test Statistic Test Statistic F = MSR/MSE = 100/4.667 = Conclusion Conclusion F = > 10.13, so we reject H 0. Example: Reed Auto Sales

Some Cautions about the Interpretation of Significance Tests Rejecting H 0 :  1 = 0 and concluding that the relationship between x and y is significant does not enable us to conclude that a cause-and-effect relationship is present between x and y. Rejecting H 0 :  1 = 0 and concluding that the relationship between x and y is significant does not enable us to conclude that a cause-and-effect relationship is present between x and y. Just because we are able to reject H 0 :  1 = 0 and demonstrate statistical significance does not enable us to conclude that there is a linear relationship between x and y. Just because we are able to reject H 0 :  1 = 0 and demonstrate statistical significance does not enable us to conclude that there is a linear relationship between x and y.

n Confidence Interval Estimate of E ( y p ) n Prediction Interval Estimate of y p y p + t  /2 s ind y p + t  /2 s ind where:confidence coefficient is 1 -  and t  /2 is based on a t distribution with n - 2 degrees of freedom Using the Estimated Regression Equation for Estimation and Prediction

n Point Estimation If 3 TV ads are run prior to a sale, we expect the mean number of cars sold to be: y = (3) = 25 cars ^ Example: Reed Auto Sales

n Confidence Interval for E ( y p ) 95% confidence interval estimate of the mean number of cars sold when 3 TV ads are run is: = to cars Example: Reed Auto Sales

n Prediction Interval for y p 95% prediction interval estimate of the number of cars sold in one particular week when 3 TV ads are run is: = to cars = to cars Example: Reed Auto Sales

n Residual for Observation i y i – y i y i – y i n Standardized Residual for Observation i where: Residual Analysis ^^^ ^

Example: Reed Auto Sales n Residuals

Example: Reed Auto Sales n Residual Plot

Residual Analysis n Residual Plot x 0 Good Pattern Residual

Residual Analysis n Residual Plot x 0 Nonconstant Variance Residual

Residual Analysis n Residual Plot x 0 Model Form Not Adequate Residual