Essentials of Modern Business Statistics (7e)

Slides:



Advertisements
Similar presentations
Pengujian Parameter Regresi Pertemuan 26 Matakuliah: I0174 – Analisis Regresi Tahun: Ganjil 2007/2008.
Advertisements

Chapter 12 Simple Linear Regression
1 1 Slide © 2014 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole.
Chapter 12 Simple Regression
1 1 Slide 統計學 Spring 2004 授課教師:統計系余清祥 日期: 2004 年 5 月 4 日 第十二週:複迴歸.
Simple Linear Regression Analysis
1 1 Slide © 2015 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole.
1 1 Slide © 2003 South-Western/Thomson Learning™ Slides Prepared by JOHN S. LOUCKS St. Edward’s University.
Chapter 14 Introduction to Linear Regression and Correlation Analysis
1 1 Slide © 2008 Thomson South-Western. All Rights Reserved Slides by JOHN LOUCKS & Updated by SPIROS VELIANITIS.
1 1 Slide © 2008 Thomson South-Western. All Rights Reserved Slides by JOHN LOUCKS & Updated by SPIROS VELIANITIS.
Objectives of Multiple Regression
Introduction to Linear Regression and Correlation Analysis
Chapter 13: Inference in Regression
Regression Method.
1 1 Slide © 2005 Thomson/South-Western Slides Prepared by JOHN S. LOUCKS St. Edward’s University Slides Prepared by JOHN S. LOUCKS St. Edward’s University.
Econ 3790: Business and Economics Statistics
1 1 Slide © 2016 Cengage Learning. All Rights Reserved. The equation that describes how the dependent variable y is related to the independent variables.
1 1 Slide © 2005 Thomson/South-Western Slides Prepared by JOHN S. LOUCKS St. Edward’s University Slides Prepared by JOHN S. LOUCKS St. Edward’s University.
1 1 Slide © 2003 Thomson/South-Western Chapter 13 Multiple Regression n Multiple Regression Model n Least Squares Method n Multiple Coefficient of Determination.
1 1 Slide © 2007 Thomson South-Western. All Rights Reserved OPIM 303-Lecture #9 Jose M. Cruz Assistant Professor.
1 1 Slide © 2007 Thomson South-Western. All Rights Reserved Chapter 13 Multiple Regression n Multiple Regression Model n Least Squares Method n Multiple.
1 1 Slide © 2012 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole.
1 1 Slide Multiple Regression n Multiple Regression Model n Least Squares Method n Multiple Coefficient of Determination n Model Assumptions n Testing.
1 1 Slide © 2004 Thomson/South-Western Slides Prepared by JOHN S. LOUCKS St. Edward’s University Slides Prepared by JOHN S. LOUCKS St. Edward’s University.
Econ 3790: Business and Economics Statistics Instructor: Yogesh Uppal
1 1 Slide © 2008 Thomson South-Western. All Rights Reserved Chapter 15 Multiple Regression n Multiple Regression Model n Least Squares Method n Multiple.
CHAPTER 14 MULTIPLE REGRESSION
INTRODUCTORY LINEAR REGRESSION SIMPLE LINEAR REGRESSION - Curve fitting - Inferences about estimated parameter - Adequacy of the models - Linear.
1 1 Slide Simple Linear Regression Coefficient of Determination Chapter 14 BA 303 – Spring 2011.
Chap 12-1 A Course In Business Statistics, 4th © 2006 Prentice-Hall, Inc. A Course In Business Statistics 4 th Edition Chapter 12 Introduction to Linear.
1 1 Slide © 2014 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole.
1 Chapter 12 Simple Linear Regression. 2 Chapter Outline  Simple Linear Regression Model  Least Squares Method  Coefficient of Determination  Model.
Multiple Regression and Model Building Chapter 15 Copyright © 2014 by The McGraw-Hill Companies, Inc. All rights reserved.McGraw-Hill/Irwin.
Chapter 4 Linear Regression 1. Introduction Managerial decisions are often based on the relationship between two or more variables. For example, after.
1 1 Slide © 2014 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole.
14- 1 Chapter Fourteen McGraw-Hill/Irwin © 2006 The McGraw-Hill Companies, Inc., All Rights Reserved.
Chapter 13 Multiple Regression
1 1 Slide © 2003 South-Western/Thomson Learning™ Slides Prepared by JOHN S. LOUCKS St. Edward’s University.
1 1 Slide © 2011 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole.
Chapter 12 Simple Linear Regression n Simple Linear Regression Model n Least Squares Method n Coefficient of Determination n Model Assumptions n Testing.
1 1 Slide The Simple Linear Regression Model n Simple Linear Regression Model y =  0 +  1 x +  n Simple Linear Regression Equation E( y ) =  0 + 
Business Research Methods
1 1 Slide © 2011 Cengage Learning Assumptions About the Error Term  1. The error  is a random variable with mean of zero. 2. The variance of , denoted.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Multiple Regression Chapter 14.
REGRESSION AND CORRELATION SIMPLE LINEAR REGRESSION 10.2 SCATTER DIAGRAM 10.3 GRAPHICAL METHOD FOR DETERMINING REGRESSION 10.4 LEAST SQUARE METHOD.
INTRODUCTION TO MULTIPLE REGRESSION MULTIPLE REGRESSION MODEL 11.2 MULTIPLE COEFFICIENT OF DETERMINATION 11.3 MODEL ASSUMPTIONS 11.4 TEST OF SIGNIFICANCE.
Multiple Regression.
Chapter 13 Simple Linear Regression
Lecture 11: Simple Linear Regression
Chapter 14 Introduction to Multiple Regression
Chapter 20 Linear and Multiple Regression
Chapter 13 Created by Bethany Stubbe and Stephan Kogitz.
John Loucks St. Edward’s University . SLIDES . BY.
Multiple Regression and Model Building
Statistics for Business and Economics (13e)
Quantitative Methods Simple Regression.
Slides by JOHN LOUCKS St. Edward’s University.
Business Statistics Multiple Regression This lecture flows well with
Multiple Regression.
Prepared by Lee Revere and John Large
SIMPLE LINEAR REGRESSION
CHAPTER 14 MULTIPLE REGRESSION
Chapter Fourteen McGraw-Hill/Irwin
Essentials of Statistics for Business and Economics (8e)
Multiple Regression Berlin Chen
St. Edward’s University
Presentation transcript:

Essentials of Modern Business Statistics (7e) Anderson, Sweeney, Williams, Camm, Cochran © 2018 Cengage Learning

Chapter 15 Multiple Regression Multiple Regression Model Least Squares Method Multiple Coefficient of Determination Model Assumptions Testing for Significance Using the Estimated Regression Equation for Estimation and Prediction Categorical Independent Variables Residual Analysis

Multiple Regression In this chapter we continue our study of regression analysis by considering situations involving two or more independent variables. This subject area, called multiple regression analysis, enables us to consider more factors and thus obtain better estimates than are possible with simple linear regression.

Multiple Regression Model The equation that describes how the dependent variable y is related to the independent variables x1, x2, . . . xp and an error term is: y = b0 + b1x1 + b2x2 + . . . + bpxp + e where: b0, b1, b2, . . . , bp are the parameters, and e is a random variable called the error term

Multiple Regression Equation The equation that describes how the mean value of y is related to x1, x2, . . . xp is: E(y) = 0 + 1x1 + 2x2 + . . . + pxp

Estimated Multiple Regression Equation 𝑦 = b0 + b1x1 + b2x2 + . . . + bpxp A simple random sample is used to compute sample statistics b0, b1, b2, . . . , bp that are used as the point estimators of the parameters b0, b1, b2, . . . , bp.

Estimation Process

Least Squares Method Least Squares Criterion min 𝑦 𝑖 − 𝑦 𝑖 2 Computation of Coefficient Values The formulas for the regression coefficients b0, b1, b2, . . . bp involve the use of matrix algebra. We will rely on computer software packages to perform the calculations. The emphasis will be on how to interpret the computer output rather than on how to make the multiple regression computations.

Multiple Regression Model Example: Butler Trucking Company Managers at Butler Trucking Company want to develop better work schedules for their drivers. They believed that the total daily travel time would be closely related to the number of miles travelled in making the daily deliveries and also to the number of deliveries. A simple random sample of 10 driving assignments was taken.

Multiple Regression Model Example: Butler Trucking Company Driving Assignment Miles travelled x1 Deliveries x2 y = Travel Time (hours) 1 100 4 9.3 2 50 3 4.8 8.9 6.5 5 4.2 6 80 6.2 7 75 7.4 8 65 6.0 9 90 7.6 10 6.1

Multiple Regression Model Example: Butler Trucking Company Suppose we believe that total daily travel time (y) is related to the miles travelled (x1) and the number of deliveries made (x2) by the following regression model: y = 0 + 1x1 + 2x2 +  where y = Total travel time x1 = Miles travelled x2 = Deliveries made

Solving for the Estimates of 0, 1, 2 Example: Butler Trucking Company Input Data Least Squares Output x1 x2 y 100 4 9.3 50 3 4.8 . . . 90 2 6.1 b0 = b1 = b2 = R2 = etc. Computer Package for Solving Multiple Regression Problems

Solving for the Estimates of 0, 1, 2 Example: Butler Trucking Company

Estimated Regression Equation Example: Butler Trucking Company 𝑦 = -.8687 + .0611x1 + .9234x2

Interpreting the Coefficients In multiple regression analysis, we interpret each regression coefficient as follows: bi represents an estimate of the change in y corresponding to a 1-unit change in xi when all other independent variables are held constant.

Interpreting the Coefficients Example: Butler Trucking Company b1 = .0611 .0611 is an estimate of the expected increase in travel time corresponding to an increase of one mile in the distance travelled when the number of deliveries is held constant.

Interpreting the Coefficients Example: Butler Trucking Company b2 = .9234 .9234 is an estimate of the expected increase in travel time corresponding to an increase of one delivery when the number of miles travelled is held constant.

Multiple Coefficient of Determination Relationship Among SST, SSR, SSE SST = SSR + SSE 𝑦 𝑖 − 𝑦 2 = 𝑦 𝑖 − 𝑦 2 + 𝑦 𝑖 − 𝑦 𝑖 2 where: SST = Total sum of squares SSR = Sum of squares due to regression SSE = Sum of squares due to error

Multiple Coefficient of Determination Example: Butler Trucking Company ANOVA Output

Multiple Coefficient of Determination Example: Butler Trucking Company R2 = SSR/SST R2 = 21.6006 / 23.9 = .9038

Adjusted Multiple Coefficient of Determination Adding independent variables, even ones that are not statistically significant, causes the prediction errors to become smaller, thus reducing the sum of squares due to error, SSE. Because SSR = SST – SSE, when SSE becomes smaller, SSR becomes larger, causing R2 = SSR/SST to increase. The adjusted multiple coefficient of determination compensates for the number of independent variables in the model.

Adjusted Multiple Coefficient of Determination Example: Butler Trucking Company 𝑅 𝑎 2 =1− (1− 𝑅 2 ) 𝑛−1 𝑛−𝑝−1 𝑅 𝑎 2 =1− 1−.9038 10−1 10−2−1 = .8763

Assumptions About the Error Term  The error  is a random variable with mean of zero. The variance of  , denoted by  2, is the same for all values of the independent variables. The values of  are independent. The error  is a normally distributed random variable reflecting the deviation between the y value and the expected value of y given by 0 + 1x1 + 2x2 + . . + pxp.

Testing for Significance In simple linear regression, the F and t tests provide the same conclusion. In multiple regression, the F and t tests have different purposes.

Testing for Significance: F Test The F test is used to determine whether a significant relationship exists between the dependent variable and the set of all the independent variables. The F test is referred to as the test for overall significance.

Testing for Significance: t Test If the F test shows an overall significance, the t test is used to determine whether each of the individual independent variables is significant. A separate t test is conducted for each of the independent variables in the model. We refer to each of these t tests as a test for individual significance.

Testing for Significance: F Test Hypotheses H0: 1 = 2 = . . . = p = 0 Ha: One or more of the parameters is not equal to zero Test Statistics F = MSR/MSE Rejection Rule Reject H0 if p-value < a or if F ≥ F , where F is based on an F distribution with p d.f. in the numerator and n - p - 1 d.f. in the denominator.

Testing for Significance: F Test Example: Butler Trucking Company Hypotheses H0: 1 = 2 = . . . = p = 0 Ha: One or more of the parameters is not equal to zero Rejection Rule For  = .01 and d.f. = 2, 7; F.01 = 9.55 Reject H0 if p-value < .01 or F > 9.55

F Test for Overall Significance Example: Butler Trucking Company ANOVA Output p-value used to test for overall significance

F Test for Overall Significance Example: Butler Trucking Company Test Statistics F = MSR/MSE 10.8003 / .3285 = 32.9 Conclusion p-value < .01, so we can reject H0. (Also, F = 32.9 > 9.55)

Testing for Significance: t Test Hypotheses H0: i = 0 Ha: i ≠ 0 Test statistics 𝑡= 𝑏 𝑖 𝑠 𝑏 𝑖 Rejection Rule Reject H0 if p-value < a or if t < -tor t > t where t is based on a t distribution with n - p – 1 degrees of freedom.

Testing for Significance: t Test Example: Butler Trucking Company Hypotheses H0: i = 0 Ha: i ≠ 0 Rejection Rule For  = .01 and d.f. = 7, t.005 = 3.499 Reject H0 if p-value < .01, or if t < -3.499 or t > 3.499

t Test for Significance of Individual Parameters Example: Butler Trucking Company Regression Equation Output

t Test for Significance of Individual Parameters Test Statistics 𝑡= 𝑏 1 𝑠 𝑏 1 = .0611 .0099 =6.1717 𝑡= 𝑏 2 𝑠 𝑏 2 = .9234 .2211 =4.1764 Conclusions Reject both H0: 1 = 0 and H0: 2 = 0. Both independent variables are significant.

Testing for Significance: Multicollinearity The term multicollinearity refers to the correlation among the independent variables. When the independent variables are highly correlated (say, |r | > .7), it is not possible to determine the separate effect of any particular independent variable on the dependent variable.

Testing for Significance: Multicollinearity If the estimated regression equation is to be used only for predictive purposes, multicollinearity is usually not a serious problem. Every attempt should be made to avoid including independent variables that are highly correlated.

Using the Estimated Regression Equation for Estimation and Prediction The procedures for estimating the mean value of y and predicting an individual value of y in multiple regression are similar to those in simple regression. We substitute the given values of x1, x2, . . . , xp into the estimated regression equation and use the corresponding value of y as the point estimate.

Using the Estimated Regression Equation for Estimation and Prediction The formulas required to develop interval estimates for the mean value of 𝑦 and for an individual value of y are beyond the scope of the textbook. Software packages for multiple regression will often provide these interval estimates.

Categorical Independent Variables In many situations we must work with categorical independent variables such as gender (male, female), method of payment (cash, check, credit card), etc. For example, x2 might represent gender where x2 = 0 indicates male and x2 = 1 indicates female. In this case, x2 is called a dummy or indicator variable.

Categorical Independent Variables Example: Johnson Filtration, Inc. Managers of Johnson Filtration, Inc. want to predict the repair time necessary for processing its maintenance requests. Repair time is believed to be related to two factors, the number of months since last service and the type of repair problem (mechanical or electrical). Data for a sample of 10 service calls are reported in the table below.

Categorical Independent Variables Service Call Months since last service Type of repair Repair time (hours) 1 2 Electrical 2.9 6 Mechanical 3.0 3 8 4.8 4 1.8 5 7 4.9 9 4.2 4.4 10 4.5 Example: Johnson Filtration, Inc.

Categorical Independent Variables Example: Johnson Filtration, Inc. Regression Equation Output 𝑦 = 0 + 1x1 + 2x2 +  where: 𝑦 = Repair time in hours x1 = Number of months since last maintenance service x2 = 0 if type of repair is mechanical, 1 if the type of repair is electrical (x2 is a dummy variable)

Categorical Independent Variables Example: Johnson Filtration, Inc. ANOVA Output R2 = 9.0009/10.476 = .8592 𝑅 𝑎 2 =1− 1−.8592 10−1 10−2−1 = .8190 This indicates estimated regression equation does a good job of explaining the variability in repair times.

Categorical Independent Variables Example: Johnson Filtration, Inc. Conclusion: Both months since service and the type of repair are significant.

More Complex Categorical Variables If a categorical variable has k levels, k - 1 dummy variables are required, with each dummy variable being coded as 0 or 1. For example, a variable with levels A, B, and C could be represented by x1 and x2 values of (0, 0) for A, (1, 0) for B, and (0, 1) for C. Care must be taken in defining and interpreting the dummy variables.

More Complex Categorical Variables Example: Sales regions A variable indicating sales regions could be represented by x1 and x2 values as follows: Region x1 x2 A 0 0 B 1 0 C 0 1

Residual Analysis For simple linear regression the residual plot against 𝑦 and the residual plot against x provide the same information. In multiple regression analysis it is preferable to use the residual plot against 𝑦 to determine if the model assumptions are satisfied.

Standardized Residual Plot Against 𝑦 Standardized residuals are frequently used in residual plots for purposes of: Identifying outliers (typically, standardized residuals < -2 or > +2) Providing insight about the assumption that the error term  has a normal distribution The computation of the standardized residuals in multiple regression analysis is too complex to be done by hand Excel’s Regression tool can be used

Standardized Residual Plot Against 𝑦 Example: Butler Trucking Company – Residual output

Standardized Residual Plot Against 𝑦 Example: Butler Trucking Company

End of Chapter 15