Presentation is loading. Please wait.

Presentation is loading. Please wait.

Lecture 23 Multiple Regression (Sections 19.3-19.4)

Similar presentations


Presentation on theme: "Lecture 23 Multiple Regression (Sections 19.3-19.4)"— Presentation transcript:

1 Lecture 23 Multiple Regression (Sections 19.3-19.4)

2 Multiple Regression Model Multiple regression model: y =  0 +  1 x 1 +  2 x 2 + …+  k x k +  Required conditions   The regression function is a linear function of the independent variables x 1,…,x k (multiple regression line does not systematically overestimate/underestimate y for any combination of x 1,…,x k ).  The error  is normally distributed.  The standard deviation is constant (    for all values of x’s.  The errors are independent.

3 Data were collected from randomly selected 100 inns that belong to La Quinta, and ran for the following suggested model: Margin =     Rooms   Nearest   Office   College +  5 Income +  6 Disttwn Estimating the Coefficients and Assessing the Model, Example Xm19-01

4 Model Assessment The model is assessed using three tools: –The standard error of estimate –The coefficient of determination –The F-test of the analysis of variance

5 We pose the question: Is there at least one independent variable linearly related to the dependent variable (Are any of the X’s useful in predicting Y)? To answer the question we test the hypothesis H 0 :  1 =  2 = … =  k =0 H 1 : At least one  i is not equal to zero. If at least one  i is not equal to zero, the model has some validity. Testing the Validity of the Model

6 The hypotheses are tested by an ANOVA procedure. Testing the Validity of the La Quinta Inns Regression Model

7 [Variation in y] = SSR + SSE. If SSR is large relative to SSE, much of the variation in y is explained by the regression model; the model is useful and thus, the null hypothesis should be rejected. Thus, we reject for large F. Rejection region F>F ,k,n-k-1 Testing the Validity of the La Quinta Inns Regression Model

8 F ,k,n-k-1 = F 0.05,6,100-6-1 =2.17 F = 17.14 > 2.17 Also, the p-value (Significance F) = 0.0000 Reject the null hypothesis. Testing the Validity of the La Quinta Inns Regression Model Conclusion: There is sufficient evidence to reject the null hypothesis in favor of the alternative hypothesis. At least one of the  i is not equal to zero. Thus, at least one independent variable is linearly related to y. This linear regression model is valid

9 Relationships among and F SSER2R2 FAsses. of model 0 Small Large

10 b 0 = 38.14. This is the intercept, the value of y when all the variables take the value zero. Since the data range of all the independent variables do not cover the value zero, do not interpret the intercept. b 1 = – 0.0076. In this model, for each additional room within 3 mile of the La Quinta inn, the operating margin decreases on average by.0076% (assuming the other variables are held constant). Interpreting the Coefficients

11 b 2 = 1.65. In this model, for each additional mile that the nearest competitor is to a La Quinta inn, the operating margin increases on average by 1.65% when the other variables are held constant. b 3 = 0.020. For each additional 1000 sq-ft of office space, the operating margin will increase on average by.02% when the other variables are held constant. b 4 = 0.21. For each additional thousand students the operating margin increases on average by.21% when the other variables are held constant. Interpreting the Coefficients

12 b 5 = 0.41. For additional $1000 increase in median household income, the operating margin increases on average by.41%, when the other variables remain constant. b 6 = -0.23. For each additional mile to the downtown center, the operating margin decreases on average by.23% when the other variables are held constant. Interpreting the Coefficients

13 The hypothesis for each  i is JMP printout H 0 :  i  0 H 1 :  i  0 d.f. = n - k -1 Test statistic Testing the Coefficients

14 Confidence Intervals for Coefficients Note that test of is a test of whether x i helps to predict y given x 1,…,x i-1,x i+1,…x k. Results of test might change as we change other independent variables in the model. A confidence interval for is In La Quinta data, a 95% confidence interval for (the coefficient on number of rooms) is

15 The model can be used for making predictions by –Producing prediction interval estimate for the particular value of y, for a given values of x i. –Producing a confidence interval estimate for the expected value of y, for given values of x i. The model can be used to learn about relationships between the independent variables x i, and the dependent variable y, by interpreting the coefficients  i Using the Linear Regression Equation

16 Predict the average operating margin of an inn at a site with the following characteristics: –3815 rooms within 3 miles, –Closet competitor.9 miles away, –476,000 sq-ft of office space, –24,500 college students, –$35,000 median household income, –11.2 miles distance to downtown center. MARGIN = 38.14 - 0.0076 (3815) + 1.65 (.9) + 0.020( 476) +0.21 (24.5) + 0.41( 35) - 0.23 (11.2) = 37.1% Xm19-01 La Quinta Inns, Predictions

17 Prediction Intervals and Confidence Intervals for Mean Prediction interval for y given x 1,…,x k : Confidence interval for mean of y given x 1,…,x k: For inn with characteristics on previous slide: Confidence interval for mean = (32.970,41.213) Prediction interval = (25.395,48.788)

18 The conditions required for the model assessment to apply must be checked. –Is the error variable normally distributed? –Is the regression function correctly specified as a linear function of x 1,…,x k Plot the residuals versus x’s and –Is the error variance constant? –Are the errors independent? –Can we identify outlier? –Is multicollinearity a problem? 19.4 Regression Diagnostics - II Draw a histogram of the residuals Plot the residuals versus y ^ Plot the residuals versus the time periods

19 Multicollinearity Condition in which independent variables are highly correlated. Multicollinearity causes two kinds of difficulties: –The t statistics appear to be too small. –The  coefficients cannot be interpreted as “slopes”. Diagnostics: –High correlation between independent variables –Counterintuitive signs on regression coefficients –Low values for t-statistics despite a significant overall fit, as measured by the F statistics

20 Diagnostics: Multicollinearity Example 19.2: Predicting house price ( Xm19- 02) Xm19- 02 –A real estate agent believes that a house selling price can be predicted using the house size, number of bedrooms, and lot size. –A random sample of 100 houses was drawn and data recorded. –Analyze the relationship among the four variables

21 The proposed model is PRICE =  0 +  1 BEDROOMS +  2 H-SIZE +  3 LOTSIZE +  The model is valid, but no variable is significantly related to the selling price ?! Diagnostics: Multicollinearity

22 Multicollinearity is found to be a problem. Diagnostics: Multicollinearity Multicollinearity causes two kinds of difficulties: –The t statistics appear to be too small. –The  coefficients cannot be interpreted as “slopes”.


Download ppt "Lecture 23 Multiple Regression (Sections 19.3-19.4)"

Similar presentations


Ads by Google