Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Multiple Regression Chapter 14.

Slides:



Advertisements
Similar presentations
Simple Linear Regression Analysis
Advertisements

Multiple Regression and Model Building
Chap 12-1 Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chapter 12 Simple Regression Statistics for Business and Economics 6.
Irwin/McGraw-Hill © Andrew F. Siegel, 1997 and l Chapter 12 l Multiple Regression: Predicting One Factor from Several Others.
Copyright © 2009 Pearson Education, Inc. Chapter 29 Multiple Regression.
6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
Regression Analysis Once a linear relationship is defined, the independent variable can be used to forecast the dependent variable. Y ^ = bo + bX bo is.
Simple Linear Regression
Chapter 12 Simple Linear Regression
1 1 Slide © 2014 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole.
© 2010 Pearson Prentice Hall. All rights reserved Least Squares Regression Models.
Copyright © 2010, 2007, 2004 Pearson Education, Inc. *Chapter 29 Multiple Regression.
Chapter 10 Simple Regression.
Chapter 12 Simple Regression
Irwin/McGraw-Hill © The McGraw-Hill Companies, Inc., 2000 LIND MASON MARCHAL 1-1 Chapter Twelve Multiple Regression and Correlation Analysis GOALS When.
Chapter 11 Multiple Regression.
Simple Linear Regression Analysis
Multiple Regression and Correlation Analysis
1 1 Slide © 2015 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole.
Ch. 14: The Multiple Regression Model building
1 1 Slide © 2003 South-Western/Thomson Learning™ Slides Prepared by JOHN S. LOUCKS St. Edward’s University.
Simple Linear Regression Analysis
1 1 Slide © 2008 Thomson South-Western. All Rights Reserved Slides by JOHN LOUCKS & Updated by SPIROS VELIANITIS.
Active Learning Lecture Slides
McGraw-Hill/IrwinCopyright © 2009 by The McGraw-Hill Companies, Inc. All Rights Reserved. Simple Linear Regression Analysis Chapter 13.
Introduction to Linear Regression and Correlation Analysis
Chapter 13: Inference in Regression
Chapter 11 Simple Regression
Correlation and Linear Regression
Copyright © 2013, 2010 and 2007 Pearson Education, Inc. Chapter Inference on the Least-Squares Regression Model and Multiple Regression 14.
1 1 Slide © 2005 Thomson/South-Western Slides Prepared by JOHN S. LOUCKS St. Edward’s University Slides Prepared by JOHN S. LOUCKS St. Edward’s University.
1 1 Slide © 2003 Thomson/South-Western Chapter 13 Multiple Regression n Multiple Regression Model n Least Squares Method n Multiple Coefficient of Determination.
1 1 Slide © 2007 Thomson South-Western. All Rights Reserved OPIM 303-Lecture #9 Jose M. Cruz Assistant Professor.
1 1 Slide © 2007 Thomson South-Western. All Rights Reserved Chapter 13 Multiple Regression n Multiple Regression Model n Least Squares Method n Multiple.
1 1 Slide © 2012 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole.
1 1 Slide Multiple Regression n Multiple Regression Model n Least Squares Method n Multiple Coefficient of Determination n Model Assumptions n Testing.
Statistics for Business and Economics 7 th Edition Chapter 11 Simple Regression Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall Ch.
1 1 Slide © 2008 Thomson South-Western. All Rights Reserved Chapter 15 Multiple Regression n Multiple Regression Model n Least Squares Method n Multiple.
Copyright © 2007 Pearson Education, Inc. Publishing as Pearson Addison-Wesley Slide
Multiple Regression and Model Building Chapter 15 Copyright © 2014 by The McGraw-Hill Companies, Inc. All rights reserved.McGraw-Hill/Irwin.
Copyright © 2005 by the McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Managerial Economics Thomas Maurice eighth edition Chapter 4.
Inference for Regression Simple Linear Regression IPS Chapter 10.1 © 2009 W.H. Freeman and Company.
Lesson Multiple Regression Models. Objectives Obtain the correlation matrix Use technology to find a multiple regression equation Interpret the.
Multiple Regression Petter Mostad Review: Simple linear regression We define a model where are independent (normally distributed) with equal.
14- 1 Chapter Fourteen McGraw-Hill/Irwin © 2006 The McGraw-Hill Companies, Inc., All Rights Reserved.
Statistics for Business and Economics 8 th Edition Chapter 11 Simple Regression Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall Ch.
© Copyright McGraw-Hill Correlation and Regression CHAPTER 10.
Chapter 13 Multiple Regression
Multiple Regression. Simple Regression in detail Y i = β o + β 1 x i + ε i Where Y => Dependent variable X => Independent variable β o => Model parameter.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Model Building and Model Diagnostics Chapter 15.
1 1 Slide © 2011 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole.
Lesson 14 - R Chapter 14 Review. Objectives Summarize the chapter Define the vocabulary used Complete all objectives Successfully answer any of the review.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.
McGraw-Hill/IrwinCopyright © 2009 by The McGraw-Hill Companies, Inc. All Rights Reserved. Simple Linear Regression Analysis Chapter 13.
Chapter 12 Simple Linear Regression n Simple Linear Regression Model n Least Squares Method n Coefficient of Determination n Model Assumptions n Testing.
1 1 Slide The Simple Linear Regression Model n Simple Linear Regression Model y =  0 +  1 x +  n Simple Linear Regression Equation E( y ) =  0 + 
INTRODUCTION TO MULTIPLE REGRESSION MULTIPLE REGRESSION MODEL 11.2 MULTIPLE COEFFICIENT OF DETERMINATION 11.3 MODEL ASSUMPTIONS 11.4 TEST OF SIGNIFICANCE.
Bivariate Regression. Bivariate Regression analyzes the relationship between two variables. Bivariate Regression analyzes the relationship between two.
Correlation and Linear Regression Chapter 13 McGraw-Hill/Irwin Copyright © 2012 by The McGraw-Hill Companies, Inc. All rights reserved.
Chapter 11: Linear Regression E370, Spring From Simple Regression to Multiple Regression.
Essentials of Modern Business Statistics (7e)
Multiple Regression and Model Building
Chapter 11 Simple Regression
Quantitative Methods Simple Regression.
Correlation and Simple Linear Regression
Multiple Regression Models
Correlation and Simple Linear Regression
Chapter Fourteen McGraw-Hill/Irwin
Presentation transcript:

Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Multiple Regression Chapter 14

14-2 Multiple Regression 14.1The Multiple Regression Model and the Least Squares Point Estimate 14.2Model Assumptions and the Standard Error 14.3R 2 and Adjusted R The Overall F Test 14.5Testing the Significance of an Independent Variable

14-3 Multiple Regression Continued 14.6Confidence and Prediction Intervals 14.7The Sales Territory Performance Case 14.8Using Dummy Variables to Model Qualitative Independent Variables 14.9The Partial F Test: Testing the Significance of a Portion of a Regression Model 14.10Residual Analysis in Multiple Regression

The Multiple Regression Model and the Least Squares Point Estimate Simple linear regression used one independent variable to explain the dependent variable Some relationships are too complex to be described using a single independent variable Multiple regression uses two or more independent variables to describe the dependent variable This allows multiple regression models to handle more complex situations There is no limit to the number of independent variables a model can use Multiple regression has only one dependent variable LO 1: Explain the multiple regression model and the related least squares point estimates.

14-5 The Multiple Regression Model The linear regression model relating y to x1, x2,…, xk is y = β0 + β1x1 + β2x2 +…+ βkxk +  µy = β0 + β1x1 + β2x2 +…+ βkxk is the mean value of the dependent variable y when the values of the independent variables are x1, x2,…, xk β0, β1, β2,… βk are unknown the regression parameters relating the mean value of y to x1, x2,…, xk  is an error term that describes the effects on y of all factors other than the independent variables x1, x2,…, xk LO1

Model Assumptions and the Standard Error The model is y = β 0 + β 1 x 1 + β 2 x 2 + … + β k x k +  Assumptions for multiple regression are stated about the model error terms,  ’s LO 2: Explain the assumptions behind multiple regression and calculate the standard error.

14-7 The Regression Model Assumptions Continued 1. Mean of Zero Assumption The mean of the error terms is equal to 0 2. Constant Variance Assumption The variance of the error terms σ 2 is, the same for every combination values of x 1, x 2,…, x k 3. Normality Assumption The error terms follow a normal distribution for every combination values of x 1, x 2,…, x k 4. Independence Assumption The values of the error terms are statistically independent of each other LO2

R 2 and Adjusted R 2 1. Total variation is given by the formula Σ(y i - y ̄) 2 2. Explained variation is given by the formula Σ(y ̂ i - y ̄) 2 3. Unexplained variation is given by the formula Σ(y i - y ̂ i ) 2 4. Total variation is the sum of explained and unexplained variation This section can be read anytime after reading Section 14.1 LO 3: Calculate and interpret the multiple and adjusted multiple coefficients of determination.

The Overall F Test To test H 0 : β 1 = β 2 = …= β k = 0 versus H a : At least one of β 1, β 2,…, β k ≠ 0 The test statistic is Reject H 0 in favor of H a if F(model) > F  * or p-value <  * F  is based on k numerator and n-(k+1) denominator degrees of freedom LO 4: Test the significance of a multiple regression model by using an F test.

Testing the Significance of an Independent Variable A variable in a multiple regression model is not likely to be useful unless there is a significant relationship between it and y To test significance, we use the null hypothesis H 0 : β j = 0 Versus the alternative hypothesis H a : β j ≠ 0 LO 5: Test the significance of a single independent variable.

Confidence and Prediction Intervals The point on the regression line corresponding to a particular value of x 01, x 02,…, x 0k, of the independent variables is y ̂ = b 0 + b 1 x 01 + b 2 x 02 + … + b k x 0k It is unlikely that this value will equal the mean value of y for these x values Therefore, we need to place bounds on how far the predicted value might be from the actual value We can do this by calculating a confidence interval for the mean value of y and a prediction interval for an individual value of y LO 6: Find and interpret a confidence interval for a mean value and a prediction interval for an individual value.

Using Dummy Variables to Model Qualitative Independent Variables So far, we have only looked at including quantitative data in a regression model However, we may wish to include descriptive qualitative data as well For example, might want to include the gender of respondents We can model the effects of different levels of a qualitative variable by using what are called dummy variables Also known as indicator variables LO 7: Use dummy variables to model qualitative independent variables.

The Partial F Test: Testing the Significance of a Portion of a Regression Model So far, we have looked at testing single slope coefficients using t test We have also looked at testing all the coefficients at once using F test The partial F test allows us to test the significance of any set of independent variables in a regression model LO 8: Test the significance of a portion of a regression model by using an F test.

Residual Analysis in Multiple Regression For an observed value of y i, the residual is e i = y i - y ̂ = y i – (b 0 + b 1 x i 1 + … + b k x ik ) If the regression assumptions hold, the residuals should look like a random sample from a normal distribution with mean 0 and variance σ 2 LO 9: Use residual analysis to check the assumptions of multiple regression.