Ch. 14: The Multiple Regression Model building

Slides:



Advertisements
Similar presentations
Korelasi Diri (Auto Correlation) Pertemuan 15 Matakuliah: I0174 – Analisis Regresi Tahun: Ganjil 2007/2008.
Advertisements

Chap 12-1 Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chapter 12 Simple Regression Statistics for Business and Economics 6.
Forecasting Using the Simple Linear Regression Model and Correlation
Business Statistics: A Decision-Making Approach, 6e © 2005 Prentice-Hall, Inc. Chap 14-1 Business Statistics: A Decision-Making Approach 6 th Edition Chapter.
Regression Analysis Simple Regression. y = mx + b y = a + bx.
Simple Linear Regression and Correlation
Multiple Regression [ Cross-Sectional Data ]
Chapter 12 Simple Linear Regression
Chapter 13 Multiple Regression
Regresi dan Analisis Varians Pertemuan 21 Matakuliah: I0174 – Analisis Regresi Tahun: Ganjil 2007/2008.
Chapter 14 Introduction to Multiple Regression
Chapter 10 Simple Regression.
Korelasi Ganda Dan Penambahan Peubah Pertemuan 13 Matakuliah: I0174 – Analisis Regresi Tahun: Ganjil 2007/2008.
Chapter 12 Simple Regression
Interaksi Dalam Regresi (Lanjutan) Pertemuan 25 Matakuliah: I0174 – Analisis Regresi Tahun: Ganjil 2007/2008.
Regresi dan Rancangan Faktorial Pertemuan 23 Matakuliah: I0174 – Analisis Regresi Tahun: Ganjil 2007/2008.
Chapter 12 Multiple Regression
© 2000 Prentice-Hall, Inc. Chap Multiple Regression Models.
Multiple Regression Models. The Multiple Regression Model The relationship between one dependent & two or more independent variables is a linear function.
© 2003 Prentice-Hall, Inc.Chap 14-1 Basic Business Statistics (9 th Edition) Chapter 14 Introduction to Multiple Regression.
Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc. Chap 14-1 Chapter 14 Introduction to Multiple Regression Basic Business Statistics 11 th Edition.
Chapter 12 Simple Linear Regression
Chapter Topics Types of Regression Models
Linear Regression and Correlation Analysis
Statistics for Business and Economics Chapter 11 Multiple Regression and Model Building.
© 2004 Prentice-Hall, Inc.Chap 14-1 Basic Business Statistics (9 th Edition) Chapter 14 Introduction to Multiple Regression.
Linear Regression Example Data
© 2000 Prentice-Hall, Inc. Chap Forecasting Using the Simple Linear Regression Model and Correlation.
Chapter 14 Introduction to Linear Regression and Correlation Analysis
Introduction to Regression Analysis, Chapter 13,
Statistics for Managers Using Microsoft Excel, 4e © 2004 Prentice-Hall, Inc. Chap 13-1 Chapter 13 Introduction to Multiple Regression Statistics for Managers.
Chapter 8 Forecasting with Multiple Regression
Introduction to Linear Regression and Correlation Analysis
Chapter 13: Inference in Regression
Chapter 14 Introduction to Multiple Regression Sections 1, 2, 3, 4, 6.
Purpose of Regression Analysis Regression analysis is used primarily to model causality and provide prediction –Predicts the value of a dependent (response)
© 2003 Prentice-Hall, Inc.Chap 11-1 Business Statistics: A First Course (3 rd Edition) Chapter 11 Multiple Regression.
Chapter 12 Multiple Regression and Model Building.
Lecture 14 Multiple Regression Model
© 2002 Prentice-Hall, Inc.Chap 14-1 Introduction to Multiple Regression Model.
OPIM 303-Lecture #8 Jose M. Cruz Assistant Professor.
Statistics for Business and Economics 7 th Edition Chapter 11 Simple Regression Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall Ch.
Chapter 14 Introduction to Multiple Regression
Chap 12-1 A Course In Business Statistics, 4th © 2006 Prentice-Hall, Inc. A Course In Business Statistics 4 th Edition Chapter 12 Introduction to Linear.
EQT 373 Chapter 3 Simple Linear Regression. EQT 373 Learning Objectives In this chapter, you learn: How to use regression analysis to predict the value.
Applied Quantitative Analysis and Practices LECTURE#23 By Dr. Osman Sadiq Paracha.
Chap 14-1 Copyright ©2012 Pearson Education, Inc. publishing as Prentice Hall Chap 14-1 Chapter 14 Introduction to Multiple Regression Basic Business Statistics.
Lesson Multiple Regression Models. Objectives Obtain the correlation matrix Use technology to find a multiple regression equation Interpret the.
Lecture 4 Introduction to Multiple Regression
Lecture 10: Correlation and Regression Model.
Multiple Regression. Simple Regression in detail Y i = β o + β 1 x i + ε i Where Y => Dependent variable X => Independent variable β o => Model parameter.
Lecture 3 Introduction to Multiple Regression Business and Economic Forecasting.
Copyright ©2011 Pearson Education, Inc. publishing as Prentice Hall 14-1 Chapter 14 Introduction to Multiple Regression Statistics for Managers using Microsoft.
Business Statistics: A Decision-Making Approach, 6e © 2005 Prentice- Hall, Inc. Chap 14-1 Business Statistics: A Decision-Making Approach 6 th Edition.
Basic Business Statistics, 10e © 2006 Prentice-Hall, Inc.. Chap 14-1 Chapter 14 Introduction to Multiple Regression Basic Business Statistics 10 th Edition.
Chap 13-1 A Course In Business Statistics, 4th © 2006 Prentice-Hall, Inc. A Course In Business Statistics 4 th Edition Chapter 13 Multiple Regression and.
Introduction to Multiple Regression Lecture 11. The Multiple Regression Model Idea: Examine the linear relationship between 1 dependent (Y) & 2 or more.
Chapter 12 Simple Linear Regression n Simple Linear Regression Model n Least Squares Method n Coefficient of Determination n Model Assumptions n Testing.
1 1 Slide The Simple Linear Regression Model n Simple Linear Regression Model y =  0 +  1 x +  n Simple Linear Regression Equation E( y ) =  0 + 
Multiple Regression Learning Objectives n Explain the Linear Multiple Regression Model n Interpret Linear Multiple Regression Computer Output n Test.
Statistics for Managers Using Microsoft Excel, 5e © 2008 Prentice-Hall, Inc.Chap 14-1 Statistics for Managers Using Microsoft® Excel 5th Edition Chapter.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Multiple Regression Chapter 14.
Chapter 14 Introduction to Multiple Regression
Chapter 15 Multiple Regression and Model Building
Multiple Regression Analysis and Model Building
CHAPTER 29: Multiple Regression*
Pemeriksaan Sisa dan Data Berpengaruh Pertemuan 17
Korelasi Parsial dan Pengontrolan Parsial Pertemuan 14
Introduction to Regression
Presentation transcript:

Ch. 14: The Multiple Regression Model building Idea: Examine the linear relationship between 1 dependent (Y) & 2 or more independent variables (Xi) Multiple Regression Model with k Independent Variables: Y-intercept Population slopes Random Error

The coefficients of the multiple regression model are estimated using sample data with k independent variables Interpretation of the Slopes: (referred to as a Net Regression Coefficient) b1=The change in the mean of Y per unit change in X1, taking into account the effect of X2 (or net of X2) b0 Y intercept. It is the same as simple regression. Estimated (or predicted) value of Y Estimated intercept Estimated slope coefficients

Graph of a Two-Variable Model Three dimension Y Slope for variable X1 X2 Slope for variable X2 X1

Example: Simple Regression Results Multiple Regression Results Check the size and significance level of the coefficients, the F-value, the R-Square, etc. You will see what the “net of “ effects are.

Using The Equation to Make Predictions Predict the appraised value at average lot size (7.24) and average number of rooms (7.12). What is the total effect from 2000 sf increase in lot size and 2 additional rooms?

Coefficient of Multiple Determination, r2 and Adjusted r2 Reports the proportion of total variation in Y explained by all X variables taken together (the model) Adjusted r2 r2 never decreases when a new X variable is added to the model This can be a disadvantage when comparing models

What is the net effect of adding a new variable? We lose a degree of freedom when a new X variable is added Did the new X variable add enough explanatory power to offset the loss of one degree of freedom? Shows the proportion of variation in Y explained by all X variables adjusted for the number of X variables used (where n = sample size, k = number of independent variables) Penalize excessive use of unimportant independent variables Smaller than r2 Useful in comparing among models

Multiple Regression Assumptions The errors are normally distributed Errors have a constant variance The model errors are independent Errors (residuals) from the regression model: ei = (Yi – Yi) These residual plots are used in multiple regression: Residuals vs. Yi Residuals vs. X1i Residuals vs. X2i Residuals vs. time (if time series data)

Two variable model Y Yi Residual = ei = (Yi – Yi) Yi x2i X2 x1i Sample observation Yi Residual = ei = (Yi – Yi) < Yi < x2i X2 x1i < The best fit equation, Y , is found by minimizing the sum of squared errors, e2 X1

Are Individual Variables Significant? Use t-tests of individual variable slopes Shows if there is a linear relationship between the variable Xi and Y; Hypotheses: H0: βi = 0 (no linear relationship) H1: βi ≠ 0 (linear relationship does exist between Xi and Y) Test Statistic: Confidence interval for the population slope βi

Is the Overall Model Significant? F-Test for Overall Significance of the Model Shows if there is a linear relationship between all of the X variables considered together and Y Use F test statistic; Hypotheses: H0: β1 = β2 = … = βk = 0 (no linear relationship) H1: at least one βi ≠ 0 (at least one independent variable affects Y) Test statistic:

Testing Portions of the Multiple Regression Model To find out if inclusion of an individual Xj or a set of Xs, significantly improves the model, given that other independent variables are included in the model Two Measures: Partial F-test criterion The Coefficient of Partial Determination

Contribution of a Single Independent Variable Xj SSR(Xj | all variables except Xj) = SSR (all variables) – SSR(all variables except Xj) Measures the contribution of Xj in explaining the total variation in Y (SST) consider here a 3-variable model: SSR(X1 | X2 and X3) = SSR (all variablesX1-x3) – SSR(X2 and X3) SSRR Model SSRUR Model

The Partial F-Test Statistic Consider the hypothesis test: H0: variable Xj does not significantly improve the model after all other variables are included H1: variable Xj significantly improves the model after all other variables are included Note that the numerator is the contribution of Xj to the regression. If Actual F Statistic is > than the Critical F, then Conclusion is: Reject H0; adding X1 does improve model

Coefficient of Partial Determination for one or a set of variables Measures the proportion of total variation in the dependent variable (SST) that is explained by Xj while controlling for (holding constant) the other explanatory variables

Regression intercepts are different if the variable is significant Using Dummy Variables A dummy variable is a categorical explanatory variable with two levels: yes or no, on or off, male or female coded as 0 or 1 Regression intercepts are different if the variable is significant Assumes equal slopes for other variables If more than two levels, the number of dummy variables needed is (number of levels - 1)

Different Intercepts, same slope Fire Place No Fire Place Fire Place (X2 = 1) Y (sales) If H0: β2 = 0 is rejected, then “Fire Place” has a significant effect on Values b0 + b2 No Fire place (X2 = 0) b0

Interaction Between Explanatory Variables Hypothesizes interaction between pairs of X variables Response to one X variable may vary at different levels of another X variable Contains two-way cross product terms Effect of Interaction Without interaction term, effect of X1 on Y is measured by β1 With interaction term, effect of X1 on Y is measured by β1 + β3 X2 Effect changes as X2 changes

Slopes are different if the effect of X1 on Y depends on X2 value Example: Suppose X2 is a dummy variable and the estimated regression equation is = 1 + 2X1 + 3X2 + 4X1X2 Y Y = 1 + 2X1 + 3(1) + 4X1(1) = 4 + 6X1 X2 = 1: Y = 1 + 2X1 + 3(0) + 4X1(0) = 1 + 2X1 X2 = 0: X1 0.5 1 1.5 Slopes are different if the effect of X1 on Y depends on X2 value