Lesson 14 - 3 Multiple Regression Models. Objectives Obtain the correlation matrix Use technology to find a multiple regression equation Interpret the.

Slides:



Advertisements
Similar presentations
Business Statistics: A Decision-Making Approach, 6e © 2005 Prentice-Hall, Inc. Chap 14-1 Business Statistics: A Decision-Making Approach 6 th Edition Chapter.
Advertisements

Chapter 14, part D Statistical Significance. IV. Model Assumptions The error term is a normally distributed random variable and The variance of  is constant.
6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
Regression Analysis Once a linear relationship is defined, the independent variable can be used to forecast the dependent variable. Y ^ = bo + bX bo is.
Objectives (BPS chapter 24)
Chapter 12 Simple Linear Regression
1 1 Slide © 2014 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole.
Chapter 13 Multiple Regression
© 2010 Pearson Prentice Hall. All rights reserved Least Squares Regression Models.
Chapter 10 Simple Regression.
Statistics for Managers Using Microsoft® Excel 5th Edition
Chapter 12 Simple Regression
Statistics for Managers Using Microsoft® Excel 5th Edition
January 6, morning session 1 Statistics Micro Mini Multiple Regression January 5-9, 2008 Beth Ayers.
1 Pertemuan 13 Uji Koefisien Korelasi dan Regresi Matakuliah: A0392 – Statistik Ekonomi Tahun: 2006.
Pengujian Parameter Koefisien Korelasi Pertemuan 04 Matakuliah: I0174 – Analisis Regresi Tahun: Ganjil 2007/2008.
Chapter Topics Types of Regression Models
Simple Linear Regression Analysis
Linear Regression Example Data
Ch. 14: The Multiple Regression Model building
Chapter 7 Forecasting with Simple Regression
1 1 Slide © 2008 Thomson South-Western. All Rights Reserved Slides by JOHN LOUCKS & Updated by SPIROS VELIANITIS.
Statistics for Managers Using Microsoft Excel, 4e © 2004 Prentice-Hall, Inc. Chap 13-1 Chapter 13 Introduction to Multiple Regression Statistics for Managers.
Lecture 5 Correlation and Regression
Quantitative Business Analysis for Decision Making Multiple Linear RegressionAnalysis.
Multiple Linear Regression Response Variable: Y Explanatory Variables: X 1,...,X k Model (Extension of Simple Regression): E(Y) =  +  1 X 1 +  +  k.
Objectives of Multiple Regression
Regression Analysis Regression analysis is a statistical technique that is very useful for exploring the relationships between two or more variables (one.
Copyright © Cengage Learning. All rights reserved. 13 Linear Correlation and Regression Analysis.
Copyright © 2013, 2010 and 2007 Pearson Education, Inc. Chapter Inference on the Least-Squares Regression Model and Multiple Regression 14.
© 2002 Prentice-Hall, Inc.Chap 14-1 Introduction to Multiple Regression Model.
EQT 272 PROBABILITY AND STATISTICS
Ms. Khatijahhusna Abd Rani School of Electrical System Engineering Sem II 2014/2015.
1 1 Slide © 2005 Thomson/South-Western Slides Prepared by JOHN S. LOUCKS St. Edward’s University Slides Prepared by JOHN S. LOUCKS St. Edward’s University.
OPIM 303-Lecture #8 Jose M. Cruz Assistant Professor.
© 2003 Prentice-Hall, Inc.Chap 13-1 Basic Business Statistics (9 th Edition) Chapter 13 Simple Linear Regression.
1 1 Slide © 2008 Thomson South-Western. All Rights Reserved Chapter 15 Multiple Regression n Multiple Regression Model n Least Squares Method n Multiple.
INTRODUCTORY LINEAR REGRESSION SIMPLE LINEAR REGRESSION - Curve fitting - Inferences about estimated parameter - Adequacy of the models - Linear.
1 1 Slide Simple Linear Regression Coefficient of Determination Chapter 14 BA 303 – Spring 2011.
EQT 373 Chapter 3 Simple Linear Regression. EQT 373 Learning Objectives In this chapter, you learn: How to use regression analysis to predict the value.
1 Chapter 12 Simple Linear Regression. 2 Chapter Outline  Simple Linear Regression Model  Least Squares Method  Coefficient of Determination  Model.
Multiple Regression and Model Building Chapter 15 Copyright © 2014 by The McGraw-Hill Companies, Inc. All rights reserved.McGraw-Hill/Irwin.
Inference for Regression Simple Linear Regression IPS Chapter 10.1 © 2009 W.H. Freeman and Company.
Multiple Regression BPS chapter 28 © 2006 W.H. Freeman and Company.
Statistics for Business and Economics 8 th Edition Chapter 11 Simple Regression Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall Ch.
© Copyright McGraw-Hill Correlation and Regression CHAPTER 10.
Lecture 10: Correlation and Regression Model.
Multiple Regression. Simple Regression in detail Y i = β o + β 1 x i + ε i Where Y => Dependent variable X => Independent variable β o => Model parameter.
Applied Quantitative Analysis and Practices LECTURE#25 By Dr. Osman Sadiq Paracha.
Statistics for Managers Using Microsoft Excel, 4e © 2004 Prentice-Hall, Inc. Chap 14-1 Chapter 14 Multiple Regression Model Building Statistics for Managers.
Lesson 14 - R Chapter 14 Review. Objectives Summarize the chapter Define the vocabulary used Complete all objectives Successfully answer any of the review.
Basic Business Statistics, 10e © 2006 Prentice-Hall, Inc. Chap 15-1 Chapter 15 Multiple Regression Model Building Basic Business Statistics 10 th Edition.
Chap 13-1 A Course In Business Statistics, 4th © 2006 Prentice-Hall, Inc. A Course In Business Statistics 4 th Edition Chapter 13 Multiple Regression and.
Statistics for Managers Using Microsoft Excel, 4e © 2004 Prentice-Hall, Inc. Chap 14-1 Chapter 14 Multiple Regression Model Building Statistics for Managers.
Chapter 12 Simple Linear Regression n Simple Linear Regression Model n Least Squares Method n Coefficient of Determination n Model Assumptions n Testing.
1 1 Slide The Simple Linear Regression Model n Simple Linear Regression Model y =  0 +  1 x +  n Simple Linear Regression Equation E( y ) =  0 + 
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Multiple Regression Chapter 14.
Lesson Testing the Significance of the Least Squares Regression Model.
REGRESSION AND CORRELATION SIMPLE LINEAR REGRESSION 10.2 SCATTER DIAGRAM 10.3 GRAPHICAL METHOD FOR DETERMINING REGRESSION 10.4 LEAST SQUARE METHOD.
Lecture 11: Simple Linear Regression
Chapter 14 Inference on the Least-Squares Regression Model and Multiple Regression.
Statistics for Managers using Microsoft Excel 3rd Edition
Statistics for Business and Economics (13e)
Slides by JOHN LOUCKS St. Edward’s University.
Correlation and Regression
Multiple Regression Models
Chapter 14 Inference for Regression
Introduction to Regression
St. Edward’s University
Presentation transcript:

Lesson Multiple Regression Models

Objectives Obtain the correlation matrix Use technology to find a multiple regression equation Interpret the coefficients of a multiple regression equation Determine R 2 and adjusted R 2 Perform an F-test for lack of fit Test individual regression coefficients for significance Construct confidence and prediction intervals Build a regression model

Vocabulary Correlation matrix – shows the linear correlation among all variables under consideration in a multiple regression model Multicollinearity – when two explanatory variables have a high linear correlation between themselves Additive effect – explanatory variables do not interact Adjusted R 2 – modifies the value of R2 based on the sample size, n, and the number of explanatory variables, k; will decrease if an explanatory variable is added to the model that does little to explain the variation in the response variable

Multiple Regression Model y i = β 0 + β 1x1i + β 2x2i + … + β kxki + ε i where y i is the value of the response variable for the ith individual β 0, β 1, β 2,, β k,are the parameters to be estimated based on the sample data x 1i is the ith observation for the first explanatory variable, x 2i is the ith observation for the second explanatory variable and so on ε i is am independent random error term that is normally distributed with mean 0 and variance = σ² i = 1, 2, 3, …, n, where n is the sample size Note: although formulas exists to estimate β 0, β 1, β 2, …, β k exist, we will use Excel to obtain estimates

Correlation Matrix Its good that explanatory variables are highly correlated (either positively or negatively) with the response variable There may be problems if the explanatory variables are highly correlated with each other (multi- collinearity) General Rule: |correlation| > 0.7  then multi- collinearity may be a problem VariablesX1X1 X2X2 X3X3 Response X1X X2X X3X Response

note: modifies R 2 based on sample size, n, and the number of explanatory variables, k to compensate for adding more variables to the model explained variation unexplained variation R 2 = = total variation total variation n – 1 R 2 adj = 1 – (1 – R 2 ) n – k – 1 R 2 and Adjusted R 2 Values

Adjusted R² The adjusted R² is used in multiple regression models The adjusted R² will decrease if a variable is added to the model that does little to explain the variation in the response variable. The adjusted R² will increase if a variable is taken from the model that does little to explain the variation in the response variable.

Hypothesis Test in Multiple Regression The null hypothesis is that none of the explanatory variables have a significant linear relation with the response variable The alternative hypothesis is that at least on of the explanatory variables has a significant linear relation with the response variable

with k – 1 degrees of freedom in the numerator and, n – k degrees of freedom in the denominator where k is the number of explanatory variables n is the sample size NOTE: H 0 : β 0 = β 1 = β 2 = … = β k = 0 use P-value compared to level of significance, α, for Decision Rule Mean Square due to Regression MSR F = = Mean Square Error MSE R 2 n – (k + 1) F = · – R 2 k F – Test Statistic Using R 2 F Test Statistic for Multiple Regression

Guidelines in Developing a Multiple Regression Model (backwards step-wise regression) 1.Construct a correlation matrix to help identify the explanatory variables that have a high correlation with the response variable. In addition, look for any indication that the explanatory variables are correlated with each other. If two explanatory variables have high correlation, then it’s a tip-off to watch out for multicollinearity – but not conclusive evidence. 2.See if the multiple regression model uses all the explanatory variables that have been identified by the researcher. 3.If the null hypothesis that all the slope coefficients are zero has been rejected, we proceed to look at the individual slope coefficients. Identify those slope coefficients that have small t-test statistics (hence large p-values). These are explanatory variable\ candidates that could be removed from the model. Remove one at a time and then recomputed the regression model. 4.Repeat Step 3 until all slope coefficients are significantly different from zero. 5.Use residual plots to check model appropriateness

Backwards Step-wise Regression Put all possible variables into the model Run regression model (focus on adjusted R²) Pull out the variable with the highest p-value –one with the least likely probability of having a linear relationship with the response variable Rerun the model –if adjusted R² goes up; repeat procedures –if adjusted R² goes down then stop

Example 9 on page

Summary and Homework Summary –Given the appropriate conditions, we can perform inference on whether the slope and intercept are significantly different from 0 –We can also calculate confidence and prediction intervals to quantify the accuracy of our predictions of the response variable y –Multiple regression models are models where more than one explanatory variable is considered Homework –pg : 1, 3, 4, 6, 8, 17