Chapter 13 Additional Topics in Regression Analysis

Slides:



Advertisements
Similar presentations
Forecasting Using the Simple Linear Regression Model and Correlation
Advertisements

Business Statistics: A Decision-Making Approach, 6e © 2005 Prentice-Hall, Inc. Chap 14-1 Business Statistics: A Decision-Making Approach 6 th Edition Chapter.
Chapter 13 Multiple Regression
© 2010 Pearson Prentice Hall. All rights reserved Least Squares Regression Models.
Chapter 10 Simple Regression.
Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc. Chap 13-1 Chapter 13 Simple Linear Regression Basic Business Statistics 11 th Edition.
Statistics for Managers Using Microsoft® Excel 5th Edition
Chapter 12 Multiple Regression
Additional Topics in Regression Analysis
Chapter 13 Introduction to Linear Regression and Correlation Analysis
Statistics for Managers Using Microsoft Excel, 5e © 2008 Prentice-Hall, Inc.Chap 13-1 Statistics for Managers Using Microsoft® Excel 5th Edition Chapter.
Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc. Chap 14-1 Chapter 14 Introduction to Multiple Regression Basic Business Statistics 11 th Edition.
Pengujian Parameter Koefisien Korelasi Pertemuan 04 Matakuliah: I0174 – Analisis Regresi Tahun: Ganjil 2007/2008.
Chapter Topics Types of Regression Models
Linear Regression Example Data
Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc. Chap 15-1 Chapter 15 Multiple Regression Model Building Basic Business Statistics 11 th Edition.
© 2000 Prentice-Hall, Inc. Chap Forecasting Using the Simple Linear Regression Model and Correlation.
Chapter 14 Introduction to Linear Regression and Correlation Analysis
Basic Business Statistics, 10e © 2006 Prentice-Hall, Inc. Chap 13-1 Chapter 13 Simple Linear Regression Basic Business Statistics 10 th Edition.
Chapter 7 Forecasting with Simple Regression
Introduction to Regression Analysis, Chapter 13,
Chapter 13 Simple Linear Regression
Copyright ©2011 Pearson Education 15-1 Chapter 15 Multiple Regression Model Building Statistics for Managers using Microsoft Excel 6 th Global Edition.
Statistics for Managers Using Microsoft Excel, 4e © 2004 Prentice-Hall, Inc. Chap 13-1 Chapter 13 Introduction to Multiple Regression Statistics for Managers.
Chapter 13 Simple Linear Regression
Chapter 13 Additional Topics in Regression Analysis
Statistics for Managers Using Microsoft Excel, 4e © 2004 Prentice-Hall, Inc. Chap 12-1 Chapter 12 Simple Linear Regression Statistics for Managers Using.
Chapter 11 Simple Regression
Regression Method.
Chapter 15 Multiple Regression
Copyright ©2011 Pearson Education, Inc. publishing as Prentice Hall 15-1 Chapter 15 Multiple Regression Model Building Statistics for Managers using Microsoft.
Copyright © 2013, 2010 and 2007 Pearson Education, Inc. Chapter Inference on the Least-Squares Regression Model and Multiple Regression 14.
OPIM 303-Lecture #8 Jose M. Cruz Assistant Professor.
Statistics for Business and Economics 7 th Edition Chapter 11 Simple Regression Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall Ch.
Chapter 14 Introduction to Multiple Regression
© 2003 Prentice-Hall, Inc.Chap 13-1 Basic Business Statistics (9 th Edition) Chapter 13 Simple Linear Regression.
Chap 12-1 A Course In Business Statistics, 4th © 2006 Prentice-Hall, Inc. A Course In Business Statistics 4 th Edition Chapter 12 Introduction to Linear.
EQT 373 Chapter 3 Simple Linear Regression. EQT 373 Learning Objectives In this chapter, you learn: How to use regression analysis to predict the value.
Chap 14-1 Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chapter 14 Additional Topics in Regression Analysis Statistics for Business.
Chap 14-1 Copyright ©2012 Pearson Education, Inc. publishing as Prentice Hall Chap 14-1 Chapter 14 Introduction to Multiple Regression Basic Business Statistics.
Chap 13-1 Copyright ©2012 Pearson Education, Inc. publishing as Prentice Hall Chap 13-1 Chapter 13 Simple Linear Regression Basic Business Statistics 12.
Statistics for Business and Economics 8 th Edition Chapter 11 Simple Regression Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall Ch.
Lecture 4 Introduction to Multiple Regression
Lecture 10: Correlation and Regression Model.
Lecture 3 Introduction to Multiple Regression Business and Economic Forecasting.
Statistics for Managers Using Microsoft Excel, 4e © 2004 Prentice-Hall, Inc. Chap 14-1 Chapter 14 Multiple Regression Model Building Statistics for Managers.
Copyright ©2011 Pearson Education, Inc. publishing as Prentice Hall 14-1 Chapter 14 Introduction to Multiple Regression Statistics for Managers using Microsoft.
Business Statistics: A Decision-Making Approach, 6e © 2005 Prentice- Hall, Inc. Chap 14-1 Business Statistics: A Decision-Making Approach 6 th Edition.
Basic Business Statistics, 10e © 2006 Prentice-Hall, Inc.. Chap 14-1 Chapter 14 Introduction to Multiple Regression Basic Business Statistics 10 th Edition.
Basic Business Statistics, 10e © 2006 Prentice-Hall, Inc. Chap 15-1 Chapter 15 Multiple Regression Model Building Basic Business Statistics 10 th Edition.
Chap 13-1 A Course In Business Statistics, 4th © 2006 Prentice-Hall, Inc. A Course In Business Statistics 4 th Edition Chapter 13 Multiple Regression and.
Statistics for Managers Using Microsoft® Excel 5th Edition
Introduction to Multiple Regression Lecture 11. The Multiple Regression Model Idea: Examine the linear relationship between 1 dependent (Y) & 2 or more.
Statistics for Managers Using Microsoft Excel, 4e © 2004 Prentice-Hall, Inc. Chap 14-1 Chapter 14 Multiple Regression Model Building Statistics for Managers.
Chapter 12 Simple Linear Regression.
Statistics for Managers Using Microsoft Excel, 5e © 2008 Prentice-Hall, Inc.Chap 14-1 Statistics for Managers Using Microsoft® Excel 5th Edition Chapter.
Statistics for Managers Using Microsoft Excel, 4e © 2004 Prentice-Hall, Inc. Chap 12-1 Chapter 12 Simple Linear Regression Statistics for Managers Using.
Conceptual Foundations © 2008 Pearson Education Australia Lecture slides for this course are based on teaching materials provided/referred by: (1) Statistics.
Yandell – Econ 216 Chap 15-1 Chapter 15 Multiple Regression Model Building.
Chapter 15 Multiple Regression Model Building
Chapter 14 Introduction to Multiple Regression
Chapter 15 Multiple Regression and Model Building
Statistics for Managers using Microsoft Excel 3rd Edition
Multiple Regression Analysis and Model Building
Chapter 11 Simple Regression
Chapter 13 Simple Linear Regression
Chapter 13 Additional Topics in Regression Analysis
Chapter 13 Simple Linear Regression
Presentation transcript:

Chapter 13 Additional Topics in Regression Analysis Statistics for Business and Economics 7th Edition Chapter 13 Additional Topics in Regression Analysis Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall Chapter Goals After completing this chapter, you should be able to: Explain regression model-building methodology Apply dummy variables for categorical variables with more than two categories Explain how dummy variables can be used in experimental design models Incorporate lagged values of the dependent variable is regressors Describe specification bias and multicollinearity Examine residuals for heteroscedasticity and autocorrelation Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

The Stages of Model Building 13.1 Model Specification * Understand the problem to be studied Select dependent and independent variables Identify model form (linear, quadratic…) Determine required data for the study Coefficient Estimation Model Verification Interpretation and Inference Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

The Stages of Model Building (continued) Model Specification Estimate the regression coefficients using the available data Form confidence intervals for the regression coefficients For prediction, goal is the smallest se If estimating individual slope coefficients, examine model for multicollinearity and specification bias Coefficient Estimation * Model Verification Interpretation and Inference Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

The Stages of Model Building (continued) Model Specification Logically evaluate regression results in light of the model (i.e., are coefficient signs correct?) Are any coefficients biased or illogical? Evaluate regression assumptions (i.e., are residuals random and independent?) If any problems are suspected, return to model specification and adjust the model Coefficient Estimation Model Verification * Interpretation and Inference Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

The Stages of Model Building (continued) Model Specification Coefficient Estimation Interpret the regression results in the setting and units of your study Form confidence intervals or test hypotheses about regression coefficients Use the model for forecasting or prediction Model Verification Interpretation and Inference * Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

Dummy Variable Models (More than 2 Levels) 13.2 Dummy variables can be used in situations in which the categorical variable of interest has more than two categories Dummy variables can also be useful in experimental design Experimental design is used to identify possible causes of variation in the value of the dependent variable Y outcomes are measured at specific combinations of levels for treatment and blocking variables The goal is to determine how the different treatments influence the Y outcome Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

Dummy Variable Models (More than 2 Levels) Consider a categorical variable with K levels The number of dummy variables needed is one less than the number of levels, K – 1 Example: y = house price ; x1 = square feet If style of the house is also thought to matter: Style = ranch, split level, condo Three levels, so two dummy variables are needed Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

Dummy Variable Models (More than 2 Levels) (continued) Example: Let “condo” be the default category, and let x2 and x3 be used for the other two categories: y = house price x1 = square feet x2 = 1 if ranch, 0 otherwise x3 = 1 if split level, 0 otherwise The multiple regression equation is: Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

Interpreting the Dummy Variable Coefficients (with 3 Levels) Consider the regression equation: For a condo: x2 = x3 = 0 With the same square feet, a ranch will have an estimated average price of 23.53 thousand dollars more than a condo For a ranch: x2 = 1; x3 = 0 With the same square feet, a split-level will have an estimated average price of 18.84 thousand dollars more than a condo. For a split level: x2 = 0; x3 = 1 Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall Experimental Design Consider an experiment in which four treatments will be used, and the outcome also depends on three environmental factors that cannot be controlled by the experimenter Let variable z1 denote the treatment, where z1 = 1, 2, 3, or 4. Let z2 denote the environment factor (the “blocking variable”), where z2 = 1, 2, or 3 To model the four treatments, three dummy variables are needed To model the three environmental factors, two dummy variables are needed Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall Experimental Design (continued) Define five dummy variables, x1, x2, x3, x4, and x5 Let treatment level 1 be the default (z1 = 1) Define x1 = 1 if z1 = 2, x1 = 0 otherwise Define x2 = 1 if z1 = 3, x2 = 0 otherwise Define x3 = 1 if z1 = 4, x3 = 0 otherwise Let environment level 1 be the default (z2 = 1) Define x4 = 1 if z2 = 2, x4 = 0 otherwise Define x5 = 1 if z2 = 3, x5 = 0 otherwise Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

Experimental Design: Dummy Variable Tables The dummy variable values can be summarized in a table: Z1 X1 X2 X3 1 2 3 4 Z2 X4 X5 1 2 3 Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

Experimental Design Model The experimental design model can be estimated using the equation The estimated value for β2 , for example, shows the amount by which the y value for treatment 3 exceeds the value for treatment 1 Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

Lagged Values of the Dependent Variable 13.3 In time series models, data is collected over time (weekly, quarterly, etc…) The value of y in time period t is denoted yt The value of yt often depends on the value yt-1, as well as other independent variables xj : A lagged value of the dependent variable is included as an explanatory variable Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

Interpreting Results in Lagged Models An increase of 1 unit in the independent variable xj in time period t (all other variables held fixed), will lead to an expected increase in the dependent variable of j in period t j  in period (t+1) j2 in period (t+2) j3 in period (t+3) and so on The total expected increase over all current and future time periods is j/(1-) The coefficients 0, 1, . . . ,K,  are estimated by least squares in the usual manner Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

Interpreting Results in Lagged Models (continued) Confidence intervals and hypothesis tests for the regression coefficients are computed the same as in ordinary multiple regression (When the regression equation contains lagged variables, these procedures are only approximately valid. The approximation quality improves as the number of sample observations increases.) Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

Interpreting Results in Lagged Models (continued) Caution should be used when using confidence intervals and hypothesis tests with time series data There is a possibility that the equation errors i are no longer independent from one another. When errors are correlated the coefficient estimates are unbiased, but not efficient. Thus confidence intervals and hypothesis tests are no longer valid. Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall Specification Bias 13.4 Suppose an important independent variable z is omitted from a regression model If z is uncorrelated with all other included independent variables, the influence of z is left unexplained and is absorbed by the error term, ε But if there is any correlation between z and any of the included independent variables, some of the influence of z is captured in the coefficients of the included variables Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall Specification Bias (continued) If some of the influence of omitted variable z is captured in the coefficients of the included independent variables, then those coefficients are biased… …and the usual inferential statements from hypothesis test or confidence intervals can be seriously misleading In addition the estimated model error will include the effect of the missing variable(s) and will be larger Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall Multicollinearity 13.5 Collinearity: High correlation exists among two or more independent variables This means the correlated variables contribute redundant information to the multiple regression model Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall Multicollinearity (continued) Including two highly correlated explanatory variables can adversely affect the regression results No new information provided Can lead to unstable coefficients (large standard error and low t-values) Coefficient signs may not match prior expectations Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

Some Indications of Strong Multicollinearity Incorrect signs on the coefficients Large change in the value of a previous coefficient when a new variable is added to the model A previously significant variable becomes insignificant when a new independent variable is added The estimate of the standard deviation of the model increases when a variable is added to the model Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

Detecting Multicollinearity Examine the simple correlation matrix to determine if strong correlation exists between any of the model independent variables Multicollinearity may be present if the model appears to explain the dependent variable well (high F statistic and low se ) but the individual coefficient t statistics are insignificant Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

Assumptions of Regression Normality of Error Error values (ε) are normally distributed for any given value of X Homoscedasticity The probability distribution of the errors has constant variance Independence of Errors Error values are statistically independent Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall Residual Analysis The residual for observation i, ei , is the difference between its observed and predicted value Check the assumptions of regression by examining the residuals Examine for linearity assumption Examine for constant variance for all levels of X (homoscedasticity) Evaluate normal distribution assumption Evaluate independence assumption Graphical Analysis of Residuals Can plot residuals vs. X Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

Residual Analysis for Linearity x x x x residuals residuals  Not Linear Linear Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

Residual Analysis for Homoscedasticity x x x x residuals residuals  Constant variance Non-constant variance Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

 Residual Analysis for Independence Not Independent Independent X X X residuals X residuals X residuals Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

Excel Residual Output Does not appear to violate Predicted House Price Residuals 1 251.92316 -6.923162 2 273.87671 38.12329 3 284.85348 -5.853484 4 304.06284 3.937162 5 218.99284 -19.99284 6 268.38832 -49.38832 7 356.20251 48.79749 8 367.17929 -43.17929 9 254.6674 64.33264 10 -29.85348 Does not appear to violate any regression assumptions Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall Heteroscedasticity 13.6 Homoscedasticity The probability distribution of the errors has constant variance Heteroscedasticity The error terms do not all have the same variance The size of the error variances may depend on the size of the dependent variable value, for example Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall Heteroscedasticity (continued) When heteroscedasticity is present: least squares is not the most efficient procedure to estimate regression coefficients The usual procedures for deriving confidence intervals and tests of hypotheses is not valid Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

Tests for Heteroscedasticity To test the null hypothesis that the error terms, εi, all have the same variance against the alternative that their variances depend on the expected values Estimate the simple regression Let R2 be the coefficient of determination of this new regression The null hypothesis is rejected if nR2 is greater than 21, where 21, is the critical value of the chi-square random variable with 1 degree of freedom and probability of error  Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

Autocorrelated Errors 13.7 Independence of Errors Error values are statistically independent Autocorrelated Errors Residuals in one time period are related to residuals in another period Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

Autocorrelated Errors (continued) Autocorrelation violates a least squares regression assumption Leads to sb estimates that are too small (i.e., biased) Thus t-values are too large and some variables may appear significant when they are not Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall Autocorrelation Autocorrelation is correlation of the errors (residuals) over time Here, residuals show a cyclic pattern, not random Violates the regression assumption that residuals are random and independent Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

The Durbin-Watson Statistic The Durbin-Watson statistic is used to test for autocorrelation H0: successive residuals are not correlated (i.e., Corr(εt,εt-1) = 0) H1: autocorrelation is present Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

The Durbin-Watson Statistic H0: ρ = 0 (no autocorrelation) H1: autocorrelation is present The Durbin-Watson test statistic (d): The possible range is 0 ≤ d ≤ 4 d should be close to 2 if H0 is true d less than 2 may signal positive autocorrelation, d greater than 2 may signal negative autocorrelation Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

Testing for Positive Autocorrelation H0: positive autocorrelation does not exist H1: positive autocorrelation is present Calculate the Durbin-Watson test statistic = d d can be approximated by d = 2(1 – r) , where r is the sample correlation of successive errors Find the values dL and dU from the Durbin-Watson table (for sample size n and number of independent variables K) Decision rule: reject H0 if d < dL Reject H0 Inconclusive Do not reject H0 dL dU 2 Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

Negative Autocorrelation Negative autocorrelation exists if successive errors are negatively correlated This can occur if successive errors alternate in sign Decision rule for negative autocorrelation: reject H0 if d > 4 – dL Do not reject H0 Reject H0 Inconclusive Inconclusive Reject H0 dL dU 2 4 – dU 4 – dL 4 Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

Testing for Positive Autocorrelation (continued) Example with n = 25: Durbin-Watson Calculations Sum of Squared Difference of Residuals 3296.18 Sum of Squared Residuals 3279.98 Durbin-Watson Statistic 1.00494 Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

Testing for Positive Autocorrelation (continued) Here, n = 25 and there is k = 1 independent variable Using the Durbin-Watson table, dL = 1.29 and dU = 1.45 D = 1.00494 < dL = 1.29, so reject H0 and conclude that significant positive autocorrelation exists Therefore the linear model is not the appropriate model to forecast sales Decision: reject H0 since D = 1.00494 < dL Reject H0 Inconclusive Do not reject H0 dL=1.29 dU=1.45 2 Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

Dealing with Autocorrelation Suppose that we want to estimate the coefficients of the regression model where the error term εt is autocorrelated Two steps: (i) Estimate the model by least squares, obtaining the Durbin-Watson statistic, d, and then estimate the autocorrelation parameter using Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

Dealing with Autocorrelation (ii) Estimate by least squares a second regression with dependent variable (yt – ryt-1) independent variables (x1t – rx1,t-1) , (x2t – rx2,t-1) , . . ., (xk1t – rxk,t-1) The parameters 1, 2, . . ., k are estimated regression coefficients from the second model An estimate of 0 is obtained by dividing the estimated intercept for the second model by (1-r) Hypothesis tests and confidence intervals for the regression coefficients can be carried out using the output from the second model Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall Chapter Summary Discussed regression model building Introduced dummy variables for more than two categories and for experimental design Used lagged values of the dependent variable as regressors Discussed specification bias and multicollinearity Described heteroscedasticity Defined autocorrelation and used the Durbin-Watson test to detect positive and negative autocorrelation Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall