Presentation is loading. Please wait.

Presentation is loading. Please wait.

Multicollinearity. Overview This set of slides –What is multicollinearity? –What is its effects on regression analyses? Next set of slides –How to detect.

Similar presentations


Presentation on theme: "Multicollinearity. Overview This set of slides –What is multicollinearity? –What is its effects on regression analyses? Next set of slides –How to detect."— Presentation transcript:

1 Multicollinearity

2 Overview This set of slides –What is multicollinearity? –What is its effects on regression analyses? Next set of slides –How to detect multicollinearity? –How to reduce some types of multicollinearity?

3 Multicollinearity Multicollinearity exists when two or more of the predictors in a regression model are moderately or highly correlated. Regression analyses most often take place on data obtained from observational studies. In observational studies, multicollinearity happens more often than not.

4 Types of multicollinearity Structural multicollinearity –a mathematical artifact caused by creating new predictors from other predictors, such as, creating the predictor x 2 from the predictor x. Sample-based multicollinearity –a result of a poorly designed experiment, reliance on purely observational data, or the inability to manipulate the system on which you collect the data

5 Example n = 20 hypertensive individuals p-1 = 6 predictor variables

6 Example BP Age Weight BSA Duration Pulse Age Weight BSA Duration Pulse Stress Blood pressure (BP) is the response.

7 What is effect on regression analyses if predictors are perfectly uncorrelated?

8 Example x1 x2 y Pearson correlation of x1 and x2 = 0.000

9 Example

10 The regression equation is y = x1 Predictor Coef SE Coef T P Constant x Analysis of Variance Source DF SS MS F P Regression Error Total Regress y on x 1

11 Regress y on x 2 The regression equation is y = x2 Predictor Coef SE Coef T P Constant x Analysis of Variance Source DF SS MS F P Regression Error Total

12 Regress y on x 1 and x 2 The regression equation is y = x x2 Predictor Coef SE Coef T P Constant x x Analysis of Variance Source DF SS MS F P Regression Error Total Source DF Seq SS x x

13 Regress y on x 2 and x 1 The regression equation is y = x x1 Predictor Coef SE Coef T P Constant x x Analysis of Variance Source DF SS MS F P Regression Error Total Source DF Seq SS x x

14 Summary of results Modelb1b1 se(b 1 )b2b2 se(b 2 )Seq SS x 1 only SSR(X 1 ) 3.13 x 2 only SSR(X 2 ) x 1, x 2 (in order) SSR(X 2 |X 1 ) x 2, x 1 (in order) SSR(X 1 |X 2 ) 3.13

15 If predictors are perfectly uncorrelated, then… You get the same slope estimates regardless of the first-order regression model used. That is, the effect on the response ascribed to a predictor doesn’t depend on the other predictors in the model.

16 If predictors are perfectly uncorrelated, then… The sum of squares SSR(X 1 ) is the same as the sequential sum of squares SSR(X 1 |X 2 ). The sum of squares SSR(X 2 ) is the same as the sequential sum of squares SSR(X 2 |X 1 ). That is, the marginal contribution of one predictor variable in reducing the error sum of squares doesn’t depend on the other predictors in the model.

17 Do we see similar results for “real data” with nearly uncorrelated predictors?

18 Example BP Age Weight BSA Duration Pulse Age Weight BSA Duration Pulse Stress

19 Example

20 Regress BP on x 1 = Stress The regression equation is BP = Stress Predictor Coef SE Coef T P Constant Stress S = R-Sq = 2.7% R-Sq(adj) = 0.0% Analysis of Variance Source DF SS MS F P Regression Error Total

21 Regress BP on x 2 = BSA The regression equation is BP = BSA Predictor Coef SE Coef T P Constant BSA S = R-Sq = 75.0% R-Sq(adj) = 73.6% Analysis of Variance Source DF SS MS F P Regression Error Total

22 Regress BP on x 1 = Stress and x 2 = BSA The regression equation is BP = Stress BSA Predictor Coef SE Coef T P Constant Stress BSA Analysis of Variance Source DF SS MS F P Regression Error Total Source DF Seq SS Stress BSA

23 Regress BP on x 2 = BSA and x 1 = Stress The regression equation is BP = BSA Stress Predictor Coef SE Coef T P Constant BSA Stress Analysis of Variance Source DF SS MS F P Regression Error Total Source DF Seq SS BSA Stress

24 Summary of results Modelb1b1 se(b 1 )b2b2 se(b 2 )Seq SS x 1 only SSR(X 1 ) x 2 only SSR(X 2 ) x 1, x 2 (in order) SSR(X 2 |X 1 ) x 2, x 1 (in order) SSR(X 1 |X 2 ) 12.26

25 If predictors are nearly uncorrelated, then… You get similar slope estimates regardless of the first-order regression model used. The sum of squares SSR(X 1 ) is similar to the sequential sum of squares SSR(X 1 |X 2 ). The sum of squares SSR(X 2 ) is similar to the sequential sum of squares SSR(X 2 |X 1 ).

26 What happens if the predictor variables are highly correlated?

27 Example BP Age Weight BSA Duration Pulse Age Weight BSA Duration Pulse Stress

28 Example

29 Regress BP on x 1 = Weight The regression equation is BP = Weight Predictor Coef SE Coef T P Constant Weight S = R-Sq = 90.3% R-Sq(adj) = 89.7% Analysis of Variance Source DF SS MS F P Regression Error Total

30 Regress BP on x 2 = BSA The regression equation is BP = BSA Predictor Coef SE Coef T P Constant BSA S = R-Sq = 75.0% R-Sq(adj) = 73.6% Analysis of Variance Source DF SS MS F P Regression Error Total

31 Regress BP on x 1 = Weight and x 2 = BSA The regression equation is BP = Weight BSA Predictor Coef SE Coef T P Constant Weight BSA Analysis of Variance Source DF SS MS F P Regression Error Total Source DF Seq SS Weight BSA

32 Regress BP on x 2 = BSA and x 1 = Weight The regression equation is BP = BSA Weight Predictor Coef SE Coef T P Constant BSA Weight Analysis of Variance Source DF SS MS F P Regression Error Total Source DF Seq SS BSA Weight

33 Effect #1 of multicollinearity When predictor variables are correlated, the regression coefficient of any one variable depends on which other predictor variables are included in the model. Variables in model b1b1 b2b2 x1x x2x x 1, x

34 Even correlated predictors not in the model can have an impact! Regression of territory sales on territory population, per capita income, etc. Against expectation, coefficient of territory population was determined to be negative. Competitor’s market penetration, which was strongly positively correlated with territory population, was not included in model. But, competitor kept sales down in territories with large populations.

35 Effect #2 of multicollinearity When predictor variables are correlated, the precision of the estimated regression coefficients decreases as more predictor variables are added to the model. Variables in model se(b 1 )se(b 2 ) x1x x2x x 1, x

36 Why not effects #1 and #2?

37

38 Why effects #1 and #2?

39 Effect #3 of multicollinearity When predictor variables are correlated, the marginal contribution of any one predictor variable in reducing the error sum of squares varies, depending on which other variables are already in model. SSR(X 1 ) = SSR(X 1 |X 2 ) = SSR(X 2 ) = SSR(X 2 |X 1 ) = 2.81

40 What is the effect on estimating mean or predicting new response?

41 Weight Fit SE Fit 95.0% CI 95.0% PI (111.85,113.54) (108.94,116.44) BSA Fit SE Fit 95.0% CI 95.0% PI (112.76,115.38) (108.06,120.08) BSA Weight Fit SE Fit 95.0% CI 95.0% PI (111.93,113.83) (109.08, ) Effect #4 of multicollinearity on estimating mean or predicting Y High multicollinearity among predictor variables does not prevent good, precise predictions of the response (within scope of model).

42 The regression equation is BP = BSA Predictor Coef SE Coef T P Constant BSA S = R-Sq = 75.0% R-Sq(adj) = 73.6% Analysis of Variance Source DF SS MS F P Regression Error Total What is effect on tests of individual slopes?

43 The regression equation is BP = Weight Predictor Coef SE Coef T P Constant Weight S = R-Sq = 90.3% R-Sq(adj) = 89.7% Analysis of Variance Source DF SS MS F P Regression Error Total What is effect on tests of individual slopes?

44 The regression equation is BP = Weight BSA Predictor Coef SE Coef T P Constant Weight BSA Analysis of Variance Source DF SS MS F P Regression Error Total Source DF Seq SS Weight BSA

45 Effect #5 of multicollinearity on slope tests When predictor variables are correlated, hypothesis tests for β k = 0 may yield different conclusions depending on which predictor variables are in the model. Variables in model b2b2 se(b 2 )tP-value x2x x 1, x

46 The major impacts on model use In the presence of multicollinearity, it is okay to use an estimated regression model to predict y or estimate μ Y in scope of model. In the presence of multicollinearity, we can no longer interpret a slope coefficient as … –the change in the mean response for each additional unit increase in x k, when all the other predictors are held constant


Download ppt "Multicollinearity. Overview This set of slides –What is multicollinearity? –What is its effects on regression analyses? Next set of slides –How to detect."

Similar presentations


Ads by Google