Download presentation

Presentation is loading. Please wait.

Published byOctavio Filer Modified about 1 year ago

1
Redundancy and Suppression Trivariate Regression

2
b ac X1X1 X2X2 Y Predictors Independent of Each Other b = error

3
Redundancy For each X, sr i and i will be smaller than r yi, and the sum of the squared semipartial r’s (a + c) will be less than the multiple R 2. (a + b + c)

4
Formulas Used Here

5
Classical Suppression r y1 =.38, r y2 = 0, r 12 =.45. the sign of and sr for the classical suppressor variable may be opposite that of its zero-order r 12. Notice also that for both predictor variables the absolute value of exceeds that of the predictor’s r with Y. Y X1X1 X2X2

6
Classical Suppression WTF adding a predictor that is uncorrelated with Y (for practical purposes, one whose r with Y is close to zero) increased our ability to predict Y? X 2 suppresses the variance in X 1 that is irrelevant to Y (area d)

7
Classical Suppression Math r 2 y(1.2), the squared semipartial for predicting Y from X 2 (sr 2 2 ), is the r 2 between Y and the residual (X 1 – X 1.2 ). It is increased (relative to r 2 y1 ) by removing from X 1 the irrelevant variance due to X 2 what variance is left in partialed X 1 is better correlated with Y than is unpartialed X 1.

8
Classical Suppression Math is less than Y X1X1 X2X2

9
Net Suppression Y X1X1 X2X2 r y1 =.65, r y2 =.25, and r 12 =.70. Note that 2 has a sign opposite that of r y2. It is always the X which has the smaller r yi which ends up with a of opposite sign. Each falls outside of the range 0 r yi, which is always true with any sort of suppression.

10
Reversal Paradox Aka, Simpson’s Paradox treating severity of fire as the covariate, when we control for severity of fire, the more fire fighters we send, the less the amount of damage suffered in the fire. That is, for the conditional distributions (where severity of fire is held constant at some set value), sending more fire fighters reduces the amount of damage.

11
Cooperative Suppression Two X’s correlate negatively with one another but positively with Y (or positively with one another and negatively with Y) Each predictor suppresses variance in the other that is irrelevant to Y both predictor’s , pr, and sr increase in absolute magnitude (and retain the same sign as r yi ).

12
Cooperative Suppression Y = how much the students in an introductory psychology class will learn Subjects are graduate teaching assistants X 1 is a measure of the graduate student’s level of mastery of general psychology. X 2 is an SOIS rating of how well the teacher presents simple easy to understand explanations.

13
Cooperative Suppression r y1 =.30, r y2 =.25, and r 12 = 0.35.

14
Summary When i falls outside the range of 0 r yi, suppression is taking place If one r yi is zero or close to zero, it is classic suppression, and the sign of the for the X with a nearly zero r yi may be opposite the sign of r yi.

15
Summary When neither X has r yi close to zero but one has a opposite in sign from its r yi and the other a greater in absolute magnitude but of the same sign as its r yi, net suppression is taking place. If both X’s have absolute i > r yi, but of the same sign as r yi, then cooperative suppression is taking place.

16
Psychologist Investigating Suppressor Effects in a Five Predictor Model

Similar presentations

© 2017 SlidePlayer.com Inc.

All rights reserved.

Ads by Google