Interaction Lyytinen & Gaskin.

Slides:



Advertisements
Similar presentations
Chapter 0 Review of Algebra.
Advertisements

Adders Used to perform addition, subtraction, multiplication, and division (sometimes) Half-adder adds rightmost (least significant) bit Full-adder.
Algebra I Unit 1: Solving Equations in One Variable
Ecole Nationale Vétérinaire de Toulouse Linear Regression
Introductory Mathematics & Statistics for Business
Properties of Real Numbers CommutativeAssociativeDistributive Identity + × Inverse + ×
Copyright © 2010 Pearson Education, Inc. Slide A least squares regression line was fitted to the weights (in pounds) versus age (in months) of a.
Overview of Lecture Partitioning Evaluating the Null Hypothesis ANOVA
Lecture 2 ANALYSIS OF VARIANCE: AN INTRODUCTION
1 Contact details Colin Gray Room S16 (occasionally) address: Telephone: (27) 2233 Dont hesitate to get in touch.
1 Correlation and Simple Regression. 2 Introduction Interested in the relationships between variables. What will happen to one variable if another is.
Chapter 7 Sampling and Sampling Distributions
Dr. Engr. Sami ur Rahman Data Analysis Lecture 6: SPSS.
Solve Multi-step Equations
Biostatistics Unit 5 Samples Needs to be completed. 12/24/13.
Solving Multi-Step Equations
Factoring Quadratics — ax² + bx + c Topic
Copyright © Cengage Learning. All rights reserved.
3/2003 Rev 1 I – slide 1 of 33 Session I Part I Review of Fundamentals Module 2Basic Physics and Mathematics Used in Radiation Protection.
1 Revisiting salary Acme Bank: Background A bank is facing a discrimination suit in which it is accused of paying its female employees.
Chapter 4 Systems of Linear Equations; Matrices
Copyright © 2001 by Houghton Mifflin Company. All rights reserved. 1 Economics THIRD EDITION By John B. Taylor Stanford University.
Quantitative Methods II
Copyright © Cengage Learning. All rights reserved.
5-1 Chapter 5 Theory & Problems of Probability & Statistics Murray R. Spiegel Sampling Theory.
Slide 6-1 COMPLEX NUMBERS AND POLAR COORDINATES 8.1 Complex Numbers 8.2 Trigonometric Form for Complex Numbers Chapter 8.
4.6 Perform Operations with Complex Numbers
LIAL HORNSBY SCHNEIDER
Chapter 10 Money, Interest, and Income
Hypothesis Tests: Two Independent Samples
Solving Quadratic Equations Solving Quadratic Equations
Copyright © 2013, 2009, 2006 Pearson Education, Inc.
Risk and Return Learning Module.
P Preparation for Calculus.
Factor P 16 8(8-5ab) 4(d² + 4) 3rs(2r – s) 15cd(1 + 2cd) 8(4a² + 3b²)
© 2012 National Heart Foundation of Australia. Slide 2.
Lecture plan Outline of DB design process Entity-relationship model
Statistical Analysis SC504/HS927 Spring Term 2008
Copyright © 2013, 2009, 2006 Pearson Education, Inc. 1 Section 5.4 Polynomials in Several Variables Copyright © 2013, 2009, 2006 Pearson Education, Inc.
6.4 Best Approximation; Least Squares
More Two-Step Equations
Model and Relationships 6 M 1 M M M M M M M M M M M M M M M M
Polynomial Functions of Higher Degree
Chapter 6 Equations 6.1 Solving Trigonometric Equations 6.2 More on Trigonometric Equations 6.3 Trigonometric Equations Involving Multiples Angles 6.4.
Rational Functions and Models
Computing Transformations
Determining How Costs Behave
Correlation and Regression
Analyzing Genes and Genomes
©Brooks/Cole, 2001 Chapter 12 Derived Types-- Enumerated, Structure and Union.
1 Interpreting a Model in which the slopes are allowed to differ across groups Suppose Y is regressed on X1, Dummy1 (an indicator variable for group membership),
Exponents and Radicals
Intracellular Compartments and Transport
PSSA Preparation.
Chapter 11: The t Test for Two Related Samples
Copyright © 2013 Pearson Education, Inc. All rights reserved Chapter 11 Simple Linear Regression.
Essential Cell Biology
Simple Linear Regression Analysis
Correlation and Linear Regression
Week 13 November Three Mini-Lectures QMM 510 Fall 2014.
Multiple Regression and Model Building
4/4/2015Slide 1 SOLVING THE PROBLEM A one-sample t-test of a population mean requires that the variable be quantitative. A one-sample test of a population.
Topic 7 – Other Regression Issues Reading: Some parts of Chapters 11 and 15.
Copyright © Cengage Learning. All rights reserved.
September In Chapter 14: 14.1 Data 14.2 Scatterplots 14.3 Correlation 14.4 Regression.
Lecture 22 Dustin Lueker.  The sample mean of the difference scores is an estimator for the difference between the population means  We can now use.
Soc 3306a Multiple Regression Testing a Model and Interpreting Coefficients.
By: Amani Albraikan.  Pearson r  Spearman rho  Linearity  Range restrictions  Outliers  Beware of spurious correlations….take care in interpretation.
Regression Chapter 16. Regression >Builds on Correlation >The difference is a question of prediction versus relation Regression predicts, correlation.
Multivariate Analysis Lec 4
Presentation transcript:

Interaction Lyytinen & Gaskin

Interaction – Definition In factorial designs, interaction effects are the joint effects of two predictor variables in addition to the individual main effects. This is another form of moderation (along with multi-grouping) – i.e., the XY relationship changes form (gets stronger, weaker, changes signs) depending on the value of another explanatory variable (the moderator) Hair et al 2010 pg. 347

Interaction Effects Interactions represent non-additive effects i.e., when the joint effects X and Z on Y are more or less than their additive effects. e.g., Diet * Exercise = greater Weight loss e.g., Chocolate * Cheese = Yucky!

Additive versus Interaction effects Negative Interaction Positive Interaction X=0 X=1 Z=0 1 3 Z=1 5 Diff 2 X=0 X=1 Z=0 1 3 Z=1 4 Diff 2 X=0 X=1 Z=0 1 3 Z=1 6 Diff 2 Y Y Y = > <

Simple interaction example Diet X Diet x Exercise Weight Loss The interaction term is used for testing the moderating effect of Exercise Y Exercise Z Exercise Diet Weight Loss

Diet Weight loss Exercise Weight loss Diet x Exercise Weight loss H2O low high Exercise Weight loss low high Diet x Exercise Weight loss low high H2O

Weight loss example Additive Effect Negative Interaction Positive Diet=0 Diet=1 Exer=0 1 3 Exer=1 5 Diff 2 Diet=0 Diet=1 Exer=0 1 3 Exer=1 4 Diff 2 Diet=0 Diet=1 Exer=0 1 3 Exer=1 6 Diff 2 WL WL WL = > < i.e., Exercising does not alter the effectiveness of dieting i.e., Exercising makes dieting less effective i.e., Exercising makes dieting more effective

Other examples Interaction between adding sugar to coffee and stirring the coffee. Neither of the two individual variables has much effect on sweetness but a combination of the two does. Interaction between adding carbon to steel and quenching. Neither of the two individually has much effect on strength but a combination of the two has a dramatic effect. Interaction between smoking and inhaling asbestos fibers: Both raise lung carcinoma risk, but exposure to asbestos multiplies the cancer risk in smokers and non-smokers. Interaction between genetic risk factors for type 2 diabetes and diet (specifically, a "western" diet). The western dietary pattern has been shown to increase diabetes risk for subjects with a high "genetic risk score", but not for other subjects.

Interaction in literature Interaction (a form of moderation) is central to research in the organizational and social sciences. Interaction is involved in research demonstrating: the effects of motivation on job performance are stronger among employees with high abilities (Locke & Latham, 1990), the effects of distributive justice on employee reactions are greater when procedural justice is low (Brockner & Wiesenfeld, 1996), the effects of job demands on illness are weaker when employees have control in their work environment (Karasek, 1979; Karasek & Theorell, 1990). Edwards 2009

Why interaction Interactions enable more precise explanation of causal effects by providing a method for explaining not only how X effects Y, but also under what circumstances the effect of X changes depending on the moderating variable of Z.

Interaction vs. Multi-group Literature makes little distinction between interaction and multi-group. Interaction is often treated like multi-group High vs Low values of age, income, size etc. The interpretation is much the same, but the method is different. Multi-group: categorical variables, split dataset, constrained paths Interaction: continuous variables, whole dataset, interaction variables

The statistical side of it A regular regression equation involving two independent variables: Y = b0 + b1X + b2Z + e In ordinary least squares (OLS) regression, the product of two variables can be used to represent the interactive effect: Y = b0 + b1X + b2Z + b3XZ + e where, XZ is the product term that represents the interaction effect, and b3 is the change in the slope of the regression of XY when Z changes by one unit.

The statistical side of it (cont) Essentially, the interaction regression equation specifies that the slope of the line relating X to Y changes at different levels of Z, or equivalently, that the slope of the line relating Z to Y changes at different levels of X. Saunders (1956) first demonstrated that a product term accurately reflects a continuous variable interaction. Similarly, natural polynomial or powered variables (X2, X3, etc.) can be used to represent higher order nonlinear effects of a variable such as a quadratic or cubic trend of age or time.

Significance of interaction effects Are slopes of regression lines for XY significantly different at differing values of Z? e.g., is the slope of the relationship between diet and weight loss significantly different between those who exercise very little and those who exercise a lot? The way to determine the significance is to calculate a p-value for the regression of XZY (AMOS handles this)

Significance of interaction part 2 One may also desire to know whether the change in the XY relationship with and without the interaction effect is significant. This can be done through a rather complex method available at: http://www.people.ku.edu/~preacher/interact/mlr2.htm He has a tool for it.

Range of Significance The region of significance defines the specific values of z at which the regression of y on x moves from non-significance to significance (see Preacher 2007). There are lower and upper bounds to the region. In many cases, the regression of y on the focal predictor is significant at values of the moderator that are less than the lower bound and greater than the upper bound, and the regression is non-significant at values of the moderator falling within the region. However, there are some cases in which the opposite holds (e.g., the significant slopes fall within the region). We will not calculate this, but there are ways to do so (see Preacher 2007).

Statistical Interaction Considerations Multicollinearity Interaction terms can be highly collinear with constituent IVs – this can be addressed by centering the means (Edwards 2009) Non-normal distribution handling If IVs are not normally distributed, the product term (interaction term) will likely result in biased estimations Neither is it true that if the IVs are normally distributed their product will be normally distributed This can sometimes be fixed through transformation of either or both the IVs and the interaction term

Statistical Interaction Considerations Reliability low reliability of either the IVs or the moderator (especially) is likely to increase either type I or type II errors (Edwards 2009) Isolating unique effects If the effect of XZY is significant, both XY and ZY must remain in the path model, even if non-significant, in order to isolate the unique effects of the interaction term

Mean Centering One way to fix multicollinearity issues inherent in the use of interaction terms is to mean-center all involved variables. This involves subtracting the variable mean from each response, thus placing the new mean at zero. Similarly, you can standardize the variable, which is simply replacing the variable values with their corresponding z-scores (mean=0, sd=1)

Benefits of Centering Centering can make otherwise uninterpretable regression coefficients meaningful, and Centering reduces multicollinearity among predictor variables. Centering has no effect on linear regression coefficients (except b0)

Mean Centering vs. Standardizing Original value Mean Centered Standardized Mean 5 Std Dev 2.73 1 A -4 -1.46 B 2 -3 -1.10 C 3 -2 -0.73 D 4 -1 -0.37 E 0.00 F 6 0.37 G 7 0.73 H 8 1.10 I 9 1.46

Higher order interactions We work with 2-way interactions, but interactions are not limited to X and Z. A three-way interaction looks like this: Y = b0 + b1X + b2Z + b3W + b4XZ + b5XW + b6ZW + b7XZW Where W is a second interaction term There are no mathematical limits on the number of interacting terms, but there are certainly practical limitations. One challenge is testing significance, as there are approaches to do this with 3 variables but not with more Another challenge is the interpretation of the interactions and what they truly mean theoretically

How To In SPSS create new variables by standardizing all variables in the model (except categorical ones) and then computing a product variable Use like an IV in AMOS. Trim model, starting with interaction effects Don’t trim paths from constituent IVs unless the parent interaction is deleted due to insignificance (this is because all the paths and their coefficients need to be interpreted together with the interaction term) Adjust per Model Fit issues If interactions are significant, plot them (there is software for it) Interpret the interaction/moderation effects

Standardize all variables in the model (unless categorical) 1a. Standardize Standardize all variables in the model (unless categorical) Result

1b. Compute product variable Go here Type new name (no spaces or mathematical symbols allowed) Type or click the product expression Hit OK Result

2. Put in AMOS

3. Trim Start here

4. Attend to Model Fit Just normal model fit stuff Don’t forget about: SRMR Chi-square/df CFI RMSEA Modification indices

5. Plot interaction (if the interaction is significant)

6. Interpret atrust is a stronger predictor of vallong (i.e., has a larger slope value – Beta) for cases of high ctrust. Thus, ctrust positively moderates (amplifies) the effect of atrust on vallong. For small differences like this one, the moderation is significant if the ZY Beta is significant (in AMOS output)

Plotting approach To ease interpretation of interaction, we treat them somewhat like multi-group variables… Select high and low values of moderator High: one standard deviation above the mean Low: one standard deviation below the mean -1 +1

More on Interpretation Exercise positively moderates (amplifies) the relationship between diet and weight loss. Exercise does not moderate the relationship between diet and weight loss. i.e., Exercising does not alter the effectiveness of dieting i.e., Exercising makes dieting more effective Exercise negatively moderates (dampens) the relationship between diet and weight loss. Exercise inversely moderates the relationship between diet and weight loss. i.e., Exercising makes dieting less effective i.e., All or nothing moderation, do both or do neither

Additional Resources This is a site hosted by Kristopher Preacher (as in the Preacher and Hayes articles), and is very informative regarding interactions: http://www.people.ku.edu/~preacher/interact/interactions.htm If you are interested in calculating the range of significance interaction values, refer to this somewhat complex (yet simplified) tool: http://www.people.ku.edu/~preacher/interact/mlr2.htm