Multiple Regression Review Sociology 229A Copyright © 2008 by Evan Schofer Do not copy or distribute without permission.

Slides:



Advertisements
Similar presentations
Partial and Semipartial Correlation
Advertisements

The Regression Equation  A predicted value on the DV in the bi-variate case is found with the following formula: Ŷ = a + B (X1)
Irwin/McGraw-Hill © Andrew F. Siegel, 1997 and l Chapter 12 l Multiple Regression: Predicting One Factor from Several Others.
Inferential Statistics
Multiple Regression Fenster Today we start on the last part of the course: multivariate analysis. Up to now we have been concerned with testing the significance.
Moderation: Assumptions
CORRELATION. Overview of Correlation u What is a Correlation? u Correlation Coefficients u Coefficient of Determination u Test for Significance u Correlation.
Statistics II: An Overview of Statistics. Outline for Statistics II Lecture: SPSS Syntax – Some examples. Normal Distribution Curve. Sampling Distribution.
Multiple Regression Involves the use of more than one independent variable. Multivariate analysis involves more than one dependent variable - OMS 633 Adding.
Department of Applied Economics National Chung Hsing University
Intro to Statistics for the Behavioral Sciences PSYC 1900 Lecture 7: Interactions in Regression.
Simple Regression correlation vs. prediction research prediction and relationship strength interpreting regression formulas –quantitative vs. binary predictor.
Ch. 14: The Multiple Regression Model building
Today Concepts underlying inferential statistics
Multiple Regression 2 Sociology 5811 Lecture 23 Copyright © 2005 by Evan Schofer Do not copy or distribute without permission.
Correlation and Regression Analysis
Week 14 Chapter 16 – Partial Correlation and Multiple Regression and Correlation.
Multiple Regression 1 Sociology 8811 Copyright © 2007 by Evan Schofer Do not copy or distribute without permission.
Linear Regression 2 Sociology 5811 Lecture 21 Copyright © 2005 by Evan Schofer Do not copy or distribute without permission.
Relationships Among Variables
Review for Final Exam Some important themes from Chapters 9-11 Final exam covers these chapters, but implicitly tests the entire course, because we use.
Multiple Linear Regression A method for analyzing the effects of several predictor variables concurrently. - Simultaneously - Stepwise Minimizing the squared.
Dummies (no, this lecture is not about you) POL 242 Renan Levine February 13/15, 2007.
Smith/Davis (c) 2005 Prentice Hall Chapter Eight Correlation and Prediction PowerPoint Presentation created by Dr. Susan R. Burns Morningside College.
Nonlinear Regression Functions
LEARNING PROGRAMME Hypothesis testing Intermediate Training in Quantitative Analysis Bangkok November 2007.
Hypothesis Testing in Linear Regression Analysis
Multiple Regression. In the previous section, we examined simple regression, which has just one independent variable on the right side of the equation.
Moderation & Mediation
Multiple Regression 1 Sociology 5811 Lecture 22 Copyright © 2005 by Evan Schofer Do not copy or distribute without permission.
Linear Functions 2 Sociology 5811 Lecture 18 Copyright © 2004 by Evan Schofer Do not copy or distribute without permission.
Regression Analysis. Scatter plots Regression analysis requires interval and ratio-level data. To see if your data fits the models of regression, it is.
Soc 3306a Multiple Regression Testing a Model and Interpreting Coefficients.
Statistics for the Social Sciences Psychology 340 Fall 2013 Correlation and Regression.
Correlation and Linear Regression. Evaluating Relations Between Interval Level Variables Up to now you have learned to evaluate differences between the.
Multiple Regression 5 Sociology 5811 Lecture 26 Copyright © 2005 by Evan Schofer Do not copy or distribute without permission.
1 Psych 5510/6510 Chapter 10. Interactions and Polynomial Regression: Models with Products of Continuous Predictors Spring, 2009.
Multiple Regression 3 Sociology 5811 Lecture 24 Copyright © 2005 by Evan Schofer Do not copy or distribute without permission.
Multiple Regression 4 Sociology 5811 Lecture 25 Copyright © 2005 by Evan Schofer Do not copy or distribute without permission.
MGS3100_04.ppt/Sep 29, 2015/Page 1 Georgia State University - Confidential MGS 3100 Business Analysis Regression Sep 29 and 30, 2015.
Multiple Regression. Multiple Regression  Usually several variables influence the dependent variable  Example: income is influenced by years of education.
SW388R6 Data Analysis and Computers I Slide 1 Multiple Regression Key Points about Multiple Regression Sample Homework Problem Solving the Problem with.
Chapter 16 Data Analysis: Testing for Associations.
Testing hypotheses Continuous variables. H H H H H L H L L L L L H H L H L H H L High Murder Low Murder Low Income 31 High Income 24 High Murder Low Murder.
University of Warwick, Department of Sociology, 2012/13 SO 201: SSAASS (Surveys and Statistics) (Richard Lampard) Week 5 Multiple Regression.
Welcome to Econ 420 Applied Regression Analysis Study Guide Week Seven.
Sociology 5811: Lecture 3: Measures of Central Tendency and Dispersion Copyright © 2005 by Evan Schofer Do not copy or distribute without permission.
1 Multivariable Modeling. 2 nAdjustment by statistical model for the relationships of predictors to the outcome. nRepresents the frequency or magnitude.
Overview of Regression Analysis. Conditional Mean We all know what a mean or average is. E.g. The mean annual earnings for year old working males.
Dummy Variables; Multiple Regression July 21, 2008 Ivan Katchanovski, Ph.D. POL 242Y-Y.
Correlation & Regression Analysis
Regression Analysis: Part 2 Inference Dummies / Interactions Multicollinearity / Heteroscedasticity Residual Analysis / Outliers.
Correlation They go together like salt and pepper… like oil and vinegar… like bread and butter… etc.
Business Statistics: A Decision-Making Approach, 6e © 2005 Prentice- Hall, Inc. Chap 14-1 Business Statistics: A Decision-Making Approach 6 th Edition.
DTC Quantitative Research Methods Regression I: (Correlation and) Linear Regression Thursday 27 th November 2014.
Multiple Regression David A. Kenny January 12, 2014.
Multiple Regression Learning Objectives n Explain the Linear Multiple Regression Model n Interpret Linear Multiple Regression Computer Output n Test.
Biostatistics Regression and Correlation Methods Class #10 April 4, 2000.
Regression Analysis: A statistical procedure used to find relations among a set of variables B. Klinkenberg G
Multiple Independent Variables POLS 300 Butz. Multivariate Analysis Problem with bivariate analysis in nonexperimental designs: –Spuriousness and Causality.
Chapter 8 Multivariate Regression Analysis 8.3 Multiple Regression with K Independent Variables 8.4 Significance tests of Parameters.
Linear Regression 1 Sociology 5811 Lecture 19 Copyright © 2005 by Evan Schofer Do not copy or distribute without permission.
University of Warwick, Department of Sociology, 2014/15 SO 201: SSAASS (Surveys and Statistics) (Richard Lampard)   Week 5 Multiple Regression  
Regression Analysis.
Multiple Regression Analysis and Model Building
Week 14 Chapter 16 – Partial Correlation and Multiple Regression and Correlation.
Regression 1 Sociology 8811 Copyright © 2007 by Evan Schofer
Multiple Regression – Part II
Multiple Regression 1 Sociology 8811 Copyright © 2007 by Evan Schofer
Financial Econometrics Fin. 505
Presentation transcript:

Multiple Regression Review Sociology 229A Copyright © 2008 by Evan Schofer Do not copy or distribute without permission

Multiple Regression Question: What if a dependent variable is affected by more than one independent variable? Strategy #1: Do separate bivariate regressions –One regression for each independent variable This yields separate slope estimates for each independent variable –Bivariate slope estimates implicitly assume that neither independent variable mediates the other –In reality, there might be no effect of family wealth over and above education

Multiple Regression Job Prestige: Two separate regression models Both variables have positive, significant slopes

Multiple Regression Idea #2: Use Multiple Regression Multiple regression can examine “partial” relationships –Partial = Relationships after the effects of other variables have been “controlled” (taken into account) This lets you determine the effects of variables “over and above” other variables –And shows the relative impact of different factors on a dependent variable And, you can use several independent variables to improve your predictions of the dependent var

Multiple Regression Job Prestige: 2 variable multiple regression Education slope is basically unchanged Family Income slope decreases compared to bivariate analysis (bivariate: b = 2.07) And, outcome of hypothesis test changes – t < 1.96

Multiple Regression Ex: Job Prestige: 2 variable multiple regression 1. Education has a large slope effect controlling for (i.e. “over and above”) family income 2. Family income does not have much effect controlling for education Despite a strong bivariate relationship Possible interpretations: Family income may lead to education, but education is the critical predictor of job prestige Or, family income is wholly unrelated to job prestige… but is coincidentally correlated with a variable that is (education), which generated a spurious “effect”.

The Multiple Regression Model A two-independent variable regression model: Note: There are now two X variables And a slope (b) is estimated for each one The full multiple regression model is: For k independent variables

Multiple Regression: Slopes Regression slope for the two variable case: b 1 = slope for X 1 – controlling for the other independent variable X 2 b 2 is computed symmetrically. Swap X 1 s, X 2 s Compare to bivariate slope:

Multiple Regression Slopes Let’s look more closely at the formulas: What happens to b 1 if X 1 and X 2 are totally uncorrelated? Answer: The formula reduces to the bivariate What if X 1 and X 2 are correlated with each other AND X 2 is more correlated with Y than X 1 ? Answer: b 1 gets smaller (compared to bivariate)

Regression Slopes So, if two variables (X 1, X 2 ) are correlated and both predict Y: The X variable that is more correlated with Y will have a higher slope in multivariate regression –The slope of the less-correlated variable will shrink Thus, slopes for each variable are adjusted to how well the other variable predicts Y –It is the slope “controlling” for other variables.

Multiple Regression Slopes One last thing to keep in mind… What happens to b 1 if X 1 and X 2 are almost perfectly correlated? Answer: The denominator approaches Zero The slope “blows up”, approaching infinity Highly correlated independent variables can cause trouble for regression models… watch out

Interpreting Results (Over)Simplified rules for interpretation –Assumes good sample, measures, models, etc. Multivariate regression with two variables: A, B If slopes of A, B are the same as bivariate, then each has an independent effect If A remains large, B shrinks to zero we typically conclude that effect of B was spurious, or operates through A If both A and B shrink a little, each has an effect, but some overlap or mediation is occurring

Interpreting Multivariate Results Things to watch out for: 1. Remember: Correlation is not causation –Ability to “control” for many variables can help detect spurious relationships… but it isn’t perfect. –Be aware that other (omitted) variables may be affecting your model. Don’t over-interpret results. 2. Reverse causality –Many sociological processes involve bi-directional causality. Regression slopes (and correlations) do not identify which variable “causes” the other. Ex: self-esteem and test scores.

Standardized Regression Coefficients Regression slopes reflect the units of the independent variables Question: How do you compare how “strong” the effects of two variables if they have totally different units? Example: Education, family wealth, job prestige –Education measured in years, b = 2.5 –Family wealth measured on 1-5 scale, b =.18 –Which is a “bigger” effect? Units aren’t comparable! Answer: Create “standardized” coefficients

Standardized Regression Coefficients Standardized Coefficients –Also called “Betas” or Beta Weights” –Symbol: Greek b with asterisk:  * –Equivalent to Z-scoring (standardizing) all independent variables before doing the regression Formula of coeficient for X j : Result: The unit is standard deviations Betas: Indicates the effect a 1 standard deviation change in X j on Y

Standardized Regression Coefficients Ex: Education, family income, and job prestige: An increase of 1 standard deviation in Education results in a.52 standard deviation increase in job prestige Betas give you a sense of which variables “matter most” What is the interpretation of the “family income” beta?

R-Square in Multiple Regression Multivariate R-square is much like bivariate: But, SSregression is based on the multivariate regression The addition of new variables results in better prediction of Y, less error (e), higher R-square.

R-Square in Multiple Regression Example: R-square of.272 indicates that education, parents wealth explain 27% of variance in job prestige “Adjusted R-square” is a more conservative, more accurate measure in multiple regression –Generally, you should report Adjusted R-square.

Dummy Variables Question: How can we incorporate nominal variables (e.g., race, gender) into regression? Option 1: Analyze each sub-group separately –Generates different slope, constant for each group Option 2: Dummy variables –“Dummy” = a dichotomous variables coded to indicate the presence or absence of something –Absence coded as zero, presence coded as 1.

Dummy Variables Strategy: Create a separate dummy variable for all nominal categories Ex: Gender – make female & male variables –DFEMALE: coded as 1 for all women, zero for men –DMALE: coded as 1 for all men Next: Include all but one dummy variables into a multiple regression model If two dummies, include 1; If 5 dummies, include 4.

Dummy Variables Question: Why can’t you include DFEMALE and DMALE in the same regression model? Answer: They are perfectly correlated (negatively): r = -1 –Result: Regression model “blows up” For any set of nominal categories, a full set of dummies contains redundant information –DMALE and DFEMALE contain same information –Dropping one removes redundant information.

Dummy Variables: Interpretation Consider the following regression equation: Question: What if the case is a male? Answer: DFEMALE is 0, so the entire term becomes zero. –Result: Males are modeled using the familiar regression model: a + b 1 X + e.

Dummy Variables: Interpretation Consider the following regression equation: Question: What if the case is a female? Answer: DFEMALE is 1, so b 2 (1) stays in the equation (and is added to the constant) –Result: Females are modeled using a different regression line: (a+b 2 ) + b 1 X + e –Thus, the coefficient of b 2 reflects difference in the constant for women.

Dummy Variables: Interpretation Remember, a different constant generates a different line, either higher or lower –Variable: DFEMALE (women = 1, men = 0) –A positive coefficient (b) indicates that women are consistently higher compared to men (on dep. var.) –A negative coefficient indicated women are lower Example: If DFEMALE coeff = 1.2: –“Women are on average 1.2 points higher than men”.

Dummy Variables: Interpretation Visually: Women = blue, Men = red INCOME HAPPY Overall slope for all data points Note: Line for men, women have same slope… but one is high other is lower. The constant differs! If women=1, men=0: The constant (a) reflects men only. Dummy coefficient (b) reflects increase for women (relative to men)

Dummy Variables What if you want to compare more than 2 groups? Example: Race –Coded 1=white, 2=black, 3=other (like GSS) Make 3 dummy variables: –“DWHITE” is 1 for whites, 0 for everyone else –“DBLACK” is 1 for Af. Am., 0 for everyone else –“DOTHER” is 1 for “others”, 0 for everyone else Then, include two of the three variables in the multiple regression model.

Dummy Variables: Interpretation Ex: Job Prestige Negative coefficient for DBLACK indicates a lower level of job prestige compared to whites –T- and P-values indicate if difference is significant.

Dummy Variables: Interpretation Comments: 1. Dummy coefficients shouldn’t be called slopes –Referring to the “slope” of gender doesn’t make sense –Rather, it is the difference in the constant (or “level”) 2. The contrast is always with the nominal category that was left out of the equation –If DFEMALE is included, the contrast is with males –If DBLACK, DOTHER are included, coefficients reflect difference in constant compared to whites.

Interaction Terms Question: What if you suspect that a variable has a totally different slope for two different sub- groups in your data? Example: Income and Happiness –Perhaps men are more materialistic -- an extra dollar increases their happiness a lot –If women are less materialistic, each dollar has a smaller effect on income (compared to men) Issue isn’t men = “more” or “less” than women –Rather, the slope of a variable (income) differs across groups

Interaction Terms Issue isn’t men = “more” or “less” than women –Rather, the slope of a variable coefficient (for income) differs across groups Again, we want to specify a different regression line for each group –We want lines with different slopes, not parallel lines that are higher or lower.

Interaction Terms Visually: Women = blue, Men = red INCOME HAPPY Overall slope for all data points Note: Here, the slope for men and women differs. The effect of income on happiness (X1 on Y) varies with gender (X2). This is called an “interaction effect”

Interaction Terms Examples of interaction: –Effect of education on income may interact with type of school attended (public vs. private) Private schooling has bigger effect on income –Effect of aspirations on educational attainment interacts with poverty Aspirations matter less if you don’t have money to pay for college Question: Can you think of examples of two variables that might interact? Either from your final project? Or anything else?

Interaction Terms Interaction effects: Differences in the relationship (slope) between two variables for each category of a third variable Option #1: Analyze each group separately Look for different sized slope in each group Option #2: Multiply the two variables of interest: (DFEMALE, INCOME) to create a new variable –Called: DFEMALE*INCOME –Add that variable to the multiple regression model.

Interaction Terms Consider the following regression equation: Question: What if the case is male? Answer: DFEMALE is 0, so b 2 (DFEM*INC) drops out of the equation –Result: Males are modeled using the ordinary regression equation: a + b 1 X + e.

Interaction Terms Consider the following regression equation: Question: What if the case is female? Answer: DFEMALE is 1, so b 2 (DFEM*INC) becomes b 2 *INCOME, which is added to b 1 –Result: Females are modeled using a different regression line: a + (b 1 +b 2 ) X + e –Thus, the coefficient of b 2 reflects difference in the slope of INCOME for women.

Interpreting Interaction Terms Interpreting interaction terms: A positive b for DFEMALE*INCOME indicates the slope for income is higher for women vs. men –A negative effect indicates the slope is lower –Size of coefficient indicates actual difference in slope Example: DFEMALE*INCOME. Observed b’s: –Income: b =.5 –DFEMALE * INCOME: b = -.2 Interpretation: Slope is.5 for men,.3 for women.

Interpreting Interaction Terms Example: Interaction of Race and Education affecting Job Prestige: DBLACK*EDUC has a negative effect (nearly significant). Coefficient of indicates that the slope of education and job prestige is.576 points lower for Blacks than for non-blacks.

Continuous Interaction Terms Two continuous variables can also interact Example: Effect of education and income on happiness –Perhaps highly educated people are less materialistic –As education increases, the slope between between income and happiness would decrease Simply multiply Education and Income to create the interaction term “EDUCATION*INCOME” And add it to the model.

Interpreting Interaction Terms How do you interpret continuous variable interactions? Example: EDUCATION*INCOME: Coefficient = 2.0 Answer: For each unit change in education, the slope of income vs. happiness increases by 2 –Note: coefficient is symmetrical: For each unit change in income, education slope increases by 2 Dummy interactions effectively estimate 2 slopes: one for each group Continuous interactions result in many slopes: Each value of education*income yields a different slope.

Interpreting Interaction Terms Interaction terms alters the interpretation of “main effect” coefficients Including “EDUC*INCOME changes the interpretation of EDUC and of INCOME See Allison p –Specifically, coefficient for EDUC represents slope of EDUC when INCOME = 0 Likewise, INCOME shows slope when EDUC=0 –Thus, main effects are like “baseline” slopes And, the interaction effect coefficient shows how the slope grows (or shrinks) for a given unit change.

Dummy Interactions It is also possible to construct interaction terms based on two dummy variables –Instead of a “slope” interaction, dummy interactions show difference in constants Constant (not slope) differs across values of a third variable –Example: Effect of of race on school success varies by gender African Americans do less well in school; but the difference is much larger for black males.

Dummy Interactions Strategy for dummy interaction is the same: Multiply both variables –Example: Multiply DBLACK, DMALE to create DBLACK*DMALE Then, include all 3 variables in the model –Effect of DBLACK*DMALE reflects difference in constant (level) for black males, compared to white males and black females You would observe a negative coefficient, indicating that black males fare worse in schools than black females or white males.

Interaction Terms: Remarks 1. If you make an interaction you should also include the component variables in the model: –A model with “DFEMALE * INCOME” should also include DFEMALE and INCOME There are rare exceptions. But when in doubt, include them 2. Sometimes interaction terms are highly correlated with its components That can cause problems (multicollinearity – which we’ll discuss more soon)

Interaction Terms: Remarks 3. Make sure you have enough cases in each group for your interaction terms –Interaction terms involve estimating slopes for sub- groups (e.g., black females vs black males). If you there are hardly any black females in the dataset, you can have problems 4. “Three-way” interactions are also possible! An interaction effect that varies across categories of yet another variable –Ex: DMale*DBlack interaction may vary across class They are mainly used in experimental research settings with large sample sizes… but they are possible.