MULTIPLE REGRESSION. OVERVIEW What Makes it Multiple? What Makes it Multiple? Additional Assumptions Additional Assumptions Methods of Entering Variables.

Slides:



Advertisements
Similar presentations
Topics: Multiple Regression Analysis (MRA)
Advertisements

Chapter 5 Multiple Linear Regression
Kin 304 Regression Linear Regression Least Sum of Squares
Items to consider - 3 Multicollinearity
Automated Regression Modeling Descriptive vs. Predictive Regression Models Four common automated modeling procedures Forward Modeling Backward Modeling.
Regression single and multiple. Overview Defined: A model for predicting one variable from other variable(s). Variables:IV(s) is continuous, DV is continuous.
Regression With Categorical Variables. Overview Regression with Categorical Predictors Logistic Regression.
LINEAR REGRESSION: Evaluating Regression Models Overview Assumptions for Linear Regression Evaluating a Regression Model.
t-Tests Overview of t-Tests How a t-Test Works How a t-Test Works Single-Sample t Single-Sample t Independent Samples t Independent Samples t Paired.
LINEAR REGRESSION: Evaluating Regression Models. Overview Assumptions for Linear Regression Evaluating a Regression Model.
LINEAR REGRESSION: Evaluating Regression Models. Overview Standard Error of the Estimate Goodness of Fit Coefficient of Determination Regression Coefficients.
T-Tests.
Psychology 202b Advanced Psychological Statistics, II February 22, 2011.
LINEAR REGRESSION: What it Is and How it Works. Overview What is Bivariate Linear Regression? The Regression Equation How It’s Based on r Assumptions.
© 2005 The McGraw-Hill Companies, Inc., All Rights Reserved. Chapter 14 Using Multivariate Design and Analysis.
Comparing the Various Types of Multiple Regression
1-1 Regression Models  Population Deterministic Regression Model Y i =  0 +  1 X i u Y i only depends on the value of X i and no other factor can affect.
Statistics for Managers Using Microsoft® Excel 5th Edition
Bivariate Regression CJ 526 Statistical Analysis in Criminal Justice.
Statistics for Managers Using Microsoft® Excel 5th Edition
Multiple Linear Regression Introduction to Business Statistics, 5e Kvanli/Guynes/Pavur (c)2000 South-Western College Publishing.
19-1 Chapter Nineteen MULTIVARIATE ANALYSIS: An Overview.
Intro to Statistics for the Behavioral Sciences PSYC 1900
Lecture 6: Multiple Regression
Predictive Analysis in Marketing Research
Bivariate & Multivariate Regression correlation vs. prediction research prediction and relationship strength interpreting regression formulas process of.
Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc. Chap 15-1 Chapter 15 Multiple Regression Model Building Basic Business Statistics 11 th Edition.
Chapter 15: Model Building
Correlation 1. Correlation - degree to which variables are associated or covary. (Changes in the value of one tends to be associated with changes in the.
Multiple Regression – Basic Relationships
1 Chapter 17: Introduction to Regression. 2 Introduction to Linear Regression The Pearson correlation measures the degree to which a set of data points.
Multiple Regression Dr. Andy Field.
Multiple Linear Regression A method for analyzing the effects of several predictor variables concurrently. - Simultaneously - Stepwise Minimizing the squared.
Correlation & Regression
Example of Simple and Multiple Regression
Elements of Multiple Regression Analysis: Two Independent Variables Yong Sept
LOGO Chapter 4 Multiple Regression Analysis Devilia Sari - Natalia.
Regression Analyses. Multiple IVs Single DV (continuous) Generalization of simple linear regression Y’ = b 0 + b 1 X 1 + b 2 X 2 + b 3 X 3...b k X k Where.
Multiple Linear Regression. Purpose To analyze the relationship between a single dependent variable and several independent variables.
Lesson Multiple Regression Models. Objectives Obtain the correlation matrix Use technology to find a multiple regression equation Interpret the.
Chapter 16 Data Analysis: Testing for Associations.
Multiple Regression INCM 9102 Quantitative Methods.
© 2006 by The McGraw-Hill Companies, Inc. All rights reserved. 1 Chapter 12 Testing for Relationships Tests of linear relationships –Correlation 2 continuous.
 Relationship between education level, income, and length of time out of school  Our new regression equation: is the predicted value of the dependent.
Statistics for Managers Using Microsoft Excel, 4e © 2004 Prentice-Hall, Inc. Chap 14-1 Chapter 14 Multiple Regression Model Building Statistics for Managers.
1 Doing Statistics for Business Doing Statistics for Business Data, Inference, and Decision Making Marilyn K. Pelosi Theresa M. Sandifer Chapter 12 Multiple.
Copyright © 2010 Pearson Education, Inc Chapter Seventeen Correlation and Regression.
B AD 6243: Applied Univariate Statistics Multiple Regression Professor Laku Chidambaram Price College of Business University of Oklahoma.
Roger B. Hammer Assistant Professor Department of Sociology Oregon State University Conducting Social Research Specification: Choosing the Independent.
Statistics for Managers Using Microsoft Excel, 4e © 2004 Prentice-Hall, Inc. Chap 14-1 Chapter 14 Multiple Regression Model Building Statistics for Managers.
Using SPSS Note: The use of another statistical package such as Minitab is similar to using SPSS.
Multiple Regression David A. Kenny January 12, 2014.
Multiple Regression Analysis Regression analysis with two or more independent variables. Leads to an improvement.
Research Methodology Lecture No :26 (Hypothesis Testing – Relationship)
Venn diagram shows (R 2 ) the amount of variance in Y that is explained by X. Unexplained Variance in Y. (1-R 2 ) =.36, 36% R 2 =.64 (64%)
Copyright © 2012 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 18 Multivariate Statistics.
Multiple Regression Scott Hudson January 24, 2011.
Chapter 11 REGRESSION Multiple Regression  Uses  Explanation  Prediction.
Regression. Why Regression? Everything we’ve done in this class has been regression: When you have categorical IVs and continuous DVs, the ANOVA framework.
Chapter 12 REGRESSION DIAGNOSTICS AND CANONICAL CORRELATION.
Yandell – Econ 216 Chap 15-1 Chapter 15 Multiple Regression Model Building.
Topics: Multiple Regression Analysis (MRA)
Multiple Regression: II
Multiple Regression Prof. Andy Field.
Understanding Regression Analysis Basics
Kin 304 Regression Linear Regression Least Sum of Squares
Regression.
Regression Analysis.
Regression Part II.
Presentation transcript:

MULTIPLE REGRESSION

OVERVIEW What Makes it Multiple? What Makes it Multiple? Additional Assumptions Additional Assumptions Methods of Entering Variables Methods of Entering Variables Adjusted R 2 Adjusted R 2 Using z-Scores Using z-Scores

WHAT MAKES IT MULTIPLE? Predict from a combination of two or more predictor (X) variables. Predict from a combination of two or more predictor (X) variables. The regression model may account for more variance with more predictors. The regression model may account for more variance with more predictors. Look for predictor variables with low inter- correlations. Look for predictor variables with low inter- correlations.

Multiple Regression Equation Like simple regression, use a linear equation to predict Y scores. Like simple regression, use a linear equation to predict Y scores. Use the least squares solution. Use the least squares solution.

Assumptions for Regression Quantitative data (or dichotomous) Quantitative data (or dichotomous) Independent observations Independent observations Predict for same population that was sampled Predict for same population that was sampled Linear relationship Linear relationship

Assumptions for Regression Homoscedasticity Homoscedasticity Independent errors Independent errors Normality of errors Normality of errors

ADDITIONAL ASSUMPTIONS Large ratio of sample size to number of predictor variables Large ratio of sample size to number of predictor variables Minimum 15 subjects per predictor variable Minimum 15 subjects per predictor variable Predictor variables are not strongly intercorrelated (no multicollinearity) Predictor variables are not strongly intercorrelated (no multicollinearity) Examine VIF – should be close to 1 Examine VIF – should be close to 1

Multicollinearity When predictor variables are highly intercorrelated with each other, prediction accuracy is not as good. When predictor variables are highly intercorrelated with each other, prediction accuracy is not as good. Be cautious about determining which predictor variable is predicting the best when there is high collinearity among the predictors. Be cautious about determining which predictor variable is predicting the best when there is high collinearity among the predictors.

METHODS OF ENTERING VARIABLES Simultaneous Simultaneous Hierarchical/Block Entry Hierarchical/Block Entry Stepwise Stepwise Forward Forward Backward Backward Stepwise Stepwise

Simultaneous Multiple Regression All predictor variables are entered into the regression at the same time All predictor variables are entered into the regression at the same time Allows you to determine portion of variance explained by each predictor with the others statistically controlled (part correlation) Allows you to determine portion of variance explained by each predictor with the others statistically controlled (part correlation)

Hierarchical Multiple Regression Enter variables in a particular order based on a theory or on prior research Enter variables in a particular order based on a theory or on prior research Can be done with blocks of variables Can be done with blocks of variables

Stepwise Multiple Regression Enter or remove predictor variables one at a time based on explaining significant portions of variance in the criterion Enter or remove predictor variables one at a time based on explaining significant portions of variance in the criterion Forward Forward Backward Backward Stepwise Stepwise

Forward Stepwise begin with no predictor variables begin with no predictor variables add predictors one at a time according to which one will result in the largest increase in R 2 add predictors one at a time according to which one will result in the largest increase in R 2 stop when R 2 will not be significantly increased stop when R 2 will not be significantly increased

Backward Stepwise begin with all predictor variables begin with all predictor variables remove predictors one at a time according to which one will result in the smallest decrease in R 2 remove predictors one at a time according to which one will result in the smallest decrease in R 2 stop when R 2 would be significantly decreased stop when R 2 would be significantly decreased may uncover suppressor variables may uncover suppressor variables

Suppressor Variable Predictor variable which, when entered into the equation, increases the amount of variance explained by another predictor variable Predictor variable which, when entered into the equation, increases the amount of variance explained by another predictor variable In backward regression, removing the suppressor would likely result in a significant decrease in R 2, so it will be left in the equation In backward regression, removing the suppressor would likely result in a significant decrease in R 2, so it will be left in the equation

Suppressor Variable Example Y = Job Performance Rating Y = Job Performance Rating X1 = College GPA X1 = College GPA X2 = Writing Test Score X2 = Writing Test Score

Suppressor Variable Example Let’s say Writing Score is not correlated with Job Performance, because the job doesn’t require much writing Let’s say Writing Score is not correlated with Job Performance, because the job doesn’t require much writing Let’s say GPA is only a weak predictor of Job Performance, but it seems like it should be a good predictor Let’s say GPA is only a weak predictor of Job Performance, but it seems like it should be a good predictor

Suppressor Variable Example Let’s say GPA is “contaminated” by differences in writing ability – really good writers can fake and get higher grades Let’s say GPA is “contaminated” by differences in writing ability – really good writers can fake and get higher grades So, if Writing Score is in the equation, the contamination is removed, and we get a better picture of the GPA-Job Performance relationship So, if Writing Score is in the equation, the contamination is removed, and we get a better picture of the GPA-Job Performance relationship

Stepwise begin with no predictor variables begin with no predictor variables add predictors one at a time according to which one will result in the largest increase in R 2 add predictors one at a time according to which one will result in the largest increase in R 2 at each step remove any variable that does not explain a significant portion of variance at each step remove any variable that does not explain a significant portion of variance stop when R 2 will not be significantly increased stop when R 2 will not be significantly increased

Choosing a Stepwise Method Forward Forward Easier to conceptualize Easier to conceptualize Provides efficient model for predicting Y Provides efficient model for predicting Y Backward Backward Can uncover suppressor effects Can uncover suppressor effects Stepwise Stepwise Can uncover suppressor effects Can uncover suppressor effects Tends to be unstable with smaller N’s Tends to be unstable with smaller N’s

ADJUSTED R 2 R 2 may overestimate the true amount of variance explained. R 2 may overestimate the true amount of variance explained. Adjusted R 2 compensates by reducing the R 2 according to the ratio of subjects per predictor variable. Adjusted R 2 compensates by reducing the R 2 according to the ratio of subjects per predictor variable.

BETA WEIGHTS The regression weights can be standardized into beta weights. The regression weights can be standardized into beta weights. Beta weights do not depend on the scales of the variables. Beta weights do not depend on the scales of the variables. A beta weight indicates the amount of change in Y in units of SD for each SD change in the predictor. A beta weight indicates the amount of change in Y in units of SD for each SD change in the predictor.

Example of Reporting Results of Multiple Regression We performed a simultaneous multiple regression with vocabulary score, abstraction score, and age as predictors and preference for intense music as the dependent variable. The equation accounted for a significant portion of variance, F(3,66) = 4.47, p =.006. As shown in Table 1, the only significant predictor was abstraction score. We performed a simultaneous multiple regression with vocabulary score, abstraction score, and age as predictors and preference for intense music as the dependent variable. The equation accounted for a significant portion of variance, F(3,66) = 4.47, p =.006. As shown in Table 1, the only significant predictor was abstraction score.

Take-Home Points Multiple Regression is a useful, flexible method. Multiple Regression is a useful, flexible method. Find the right procedure for your purpose. Find the right procedure for your purpose.