Multiple Linear Regression

Slides:



Advertisements
Similar presentations
Multiple Correlation & Regression SPSS. Analyze, Regression, Linear Notice that we have added “ideal” to the model we tested earlier.
Advertisements

Redundancy and Suppression
1 G Lect 4M Interpreting multiple regression weights: suppression and spuriousness. Partial and semi-partial correlations Multiple regression in.
1 1 Chapter 5: Multiple Regression 5.1 Fitting a Multiple Regression Model 5.2 Fitting a Multiple Regression Model with Interactions 5.3 Generating and.
Correlation Chapter 6. Assumptions for Pearson r X and Y should be interval or ratio. X and Y should be normally distributed. Each X should be independent.
Multiple Regression. Outline Purpose and logic : page 3 Purpose and logic : page 3 Parameters estimation : page 9 Parameters estimation : page 9 R-square.
Chapter 13 Multiple Regression
LINEAR REGRESSION: Evaluating Regression Models Overview Assumptions for Linear Regression Evaluating a Regression Model.
Psychology 202b Advanced Psychological Statistics, II February 1, 2011.
LINEAR REGRESSION: Evaluating Regression Models. Overview Assumptions for Linear Regression Evaluating a Regression Model.
LINEAR REGRESSION: Evaluating Regression Models. Overview Standard Error of the Estimate Goodness of Fit Coefficient of Determination Regression Coefficients.
Ch. 14: The Multiple Regression Model building
Correlation 1. Correlation - degree to which variables are associated or covary. (Changes in the value of one tends to be associated with changes in the.
Stat 112: Lecture 9 Notes Homework 3: Due next Thursday
Relationships Among Variables
Example of Simple and Multiple Regression
Objectives of Multiple Regression
Elements of Multiple Regression Analysis: Two Independent Variables Yong Sept
Understanding Multivariate Research Berry & Sanders.
© 2002 Prentice-Hall, Inc.Chap 14-1 Introduction to Multiple Regression Model.
1 FORECASTING Regression Analysis Aslı Sencer Graduate Program in Business Information Systems.
Multiple regression - Inference for multiple regression - A case study IPS chapters 11.1 and 11.2 © 2006 W.H. Freeman and Company.
Chapter 9 Analyzing Data Multiple Variables. Basic Directions Review page 180 for basic directions on which way to proceed with your analysis Provides.
Multiple Linear Regression. Purpose To analyze the relationship between a single dependent variable and several independent variables.
Multiple Linear Regression Partial Regression Coefficients.
Education 793 Class Notes Multiple Regression 19 November 2003.
Weighted and Unweighted MEANS ANOVA. Data Set “Int” Notice that there is an interaction here. Effect of gender at School 1 is = 45. Effect of.
REGRESSION DIAGNOSTICS Fall 2013 Dec 12/13. WHY REGRESSION DIAGNOSTICS? The validity of a regression model is based on a set of assumptions. Violation.
Chapter 10 For Explaining Psychological Statistics, 4th ed. by B. Cohen 1 A perfect correlation implies the ability to predict one score from another perfectly.
Multiple Regression. PSYC 6130, PROF. J. ELDER 2 Multiple Regression Multiple regression extends linear regression to allow for 2 or more independent.
Multiple Regression David A. Kenny January 12, 2014.
Chapter 8 Relationships Among Variables. Outline What correlational research investigates Understanding the nature of correlation What the coefficient.
Topics, Summer 2008 Day 1. Introduction Day 2. Samples and populations Day 3. Evaluating relationships Scatterplots and correlation Day 4. Regression and.
Chapter 14 EXPLORATORY FACTOR ANALYSIS. Exploratory Factor Analysis  Statistical technique for dealing with multiple variables  Many variables are reduced.
Chapter 12 REGRESSION DIAGNOSTICS AND CANONICAL CORRELATION.
Predicting Energy Consumption in Buildings using Multiple Linear Regression Introduction Linear regression is used to model energy consumption in buildings.
Topics: Multiple Regression Analysis (MRA)
Chapter 4: Basic Estimation Techniques
Chapter 14 Introduction to Multiple Regression
Chapter 4 Basic Estimation Techniques
Correlation, Bivariate Regression, and Multiple Regression
Multiple Regression – Part I
Chapter 9 Multiple Linear Regression
Correlation and Simple Linear Regression
Basic Estimation Techniques
Chapter 11: Simple Linear Regression
Regression Diagnostics
Multiple Regression Analysis and Model Building
Regression.
Correlation and Simple Linear Regression
Multiple Regression – Part II
Basic Estimation Techniques
CHAPTER- 17 CORRELATION AND REGRESSION
DRQ 8 Dr. Capps AGEC points
CORRELATION ANALYSIS.
Correlation and Simple Linear Regression
What is Regression Analysis?
Introduction to Regression
Simple Linear Regression
Simple Linear Regression and Correlation
Multiple Linear Regression
Product moment correlation
CORRELATION AND MULTIPLE REGRESSION ANALYSIS
Introduction to Regression
CORRELATION & REGRESSION compiled by Dr Kunal Pathak
MGS 3100 Business Analysis Regression Feb 18, 2016
Forecasting Plays an important role in many industries
Correlation and Prediction
Correlation and Simple Linear Regression
Correlation and Simple Linear Regression
Presentation transcript:

Multiple Linear Regression Review, January, 2018

Squared Correlation Coefficients

Squared Semipartial Correlation the proportion of all the variance in Y that is associated with one predictor but not with any of the other predictors. the decrease in R2 that results from removing a predictor from the model

sri Predict X1 from X2 sri is the simple correlation between ALL of Y and that part of X1 that is not related to any of the other predictors

Squared Partial Correlation Of the variance in Y that is not associated with any other predictors, what proportion is associated with the variance in Xi

sr2 Related to pr2

pri Predict Y from X2 Predict X1 from X2 is the r between Y partialled for all other predictors and Xi partialled for all other predictors.

A Demonstration Partial.sas – run this SAS program to obtain an illustration of the partial nature of the coefficients obtained in a multiple regression analysis.

More Details Multiple R2 and Partial Correlation/Regression Coefficients

Relative Weights Analysis Partial regression coefficients exclude variance that is shared among predictors. It is possible to have a large R2 but none of the predictors have substantial partial coefficients. There are now methods by which one can partition the R2 into pseudo-orthogonal portions, each portion representing the relative contribution of one predictor variable.

Proportions of Variance Predictor r2 sr2 Raw Relative Weight Rescaled Teach .646* .183* .344* .456 Knowledge .465* .071* .238* .316 Exam .355* .004 .124* .164 Grade .090* .007 .027 .035 Enroll .057 .010 .022 .029

If the predictors were orthogonal, the sum of r2 would be equal to R2, and The values of r2 would be identical to the values of sr2. The sr2 here is .275, and R2 = .755, so .755 - .275 = 48% of the variance in Overall is excluded from the squared semipartials due to redundancy.

Notice That The sum of the raw relative weights = .755 = the value of R2. The sum of the rescaled relative weights is 100%. The sr2 for Exam is not significant, but its raw relative weight is significant.

Predictors Independent of Each Other b X1 X2 a c Y b = error

Redundancy For each X, sri and i will be smaller than ryi, and the sum of the squared semipartial r’s (a + c) will be less than the multiple R2. (a + b + c)

Classical Suppression ry1 = .38, ry2 = 0, r12 = .45. the sign of  and sr for the classical suppressor variable will be opposite that of its non-zero zero-order r12. Notice also that for both predictor variables the absolute value of  exceeds that of the predictor’s r with Y. Y X2 X1

Net Suppression ry1 = .65, ry2 = .25, and r12 = .70. X1 X2 Note that 2 has a sign opposite that of ry2. It is always the X which has the smaller ryi which ends up with a  of opposite sign. Each  falls outside of the range 0  ryi, which is always true with any sort of suppression.

Net Suppression If X1 and X2 were independent,

Reversal Paradox Aka, Simpson’s Paradox treating severity of fire as the covariate, when we control for severity of fire, the more fire fighters we send, the less the amount of damage suffered in the fire. That is, for the conditional distributions (where severity of fire is held constant at some set value), sending more fire fighters reduces the amount of damage.

Cooperative Suppression Two X’s correlate negatively with one another but positively with Y (or positively with one another and negatively with Y) Each predictor suppresses variance in the other that is irrelevant to Y both predictor’s , pr, and sr increase in absolute magnitude (and retain the same sign as ryi).

Cooperative Suppression Y = how much the students in an introductory psychology class will learn Subjects are graduate teaching assistants X1 is a measure of the graduate student’s level of mastery of general psychology. X2 is an SOIS rating of how well the teacher presents simple easy to understand explanations.

Cooperative Suppression ry1 = .30, ry2 = .25, and r12 = 0.35. If X1 and X2 were independent,

Summary When i falls outside the range of 0  ryi, suppression is taking place If one ryi is zero or close to zero, it is classic suppression, and the sign of the  for the X with a nearly zero ryi may be opposite the sign of ryi.

Summary When neither X has ryi close to zero but one has a  opposite in sign from its ryi and the other a  greater in absolute magnitude but of the same sign as its ryi, net suppression is taking place. If both X’s have absolute i > ryi, but of the same sign as ryi, then cooperative suppression is taking place.

Psychologist Investigating Suppressor Effects in a Five Predictor Model