LECTURE 5 MULTIPLE REGRESSION TOPICS –SQUARED MULTIPLE CORRELATION –B AND BETA WEIGHTS –HIERARCHICAL REGRESSION MODELS –SETS OF INDEPENDENT VARIABLES –SIGNIFICANCE.

Slides:



Advertisements
Similar presentations
All Possible Regressions and Statistics for Comparing Models
Advertisements

6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
Topic 15: General Linear Tests and Extra Sum of Squares.
Regression Basics Predicting a DV with a Single IV.
Chapter 17 Making Sense of Advanced Statistical Procedures in Research Articles.
Generalized Linear Models (GLM)
LINEAR REGRESSION: Evaluating Regression Models Overview Assumptions for Linear Regression Evaluating a Regression Model.
MULTIPLE REGRESSION. OVERVIEW What Makes it Multiple? What Makes it Multiple? Additional Assumptions Additional Assumptions Methods of Entering Variables.
LINEAR REGRESSION: Evaluating Regression Models. Overview Assumptions for Linear Regression Evaluating a Regression Model.
LINEAR REGRESSION: Evaluating Regression Models. Overview Standard Error of the Estimate Goodness of Fit Coefficient of Determination Regression Coefficients.
More on ANOVA. Overview ANOVA as Regression Comparison Methods.
© 2005 The McGraw-Hill Companies, Inc., All Rights Reserved. Chapter 14 Using Multivariate Design and Analysis.
X Y. Variance Covariance Correlation Scatter plot.
CORRELATION AND SIMPLE LINEAR REGRESSION - Revisited Ref: Cohen, Cohen, West, & Aiken (2003), ch. 2.
Multiple Regression Involves the use of more than one independent variable. Multivariate analysis involves more than one dependent variable - OMS 633 Adding.
Multiple Regression Models Advantages of multiple regression Important preliminary analyses Parts of a multiple regression model & interpretation Differences.
Statistical Analysis SC504/HS927 Spring Term 2008 Session 5: Week 20: 15 th February OLS (2): assessing goodness of fit, extension to multiple regression.
19-1 Chapter Nineteen MULTIVARIATE ANALYSIS: An Overview.
Statistics 350 Lecture 25. Today Last Day: Start Chapter 9 ( )…please read 9.1 and 9.2 thoroughly Today: More Chapter 9…stepwise regression.
1 Review of Correlation A correlation coefficient measures the strength of a linear relation between two measurement variables. The measure is based on.
LECTURE 12 Multiple regression analysis Epsy 640 Texas A&M University.
Data mining and statistical learning, lecture 3 Outline  Ordinary least squares regression  Ridge regression.
Multiple Linear Regression
Chapter 15: Model Building
LECTURE 13 PATH MODELING EPSY 640 Texas A&M University.
Multiple Regression Research Methods and Statistics.
EPSY 651: Structural Equation Modeling I. Where does SEM fit in Quantitative Methodology? Draws on three traditions in mathematics and science: Psychology.
Multiple Linear Regression A method for analyzing the effects of several predictor variables concurrently. - Simultaneously - Stepwise Minimizing the squared.
Marketing Research Aaker, Kumar, Day and Leone Tenth Edition
Elements of Multiple Regression Analysis: Two Independent Variables Yong Sept
Wednesday PM  Presentation of AM results  Multiple linear regression Simultaneous Simultaneous Stepwise Stepwise Hierarchical Hierarchical  Logistic.
Regression with 2 IVs Generalization of Regression from 1 to 2 Independent Variables.
Multiple Regression Selecting the Best Equation. Techniques for Selecting the "Best" Regression Equation The best Regression equation is not necessarily.
Model Selection1. 1. Regress Y on each k potential X variables. 2. Determine the best single variable model. 3. Regress Y on the best variable and each.
Soc 3306a Lecture 9: Multivariate 2 More on Multiple Regression: Building a Model and Interpreting Coefficients.
More Regression What else?.
Multiple Linear Regression. Purpose To analyze the relationship between a single dependent variable and several independent variables.
Chapter 7 Relationships Among Variables What Correlational Research Investigates Understanding the Nature of Correlation Positive Correlation Negative.
Simple Linear Regression (SLR)
Simple Linear Regression (OLS). Types of Correlation Positive correlationNegative correlationNo correlation.
Analysis Overheads1 Analyzing Heterogeneous Distributions: Multiple Regression Analysis Analog to the ANOVA is restricted to a single categorical between.
Multiple Regression Selecting the Best Equation. Techniques for Selecting the "Best" Regression Equation The best Regression equation is not necessarily.
© 2006 by The McGraw-Hill Companies, Inc. All rights reserved. 1 Chapter 12 Testing for Relationships Tests of linear relationships –Correlation 2 continuous.
 Relationship between education level, income, and length of time out of school  Our new regression equation: is the predicted value of the dependent.
Lecture 5 EPSY 642 Victor Willson Fall EFFECT SIZE DISTRIBUTION Hypothesis: All effects come from the same distribution What does this look like.
Correlation & Regression Analysis
Chapter 10 Mediation Class 6 Spring 2015 Ang &Huan (2006)
4 basic analytical tasks in statistics: 1)Comparing scores across groups  look for differences in means 2)Cross-tabulating categoric variables  look.
Using SPSS Note: The use of another statistical package such as Minitab is similar to using SPSS.
Multiple Regression Analysis Regression analysis with two or more independent variables. Leads to an improvement.
Applied Quantitative Analysis and Practices LECTURE#28 By Dr. Osman Sadiq Paracha.
Intro to Statistics for the Behavioral Sciences PSYC 1900 Lecture 7: Regression.
Week of March 23 Partial correlations Semipartial correlations
Copyright © 2012 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 18 Multivariate Statistics.
Simple and multiple regression analysis in matrix form Least square Beta estimation Beta Simple linear regression Multiple regression with two predictors.
Spring 2007 Lecture 9Slide #1 More on Multivariate Regression Analysis Multivariate F-Tests Multicolinearity The EVILS of Stepwise Regression Intercept.
Chapter 11 REGRESSION Multiple Regression  Uses  Explanation  Prediction.
Regression. Why Regression? Everything we’ve done in this class has been regression: When you have categorical IVs and continuous DVs, the ANOVA framework.
Chapter 12 REGRESSION DIAGNOSTICS AND CANONICAL CORRELATION.
Yandell – Econ 216 Chap 15-1 Chapter 15 Multiple Regression Model Building.
Topics: Multiple Regression Analysis (MRA)
Multiple Regression.
Simple Bivariate Regression
Correlation, Bivariate Regression, and Multiple Regression
Regression.
بحث في التحليل الاحصائي SPSS بعنوان :
Multiple Regression.
Incremental Partitioning of Variance (aka Hierarchical Regression)
Product moment correlation
Regression Analysis.
Presentation transcript:

LECTURE 5 MULTIPLE REGRESSION TOPICS –SQUARED MULTIPLE CORRELATION –B AND BETA WEIGHTS –HIERARCHICAL REGRESSION MODELS –SETS OF INDEPENDENT VARIABLES –SIGNIFICANCE TESTING SETS –POWER –ERROR RATES

SQUARED MULTIPLE CORRELATION Measure of variance accounted for by predictors Always increases (or stays same) with additional predictors Always >= 0 in OLS More stable than individual predictors (compensatory effect across samples)

Multiple regression analysis The test of the overall hypothesis that y is unrelated to all predictors, equivalent to H 0 :  2 y  123… = 0 H 1 :  2 y  123… = 0 is tested by F = [ R 2 y  123… / p] / [ ( 1 - R 2 y  123… ) / (n – p – 1) ] F = [ SS reg / p ] / [ SS e / (n – p – 1)]

ss x 1 ss x 2 SSy SSe Fig. 8.4: Venn diagram for multiple regression with two predictors and one outcome measure SS reg

ss x 1 ss x 2 SSy SSe Fig. 8.4: Venn diagram for multiple regression with two predictors and one outcome measure SS reg

Type I ss x 1 Type III ss x 2 SSy SSe Fig. 8.5: Type I and III contributions SSx 1 SSx 2

B and Beta Weights B weights –are t-distributed under multinormality –Give change in y per unit change in predictor x –“raw” or “unstandardized” coefficients

B and Beta Weights Beta weights –are NOT t-distributed- no correct significance test –Give change in y in standard deviation units per standard deviation change in predictor x –“standardized” coefficients –More easily interpreted

X1X1 X2X2 Y e  =.5  =.6 r =.4 R 2 = (.74)(.8)(.4)  ( ) = PATH DIAGRAM FOR REGRESSION – Beta weight form

Depression DEPRESSION LOC. CON. SELF-EST SELF-REL R 2 =.60 e 

X1X1 X2X2 Y1Y1 e1  =.2  =.3 r =.35* R 2 y= PATH DIAGRAM FOR REGRESSIONS – Beta weight form Y2Y2 e2  =.2  =.5  =.3 R 2 y=.2

HIERARCHICAL REGRESSION Predictors entered in SETS First set either causally prior, existing conditions, or theoretically/empirically established structure Next set added to decide if model changes Mediation effect Independent contribution to R-square

HIERARCHICAL REGRESSION Sample-focused procedures: Forward regression Backward regression Stepwise regression Criteria may include: R-square change in sample, error reduction

STATISTICAL TESTING – Single additional predictor R-square change: F-test for increase in SS per predictor in relation to MSerror for complete model: F (1,dfe) = (SS A+B – SS A )/ MSe AB SSe A B A B Y Y b yB t = b yB / s e b yB

STATISTICAL TESTING –Sets of predictors R-square change: F-test for increase in SS per p predictors in relation to MSerror for complete model: F (p,dfe) = ((SS A+B – SS A )/p)/ MSe AB SSe A B Y B is a set of p predictors

Experimentwise Error Rate Bonferroni error rate: p total <= p1 + p2 + p3 + … Allocate error differentially according to theory: –Predicted variables should have liberal error for deletion (eg..05 to retain in model) –Unpredicted additional variables should have conservative error to add (eg..01 to add to model)