Topics: Multiple Regression Analysis (MRA)

Slides:



Advertisements
Similar presentations
Multiple Linear Regression
Advertisements

Multiple Regression. Introduction In this chapter, we extend the simple linear regression model. Any number of independent variables is now allowed. We.
Lecture Unit Multiple Regression.
Statistics for the Social Sciences Psychology 340 Fall 2006 Using t-tests.
Statistics for the Social Sciences
© The McGraw-Hill Companies, Inc., Chapter 10 Testing the Difference between Means and Variances.
Chapter Thirteen The One-Way Analysis of Variance.
Simple Linear Regression Analysis
Multiple Regression and Model Building
Tests of Significance for Regression & Correlation b* will equal the population parameter of the slope rather thanbecause beta has another meaning with.
Forecasting Using the Simple Linear Regression Model and Correlation
Inference for Regression
Regression Analysis Simple Regression. y = mx + b y = a + bx.
Regression Analysis Module 3. Regression Regression is the attempt to explain the variation in a dependent variable using the variation in independent.
Multiple Regression. Outline Purpose and logic : page 3 Purpose and logic : page 3 Parameters estimation : page 9 Parameters estimation : page 9 R-square.
Université d’Ottawa / University of Ottawa 2001 Bio 4118 Applied Biostatistics L10.1 CorrelationCorrelation The underlying principle of correlation analysis.
LINEAR REGRESSION: Evaluating Regression Models Overview Assumptions for Linear Regression Evaluating a Regression Model.
MULTIPLE REGRESSION. OVERVIEW What Makes it Multiple? What Makes it Multiple? Additional Assumptions Additional Assumptions Methods of Entering Variables.
LINEAR REGRESSION: Evaluating Regression Models. Overview Assumptions for Linear Regression Evaluating a Regression Model.
1-1 Regression Models  Population Deterministic Regression Model Y i =  0 +  1 X i u Y i only depends on the value of X i and no other factor can affect.
Intro to Statistics for the Behavioral Sciences PSYC 1900
Pengujian Parameter Koefisien Korelasi Pertemuan 04 Matakuliah: I0174 – Analisis Regresi Tahun: Ganjil 2007/2008.
Chapter Topics Types of Regression Models
Topics: Regression Simple Linear Regression: one dependent variable and one independent variable Multiple Regression: one dependent variable and two or.
Chapter 11 Multiple Regression.
Ch. 14: The Multiple Regression Model building
Korelasi dalam Regresi Linear Sederhana Pertemuan 03 Matakuliah: I0174 – Analisis Regresi Tahun: Ganjil 2007/2008.
© 2000 Prentice-Hall, Inc. Chap Forecasting Using the Simple Linear Regression Model and Correlation.
Analysis of Variance & Multivariate Analysis of Variance
Chapter 14 Introduction to Linear Regression and Correlation Analysis
Correlation 1. Correlation - degree to which variables are associated or covary. (Changes in the value of one tends to be associated with changes in the.
Chapter 7 Forecasting with Simple Regression
Simple Linear Regression Analysis
Review Guess the correlation. A.-2.0 B.-0.9 C.-0.1 D.0.1 E.0.9.
Lecture 5 Correlation and Regression
Topics: Significance Testing of Correlation Coefficients Inference about a population correlation coefficient: –Testing H 0 :  xy = 0 or some specific.
Statistics for the Social Sciences Psychology 340 Fall 2013 Tuesday, November 19 Chi-Squared Test of Independence.
Marketing Research Aaker, Kumar, Day and Leone Tenth Edition
Elements of Multiple Regression Analysis: Two Independent Variables Yong Sept
Chapter 13: Inference in Regression
© 2011 Pearson Prentice Hall, Salkind. Introducing Inferential Statistics.
Means Tests Hypothesis Testing Assumptions Testing (Normality)
OPIM 303-Lecture #8 Jose M. Cruz Assistant Professor.
© 2003 Prentice-Hall, Inc.Chap 13-1 Basic Business Statistics (9 th Edition) Chapter 13 Simple Linear Regression.
1 1 Slide © 2008 Thomson South-Western. All Rights Reserved Chapter 15 Multiple Regression n Multiple Regression Model n Least Squares Method n Multiple.
Introduction to Linear Regression
EQT 373 Chapter 3 Simple Linear Regression. EQT 373 Learning Objectives In this chapter, you learn: How to use regression analysis to predict the value.
Multiple Linear Regression. Purpose To analyze the relationship between a single dependent variable and several independent variables.
Examining Relationships in Quantitative Research
Regression Chapter 16. Regression >Builds on Correlation >The difference is a question of prediction versus relation Regression predicts, correlation.
Chapter 16 Data Analysis: Testing for Associations.
Chapter 13 Multiple Regression
Lecture 10: Correlation and Regression Model.
© 2006 by The McGraw-Hill Companies, Inc. All rights reserved. 1 Chapter 12 Testing for Relationships Tests of linear relationships –Correlation 2 continuous.
CHAPTER OVERVIEW Say Hello to Inferential Statistics The Idea of Statistical Significance Significance Versus Meaningfulness Meta-analysis.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.
1 1 Slide The Simple Linear Regression Model n Simple Linear Regression Model y =  0 +  1 x +  n Simple Linear Regression Equation E( y ) =  0 + 
Research Methodology Lecture No :26 (Hypothesis Testing – Relationship)
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Multiple Regression Chapter 14.
Topics, Summer 2008 Day 1. Introduction Day 2. Samples and populations Day 3. Evaluating relationships Scatterplots and correlation Day 4. Regression and.
LESSON 4.1. MULTIPLE LINEAR REGRESSION 1 Design and Data Analysis in Psychology II Salvador Chacón Moscoso Susana Sanduvete Chaves.
Chapter 11 REGRESSION Multiple Regression  Uses  Explanation  Prediction.
INTRODUCTION TO MULTIPLE REGRESSION MULTIPLE REGRESSION MODEL 11.2 MULTIPLE COEFFICIENT OF DETERMINATION 11.3 MODEL ASSUMPTIONS 11.4 TEST OF SIGNIFICANCE.
Chapter 11: Linear Regression E370, Spring From Simple Regression to Multiple Regression.
Stats Methods at IC Lecture 3: Regression.
Chapter 8 Introducing Inferential Statistics.
Topics: Multiple Regression Analysis (MRA)
Statistics for Managers using Microsoft Excel 3rd Edition
Quantitative Methods Simple Regression.
PENGOLAHAN DAN PENYAJIAN
Presentation transcript:

Topics: Multiple Regression Analysis (MRA) Review Simple Regression Analysis Multiple Regression Analysis Design requirements Multiple regression model R2 Testing R2 and b’s Comparing models Comparing standardized regression coefficients

Multiple Regression Analysis (MRA) Method for studying the relationship between a dependent variable and two or more independent variables. Purposes: Prediction Explanation Theory building

Design Requirements One dependent variable (criterion) Two or more independent variables (predictor variables). Sample size: >= 50 (at least 10 times as many cases as independent variables)

Assumptions Independence: the scores of any particular subject are independent of the scores of all other subjects Normality: in the population, the scores on the dependent variable are normally distributed for each of the possible combinations of the level of the X variables; each of the variables is normally distributed Homoscedasticity: in the population, the variances of the dependent variable for each of the possible combinations of the levels of the X variables are equal. Linearity: In the population, the relation between the dependent variable and the independent variable is linear when all the other independent variables are held constant.

Simple vs. Multiple Regression One dependent variable Y predicted from a set of independent variables (X1, X2 ….Xk) One regression coefficient for each independent variable R2: proportion of variation in dependent variable Y predictable by set of independent variables (X’s) One dependent variable Y predicted from one independent variable X One regression coefficient r2: proportion of variation in dependent variable Y predictable from X

Example: Self Concept and Academic Achievement (N=103) Shavelson, Text example p 530 (Table 18.1) (Example 18.1 p 538) A study of the relation between academic achievement (AA), grades, and general and academic self concept. General Self concept- one’s perception of him/her self. Multifaceted: perception of behavior is the base moving to inferences about self in sub-areas (e.g. physical (appearance, ability), academic (math, history) , social , and then to inferences about self in non-academic and academic areas and then to inferences about self in general. Academic self concept: one’s perception of self in academic areas The theory leads us to predict that AA and ASC will be more closely related than AA and GSC, But relationship of grades and these other variables is not so clear Grades certainly reflect academic achievment, but also student’s efforts, improvement and participation. So hypothesize that best predictor of grades might be AA and GSC. Once AA and GSC have been used to predict grades, academic self-concept not expected to improve the prediction of grades (I.e. ASC not expected to account for any additional variation in grades. MRA allows us to statistically examine these predictions from self-concept theory.

Example: The Model Y’ = a + b1X1 + b2X2 + …bkXk The b’s are called partial regression coefficients Our example-Predicting AA: Y’= 36.83 + (3.52)XASC + (-.44)XGSC Predicted AA for person with GSC of 4 and ASC of 6 Y’= 36.83 + (3.52)(6) + (-.44)(4) = 56.23

Multiple Correlation Coefficient (R) and Coefficient of Multiple Determination (R2) R = the magnitude of the relationship between the dependent variable and the best linear combination of the predictor variables R2 = the proportion of variation in Y accounted for by the set of independent variables (X’s).

Explaining Variation: How much? Predictable variation by the combination of independent variables Total Variation in Y Total variance: Predicted (Explained) Variance (SS Regression): Coefficient of Determination (R2) = predictable variation by the combination of independent variables Unpredicted (Residual) Variance (SS Residual): SSreg/SSy = proporion of variation in Y predictable from X’s (R2) SSres/SSy = proportion of variaion in Y unpredictable from X’s (1-R2) In our example: R = .41 and R2 = .161 (16% of the variability in academic achievement is accouted for by the weighted composite of the independent variables general and academic self-concept (actually the formula for computing R2 is used first, then take the squareroot of R2) Unpredictable Variation

Proportion of Predictable and Unpredictable Variation (1-R2) = Unpredictable (unexplained) variation in Y Where: Y= AA X1 = ASC X2 =GSC Y X1 R2 = Predictable (explained) variation in Y X2

Various Significance Tests Testing R2 Test R2 through an F test Test of competing models (difference between R2) through an F test of difference of R2s Testing b Test of each partial regression coefficient (b) by t-tests Comparison of partial regression coefficients with each other - t-test of difference between standardized partial regression coefficients ()

Example: Testing R2 What proportion of variation in AA can be predicted from GSC and ASC? Compute R2: R2 = .16 (R = .41) : 16% of the variance in AA can be accounted for by the composite of GSC and ASC Is R2 statistically significant from 0? F test: Fobserved = 9.52, Fcrit (05/2,100) = 3.09 Reject H0: in the population there is a significant relationship between AA and the linear composite of GSC and ASC

Example: Comparing Models -Testing R2 Model 1: Y’= 35.37 + (3.38)XASC Model 2: Y’= 36.83 + (3.52)XASC + (-.44)XGSC Compute R2 for each model Model 1: R2 = r2 = .160 Model 2: R2 = .161 Test difference between R2s Fobs = .119, Fcrit(.05/1,100) = 3.94 Conclude that GSC does not add significantly to ASC in predicting AA

Testing Significance of b’s H0:  = 0 tobserved = b -  standard error of b with N-k-1 df

Example: t-test of b tobserved = -.44 - 0/14.24 tobserved = -.03 tcritical(.05,2,100) = 1.97 Decision: Cannot reject the null hypothesis. Conclusion: The population  for GSC is not significantly different from 0

Comparing Partial Regression Coefficients Which is the stronger predictor? Comparing bGSC and bASC Convert to standardized partial regression coefficients (beta weights, ’s) GSC = -.038 ASC = .417 On same scale so can compare: ASC is stronger predictor than GSC Beta weights (’s ) can also be tested for significance with t tests.

Different Ways of Building Regression Models Simultaneous: all independent variables entered together Stepwise: independent variables entered according to some order By size or correlation with dependent variable In order of significance Hierarchical: independent variables entered in stages

Practice: Grades reflect academic achievement, but also student’s efforts, improvement, participation, etc. Thus hypothesize that best predictor of grades might be academic achievement and general self concept. Once AA and GSC have been used to predict grades, academic self-concept (ASC) is not expected to improve the prediction of grades (I.e. not expected to account for any additional variation in grades)