Chapter 11 REGRESSION Multiple Regression  Uses  Explanation  Prediction.

Slides:



Advertisements
Similar presentations
Lesson 10: Linear Regression and Correlation
Advertisements

Kin 304 Regression Linear Regression Least Sum of Squares
6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
Regression Analysis Module 3. Regression Regression is the attempt to explain the variation in a dependent variable using the variation in independent.
Linear Regression. PSYC 6130, PROF. J. ELDER 2 Correlation vs Regression: What’s the Difference? Correlation measures how strongly related 2 variables.
Regression Analysis Once a linear relationship is defined, the independent variable can be used to forecast the dependent variable. Y ^ = bo + bX bo is.
© McGraw-Hill Higher Education. All Rights Reserved. Chapter 2F Statistical Tools in Evaluation.
Simple Linear Regression
LINEAR REGRESSION: Evaluating Regression Models Overview Assumptions for Linear Regression Evaluating a Regression Model.
LINEAR REGRESSION: Evaluating Regression Models. Overview Assumptions for Linear Regression Evaluating a Regression Model.
Statistics for the Social Sciences
1-1 Regression Models  Population Deterministic Regression Model Y i =  0 +  1 X i u Y i only depends on the value of X i and no other factor can affect.
REGRESSION What is Regression? What is the Regression Equation? What is the Least-Squares Solution? How is Regression Based on Correlation? What are the.
Chapter 11 Multiple Regression.
Ch. 14: The Multiple Regression Model building
REGRESSION Predict future scores on Y based on measured scores on X Predictions are based on a correlation from a sample where both X and Y were measured.
Multiple Regression Research Methods and Statistics.
Week 14 Chapter 16 – Partial Correlation and Multiple Regression and Correlation.
Simple Linear Regression Analysis
Correlation & Regression
Correlation and Linear Regression
Marketing Research Aaker, Kumar, Day and Leone Tenth Edition
Introduction to Linear Regression and Correlation Analysis
Elements of Multiple Regression Analysis: Two Independent Variables Yong Sept
Chapter 11 Simple Regression
Correlation and Linear Regression
Chapter 6 & 7 Linear Regression & Correlation
Section 5.2: Linear Regression: Fitting a Line to Bivariate Data.
Chapter 17 Partial Correlation and Multiple Regression and Correlation.
Applied Quantitative Analysis and Practices LECTURE#23 By Dr. Osman Sadiq Paracha.
Regression Analyses. Multiple IVs Single DV (continuous) Generalization of simple linear regression Y’ = b 0 + b 1 X 1 + b 2 X 2 + b 3 X 3...b k X k Where.
Chapter 7 Relationships Among Variables What Correlational Research Investigates Understanding the Nature of Correlation Positive Correlation Negative.
Regression Chapter 16. Regression >Builds on Correlation >The difference is a question of prediction versus relation Regression predicts, correlation.
Simple Linear Regression. The term linear regression implies that  Y|x is linearly related to x by the population regression equation  Y|x =  +  x.
Chapter 16 Data Analysis: Testing for Associations.
Correlation and Regression: The Need to Knows Correlation is a statistical technique: tells you if scores on variable X are related to scores on variable.
CHAPTER 5 CORRELATION & LINEAR REGRESSION. GOAL : Understand and interpret the terms dependent variable and independent variable. Draw a scatter diagram.
POD 09/19/ B #5P a)Describe the relationship between speed and pulse as shown in the scatterplot to the right. b)The correlation coefficient, r,
Linear Prediction Correlation can be used to make predictions – Values on X can be used to predict values on Y – Stronger relationships between X and Y.
Advanced Statistical Methods: Continuous Variables REVIEW Dr. Irina Tomescu-Dubrow.
B AD 6243: Applied Univariate Statistics Multiple Regression Professor Laku Chidambaram Price College of Business University of Oklahoma.
Lesson 14 - R Chapter 14 Review. Objectives Summarize the chapter Define the vocabulary used Complete all objectives Successfully answer any of the review.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.
Linear Regression and Correlation Chapter GOALS 1. Understand and interpret the terms dependent and independent variable. 2. Calculate and interpret.
4 basic analytical tasks in statistics: 1)Comparing scores across groups  look for differences in means 2)Cross-tabulating categoric variables  look.
Assumptions of Multiple Regression 1. Form of Relationship: –linear vs nonlinear –Main effects vs interaction effects 2. All relevant variables present.
Copyright © 2012 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 18 Multivariate Statistics.
Chapter 14 Introduction to Regression Analysis. Objectives Regression Analysis Uses of Regression Analysis Method of Least Squares Difference between.
Multiple Regression Scott Hudson January 24, 2011.
Chapter 12 REGRESSION DIAGNOSTICS AND CANONICAL CORRELATION.
REGRESSION G&W p
Correlation, Bivariate Regression, and Multiple Regression
AP Statistics Chapter 14 Section 1.
Practice. Practice Practice Practice Practice r = X = 20 X2 = 120 Y = 19 Y2 = 123 XY = 72 N = 4 (4) 72.
Chapter 10 CORRELATION.
Regression Chapter 6 I Introduction to Regression
Kin 304 Regression Linear Regression Least Sum of Squares
Multiple Regression.
Week 14 Chapter 16 – Partial Correlation and Multiple Regression and Correlation.
BPK 304W Regression Linear Regression Least Sum of Squares
Regression.
BIVARIATE REGRESSION AND CORRELATION
BPK 304W Correlation.
بحث في التحليل الاحصائي SPSS بعنوان :
Multiple Regression A curvilinear relationship between one variable and the values of two or more other independent variables. Y = intercept + (slope1.
CHAPTER- 17 CORRELATION AND REGRESSION
Simple Linear Regression
Warsaw Summer School 2017, OSU Study Abroad Program
3 basic analytical tasks in bivariate (or multivariate) analyses:
Presentation transcript:

Chapter 11 REGRESSION

Multiple Regression  Uses  Explanation  Prediction

Multiple Regression  Based On: Correlations Characteristics of a straight line

Regression vs. Multiple Regression  One independent variable vs. more than one independent variable  One dependent variable

Multiple Regression TType of Data Required Independent variables CCategorical - can be coded for entry CContinuous - meet assumptions Dependent variable CContinuous - should be normally distributed

AAssumptions Sample representative of population Variables should have normal distribution Homoscedasticity Linear relationship between variables

 Power analysis If sample size = number of variables, Rsquared will equal Generally need subjects per independent variable Less than 10 subjects per independent variable leads to serious error

Relationship of correlation to regression  Perfect correlation?  No correlation?  Imperfect correlation?

Regression Equation  Formula for a straight line  Predicted score = constant plus regression weight times score  Y’ = a + bX  Y’ = a + b1X1 + b2X2 = B3X3

Regression Equation  Predicted score Y’  Constant a value of Y when X = 0 point where regression line intercepts the Y axis  regression coefficient/s b or beta rate of change in Y with a unit change in X measure of slope of regression line

Regression Equation  Constant or a is based on means of variables involved  Regression coeffients, b or beta, based on correlation between two variables

Regression coefficients  The b-weights are based on raw scores  Beta-weights are based on standardized scores and are partial correlation coefficients

Regression line  Least squares  “Line of best fit”  Deviations around this line sum to zero Deviations are the differences between the actual and predicted scores

Computer Example  What is the multiple correlation between a group of independent variables entered in three blocks and the dependent variable, total positive psychological attitudes?  Block 1: Age and education  Block 2: Smoking hx and exercise  Block 3: Sat. with wt. and Health

SPSS - Multiple Regression  ANALYZE Regression  Linear Statistics  Confidence intervals  R squared change  Descriptives  Part and partial correlations Options  Exclude cases pairwise

Dummy coding  Uses 1s and 0s  a = mean of dependent variable for group assigned 0s throughout  b - tests the difference between the group assigned 1 on the variable and the group assigned 0s throughout

Dummy coding  Vector 1 Republicans = 1 Democrats = 0 Independents = 0  Vector 2 Republicans = 0 Democrats = 1 Independents = 0

SPSS - creating dummy variables  TRANSFORM Compute  New variable with new value  IF create conditional expression

Effect Coding  Uses 1s, 0s, and -1s  a = grand mean of dependent variable  b tests the difference between the mean of the group assigned 1 and the grand mean  the b-weights add up to zero

Effect Coding  Vector 1 Republicans 1 Democrats 0 Independents -1  Vector 2 Republicans 0 Democrats 1 Independents -1

MULTIPLE REGRESSION  Selecting Variables for the Equation Standard (ENTER) Hierarchical (Setwise) Stepwise  Forward  Backward  Stepwise

Mediator and Moderator Variables

Mediator Variable  A variable seen as “between” the independent and dependent variable.  Tested with multiple regression/path analysis

Moderator Variable  Affects the association between an independent and dependent variable.  Test for an interaction between the moderator and another independent variable using hierarchical multiple regression.

Example from the literature