Multiple Regression.

Slides:



Advertisements
Similar presentations
Topic 12: Multiple Linear Regression
Advertisements

Chapter 12 Simple Linear Regression
Pengujian Parameter Regresi Pertemuan 26 Matakuliah: I0174 – Analisis Regresi Tahun: Ganjil 2007/2008.
Regression Analysis Module 3. Regression Regression is the attempt to explain the variation in a dependent variable using the variation in independent.
Chapter 12 Simple Linear Regression
1 Copyright © 2005 Brooks/Cole, a division of Thomson Learning, Inc. Summarizing Bivariate Data Introduction to Linear Regression.
Chapter 10 Simple Regression.
1 1 Slide 統計學 Spring 2004 授課教師:統計系余清祥 日期: 2004 年 5 月 4 日 第十二週:複迴歸.
1 1 Slide © 2015 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole.
Ch. 14: The Multiple Regression Model building
Simple Linear Regression and Correlation
Multiple Regression Models
Relationships Among Variables
1 1 Slide © 2008 Thomson South-Western. All Rights Reserved Slides by JOHN LOUCKS & Updated by SPIROS VELIANITIS.
Lecture 15 Basics of Regression Analysis
Introduction to Linear Regression and Correlation Analysis
Simple Linear Regression Models
Ms. Khatijahhusna Abd Rani School of Electrical System Engineering Sem II 2014/2015.
CHAPTER 15 Simple Linear Regression and Correlation
Econ 3790: Business and Economics Statistics
1 1 Slide © 2003 Thomson/South-Western Chapter 13 Multiple Regression n Multiple Regression Model n Least Squares Method n Multiple Coefficient of Determination.
1 1 Slide © 2008 Thomson South-Western. All Rights Reserved Chapter 15 Multiple Regression n Multiple Regression Model n Least Squares Method n Multiple.
CHAPTER 14 MULTIPLE REGRESSION
1 1 Slide © 2014 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole.
Chapter 11 Linear Regression Straight Lines, Least-Squares and More Chapter 11A Can you pick out the straight lines and find the least-square?
1 Chapter 12 Simple Linear Regression. 2 Chapter Outline  Simple Linear Regression Model  Least Squares Method  Coefficient of Determination  Model.
MGS3100_04.ppt/Sep 29, 2015/Page 1 Georgia State University - Confidential MGS 3100 Business Analysis Regression Sep 29 and 30, 2015.
CORRELATION: Correlation analysis Correlation analysis is used to measure the strength of association (linear relationship) between two quantitative variables.
Chapter 13 Multiple Regression
Regression Analysis Relationship with one independent variable.
Business Statistics: A Decision-Making Approach, 6e © 2005 Prentice-Hall, Inc. Chap 13-1 Introduction to Regression Analysis Regression analysis is used.
STA 286 week 131 Inference for the Regression Coefficient Recall, b 0 and b 1 are the estimates of the slope β 1 and intercept β 0 of population regression.
Statistics for Business and Economics 8 th Edition Chapter 11 Simple Regression Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall Ch.
Multiple Regression I 1 Copyright © 2005 Brooks/Cole, a division of Thomson Learning, Inc. Chapter 4 Multiple Regression Analysis (Part 1) Terry Dielman.
Essentials of Business Statistics: Communicating with Numbers By Sanjiv Jaggia and Alison Kelly Copyright © 2014 by McGraw-Hill Higher Education. All rights.
Chapter 12 Simple Linear Regression n Simple Linear Regression Model n Least Squares Method n Coefficient of Determination n Model Assumptions n Testing.
1 1 Slide The Simple Linear Regression Model n Simple Linear Regression Model y =  0 +  1 x +  n Simple Linear Regression Equation E( y ) =  0 + 
Business Research Methods
Lecture 10 Introduction to Linear Regression and Correlation Analysis.
Chapter 14 Introduction to Regression Analysis. Objectives Regression Analysis Uses of Regression Analysis Method of Least Squares Difference between.
Simple linear regression and correlation Regression analysis is the process of constructing a mathematical model or function that can be used to predict.
1 1 Slide © 2008 Thomson South-Western. All Rights Reserved Slides by JOHN LOUCKS St. Edward’s University.
MGS4020_Minitab.ppt/Jul 14, 2011/Page 1 Georgia State University - Confidential MGS 4020 Business Intelligence Regression Analysis By Using Minitab Jul.
Multiple Regression.
The simple linear regression model and parameter estimation
Lecture 11: Simple Linear Regression
Chapter 20 Linear and Multiple Regression
Regression Analysis Module 3.
Essentials of Modern Business Statistics (7e)
Linear Regression Prof. Andy Field.
John Loucks St. Edward’s University . SLIDES . BY.
Chapter 11 Simple Regression
Relationship with one independent variable
Simple Linear Regression
Quantitative Methods Simple Regression.
Slides by JOHN LOUCKS St. Edward’s University.
Business Statistics Multiple Regression This lecture flows well with
CHAPTER 29: Multiple Regression*
Prepared by Lee Revere and John Large
Simple Linear Regression
Relationship with one independent variable
Simple Linear Regression
CHAPTER 14 MULTIPLE REGRESSION
Simple Linear Regression
Chapter Fourteen McGraw-Hill/Irwin
Introduction to Regression
Multiple Regression Berlin Chen
MGS 3100 Business Analysis Regression Feb 18, 2016
Pearson Correlation and R2
Presentation transcript:

Multiple Regression

What is multiple regression? Predicting a score on Y based upon several predictors. Why is this important? Behavior is rarely a function of just one variable, but is instead influenced by many variables. So the idea is that we should be able to obtain a more accurate predicted score if using multiple variables to predict our outcome.

Simple Linear Regression For simple linear regression, the function is Where is the intercept and is the slope or the coefficient associated with the predictor variable X.

The Model Multiple Linear Regression refers to regression applications in which there are several independent variables, x1, x2, … , xp . A multiple linear regression model with p independent variables has the equation The ε is a random variable with mean 0 and variance σ2.

The prediction equation A prediction equation for this model fitted to data is   Where denotes the “predicted” value computed from the equation, and bi denotes an estimate of βi. These estimates are usually obtained by the method of least squares. This means finding among the set of all possible values for the parameter estimates the ones which minimize the sum of squared residuals.

A few concepts… In multiple regression, a dependent variable (the criterion variable) is predicted from several independent variables (predictor variables) simultaneously. Thus, we form a 'linear combination' of these variables to best predict an outcome, and then we assess the contribution that each predictor variable makes to the equation.

The correlation between the criterion variable (Y) and the set of predictor variables (Xs) is Multiple R. When we speak of multiple regression we use R instead of r. So Multiple R squared is the amount of variation in Y that can be accounted for by the combination of variables.

Minitab, and other statistical software, will show a value of adjusted R squared. This is really a correction factor. Values of R squared tend to be larger in samples than they are in populations. The adjusted R squared is an attempt to correct for this. It is also a factor that is important when there are multiple variables and few participants, since this scenario will give an inflated value of R squared just by chance.   The equation substitutes a common slope for the multiple variables and is not meaningfully interpreted but is used in the overall calculation.

Doing the calculations Computation of the estimates by hand is tedious. They are ordinarily obtained using a regression computer program. Standard errors also are usually part of output from a regression program.

ANOVA An analysis of variance for a multiple linear regression model with p independent variables fitted to a data set with n observations is: Source of Variation DF SS MS Model p SSR MSR Error n-p-1 SSE MSE Total n-1 SST

Sums of squares The sums of squares SSR, SSE, and SST have the same definitions in relation to the model as in simple linear regression:

SST = SSR + SSE The value of SST does not change with the model. It depends only on the values of the dependent variable y. SSE decreases as variables are added to a model, and SSR increases by the same amount. This amount of increase in SSR is the amount of variation due to variables in the larger model that was not accounted for by variables in the smaller model.

This increase in regression sum of squares is sometimes denoted SSR(added variables | original variables), Where original variables represents the list of independent variables that were in the model prior to adding new variables, and added variables represents the list of variables that were added to obtain the new model.

The overall SSR for the new model can be partitioned into the variation attributable to the original variables plus the variation due to the added variables that is not due to the original variables, SSR(all variables) = SSR(original variables) + SSR(added variables | original variables).

R2 Generally speaking, larger values of the coefficient of determination R2 = SSR/SST indicate a better fitting model. The value of R2 must necessarily increase as variables are added to the model. However, this does not necessarily mean that the model has actually been improved. The amount of increase in R2 can be a mathematical artifact rather than a meaningful indication of an improved model. Sometimes an adjusted R2 is used to overcome this shortcoming of the usual R2.