Regression Analysis.

Slides:



Advertisements
Similar presentations
Lesson 10: Linear Regression and Correlation
Advertisements

13- 1 Chapter Thirteen McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc., All Rights Reserved.
Correlation and Linear Regression.
Learning Objectives Copyright © 2002 South-Western/Thomson Learning Data Analysis: Bivariate Correlation and Regression CHAPTER sixteen.
Learning Objectives Copyright © 2004 John Wiley & Sons, Inc. Bivariate Correlation and Regression CHAPTER Thirteen.
Learning Objectives 1 Copyright © 2002 South-Western/Thomson Learning Data Analysis: Bivariate Correlation and Regression CHAPTER sixteen.
Correlation Chapter 9.
LINEAR REGRESSION: Evaluating Regression Models Overview Assumptions for Linear Regression Evaluating a Regression Model.
LINEAR REGRESSION: Evaluating Regression Models. Overview Assumptions for Linear Regression Evaluating a Regression Model.
1-1 Regression Models  Population Deterministic Regression Model Y i =  0 +  1 X i u Y i only depends on the value of X i and no other factor can affect.
Bivariate Regression CJ 526 Statistical Analysis in Criminal Justice.
Lecture 5: Simple Linear Regression
Correlation 1. Correlation - degree to which variables are associated or covary. (Changes in the value of one tends to be associated with changes in the.
Week 14 Chapter 16 – Partial Correlation and Multiple Regression and Correlation.
Relationships Among Variables
Correlation & Regression
Example of Simple and Multiple Regression
Lecture 15 Basics of Regression Analysis
Correlation and Regression
Correlation Scatter Plots Correlation Coefficients Significance Test.
STATISTICS: BASICS Aswath Damodaran 1. 2 The role of statistics Aswath Damodaran 2  When you are given lots of data, and especially when that data is.
Chapter 14 – Correlation and Simple Regression Math 22 Introductory Statistics.
Learning Objective Chapter 14 Correlation and Regression Analysis CHAPTER fourteen Correlation and Regression Analysis Copyright © 2000 by John Wiley &
Regression Analysis. Scatter plots Regression analysis requires interval and ratio-level data. To see if your data fits the models of regression, it is.
Copyright © 2010 Pearson Education, Inc Chapter Seventeen Correlation and Regression.
Chapter 12 Examining Relationships in Quantitative Research Copyright © 2013 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin.
Simple Linear Regression One reason for assessing correlation is to identify a variable that could be used to predict another variable If that is your.
Multiple Linear Regression. Purpose To analyze the relationship between a single dependent variable and several independent variables.
CORRELATION: Correlation analysis Correlation analysis is used to measure the strength of association (linear relationship) between two quantitative variables.
© Copyright McGraw-Hill Correlation and Regression CHAPTER 10.
Chapter 16 Data Analysis: Testing for Associations.
Inferential Statistics. The Logic of Inferential Statistics Makes inferences about a population from a sample Makes inferences about a population from.
Examining Relationships in Quantitative Research
Chapter Thirteen Copyright © 2006 John Wiley & Sons, Inc. Bivariate Correlation and Regression.
 Relationship between education level, income, and length of time out of school  Our new regression equation: is the predicted value of the dependent.
Scatter Diagrams scatter plot scatter diagram A scatter plot is a graph that may be used to represent the relationship between two variables. Also referred.
Correlation & Regression Analysis
Copyright © 2010 Pearson Education, Inc Chapter Seventeen Correlation and Regression.
Regression Analysis. 1. To comprehend the nature of correlation analysis. 2. To understand bivariate regression analysis. 3. To become aware of the coefficient.
MGT-491 QUANTITATIVE ANALYSIS AND RESEARCH FOR MANAGEMENT OSMAN BIN SAIF Session 25.
Chapter 13 Linear Regression and Correlation. Our Objectives  Draw a scatter diagram.  Understand and interpret the terms dependent and independent.
Stats Methods at IC Lecture 3: Regression.
Statistical analysis.
Regression Analysis.
Correlation and Simple Linear Regression
Statistical analysis.
Week 14 Chapter 16 – Partial Correlation and Multiple Regression and Correlation.
Correlation and Regression
12 Inferential Analysis.
Simple Linear Regression
CHAPTER fourteen Correlation and Regression Analysis
POSC 202A: Lecture Lecture: Substantive Significance, Relationship between Variables 1.
Correlation and Simple Linear Regression
Regression Analysis PhD Course.
Correlation and Regression
CORRELATION ANALYSIS.
Correlation and Simple Linear Regression
Regression Analysis.
Simple Linear Regression
12 Inferential Analysis.
HW# : Complete the last slide
Simple Linear Regression and Correlation
Product moment correlation
CORRELATION AND MULTIPLE REGRESSION ANALYSIS
15.1 The Role of Statistics in the Research Process
Chapter Thirteen McGraw-Hill/Irwin
Regression Part II.
MGS 3100 Business Analysis Regression Feb 18, 2016
Correlation and Simple Linear Regression
Correlation and Simple Linear Regression
Presentation transcript:

Regression Analysis

Scatter plots Regression analysis requires interval and ratio-level data. To see if your data fits the models of regression, it is wise to conduct a scatter plot analysis. The reason? Regression analysis assumes a linear relationship. If you have a curvilinear relationship or no relationship, regression analysis is of little use.

Types of Lines

Scatter plot This is a linear relationship It is a positive relationship. As population with BA’s increases so does the personal income per capita.

Regression Line Regression line is the best straight line description of the plotted points and use can use it to describe the association between the variables. If all the lines fall exactly on the line then the line is 0 and you have a perfect relationship.

Things to remember Regressions are still focuses on association, not causation. Association is a necessary prerequisite for inferring causation, but also: The independent variable must preceded the dependent variable in time. The two variables must be plausibly lined by a theory, Competing independent variables must be eliminated.

Regression Table The regression coefficient is not a good indicator for the strength of the relationship. Two scatter plots with very different dispersions could produce the same regression line.

Reading the tables When you run regression analysis on SPSS you get a 3 tables. Each tells you something about the relationship. The first is the model summary. The R is the Pearson Product Moment Correlation Coefficient. In this case R is .736 R is the square root of R-Squared and is the correlation between the observed and predicted values of dependent variable.

R-Square R-Square is the proportion of variance in the dependent variable (income per capita) which can be predicted from the independent variable (level of education).  This value indicates that 54.2% of the variance in income can be predicted from the variable education.  Note that this is an overall measure of the strength of association, and does not reflect the extent to which any particular independent variable is associated with the dependent variable.  R-Square is also called the coefficient of determination.

Adjusted R-square As predictors are added to the model, each predictor will explain some of the variance in the dependent variable simply due to chance.  One could continue to add predictors to the model which would continue to improve the ability of the predictors to explain the dependent variable, although some of this increase in R-square would be simply due to chance variation in that particular sample.  The adjusted R-square attempts to yield a more honest value to estimate the R-squared for the population.   The value of R-square was .542, while the value of Adjusted R-square was .532. There isn’t much difference because we are dealing with only one variable.  When the number of observations is small and the number of predictors is large, there will be a much greater difference between R-square and adjusted R-square. By contrast, when the number of observations is very large compared to the number of predictors, the value of R-square and adjusted R-square will be much closer.

ANOVA The p-value associated with this F value is very small (0.0000). These values are used to answer the question "Do the independent variables reliably predict the dependent variable?".  The p-value is compared to your alpha level (typically 0.05) and, if smaller, you can conclude "Yes, the independent variables reliably predict the dependent variable".  If the p-value were greater than 0.05, you would say that the group of independent variables does not show a statistically significant relationship with the dependent variable, or that the group of independent variables does not reliably predict the dependent variable. 

Coefficients B - These are the values for the regression equation for predicting the dependent variable from the independent variable.  These are called unstandardized coefficients because they are measured in their natural units.  As such, the coefficients cannot be compared with one another to determine which one is more influential in the model, because they can be measured on different scales. 

Coefficients This chart looks at two variables and shows how the different bases affect the B value. That is why you need to look at the standardized Beta to see the differences.

Coefficients Beta - The are the standardized coefficients. These are the coefficients that you would obtain if you standardized all of the variables in the regression, including the dependent and all of the independent variables, and ran the regression.  By standardizing the variables before running the regression, you have put all of the variables on the same scale, and you can compare the magnitude of the coefficients to see which one has more of an effect.  You will also notice that the larger betas are associated with the larger t-values.

How to translate a typical table Regression Analysis Level of Education by Income per capita

Single Multiple Regression