Linear Regression.  Uses correlations  Predicts value of one variable from the value of another  ***computes UKNOWN outcomes from present, known outcomes.

Slides:



Advertisements
Similar presentations
Correlations. Captures how the value of one variable changes when the value of the other changes Use it when: Test the relationship between variables.
Advertisements

11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Inference for Regression
6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
Overview Correlation Regression -Definition
Simple Linear Regression 1. Correlation indicates the magnitude and direction of the linear relationship between two variables. Linear Regression: variable.
Cal State Northridge  320 Andrew Ainsworth PhD Regression.
LINEAR REGRESSION: Evaluating Regression Models Overview Assumptions for Linear Regression Evaluating a Regression Model.
LINEAR REGRESSION: Evaluating Regression Models. Overview Assumptions for Linear Regression Evaluating a Regression Model.
LINEAR REGRESSION: Evaluating Regression Models. Overview Standard Error of the Estimate Goodness of Fit Coefficient of Determination Regression Coefficients.
Linear Regression.
Statistics for the Social Sciences
Bivariate Regression CJ 526 Statistical Analysis in Criminal Justice.
Chapter 12 Simple Regression
SIMPLE LINEAR REGRESSION
11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
C82MCP Diploma Statistics School of Psychology University of Nottingham 1 Linear Regression and Linear Prediction Predicting the score on one variable.
Correlation 1. Correlation - degree to which variables are associated or covary. (Changes in the value of one tends to be associated with changes in the.
Multiple Regression Research Methods and Statistics.
1 Chapter 17: Introduction to Regression. 2 Introduction to Linear Regression The Pearson correlation measures the degree to which a set of data points.
Leon-Guerrero and Frankfort-Nachmias,
Statistical hypothesis testing – Inferential statistics II. Testing for associations.
Multiple Regression continued… STAT E-150 Statistical Methods.
Lecture 5 Correlation and Regression
CHAPTER 5 REGRESSION Discovering Statistics Using SPSS.
Chapter 8: Bivariate Regression and Correlation
Example of Simple and Multiple Regression
Chapter 12 Correlation and Regression Part III: Additional Hypothesis Tests Renee R. Ha, Ph.D. James C. Ha, Ph.D Integrative Statistics for the Social.
Introduction to Linear Regression and Correlation Analysis
ASSOCIATION BETWEEN INTERVAL-RATIO VARIABLES
Chapter 15 Correlation and Regression
Chapter 6 & 7 Linear Regression & Correlation
Regression. Correlation and regression are closely related in use and in math. Correlation summarizes the relations b/t 2 variables. Regression is used.
Simple Linear Regression One reason for assessing correlation is to identify a variable that could be used to predict another variable If that is your.
Introductory Statistics for Laboratorians dealing with High Throughput Data sets Centers for Disease Control.
Linear Regression Model In regression, x = independent (predictor) variable y= dependent (response) variable regression line (prediction line) ŷ = a +
Part IV Significantly Different Using Inferential Statistics Chapter 15 Using Linear Regression Predicting Who’ll Win the Super Bowl.
Part IV Significantly Different: Using Inferential Statistics
Warsaw Summer School 2015, OSU Study Abroad Program Regression.
Department of Cognitive Science Michael J. Kalsher Adv. Experimental Methods & Statistics PSYC 4310 / COGS 6310 Regression 1 PSYC 4310/6310 Advanced Experimental.
PS 225 Lecture 20 Linear Regression Equation and Prediction.
Political Science 30: Political Inquiry. Linear Regression II: Making Sense of Regression Results Interpreting SPSS regression output Coefficients for.
Simple Linear Regression (SLR)
Simple Linear Regression (OLS). Types of Correlation Positive correlationNegative correlationNo correlation.
Environmental Modeling Basic Testing Methods - Statistics III.
Correlation and Regression: The Need to Knows Correlation is a statistical technique: tells you if scores on variable X are related to scores on variable.
CHAPTER 5 CORRELATION & LINEAR REGRESSION. GOAL : Understand and interpret the terms dependent variable and independent variable. Draw a scatter diagram.
ANOVAs.  Analysis of Variance (ANOVA)  Difference in two or more average scores in different groups  Simplest is one-way ANOVA (one variable as predictor);
LESSON 6: REGRESSION 2/21/12 EDUC 502: Introduction to Statistics.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.
Chapter 7 Calculation of Pearson Coefficient of Correlation, r and testing its significance.
Michael J. Kalsher PSYCHOMETRICS MGMT 6971 Regression 1 PSYC 4310 Advanced Experimental Methods and Statistics © 2014, Michael Kalsher.
رگرسیون چندگانه Multiple Regression
Reasoning in Psychology Using Statistics
Political Science 30: Political Inquiry
Multiple Regression.
Linear Regression Prof. Andy Field.
Week 14 Chapter 16 – Partial Correlation and Multiple Regression and Correlation.
Chapter 15 Linear Regression
Correlation and Simple Linear Regression
Stats Club Marnie Brennan
Example 1 5. Use SPSS output ANOVAb Model Sum of Squares df
24/02/11 Tutorial 3 Inferential Statistics, Statistical Modelling & Survey Methods (BS2506) Pairach Piboonrungroj (Champ)
Correlation and Simple Linear Regression
Statistics for the Social Sciences
Introduction to Regression
Introduction to Regression
Correlation and Simple Linear Regression
Correlation and Simple Linear Regression
Presentation transcript:

Linear Regression

 Uses correlations  Predicts value of one variable from the value of another  ***computes UKNOWN outcomes from present, known outcomes  If we know correlation between two variables and one value, we can predict other value  In other words, what value on Y would be predicted by a score on X?

 You are examining a relationship between continuous variables  You wish to predict scores on one variable from scores on the other

 Fit a line between the two variables that best captures the scores ◦ Minimal distance between each data point and the line ◦ Allows for the best guess at a score on the second variable given some data point on the first ◦ Error in prediction: Distance from each point to the regression line ◦ If the correlation were perfect, data points would be at a 45-degree angle.

Y’ = bX + a Y’ = predicted score of Y based on X b = slope of the line a = point where line crosses the y-axis X = score used as the predictor

 b ◦ The value of b is the slope ◦ From this we can tell how much the Y variable will change when X increases by 1 point  a ◦ The Y-intercept ◦ This tells us what Y would be if X = 0 ◦ This is where the line crosses the Y axis

b = ΣXY – (ΣXΣY / n) ΣX 2 – [(ΣX) 2 / n]

a = ΣY - bΣX n

 Can examine how closely the actual Y values approximate the predicted Y values  If averaged across all data points, this is the standard error of the estimate ◦ Estimates the imprecision of the line

 1. State hypotheses ◦ Null hypothesis: no relationship between years of education and income  H 0 : β = 0 ◦ Research hypothesis: years of education predicts income  H 1 : β ≠ 0

 We’ll use SPSS output to test if the x significantly predicts changes in y  Partitions variance into variance accounted for by predictors ◦ And variance unaccounted for by predictors (the residual) ◦ The output will include a significance test of whether the variance accounted for significantly differs from zero (an F-statistic)

 5. Use SPSS output ANOVA b Model Sum of Squaresdf Mean SquareFSig. 1Regression a Residual Total a. Predictors: (Constant), yrsed b. Dependent Variable: income

 5. Use SPSS output for the standardized beta and the test statistic Model Unstandardized Coefficients Standardized Coefficients tSig. BStd. ErrorBeta 1(Constant) yrsed

 6. The output indicates that b = 3.54 and β =.95, with a p <.05 (actually p <.01) ◦ So it does exceed the critical value  7. If over the critical value, reject the null  & conclude that years of education significantly predicts income

 In results ◦ Years of education significantly predicted income, b = 3.54, t = 6.11, p <.05, such that more years of education predicted greater income. ◦ Could further say that: for every additional year of education, participants made an additional $35,400 per year (3.54 x10,000 dollars).

 Predict an outcome Y-value with multiple predictor X-values  **This is the real advantage over a correlation coefficient  Determine whether each predictor makes a unique improvement to the prediction of Y

Model Unstandardized Coefficients Standardized Coefficients tSig. BStd. ErrorBeta 1(Constant) yrsed (Constant) yrsed pincome