Prediction/Regression

Slides:



Advertisements
Similar presentations
Regression. Lines y=mx+b y=mx+b m = slope of the line; how steep it is m = slope of the line; how steep it is b = y-intercept of the line; where the line.
Advertisements

Lesson 10: Linear Regression and Correlation
Regression Analysis Module 3. Regression Regression is the attempt to explain the variation in a dependent variable using the variation in independent.
 Coefficient of Determination Section 4.3 Alan Craig
Regression Analysis Once a linear relationship is defined, the independent variable can be used to forecast the dependent variable. Y ^ = bo + bX bo is.
Describing Relationships Using Correlation and Regression
© McGraw-Hill Higher Education. All Rights Reserved. Chapter 2F Statistical Tools in Evaluation.
LINEAR REGRESSION: What it Is and How it Works Overview What is Bivariate Linear Regression? The Regression Equation How It’s Based on r.
LINEAR REGRESSION: Evaluating Regression Models. Overview Standard Error of the Estimate Goodness of Fit Coefficient of Determination Regression Coefficients.
LINEAR REGRESSION: What it Is and How it Works. Overview What is Bivariate Linear Regression? The Regression Equation How It’s Based on r.
Statistics for the Social Sciences
Correlation and Regression Analysis
Prediction/Regression
Reminders  HW2 due today  Exam 1 next Tues (9/27) – Ch 1-5 –3 sections: Short answers (concepts, definitions) Calculations (you’ll be given the formulas)
Correlation MEASURING ASSOCIATION Establishing a degree of association between two or more variables gets at the central objective of the scientific enterprise.
C82MCP Diploma Statistics School of Psychology University of Nottingham 1 Linear Regression and Linear Prediction Predicting the score on one variable.
PSY 307 – Statistics for the Behavioral Sciences Chapter 7 – Regression.
Measures of Variability: Range, Variance, and Standard Deviation
Relationships Among Variables
Smith/Davis (c) 2005 Prentice Hall Chapter Eight Correlation and Prediction PowerPoint Presentation created by Dr. Susan R. Burns Morningside College.
McGraw-Hill/Irwin Copyright © 2010 by The McGraw-Hill Companies, Inc. All rights reserved. Chapter 13 Linear Regression and Correlation.
Aron, Aron, & Coups, Statistics for the Behavioral and Social Sciences: A Brief Course (3e), © 2005 Prentice Hall Chapter 3 Correlation and Prediction.
Linear Regression and Correlation
Association between Variables Measured at the Nominal Level.
Introduction to Linear Regression and Correlation Analysis
Linear Trend Lines Y t = b 0 + b 1 X t Where Y t is the dependent variable being forecasted X t is the independent variable being used to explain Y. In.
Chapter 13 Statistics © 2008 Pearson Addison-Wesley. All rights reserved.
Chapter 6 & 7 Linear Regression & Correlation
Simple Linear Regression One reason for assessing correlation is to identify a variable that could be used to predict another variable If that is your.
© 2008 Pearson Addison-Wesley. All rights reserved Chapter 1 Section 13-6 Regression and Correlation.
Section 4.2 Least Squares Regression. Finding Linear Equation that Relates x and y values together Based on Two Points (Algebra) 1.Pick two data points.
© 2014 by Pearson Higher Education, Inc Upper Saddle River, New Jersey All Rights Reserved HLTH 300 Biostatistics for Public Health Practice, Raul.
Chapter 11 Correlation Pt 1: Nov. 12, Correlation Association between scores on two variables –e.g., age and coordination skills in children, price.
Copyright © 2013, 2009, and 2007, Pearson Education, Inc. Chapter 13 Multiple Regression Section 13.2 Extending the Correlation and R-Squared for Multiple.
Sullivan – Fundamentals of Statistics – 2 nd Edition – Chapter 4 Section 2 – Slide 1 of 20 Chapter 4 Section 2 Least-Squares Regression.
Chapter 4 Prediction. Predictor and Criterion Variables  Predictor variable (X)  Criterion variable (Y)
Welcome to MM570 Psychological Statistics Unit 4 Seminar Dr. Srabasti Dutta.
Business Statistics: A Decision-Making Approach, 6e © 2005 Prentice-Hall, Inc. Chap 13-1 Introduction to Regression Analysis Regression analysis is used.
11/23/2015Slide 1 Using a combination of tables and plots from SPSS plus spreadsheets from Excel, we will show the linkage between correlation and linear.
Chapter 3 Correlation.  Association between scores on two variables –e.g., age and coordination skills in children, price and quality.
Correlation MEASURING ASSOCIATION Establishing a degree of association between two or more variables gets at the central objective of the scientific enterprise.
CHAPTER 5 CORRELATION & LINEAR REGRESSION. GOAL : Understand and interpret the terms dependent variable and independent variable. Draw a scatter diagram.
Copyright © 2013, 2009, and 2007, Pearson Education, Inc. Chapter 3 Association: Contingency, Correlation, and Regression Section 3.3 Predicting the Outcome.
Linear Prediction Correlation can be used to make predictions – Values on X can be used to predict values on Y – Stronger relationships between X and Y.
Statistics for Psychology CHAPTER SIXTH EDITION Statistics for Psychology, Sixth Edition Arthur Aron | Elliot J. Coups | Elaine N. Aron Copyright © 2013.
Linear Regression and Correlation Chapter GOALS 1. Understand and interpret the terms dependent and independent variable. 2. Calculate and interpret.
© 2011 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part, except for use as permitted in a license.
Welcome to MM570 Psychological Statistics Unit 4 Seminar Dr. Bob Lockwood.
1 AAEC 4302 ADVANCED STATISTICAL METHODS IN AGRICULTURAL RESEARCH Part II: Theory and Estimation of Regression Models Chapter 5: Simple Regression Theory.
The simple linear regression model and parameter estimation
Department of Mathematics
Statistics for Political Science Levin and Fox Chapter 11:
Prediction/Regression
1 Functions and Applications
Chapter 13 Multiple Regression
Correlation, Bivariate Regression, and Multiple Regression
1) A residual: a) is the amount of variation explained by the LSRL of y on x b) is how much an observed y-value differs from a predicted y-value c) predicts.
Introduction to Statistics for the Social Sciences SBS200, COMM200, GEOG200, PA200, POL200, or SOC200 Lecture Section 001, Spring 2016 Room 150 Harvill.
Regression Computer Print Out
Exam 5 Review GOVT 201.
Statistics for the Social Sciences
Introduction to Regression
Correlation and Regression
Regression & Prediction
Introduction to Regression
Chapter 3 Correlation and Prediction
Correlation and Prediction
Correlation and Prediction
Presentation transcript:

Prediction/Regression Chapter 12 Prediction/Regression Part 2: Nov. 18, 2014

Simple Regression Review Regression allows us to predict y using x (predictor) We create a regression equation or line – based on 1 sample, then use to predict y scores for a 2nd sample. Ŷ = .688 + .92(x) Interpret ‘a’ = ‘y’ should equal .688 when ‘x’=0 Interpret ‘b’ = If we increase ‘x’ by 1, ‘y’ should increase .92

Drawing the Regression Line Draw and label the axes for a scatter diagram Figure predicted value on criterion for a low value on predictor variable You can randomly choose what value to plug in.. Maybe x=1, so Ŷ = .688 + .92(1) = 1.61 Repeat step 2. with a high value on predictor Maybe x=6, so Ŷ = .688 + .92(6) = 6.21 Draw a line passing through the two marks (1, 1.61) and (6, 6.21) Hint: you can also use (Mx, My) to save time as one of your 2 points. Reg line always passes through the means of x and y.

Regression Error Now that you have a regression line or equation built from 1 sample, you can find predicted y scores using a new sample of x scores (Sample 2)… Then, assume that you later collect data on Sample 2’s actual y scores Compare the accuracy of predicted ŷ to the actual y scores for Sample 2 Sometimes you’ll overestimate, sometimes underestimate…this is ERROR. Can we get a measure of error? How much is OK?

Error in regression Proportionate Reduction in Error (PRE) Actual score minus the predicted score Proportionate Reduction in Error (PRE) Squared error using prediction (reg) model = SSError =  (y - ŷ)2  Compare this to amount of error w/o this prediction (reg) model. If no other model, best guess would be the mean. Total squared error when predicting

Error and Proportionate Reduction in Error Formula for proportionate reduction in error compares reg model to mean baseline (predicting everyone’s y score will be at the mean) We want reg model to be much better than mean(baseline)  that would indicate fewer prediction errors So you want PRE to be large…

Find PRE Reg model was ŷ = .688 + .92(x) Use mean model to find error (y-My)2 for each person & sum up that column  SStot Find prediction using reg model: plug in x values into reg model to get ŷ Find (y-ŷ)2 for each person, sum up that column  SSerror Find PRE

Proportionate reduction in error = r2 If our reg model no better than mean, SSerror = SStotal, so (0/ SStot) = 0. Using this regression model, we reduce error over the mean model by 0%….not good prediction. If reg model has 0 error (perfect), SStot-0/SStot = 1, or 100% reduction of error. Proportionate reduction in error = r2 aka “Proportion of variance in y accounted for by x”, ranges between 0-100%.

Computing Error Sum of the squared residuals = SSerror X Y 6 6 1 2 5 6 6 6 .688 + .92(6) = 6.2 1 2 .688 + .92(1) = 1.6 5 6 .688 + .92(5) = 5.3 3 4 .688 + .92(3) = 3.45 3 2 .688 + .92(3) = 3.45 mean 3.6 4.0

Computing SSerror Sum of the squared residuals = SSerror X Y 6 6 6.2 6 6 6.2 -0.20 0.04 1 2 1.6 0.40 0.16 5 6 5.3 0.70 0.49 3 4 3.45 0.55 0.30 3 2 3.45 -1.45 2.10 mean 3.6 4.0 0.00 3.09 SSERROR

Computing SStotal = (y-My)2 X Y My y-My (y-My)2 6 6 4 2 4 1 2 4 -2 4 5 6 4 2 4 3 4 4 0 0 3 2 4 -2 4 mean 3.6 4.0 Σ=0 Σ=16 (SStotal)

PRE PRE = 16 - 3.09 16 = .807 We have 80.7% proportionate reduction in error from using our regression model as opposed to the mean baseline model So we’re doing much better using our regression as opposed to just predicting the mean for everyone…