Chapter 15 Linear Regression

Slides:



Advertisements
Similar presentations
13- 1 Chapter Thirteen McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc., All Rights Reserved.
Advertisements

 Coefficient of Determination Section 4.3 Alan Craig
Learning Objectives Copyright © 2002 South-Western/Thomson Learning Data Analysis: Bivariate Correlation and Regression CHAPTER sixteen.
Regression Analysis Once a linear relationship is defined, the independent variable can be used to forecast the dependent variable. Y ^ = bo + bX bo is.
Learning Objectives Copyright © 2004 John Wiley & Sons, Inc. Bivariate Correlation and Regression CHAPTER Thirteen.
Learning Objectives 1 Copyright © 2002 South-Western/Thomson Learning Data Analysis: Bivariate Correlation and Regression CHAPTER sixteen.
Chapter 15 (Ch. 13 in 2nd Can.) Association Between Variables Measured at the Interval-Ratio Level: Bivariate Correlation and Regression.
LINEAR REGRESSION: Evaluating Regression Models. Overview Standard Error of the Estimate Goodness of Fit Coefficient of Determination Regression Coefficients.
Bivariate Regression CJ 526 Statistical Analysis in Criminal Justice.
Copyright (c) Bani K. Mallick1 STAT 651 Lecture #18.
Correlation-Regression The correlation coefficient measures how well one can predict X from Y or Y from X.
Chapter Topics Types of Regression Models
BCOR 1020 Business Statistics
REGRESSION Predict future scores on Y based on measured scores on X Predictions are based on a correlation from a sample where both X and Y were measured.
Lecture 5: Simple Linear Regression
1 Chapter 17: Introduction to Regression. 2 Introduction to Linear Regression The Pearson correlation measures the degree to which a set of data points.
Linear Regression.  Uses correlations  Predicts value of one variable from the value of another  ***computes UKNOWN outcomes from present, known outcomes.
Relationships Among Variables
Smith/Davis (c) 2005 Prentice Hall Chapter Eight Correlation and Prediction PowerPoint Presentation created by Dr. Susan R. Burns Morningside College.
Lecture 5 Correlation and Regression
CHAPTER 5 REGRESSION Discovering Statistics Using SPSS.
Chapter 8: Bivariate Regression and Correlation
McGraw-Hill/IrwinCopyright © 2009 by The McGraw-Hill Companies, Inc. All Rights Reserved. Simple Linear Regression Analysis Chapter 13.
Introduction to Linear Regression and Correlation Analysis
ASSOCIATION BETWEEN INTERVAL-RATIO VARIABLES
Chapter 6 & 7 Linear Regression & Correlation
Correlation is a statistical technique that describes the degree of relationship between two variables when you have bivariate data. A bivariate distribution.
Welcome to Econ 420 Applied Regression Analysis Study Guide Week Six.
1 Chapter 12 Simple Linear Regression. 2 Chapter Outline  Simple Linear Regression Model  Least Squares Method  Coefficient of Determination  Model.
Part IV Significantly Different Using Inferential Statistics Chapter 15 Using Linear Regression Predicting Who’ll Win the Super Bowl.
Part IV Significantly Different: Using Inferential Statistics
Regression Chapter 16. Regression >Builds on Correlation >The difference is a question of prediction versus relation Regression predicts, correlation.
STATISTICS 12.0 Correlation and Linear Regression “Correlation and Linear Regression -”Causal Forecasting Method.
Inference for regression - More details about simple linear regression IPS chapter 10.2 © 2006 W.H. Freeman and Company.
PS 225 Lecture 20 Linear Regression Equation and Prediction.
© Copyright McGraw-Hill Correlation and Regression CHAPTER 10.
11/23/2015Slide 1 Using a combination of tables and plots from SPSS plus spreadsheets from Excel, we will show the linkage between correlation and linear.
Political Science 30: Political Inquiry. Linear Regression II: Making Sense of Regression Results Interpreting SPSS regression output Coefficients for.
Chapter Thirteen Copyright © 2006 John Wiley & Sons, Inc. Bivariate Correlation and Regression.
CHAPTER 5 CORRELATION & LINEAR REGRESSION. GOAL : Understand and interpret the terms dependent variable and independent variable. Draw a scatter diagram.
Regression Analysis. 1. To comprehend the nature of correlation analysis. 2. To understand bivariate regression analysis. 3. To become aware of the coefficient.
STATISTICS 12.0 Correlation and Linear Regression “Correlation and Linear Regression -”Causal Forecasting Method.
Multiple Regression Analysis Regression analysis with two or more independent variables. Leads to an improvement.
Chapters 8 Linear Regression. Correlation and Regression Correlation = linear relationship between two variables. Summarize relationship with line. Called.
Chapter 14 Introduction to Regression Analysis. Objectives Regression Analysis Uses of Regression Analysis Method of Least Squares Difference between.
Chapter 11 Linear Regression and Correlation. Explanatory and Response Variables are Numeric Relationship between the mean of the response variable and.
Lecture 6 Feb. 2, 2015 ANNOUNCEMENT: Lab session will go from 4:20-5:20 based on the poll. (The majority indicated that it would not be a problem to chance,
Statistics for Political Science Levin and Fox Chapter 11:
CHAPTER 3 Describing Relationships
10.2 Regression If the value of the correlation coefficient is significant, the next step is to determine the equation of the regression line which is.
Correlation and Simple Linear Regression
Political Science 30: Political Inquiry
Multiple Regression.
CHAPTER fourteen Correlation and Regression Analysis
S519: Evaluation of Information Systems
Correlation and Simple Linear Regression
Correlation and Regression
CHAPTER 3 Describing Relationships
Section 10.2: Fitting a Linear Model to Data
Correlation and Simple Linear Regression
Residuals and Residual Plots
Introduction to Regression
Correlation and Regression
HW# : Complete the last slide
Simple Linear Regression and Correlation
Section 6.2 Prediction.
Correlation and Simple Linear Regression
Correlation and Simple Linear Regression
Presentation transcript:

Chapter 15 Linear Regression The correlation coefficient tells us the degree to which two variables are related; we can use this coefficient to predict the value of one variable when there is a change in another variable for hypothetical cases. Example: p. 247 (student GPA). Logic of Prediction To predict one variable from another, we need to know their correlation (rxy). Look at Fig. 15.1. To do that, we have to create a regression equation in order to compute a regression line, which will give us our best guess as to what score on the Y variable (college GPA) would be predicted by a score on the X variable (high school GPA). The regression line minimizes the distance between the line and each of the points on the Y axis. Question, on this scatterplot, what is the value of why given X of 3? (Fig 15.3)

The distance between each data point and the regression line is the error in prediction. The larger the error, the lower the correlation (see Figure 15.4). So, what would a correlation of 1 (perfect correlation) look like? Computing the regression coefficient (equation). Remember, Y is our dependent variable and X is our independent variable (or variable of interest or main iv). Y׳=bX + a Y׳ is the predicted value of Y based on value of X b is slope, or direction (sign), of line a is the point at which the line crosses the y-axis

Y׳=bX + a Y׳ is the predicted value of Y based on value of X b is slope, or direction (sign), of line a is the point at which the line crosses the y-axis Also need these for a and b: ΣX ΣY ΣX2 ΣY2 ΣXY

Do example on page 251. What is the regression equation (line)? Using this equation, we can predict what Y will be given a value for X. For example, with our equation Y׳=.704X + .719, we can predict what college GPA would be for a student with a high school GPA of 2.8. How? Plug 2.8 in for X. Answer = 2.69 How good is the fit (prediction)? We calculate the error of the estimate by comparing the predicted value (Y׳) with the actual value (Y) for an observed X. For example, given our equation, we would predict a college GPA of 2.69 for an high school (X) GPA of 2.8, but we know (given data set) that the actual Y value for the person with this high school GPA is 3.5. The difference (3.5-2.8) is .81 (error of the estimate). If you calculate the average error estimate for each X value (all differences between each Y׳ and Y), you get the standard error of the estimate. IV. Significance testing. Just use the computer. We need the t-score, but it is hard to compute. SPSS does it automatically along with significance level.

Regression and SPSS and regression output table (p. 253) Interpretation of coefficient “For every one unit (hour of training) increase in X, there is a _________ unit change in Y (# of injuries) of -.125 (i.e., reduction in the # of injuries).” Interpretation of significance level: Is this a significant finding? Yes, significance level is .011. That is, there is only a 1% chance that the Null is true or that we would commit a Type I error. Interpretation of the Adjusted-R2 value: tells us the percentage of the variance in Y (# of injuries) that is explained by the variance in X (hours training). In this case, 18% of the variation in injuries is explained by the number of hours spent training. Creating a graph with the regression line pp254-255