Relationship with one independent variable

Slides:



Advertisements
Similar presentations
Regression Analysis Module 3. Regression Regression is the attempt to explain the variation in a dependent variable using the variation in independent.
Advertisements

Learning Objectives Copyright © 2002 South-Western/Thomson Learning Data Analysis: Bivariate Correlation and Regression CHAPTER sixteen.
Learning Objectives Copyright © 2004 John Wiley & Sons, Inc. Bivariate Correlation and Regression CHAPTER Thirteen.
Chapter 12 Simple Linear Regression
1 1 Slide © 2014 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole.
Chapter 10 Simple Regression.
The Simple Regression Model
1 Pertemuan 13 Uji Koefisien Korelasi dan Regresi Matakuliah: A0392 – Statistik Ekonomi Tahun: 2006.
SIMPLE LINEAR REGRESSION
Pengujian Parameter Koefisien Korelasi Pertemuan 04 Matakuliah: I0174 – Analisis Regresi Tahun: Ganjil 2007/2008.
SIMPLE LINEAR REGRESSION
Korelasi dalam Regresi Linear Sederhana Pertemuan 03 Matakuliah: I0174 – Analisis Regresi Tahun: Ganjil 2007/2008.
BCOR 1020 Business Statistics
This Week Continue with linear regression Begin multiple regression –Le 8.2 –C & S 9:A-E Handout: Class examples and assignment 3.
Chapter 7 Forecasting with Simple Regression
1 1 Slide © 2008 Thomson South-Western. All Rights Reserved Slides by JOHN LOUCKS & Updated by SPIROS VELIANITIS.
Lecture 15 Basics of Regression Analysis
SIMPLE LINEAR REGRESSION
Introduction to Linear Regression and Correlation Analysis
1 1 Slide © 2005 Thomson/South-Western Slides Prepared by JOHN S. LOUCKS St. Edward’s University Slides Prepared by JOHN S. LOUCKS St. Edward’s University.
Econ 3790: Business and Economics Statistics
OPIM 303-Lecture #8 Jose M. Cruz Assistant Professor.
1 1 Slide © 2004 Thomson/South-Western Slides Prepared by JOHN S. LOUCKS St. Edward’s University Slides Prepared by JOHN S. LOUCKS St. Edward’s University.
AGENDA I.Homework 3 II.Parameter Estimates Equations III.Coefficient of Determination (R 2 ) Formula IV.Overall Model Test (F Test for Regression)
Econ 3790: Business and Economics Statistics Instructor: Yogesh Uppal
© 2003 Prentice-Hall, Inc.Chap 13-1 Basic Business Statistics (9 th Edition) Chapter 13 Simple Linear Regression.
Time Series Analysis – Chapter 2 Simple Regression Essentially, all models are wrong, but some are useful. - George Box Empirical Model-Building and Response.
1 1 Slide Simple Linear Regression Coefficient of Determination Chapter 14 BA 303 – Spring 2011.
1 1 Slide © 2014 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole.
Applied Quantitative Analysis and Practices LECTURE#23 By Dr. Osman Sadiq Paracha.
1 Chapter 12 Simple Linear Regression. 2 Chapter Outline  Simple Linear Regression Model  Least Squares Method  Coefficient of Determination  Model.
Go to Table of Content Single Variable Regression Farrokh Alemi, Ph.D. Kashif Haqqi M.D.
Chapter 5: Regression Analysis Part 1: Simple Linear Regression.
ANOVA for Regression ANOVA tests whether the regression model has any explanatory power. In the case of simple regression analysis the ANOVA test and the.
© Copyright McGraw-Hill Correlation and Regression CHAPTER 10.
Regression Analysis Relationship with one independent variable.
STA 286 week 131 Inference for the Regression Coefficient Recall, b 0 and b 1 are the estimates of the slope β 1 and intercept β 0 of population regression.
Statistics for Business and Economics 8 th Edition Chapter 11 Simple Regression Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall Ch.
Chapter Thirteen Copyright © 2006 John Wiley & Sons, Inc. Bivariate Correlation and Regression.
Environmental Modeling Basic Testing Methods - Statistics III.
Regression Analysis. 1. To comprehend the nature of correlation analysis. 2. To understand bivariate regression analysis. 3. To become aware of the coefficient.
Chapter 12 Simple Linear Regression n Simple Linear Regression Model n Least Squares Method n Coefficient of Determination n Model Assumptions n Testing.
1 1 Slide The Simple Linear Regression Model n Simple Linear Regression Model y =  0 +  1 x +  n Simple Linear Regression Equation E( y ) =  0 + 
© 2001 Prentice-Hall, Inc.Chap 13-1 BA 201 Lecture 19 Measure of Variation in the Simple Linear Regression Model (Data)Data.
Bivariate Regression. Bivariate Regression analyzes the relationship between two variables. Bivariate Regression analyzes the relationship between two.
Multiple Regression.
Chapter 13 Simple Linear Regression
Lecture 11: Simple Linear Regression
Simple Linear Regression & Correlation
Chapter 20 Linear and Multiple Regression
Statistics for Managers using Microsoft Excel 3rd Edition
Statistics for Business and Economics (13e)
Regression model with multiple predictors
Chapter 13 Simple Linear Regression
Simple Linear Regression
Quantitative Methods Simple Regression.
Slides by JOHN LOUCKS St. Edward’s University.
Multiple Regression.
Prepared by Lee Revere and John Large
Simple Linear Regression
Inferential Statistics and Probability a Holistic Approach
Relationship with one independent variable
SIMPLE LINEAR REGRESSION
SIMPLE LINEAR REGRESSION
Simple Linear Regression
Introduction to Regression
MGS 3100 Business Analysis Regression Feb 18, 2016
St. Edward’s University
Chapter 13 Simple Linear Regression
Presentation transcript:

Relationship with one independent variable Simple Regression Relationship with one independent variable

Lecture Objectives You should be able to interpret Regression Output. Specifically, Interpret Significance of relationship (Sig. F) The parameter estimates (write and use the model) Compute/interpret R-square, Standard Error (ANOVA table)

Basic Equation Independent variable (x) Dependent variable (y) ŷ = b0 + b1X b0 (y intercept) b1 = slope = ∆y/ ∆x є The straight line represents the linear relationship between y and x.

Understanding the equation What is the equation of this line?

Total Variation Sum of Squares (SST) What if there were no information on X (and hence no regression)? There would only be the y axis (green dots showing y values). The best forecast for Y would then simply be the mean of Y. Total Error in the forecasts would be the total variation from the mean. Dependent variable (y) Independent variable (x) Mean Y Variation from mean (Total Variation)

Sum of Squares Total (SST) Computation Shoe Sizes for 13 Children X Y Deviation Squared Obs Age Shoe Size from Mean deviation 1 11 5.0 -2.7692 7.6686 2 12 6.0 -1.7692 3.1302 3 4 13 7.5 -0.2692 0.0725 5 6 8.5 0.7308 0.5340 7 14 8.0 0.2308 0.0533 8 15 10.0 2.2308 4.9763 9 7.0 -0.7692 0.5917 10 17 18 11.0 3.2308 10.4379 19 48.8077 Sum of Squared Mean 7.769 0.000 Deviations (SST) In computing SST, the variable X is irrelevant. This computation tells us the total squared deviation from the mean for y.

Error after Regression Dependent variable (y) Independent variable (x) Mean Y Total Variation Explained by regression Residual Error (unexplained) Information about x gives us the regression model, which does a better job of predicting y than simply the mean of y. Thus some of the total variation in y is explained away by x, leaving some unexplained residual error.

Computing SSE Shoe Sizes for 13 Children X Y Residual Obs Age Pred. Y (Error) Squared 1 11 5.0 5.5565 -0.5565 0.3097 2 12 6.0 6.1685 -0.1685 0.0284 3 -1.1685 1.3654 4 13 7.5 6.7806 0.7194 0.5176 5 -0.7806 0.6093 6 8.5 1.7194 2.9565 7 14 8.0 7.3926 0.6074 0.3689 8 15 10.0 8.0046 1.9954 3.9815 9 7.0 -1.0046 1.0093 10 17 9.2287 -1.2287 1.5097 18 11.0 9.8407 1.1593 1.3439 -1.8407 3.3883 19 10.4528 0.5472 0.2995 0.0000 17.6880 Sum of Squares Prediction Intercept (bo) -1.17593 Error Equation: Slope (b1) 0.612037

The Regression Sum of Squares Some of the total variation in y is explained by the regression, while the residual is the error in prediction even after regression. Sum of squares Total = Sum of squares explained by regression + Sum of squares of error still left after regression. SST = SSR + SSE or, SSR = SST - SSE

R-square R2 = SSR/SST = (SST-SSE)/SST For the shoe size example, The proportion of variation in y that is explained by the regression model is called R2. R2 = SSR/SST = (SST-SSE)/SST For the shoe size example, R2 = (48.8077 – 17.6879)/48.8077 = 0.6376. R2 ranges from 0 to 1, with a 1 indicating a perfect relationship between x and y.

Mean Squared Error MSR = SSR/dfregression MSE = SSE/dferror df is the degrees of freedom For regression, df = k = # of ind. variables For error, df = n-k-1 Degrees of freedom for error refers to the number of observations from the sample that could have contributed to the overall error.

Standard Error Standard Error (SE) = √MSE Standard Error is a measure of how well the model will be able to predict y. It can be used to construct a confidence interval for the prediction.

Regression Statistics Summary Output & ANOVA SUMMARY OUTPUT Regression Statistics Multiple R 0.798498 R Square 0.637599 Adjusted R Square 0.604653 Standard Error 1.268068 Observations 13 = SSR/SST = 31.1/48.8 = √MSE = √ 1.608 ANOVA   df SS MS F Significance F Regression 1 (k) 31.1197 19.3531 0.0011 Residual (Error) 11 (n-k-1) 17.6880 1.6080 Total 12 (n-1) 48.8077 p-value for regression =MSR/MSE =31.1/1.6

The Hypothesis for Regression Ha: At least one of the βs is not 0 If all βs are 0, then it implies that y is not related to any of the x variables. Thus the alternate we try to prove is that there is in fact a relationship. The Significance F is the p-value for such a test.