Linear Regression and Correlation Analysis

Slides:



Advertisements
Similar presentations
Chap 12-1 Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chapter 12 Simple Regression Statistics for Business and Economics 6.
Advertisements

Forecasting Using the Simple Linear Regression Model and Correlation
LECTURE 3 Introduction to Linear Regression and Correlation Analysis
Chapter 14 Introduction to Linear Regression and Correlation Analysis
Chapter 12 Simple Regression
Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc. Chap 13-1 Chapter 13 Simple Linear Regression Basic Business Statistics 11 th Edition.
Chapter 13 Introduction to Linear Regression and Correlation Analysis
Fall 2006 – Fundamentals of Business Statistics 1 Chapter 13 Introduction to Linear Regression and Correlation Analysis.
1 Pertemuan 13 Uji Koefisien Korelasi dan Regresi Matakuliah: A0392 – Statistik Ekonomi Tahun: 2006.
Statistics for Managers Using Microsoft Excel, 5e © 2008 Prentice-Hall, Inc.Chap 13-1 Statistics for Managers Using Microsoft® Excel 5th Edition Chapter.
Pengujian Parameter Koefisien Korelasi Pertemuan 04 Matakuliah: I0174 – Analisis Regresi Tahun: Ganjil 2007/2008.
Chapter Topics Types of Regression Models
Linear Regression and Correlation Analysis
Chapter 13 Introduction to Linear Regression and Correlation Analysis
Linear Regression Example Data
Korelasi dalam Regresi Linear Sederhana Pertemuan 03 Matakuliah: I0174 – Analisis Regresi Tahun: Ganjil 2007/2008.
© 2000 Prentice-Hall, Inc. Chap Forecasting Using the Simple Linear Regression Model and Correlation.
Pertemua 19 Regresi Linier
Chapter 14 Introduction to Linear Regression and Correlation Analysis
Basic Business Statistics, 10e © 2006 Prentice-Hall, Inc. Chap 13-1 Chapter 13 Simple Linear Regression Basic Business Statistics 10 th Edition.
Chapter 7 Forecasting with Simple Regression
Chapter 13 Simple Linear Regression
Linear Regression Analysis
Statistics for Business and Economics 7 th Edition Chapter 11 Simple Regression Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall Ch.
Chapter 13 Simple Linear Regression
Regression and Correlation Methods Judy Zhong Ph.D.
Introduction to Linear Regression and Correlation Analysis
Statistics for Managers Using Microsoft Excel, 4e © 2004 Prentice-Hall, Inc. Chap 12-1 Chapter 12 Simple Linear Regression Statistics for Managers Using.
MAT 254 – Probability and Statistics Sections 1,2 & Spring.
Statistics for Business and Economics 8 th Edition Chapter 11 Simple Regression Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall Ch.
Chapter 14 Simple Regression
OPIM 303-Lecture #8 Jose M. Cruz Assistant Professor.
Statistics for Business and Economics 7 th Edition Chapter 11 Simple Regression Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall Ch.
© 2003 Prentice-Hall, Inc.Chap 13-1 Basic Business Statistics (9 th Edition) Chapter 13 Simple Linear Regression.
Business Statistics: A First Course, 5e © 2009 Prentice-Hall, Inc. Chap 12-1 Correlation and Regression.
Introduction to Linear Regression
Chap 12-1 A Course In Business Statistics, 4th © 2006 Prentice-Hall, Inc. A Course In Business Statistics 4 th Edition Chapter 12 Introduction to Linear.
Linear Regression and Correlation Analysis. Regression Analysis Regression Analysis attempts to determine the strength of the relationship between one.
Applied Quantitative Analysis and Practices LECTURE#22 By Dr. Osman Sadiq Paracha.
EQT 373 Chapter 3 Simple Linear Regression. EQT 373 Learning Objectives In this chapter, you learn: How to use regression analysis to predict the value.
Applied Quantitative Analysis and Practices LECTURE#23 By Dr. Osman Sadiq Paracha.
Chap 13-1 Copyright ©2012 Pearson Education, Inc. publishing as Prentice Hall Chap 13-1 Chapter 13 Simple Linear Regression Basic Business Statistics 12.
Statistics for Business and Economics 8 th Edition Chapter 11 Simple Regression Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall Ch.
Statistics for Business and Economics 8 th Edition Chapter 11 Simple Regression Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall Ch.
Business Statistics: A Decision-Making Approach, 6e © 2005 Prentice-Hall, Inc. Chap 13-1 Introduction to Regression Analysis Regression analysis is used.
Lecture 10: Correlation and Regression Model.
Statistics for Business and Economics 8 th Edition Chapter 11 Simple Regression Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall Ch.
Statistics for Managers Using Microsoft® Excel 5th Edition
Chapter 12 Simple Linear Regression.
Lecture 10 Introduction to Linear Regression and Correlation Analysis.
Statistics for Managers Using Microsoft Excel, 4e © 2004 Prentice-Hall, Inc. Chap 12-1 Chapter 12 Simple Linear Regression Statistics for Managers Using.
Conceptual Foundations © 2008 Pearson Education Australia Lecture slides for this course are based on teaching materials provided/referred by: (1) Statistics.
BUSINESS MATHEMATICS & STATISTICS. Module 6 Correlation ( Lecture 28-29) Line Fitting ( Lectures 30-31) Time Series and Exponential Smoothing ( Lectures.
Chapter 12 Simple Regression Statistika.  Analisis regresi adalah analisis hubungan linear antar 2 variabel random yang mempunyai hub linear,  Variabel.
Chapter 13 Simple Linear Regression
Simple Linear Correlation
Chapter 14 Introduction to Multiple Regression
Statistics for Managers using Microsoft Excel 3rd Edition
Simple Linear Regression
Chapter 11 Simple Regression
Chapter 13 Simple Linear Regression
Simple Linear Regression
Regression Analysis Week 4.
PENGOLAHAN DAN PENYAJIAN
Chapter 11 Simple Regression
Introduction to Regression
Chapter 13 Simple Linear Regression
Presentation transcript:

Linear Regression and Correlation Analysis

Scatter Plots and Correlation A scatter plot (or scatter diagram) is used to show the relationship between two variables Correlation analysis is used to measure strength of the association (linear relationship) between two variables Only concerned with strength of the relationship No causal effect is implied

Scatter Plot Examples Linear relationships Curvilinear relationships y

Scatter Plot Examples Strong relationships Weak relationships y y x x

Scatter Plot Examples No relationship y x y x

Correlation Coefficient The population correlation coefficient ρ (rho) measures the strength of the association between the variables The sample correlation coefficient r is an estimate of ρ and is used to measure the strength of the linear relationship in the sample observations

Features of ρ and r Unit free Range between -1 and 1 The closer to -1, the stronger the negative linear relationship The closer to 1, the stronger the positive linear relationship The closer to 0, the weaker the linear relationship

Examples of Approximate r Values y y y x x x r = -1 r = -.6 r = 0 y y x x r = +.3 r = +1

Calculating the Correlation Coefficient Sample correlation coefficient: or the algebraic equivalent: where: r = Sample correlation coefficient n = Sample size x = Value of the independent variable y = Value of the dependent variable

Calculation Example Tree Height Trunk Diameter y x xy y2 x2 35 8 280 1225 64 49 9 441 2401 81 27 7 189 729 33 6 198 1089 36 60 13 780 3600 169 21 147 45 11 495 2025 121 51 12 612 2601 144 =321 =73 =3142 =14111 =713

Calculation Example r = 0.886 → relatively strong positive Tree Height, y r = 0.886 → relatively strong positive linear association between x and y Trunk Diameter, x

Introduction to Regression Analysis Regression analysis is used to: Predict the value of a dependent variable based on the value of at least one independent variable Explain the impact of changes in an independent variable on the dependent variable Dependent variable: the variable we wish to explain Independent variable: the variable used to explain the dependent variable

Simple Linear Regression Model Only one independent variable, x Relationship between x and y is described by a linear function Changes in y are assumed to be caused by changes in x

Types of Regression Models Positive Linear Relationship Relationship NOT Linear Negative Linear Relationship No Relationship

Population Linear Regression The population regression model: Random Error term, or residual Population Slope Coefficient Population y intercept Independent Variable Dependent Variable Linear component Random Error component

Linear Regression Assumptions Error values (ε) are statistically independent Error values are normally distributed for any given value of x The probability distribution of the errors is normal The probability distribution of the errors has constant variance The underlying relationship between the x variable and the y variable is linear

Population Linear Regression y Observed Value of y for xi εi Slope = β1 Predicted Value of y for xi Random Error for this x value Intercept = β0 xi x

Estimated Regression Model The sample regression line provides an estimate of the population regression line Estimated (or predicted) y value Estimate of the regression intercept Estimate of the regression slope Independent variable The individual random error terms ei have a mean of zero

Least Squares Criterion b0 and b1 are obtained by finding the values of b0 and b1 that minimize the sum of the squared residuals

The Least Squares Equation The formulas for b1 and b0 are: algebraic equivalent: and

Interpretation of the Slope and the Intercept b0 is the estimated average value of y when the value of x is zero b1 is the estimated change in the average value of y as a result of a one-unit change in x

Simple Linear Regression Example A real estate agent wishes to examine the relationship between the selling price of a home and its size (measured in square feet) A random sample of 10 houses is selected Dependent variable (y) = house price in $1000s Independent variable (x) = square feet

Sample Data for House Price Model House Price in $1000s (y) Square Feet (x) 245 1400 312 1600 279 1700 308 1875 199 1100 219 1550 405 2350 324 2450 319 1425 255

Graphical Presentation House price model: scatter plot and regression line Slope = 0.10977 Intercept = 98.248

Interpretation of the Intercept, b0 b0 is the estimated average value of Y when the value of X is zero (if x = 0 is in the range of observed x values) Here, no houses had 0 square feet, so b0 = 98.24833 just indicates that, for houses within the range of sizes observed, $98,248.33 is the portion of the house price not explained by square feet

Interpretation of the Slope Coefficient, b1 b1 measures the estimated change in the average value of Y as a result of a one-unit change in X Here, b1 = .10977 tells us that the average value of a house increases by .10977($1000) = $109.77, on average, for each additional one square foot of size

Least Squares Regression Properties The sum of the residuals from the least squares regression line is 0 ( ) The sum of the squared residuals is a minimum (minimized ) The simple regression line always passes through the mean of the y variable and the mean of the x variable The least squares coefficients are unbiased estimates of β0 and β1

Explained and Unexplained Variation Total variation is made up of two parts: Total sum of Squares Sum of Squares Error Sum of Squares Regression where: = Average value of the dependent variable y = Observed values of the dependent variable = Estimated value of y for the given x value

Explained and Unexplained Variation SST = total sum of squares Measures the variation of the yi values around their mean y SSE = error sum of squares Variation attributable to factors other than the relationship between x and y SSR = regression sum of squares Explained variation attributable to the relationship between x and y

Explained and Unexplained Variation y yi   y SSE = (yi - yi )2 _ SST = (yi - y)2  _ y  SSR = (yi - y)2 _ _ y y x Xi

Coefficient of Determination, R2 The coefficient of determination is the portion of the total variation in the dependent variable that is explained by variation in the independent variable The coefficient of determination is also called R-squared and is denoted as R2 where

Coefficient of Determination, R2 Note: In the single independent variable case, the coefficient of determination is where: R2 = Coefficient of determination r = Simple correlation coefficient

Examples of Approximate R2 Values y R2 = 1 Perfect linear relationship between x and y: 100% of the variation in y is explained by variation in x x R2 = 1 y x R2 = +1

Examples of Approximate R2 Values y 0 < R2 < 1 Weaker linear relationship between x and y: Some but not all of the variation in y is explained by variation in x x y x

Examples of Approximate R2 Values y No linear relationship between x and y: The value of Y does not depend on x. (None of the variation in y is explained by variation in x) x R2 = 0

Standard Error of Estimate The standard deviation of the variation of observations around the regression line is estimated by Where SSE = Sum of squares error n = Sample size k = number of independent variables in the model

Example: House Prices House Price in $000s (y) Square Feet (x) y2 x2 245 1400 60025 1960000 343000 312 1600 97344 2560000 499200 279 1700 77841 2890000 474300 308 1875 94864 3515625 577500 199 1100 39601 1210000 218900 219 1550 47961 2402500 339450 405 2350 164025 5522500 951750 324 2450 104976 6002500 793800 319 1425 101761 2030625 454575 255 65025 433500 Sums 2865 17150 853423 30983750 5085975 Means 286.5 1715

Example: House Prices b1 = 0.1098 b0 = 98.25 House Price in $000s (y) Square Feet (x) y2 x2 xy Sums 2865 17150 853423 30983750 5085975 Means 286.5 1715 b1 = 0.1098 b0 = 98.25 Estimated Regression Equation:

Example: House Prices Estimated Regression Equation: House Price in $1000s (y) Square Feet (x) 245 1400 312 1600 279 1700 308 1875 199 1100 219 1550 405 2350 324 2450 319 1425 255 Predict the price for a house with 2000 square feet

Example: House Prices Predict the price for a house with 2000 square feet: The predicted price for a house with 2000 square feet is 317.85($1,000s) = $317,850

The Standard Deviation of the Regression Slope The standard error of the regression slope coefficient (b1) is estimated by where: = Estimate of the standard error of the least squares slope = Sample standard error of the estimate

Comparing Standard Errors Variation of observed y values from the regression line Variation in the slope of regression lines from different possible samples y y x x y y x x

Inference about the Slope: t Test t test for a population slope Is there a linear relationship between x and y? Null and alternative hypotheses H0: β1 = 0 (no linear relationship) H1: β1  0 (linear relationship does exist) Test statistic where: b1 = Sample regression slope coefficient β1 = Hypothesized slope sb1 = Estimator of the standard error of the slope

Inference about the Slope: t Test Estimated Regression Equation: House Price in $1000s (y) Square Feet (x) 245 1400 312 1600 279 1700 308 1875 199 1100 219 1550 405 2350 324 2450 319 1425 255 The slope of this model is 0.1098 Does square footage of the house affect its sales price?

Inferences about the Slope: t Test Example Test Statistic: t = 3.329 b1 t H0: β1 = 0 HA: β1  0 From Excel output:   Coefficients Standard Error t Stat P-value Intercept 98.24833 58.03348 1.69296 0.12892 Square Feet 0.10977 0.03297 3.32938 0.01039 d.f. = 10-2 = 8 Decision: Conclusion: Reject H0 a/2=.025 a/2=.025 There is sufficient evidence that square footage affects house price Reject H0 Do not reject H0 Reject H0 -tα/2 tα/2 -2.3060 2.3060 3.329

Regression Analysis for Description Confidence Interval Estimate of the Slope: d.f. = n - 2 Excel Printout for House Prices:   Coefficients Standard Error t Stat P-value Lower 95% Upper 95% Intercept 98.24833 58.03348 1.69296 0.12892 -35.57720 232.07386 Square Feet 0.10977 0.03297 3.32938 0.01039 0.03374 0.18580 At 95% level of confidence, the confidence interval for the slope is (0.0337, 0.1858)

Regression Analysis for Description   Coefficients Standard Error t Stat P-value Lower 95% Upper 95% Intercept 98.24833 58.03348 1.69296 0.12892 -35.57720 232.07386 Square Feet 0.10977 0.03297 3.32938 0.01039 0.03374 0.18580 Since the units of the house price variable is $1000s, we are 95% confident that the average impact on sales price is between $33.70 and $185.80 per square foot of house size This 95% confidence interval does not include 0. Conclusion: There is a significant relationship between house price and square feet at the .05 level of significance

Case Study The following data gives the Sales & Net Profit for some of the top Auto-makers during the quarter July-Sep 2006 (Rs. Crores) Company Sales Net Profit Tata Motors 6484.8 466.0 Hero Honda 2196.5 224.2 Bajaj Auto 2444.7 345.4 TVS Motor 1032.9 35.1 Bharat Forge 461.6 63.4 Ashok Leyland 1635.8 94.7 M&M 2365.5 200.6 Maruti Udyog 3426.5 315.7

Case Study Fit a regression equation for Net Profit on Sales Maruti expects its Sales to go up to 3800 crores by the introduction of a new segment A2 model. Find out a 90% confidence interval for the expected net profit. Find out the Coefficient of Determination & the Correlation between Sales & Net Profit. Does this indicate a strong association? An earlier study done in 2001 indicated that for a sale of 100 the net profit was 80. Does the current data indicate a change in this trend. Test at 5% level of confidence.

Levin Rubin – Computations for Ch 12 problems ∑X ∑Y ∑X2 ∑Y2 Mean X Mean Y ∑XY 12-16 10 37.2 75.5 147.18 597.03 3.72 7.55 295.95 12-17 5 28 25 158.5 175 5.6 131 12-18 8 340 5500 15500 3830800 42.5 687.5 227200 12-19 120 228 2100 8884 15 28.5 2580 12-20 260 128 9320 3.5 32.5 1047 12-21 34 1371 122.62 201121 3.4 137.1 4954 12-22 7 146 48 3550 352 20.857 6.857 1101 12-23 104.05 1630 159.70 349814 13 203.75 17746.6 12-24 185.8 384 3177.12 11755.6 14.29 29.54 5080.27 12-31 42 117.6 158 1187.84 3.23 9.046 337 12-32 23 66 99 700 2.875 8.25 246 12-34 12 128.7 306.95 1633.41 8838.59 10.725 25.58 3786.64 12-37 11 16 1071 23.88 106285 1.4545 97.3636 1591.8