Linear Regression and Correlation Analysis. Regression Analysis Regression Analysis attempts to determine the strength of the relationship between one.

Slides:



Advertisements
Similar presentations
Chap 12-1 Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chapter 12 Simple Regression Statistics for Business and Economics 6.
Advertisements

Forecasting Using the Simple Linear Regression Model and Correlation
LECTURE 3 Introduction to Linear Regression and Correlation Analysis
Chapter 14 Introduction to Linear Regression and Correlation Analysis
Chapter 12 Simple Regression
Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc. Chap 13-1 Chapter 13 Simple Linear Regression Basic Business Statistics 11 th Edition.
Chapter 13 Introduction to Linear Regression and Correlation Analysis
The Simple Regression Model
Fall 2006 – Fundamentals of Business Statistics 1 Chapter 13 Introduction to Linear Regression and Correlation Analysis.
1 Pertemuan 13 Uji Koefisien Korelasi dan Regresi Matakuliah: A0392 – Statistik Ekonomi Tahun: 2006.
Statistics for Managers Using Microsoft Excel, 5e © 2008 Prentice-Hall, Inc.Chap 13-1 Statistics for Managers Using Microsoft® Excel 5th Edition Chapter.
SIMPLE LINEAR REGRESSION
Pengujian Parameter Koefisien Korelasi Pertemuan 04 Matakuliah: I0174 – Analisis Regresi Tahun: Ganjil 2007/2008.
Chapter 12 Simple Linear Regression
Chapter Topics Types of Regression Models
Linear Regression and Correlation Analysis
1 Simple Linear Regression Chapter Introduction In this chapter we examine the relationship among interval variables via a mathematical equation.
Chapter 13 Introduction to Linear Regression and Correlation Analysis
Linear Regression Example Data
SIMPLE LINEAR REGRESSION
Korelasi dalam Regresi Linear Sederhana Pertemuan 03 Matakuliah: I0174 – Analisis Regresi Tahun: Ganjil 2007/2008.
© 2000 Prentice-Hall, Inc. Chap Forecasting Using the Simple Linear Regression Model and Correlation.
Pertemua 19 Regresi Linier
Chapter 14 Introduction to Linear Regression and Correlation Analysis
Basic Business Statistics, 10e © 2006 Prentice-Hall, Inc. Chap 13-1 Chapter 13 Simple Linear Regression Basic Business Statistics 10 th Edition.
Correlation and Regression Analysis
Chapter 7 Forecasting with Simple Regression
Introduction to Regression Analysis, Chapter 13,
Chapter 13 Simple Linear Regression
Statistics for Business and Economics 7 th Edition Chapter 11 Simple Regression Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall Ch.
Chapter 13 Simple Linear Regression
Regression and Correlation Methods Judy Zhong Ph.D.
SIMPLE LINEAR REGRESSION
Introduction to Linear Regression and Correlation Analysis
Statistics for Managers Using Microsoft Excel, 4e © 2004 Prentice-Hall, Inc. Chap 12-1 Chapter 12 Simple Linear Regression Statistics for Managers Using.
MAT 254 – Probability and Statistics Sections 1,2 & Spring.
Chapter 14 Simple Regression
OPIM 303-Lecture #8 Jose M. Cruz Assistant Professor.
Statistics for Business and Economics 7 th Edition Chapter 11 Simple Regression Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall Ch.
© 2003 Prentice-Hall, Inc.Chap 13-1 Basic Business Statistics (9 th Edition) Chapter 13 Simple Linear Regression.
Business Statistics: A First Course, 5e © 2009 Prentice-Hall, Inc. Chap 12-1 Correlation and Regression.
Introduction to Linear Regression
Chap 12-1 A Course In Business Statistics, 4th © 2006 Prentice-Hall, Inc. A Course In Business Statistics 4 th Edition Chapter 12 Introduction to Linear.
Applied Quantitative Analysis and Practices LECTURE#22 By Dr. Osman Sadiq Paracha.
EQT 373 Chapter 3 Simple Linear Regression. EQT 373 Learning Objectives In this chapter, you learn: How to use regression analysis to predict the value.
Applied Quantitative Analysis and Practices LECTURE#23 By Dr. Osman Sadiq Paracha.
Chap 13-1 Copyright ©2012 Pearson Education, Inc. publishing as Prentice Hall Chap 13-1 Chapter 13 Simple Linear Regression Basic Business Statistics 12.
Statistics for Business and Economics 8 th Edition Chapter 11 Simple Regression Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall Ch.
Statistics for Business and Economics 8 th Edition Chapter 11 Simple Regression Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall Ch.
Business Statistics: A Decision-Making Approach, 6e © 2005 Prentice-Hall, Inc. Chap 13-1 Introduction to Regression Analysis Regression analysis is used.
Lecture 10: Correlation and Regression Model.
Statistics for Business and Economics 8 th Edition Chapter 11 Simple Regression Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall Ch.
Statistics for Managers Using Microsoft® Excel 5th Edition
Chapter 12 Simple Linear Regression.
Lecture 10 Introduction to Linear Regression and Correlation Analysis.
Statistics for Managers Using Microsoft Excel, 4e © 2004 Prentice-Hall, Inc. Chap 12-1 Chapter 12 Simple Linear Regression Statistics for Managers Using.
Conceptual Foundations © 2008 Pearson Education Australia Lecture slides for this course are based on teaching materials provided/referred by: (1) Statistics.
Chapter 12 Simple Regression Statistika.  Analisis regresi adalah analisis hubungan linear antar 2 variabel random yang mempunyai hub linear,  Variabel.
Chapter 13 Simple Linear Regression
Statistics for Managers using Microsoft Excel 3rd Edition
Linear Regression and Correlation Analysis
Simple Linear Regression
Chapter 11 Simple Regression
Chapter 13 Simple Linear Regression
Simple Linear Regression
PENGOLAHAN DAN PENYAJIAN
Chapter 13 Simple Linear Regression
Presentation transcript:

Linear Regression and Correlation Analysis

Regression Analysis Regression Analysis attempts to determine the strength of the relationship between one dependent variable (usually denoted by Y) and a series of other changing variables (known as independent variables).

Regression Analysis  Regression Analysis is used to:  Predict the value of a dependent variable based on the value of at least one independent variable  Explain the impact of changes in an independent variable on the dependent variable  Dependent Variable: The variable we wish to explain  Independent Variable: The variable used to explain the dependent variable

Regression Analysis One of the two variables (X, Y), say X is an independent variable/ controlled variable/ ordinary variable, Y is random Variable Relationship between X and Y is described by a linear function Dependence of Y on X or Regression of Y on X Dependence of Blood Pressure Y on X Regression of gain of weight Y on food X

Types of Regression Models Positive Linear Relationship Negative Linear Relationship Relationship NOT Linear No Relationship

Regression Analysis In Regression Analysis the dependence of Y on x is the dependence of mean µ of Y on x µ = µ (x) The curve of µ (x) is called Regression Curve Straight Regression Line is: µ (x) = k 0 + k 1 x

The sample regression line provides an estimate of the population regression line Estimated Regression Model Estimate of the Regression Intercept Estimate of the Regression Slope Estimated (or predicted) value Independent Variable

Least Squares Principle The straight Line should be fitted through the given points, so that the sum of squares of distances of those points from the line is minimum, where the distance is measured in Vertical Direction. The x-values x 1, x 2,…, x n in the Sample (x 1, y 1 ), (x 2, y 2 ), …, (x n, y n ) are not all equal.

Sample Regression Line (Regression Coefficient)

Sample Regression Line S xy = Sample Covariance S x 2 = Variance of X values S y 2 = Variance of Y values

Sample Regression Line The decrease of values y[%] of Leather for certain fixed values of High Pressure X[atmosphere] was measured. Find Regression Line of Y on X. xjxj YjYj 4, , , ,0006.9

Sample Regression Line (Regression Coefficient)

Sample Regression Line Given ValuesAuxiliary Values XjXj YjYj Xj2Xj2 XjYjXjYj 4, ,000,0009,200 6, ,000,00024,600 8, ,000,00045,600 10, ,000,00069,000 ∑x j = 28,000∑y j = 19∑X j 2 216,000,000 ∑X j Y j 148,400 Regression Line = y = x

Formula: Regression Line Intercept Slope of the Line

Equation of Regression Line SubjectAge XPressure YXYX2X2 Y2Y2 A43128 B48120 C56135 D61143 E67141 F70152 ∑345819

Equation of Regression Line SubjectAge X Pressure Y XYX2X2 Y2Y2 A431285,5041,84916,384 B481205,7602,30414,400 C561357,5603,13618,225 D611438,7233,72120,449 E671419,4474,48919,881 F ,6404,90023,104 ∑ ,63420,399112,443

Problem 1 Find and sketch or graph the Sample Regression Line of y and x for the points (-1, 1), (0, 1.7), and (1, 3).

Problem 3 Find and sketch or graph the Sample Regression Line of y and x for the points (2, 12), (5, 24), (9, 33) and (14, 50).

Problem 5 Find and sketch or graph the Sample Regression Line of y and x for the given data: Also find Stopping Distance at 35 mph. Speed X [mph] of a car Stopping Distance Y [ft]

Problem 7 Find and sketch or graph the Sample Regression Line of y and x for the given data: Revolutions per Minute = X Power of a Diesel Engine [hp] = Y

Problem 9 Find and sketch or graph the Sample Regression Line of y and x for the given data: Voltage X [V] Current Y[A]

Confidence Interval Assumptions  The x-values x 1, x 2,…, x n in the Sample (x 1, y 1 ), (x 2, y 2 ), …, (x n, y n ) are not all equal.  For each fixed x, the random variable Y is normal with mean µ (x) = k 0 + k 1 x and variance б 2 independent of x  The n performances of the experiment by which we obtain a sample (x 1, y 1 ), (x 2, y 2 ), …, (x n, y n ) are independent.

Steps to Find Confidence Interval  Choose a Confidence Level γ (95%, 99%, etc)  Determine the solution C of the equation F(C) = ½ (1 + γ) [ Table t-distribution with n-2 df ]  Using Sample Compute (n-1)S x 2, (n-1)S xy and K 1  Compute K  Confidence Interval (K 1 – K ≤ K 1 ≤ K 1 + K)

Problem 11 Find a 95% Confidence Interval for the Regression Coefficient K1, Assuming that A2 and A3 hold, for the given Sample data: X = Deformation of a Certain Steel [mm] Y = Brinell Hardness [kg/ mm 2 ]

Problem 12 Find a 95% Confidence Interval for the Regression Coefficient K1, Assuming that A2 and A3 hold, for the given Sample data: Revolutions per Minute = X Power of a Diesel Engine [hp] = Y

Problem 13 Find a 95% Confidence Interval for the Regression Coefficient K1, Assuming that A2 and A3 hold, for the given Sample data: Humidity of Air X [%] Expansion of Gelatin Y [%]

Correlation Analysis Both variables (X, Y) are random variables –Grades X and Y of students in Math and Physics –Hardness X of steel plates in the centre and hardness Y near the edges of the plates

Scatter Plots and Correlation A scatter plot (or scatter diagram) is used to show the relationship between two variables Correlation analysis is used to measure strength of the association (linear relationship) between two variables –Only concerned with strength of the relationship –No causal effect is implied

Scatter Plot Examples y x y x y y x x Linear relationshipsCurvilinear relationships

Scatter Plot Examples y x y x y y x x Strong relationshipsWeak relationships

Correlation Coefficient The population correlation coefficient ρ (rho) measures the strength of the association between the variables The sample correlation coefficient r is an estimate of ρ and is used to measure the strength of the linear relationship in the sample observations

Features of ρ  and r Unit free Range between -1 and 1 The closer to -1, the stronger the negative linear relationship The closer to 1, the stronger the positive linear relationship The closer to 0, the weaker the linear relationship

r = +.3r = +1 Approximate r Values y x y x y x y x y x r = -1 r = -.6r = 0

Population Correlation Coefficient Theorem: The Correlation Coefficient ρ satisfies -1 ≤ ρ ≤ 1. In particular ρ = ±1, if and only if X and Y are linearly related i.e Y = rX + δ X = r*Y + δ* X and Y are called uncorrelated, if ρ = 0.

Theorem (Normal Distribution) Independent X and Y are uncorrelated. If (X, Y) is normal, the uncorrelated X and Y are independent. Example: If X assumes -1, 0, 1 with probability 1/3. and Y = X 2

Test For Correlation Coefficient : ρ  Test H 0 : ρ = 0 and H 1 : ρ > 0  Choose a Significance Level α (5%, 1%, etc)  Determine the solution C of the equation P(T ≤ C) = 1 - α [ Table t-distribution with n-2 df ]  Compute r and t  If t ≤ C, accept the Hypothesis, otherwise reject the Hypothesis.

Example: Correlation Coefficient Tree Height Trunk Diameter xyy2y2 x2x2 yx  =321  =73

Example: Correlation Coefficient Tree Height Trunk Diameter xyy2y2 x2x2 yx  =321  =73  =3142  =14111  =713

Trunk Diameter, x Tree Height, y Calculation Example r = → relatively strong positive linear association between x and y

Excel Output Excel Correlation Output Tools / data analysis / correlation… Correlation between Tree Height and Trunk Diameter

Significance Test for Correlation Hypotheses H 0 : ρ = 0 (no correlation) H A : ρ ≠ 0 (correlation exists) Test statistic – (with n – 2 degrees of freedom)

Example: Produce Stores Is there evidence of a linear relationship between tree height and trunk diameter at the.05 level of significance? H 0 : ρ = 0 (No correlation) H 1 : ρ ≠ 0 (correlation exists)  =.05, df = = 6

Example: Test Solution Conclusion: There is evidence of a linear relationship at the 5% level of significance Decision: Reject H 0 Reject H 0  /2=.025 -t α/2 Do not reject H 0 0 t α/2  /2= d.f. = 8-2 = 6

Linear component Population Linear Regression The population regression model: Population y intercept Population Slope Coefficient Random Error term, or residual Dependent Variable Independent Variable Random Error component

Linear Regression Assumptions Error values (ε) are statistically independent Error values are normally distributed for any given value of x The probability distribution of the errors is normal The probability distribution of the errors has constant variance The underlying relationship between the x variable and the y variable is linear

Population Linear Regression (continued) Random Error for this x value y x Observed Value of y for x i Predicted Value of y for x i xixi Slope = β 1 Intercept = β 0 εiεi

Residual Analysis Purposes –Examine for linearity assumption –Examine for constant variance for all levels of x –Evaluate normal distribution assumption Graphical Analysis of Residuals –Can plot residuals vs. x –Can create histogram of residuals to check for normality

Residual Analysis for Linearity Not Linear Linear x residuals x y x y x

Residual Analysis for Constant Variance Non-constant variance Constant variance xx y x x y residuals

Excel Output RESIDUAL OUTPUT Predicted House PriceResiduals

Simple Linear Regression Example A real estate agent wishes to examine the relationship between the selling price of a home and its size (measured in square feet) A random sample of 10 houses is selected –Dependent variable (y) = house price in $1000s –Independent variable (x) = square feet

Sample Data for House Price Model House Price in $1000s (y) Square Feet (x)

Regression Using Excel

Excel Output Regression Statistics Multiple R R Square Adjusted R Square Standard Error Observations10 ANOVA dfSSMSF Significance F Regression Residual Total Coefficien tsStandard Errort Stat P- valueLower 95% Upper 95% Intercept Square Feet The regression equation is:

Graphical Presentation House price model: scatter plot and regression line Slope = Intercept =

Interpretation of the Intercept, b 0 b 0 is the estimated average value of Y when the value of X is zero (if x = 0 is in the range of observed x values) –Here, no houses had 0 square feet, so b 0 = just indicates that, for houses within the range of sizes observed, $98, is the portion of the house price not explained by square feet

Interpretation of the Slope Coefficient, b 1 b 1 measures the estimated change in the average value of Y as a result of a one-unit change in X –Here, b 1 = tells us that the average value of a house increases by.10977($1000) = $109.77, on average, for each additional one square foot of size

Least Squares Regression Properties The sum of the residuals from the least squares regression line is 0 ( ) The sum of the squared residuals is a minimum (minimized ) The simple regression line always passes through the mean of the y variable and the mean of the x variable The least squares coefficients are unbiased estimates of β 0 and β 1

Explained and Unexplained Variation Total variation is made up of two parts: Total sum of Squares Sum of Squares Regression Sum of Squares Error where: = Average value of the dependent variable y = Observed values of the dependent variable = Estimated value of y for the given x value

SST = total sum of squares –Measures the variation of the y i values around their mean y SSE = error sum of squares –Variation attributable to factors other than the relationship between x and y SSR = regression sum of squares –Explained variation attributable to the relationship between x and y Explained and Unexplained Variation

XiXi y x yiyi SST =  (y i - y) 2 SSE =  (y i - y i ) 2  SSR =  (y i - y) 2  _ _ _ Explained and Unexplained Variation y  y y _ y 

The coefficient of determination is the portion of the total variation in the dependent variable that is explained by variation in the independent variable The coefficient of determination is also called R-squared and is denoted as R 2 Coefficient of Determination, R 2 where

Coefficient of determination Coefficient of Determination, R 2 Note: In the single independent variable case, the coefficient of determination is where: R 2 = Coefficient of determination r = Simple correlation coefficient

R 2 = +1 Examples of Approximate R 2 Values y x y x R 2 = 1 Perfect linear relationship between x and y: 100% of the variation in y is explained by variation in x

Examples of Approximate R 2 Values y x y x 0 < R 2 < 1 Weaker linear relationship between x and y: Some but not all of the variation in y is explained by variation in x

Examples of Approximate R 2 Values R 2 = 0 No linear relationship between x and y: The value of Y does not depend on x. (None of the variation in y is explained by variation in x) y x R 2 = 0

Excel Output Regression Statistics Multiple R R Square Adjusted R Square Standard Error Observations10 ANOVA dfSSMSF Significance F Regression Residual Total Coefficien tsStandard Errort Stat P- valueLower 95% Upper 95% Intercept Square Feet % 58.08% of the variation in house prices is explained by variation in square feet

Standard Error of Estimate The standard deviation of the variation of observations around the regression line is estimated by Where SSE = Sum of squares error n = Sample size k = number of independent variables in the model

The Standard Deviation of the Regression Slope The standard error of the regression slope coefficient (b 1 ) is estimated by where: = Estimate of the standard error of the least squares slope = Sample standard error of the estimate

Excel Output Regression Statistics Multiple R R Square Adjusted R Square Standard Error Observations10 ANOVA dfSSMSF Significance F Regression Residual Total Coefficien tsStandard Errort Stat P- valueLower 95% Upper 95% Intercept Square Feet

Comparing Standard Errors y yy x x x y x Variation of observed y values from the regression line Variation in the slope of regression lines from different possible samples

Inference about the Slope: t Test t test for a population slope –Is there a linear relationship between x and y? Null and alternative hypotheses –H 0 : β 1 = 0(no linear relationship) –H 1 : β 1  0(linear relationship does exist) Test statistic – where: b 1 = Sample regression slope coefficient β 1 = Hypothesized slope s b1 = Estimator of the standard error of the slope

House Price in $1000s (y) Square Feet (x) Estimated Regression Equation: The slope of this model is Does square footage of the house affect its sales price? Inference about the Slope: t Test (continued)

Inferences about the Slope: t Test Example H 0 : β 1 = 0 H A : β 1  0 Test Statistic: t = There is sufficient evidence that square footage affects house price From Excel output: Reject H 0 Coefficient s Standard Errort Stat P- value Intercept Square Feet tb1b1 Decision: Conclusion: Reject H 0  /2=.025 -t α/2 Do not reject H 0 0 t α/2  /2= d.f. = 10-2 = 8

Regression Analysis for Description Confidence Interval Estimate of the Slope: Excel Printout for House Prices: At 95% level of confidence, the confidence interval for the slope is (0.0337, ) Coefficient s Standard Errort StatP-valueLower 95% Upper 95% Intercept Square Feet d.f. = n - 2

Regression Analysis for Description Since the units of the house price variable is $1000s, we are 95% confident that the average impact on sales price is between $33.70 and $ per square foot of house size Coefficient s Standard Errort StatP-valueLower 95% Upper 95% Intercept Square Feet This 95% confidence interval does not include 0. Conclusion: There is a significant relationship between house price and square feet at the.05 level of significance

Confidence Interval for the Average y, Given x Confidence interval estimate for the mean of y given a particular x p Size of interval varies according to distance away from mean, x

Confidence Interval for an Individual y, Given x Confidence interval estimate for an Individual value of y given a particular x p This extra term adds to the interval width to reflect the added uncertainty for an individual case

Interval Estimates for Different Values of x y x Prediction Interval for an individual y, given x p x p y = b 0 + b 1 x  x Confidence Interval for the mean of y, given x p

House Price in $1000s (y) Square Feet (x) Estimated Regression Equation: Example: House Prices Predict the price for a house with 2000 square feet

Example: House Prices Predict the price for a house with 2000 square feet: The predicted price for a house with 2000 square feet is ($1,000s) = $317,850 (continued)

Estimation of Mean Values: Example Find the 95% confidence interval for the average price of 2,000 square-foot houses Predicted Price Y i = ($1,000s)  Confidence Interval Estimate for E(y)|x p The confidence interval endpoints are , or from $280, $354,900

Estimation of Individual Values: Example Find the 95% confidence interval for an individual house with 2,000 square feet Predicted Price Y i = ($1,000s)  Prediction Interval Estimate for y|x p The prediction interval endpoints are , or from $215, $420,070