Presentation is loading. Please wait.

Presentation is loading. Please wait.

Correlation and Regression

Similar presentations


Presentation on theme: "Correlation and Regression"— Presentation transcript:

1 Correlation and Regression
9-2 / 9.3 Correlation and Regression

2 Linear Correlation Coefficient r
Definition Linear Correlation Coefficient r measures strength of the linear relationship between paired x and y values in a sample nxy - (x)(y) r = n(x2) - (x) n(y2) - (y)2

3 Formula for b0 and b1 b0 = (y-intercept) b1 = (slope)
(y) (x2) - (x) (xy) b0 = (y-intercept) n(x2) - (x)2 n(xy) - (x) (y) b1 = (slope) n(x2) - (x)2 Encourage the use of calculators for these formulas. Most inexpensive non-graphics calculators will compute these two values after the data has been entered into the calculator.

4 Review Calculations 0.27 2 1.41 3 2.19 2.83 6 4 1.81 0.85 1 3.05 5 Data from the Garbage Project x Plastic (lb) y Household Find the Correlation and the Regression Equation (Line of Best Fit)

5 r = 0.842 Review Calculations b0 = 0.549 b1= 1.48 y = 0.549 + 1.48x
0.27 2 1.41 3 2.19 2.83 6 4 1.81 0.85 1 3.05 5 Data from the Garbage Project x Plastic (lb) y Household Using a calculator: b0 = 0.549 b1= 1.48 y = x r = 0.842

6 Notes on correlation r represents linear correlation coefficient for a sample  (ro) represents linear correlation coefficient for a population -1  r  1 r measures strength of a linear relationship. -1 is perfect negative correlation & 1 is perfect positive correlation

7 Interpreting the Linear Correlation Coefficient
If the absolute value of r exceeds the value in Table A - 6, conclude that there is a significant linear correlation. Otherwise, there is not sufficient evidence to support the conclusion of significant linear correlation. Discussion should be held regarding what value r needs to be in order to have a significant linear correlation.

8 Formal Hypothesis Test
Two methods Both methods let H0: = (no significant linear correlation) H1:  (significant linear correlation)

9 Method 1: Test Statistic is t (follows format of earlier chapters)
n - 2 Critical values: use Table A-3 with degrees of freedom = n - 2 This is the first example where the degrees of freedom for Table A-3 is different from n Special note should be made of this.

10 Method 2: Test Statistic is r
(uses fewer calculations) Test statistic: r Critical values: Refer to Table A-6 (no degrees of freedom) Much easier This method is preferred by some instructors because the calculations are easier.

11 TABLE A-6 Critical Values of the Pearson Correlation Coefficient r
= .05 = .01 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 25 30 35 40 45 50 60 70 80 90 100 .950 .878 .811 .754 .707 .666 .632 .602 .576 .553 .532 .514 .497 .482 .468 .456 .444 .396 .361 .335 .312 .294 .279 .254 .236 .220 .207 .196 .999 .959 .917 .875 .834 .798 .765 .735 .708 .684 .661 .641 .623 .606 .590 .575 .561 .505 .463 .430 .402 .378 .361 .330 .305 .286 .269 .256

12 Is there a significant linear correlation?
0.27 2 1.41 3 2.19 2.83 6 4 1.81 0.85 1 3.05 5 Data from the Garbage Project x Plastic (lb) y Household n =  = H0:  = 0 H1 :  0 Test statistic is r = 0.842 Using Method 2 to solve this problem.

13 Is there a significant linear correlation?
4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 25 30 35 40 45 50 60 70 80 90 100 n .999 .959 .917 .875 .834 .798 .765 .735 .708 .684 .661 .641 .623 .606 .590 .575 .561 .505 .463 .430 .402 .378 .361 .330 .305 .286 .269 .256 .950 .878 .811 .754 .707 .666 .632 .602 .576 .553 .532 .514 .497 .482 .468 .456 .444 .396 .335 .312 .294 .279 .254 .236 .220 .207 .196 = .05 = .01 n =  = H0:  = 0 H1 :  0 Test statistic is r = 0.842 Critical values are r = and 0.707 (Table A-6 with n = 8 and  = 0.05) TABLE A-6 Critical Values of the Pearson Correlation Coefficient r

14 Is there a significant linear correlation?
> 0.707, That is the test statistic does fall within the critical region. Reject = 0 Fail to reject  = 0 Reject = 0 - 1 1 r = r = Sample data: r = 0.842

15 Is there a significant linear correlation?
> 0.707, That is the test statistic does fall within the critical region. Therefore, we REJECT H0:  = 0 (no correlation) and conclude there is a significant linear correlation between the weights of discarded plastic and household size. Reject = 0 Fail to reject  = 0 Reject = 0 - 1 1 r = r = Sample data: r = 0.842

16 Regression Definition y = b0 + b1x + e y = b0 + b1x Regression Model
Regression Equation y = b0 + b1x + e y = b0 + b1x ^ Given a collection of paired data, the regression equation algebraically describes the relationship between the two variables

17 Notation for Regression Equation
Population Parameter Sample Statistic y-intercept of regression equation  b0 Slope of regression equation  b1 Equation of the regression line y = 0 + 1 x + e y = b0 + b1 ^ x

18 Regression Definition Regression Equation y = b0 + b1x Regression Line
Given a collection of paired data, the regression equation y = b0 + b1x ^ algebraically describes the relationship between the two variables Regression Line (line of best fit or least-squares line) is the graph of the regression equation

19 Assumptions & Observations
1. We are investigating only linear relationships. 2. For each x value, y is a random variable having a normal distribution. 3. There are many methods for determining normality. 3. The regression line goes through (x, y)

20 Guidelines for Using The
Regression Equation 1. If there is no significant linear correlation, don’t use the regression equation to make predictions. 2. Stay within the scope of the available sample data when making prediction.

21 Definitions Outlier Influential Points
a point lying far away from the other data points Influential Points points which strongly affect the graph of the regression line The slope b1 in the regression equation represents the marginal change in y that occurs when x changes by one unit.

22 Residuals and the Least-Squares Property
Definitions Residual (error) for a sample of paired (x,y) data, the difference (y - y) between an observed sample y-value and the value of y, which is the value of y that is predicted by using the regression equation. Least-Squares Property A straight line satisfies this property if the sum of the squares of the residuals is the smallest sum possible. ^ ^

23 Residuals and the Least-Squares Property
x ^ y = 5 + 4x y y Residual = 7 2 4 6 8 10 12 14 16 18 20 22 24 26 28 30 32 1 3 5 Residual = 11 Residual = -13 Residual = -5 x

24 Definitions Total Deviation from the mean of the particular point (x, y) the vertical distance y - y, which is the distance between the point (x, y) and the horizontal line passing through the sample mean y Explained Deviation the vertical distance y - y, which is the distance between the predicted y value and the horizontal line passing through the sample mean y Unexplained Deviation the vertical distance y - y, which is the vertical distance between the point (x, y) and the regression line. (The distance y - y is also called a residual, as defined in Section 9-3.) ^ ^ ^

25 Unexplained, Explained, and Total Deviation
y 1 3 5 7 9 11 13 15 17 19 21 23 25 27 29 31 33 35 37 39 (5, 32) Unexplained deviation (y - y) Total deviation (y - y) (5, 25) ^ Explained deviation (y - y) ^ (5, 17) y = 17 y = 5 + 4x ^ x 1 2 3 4 5 6 7 8 9

26 Σ(y - y) 2 = Σ (y - y) 2 + Σ (y - y) 2
(total deviation) = (explained deviation) + (unexplained deviation) (y - y) = (y - y) (y - y) ^ ^ (total variation) = (explained variation) + (unexplained variation) Σ(y - y) 2 = Σ (y - y) Σ (y - y) 2 ^ ^


Download ppt "Correlation and Regression"

Similar presentations


Ads by Google