Presentation is loading. Please wait.

Presentation is loading. Please wait.

Inference for Regression

Similar presentations


Presentation on theme: "Inference for Regression"— Presentation transcript:

1 Inference for Regression
4/22/2017 5:24 AM Inference for Regression Course: AP Statistics Chapter: 27 Book: Stats: Modeling the World Authors: BVD (2nd edition) © 2007 Microsoft Corporation. All rights reserved. Microsoft, Windows, Windows Vista and other product names are or may be registered trademarks and/or trademarks in the U.S. and/or other countries. The information herein is for informational purposes only and represents the current view of Microsoft Corporation as of the date of this presentation. Because Microsoft must respond to changing market conditions, it should not be interpreted to be a commitment on the part of Microsoft, and Microsoft cannot guarantee the accuracy of any information provided after the date of this presentation. MICROSOFT MAKES NO WARRANTIES, EXPRESS, IMPLIED OR STATUTORY, AS TO THE INFORMATION IN THIS PRESENTATION.

2 Categorical Variables: Use Chi-Squared Procedures
Inference for: Categorical Variables: Use Chi-Squared Procedures Quantitative Variables: Use LinearRegression

3 Regression reminders Regression Line: Predicted value of y

4 Regression reminders Regression Line: x is the explanatory variable
y is the response variable

5 Regression reminders Regression Line: residual = Predicted value of y
Actual value of y

6 So…what’s new?? Regression Line:
Now, in Chapter 27, the regression line we find represents a SAMPLE of some given data. It’s the best-fit line for that sample, so the slope and y-intercept we have found are the statistics for that line. BIG QUESTION: What are the slope and y-intercepts for the POPULATION regression line?

7 Chapter 27 Regression Line:
We are going to use our SAMPLE statistics to find our POPULATION statistics (or, at least, we’ll get as close as we can). What are these called??

8 Population Statistics
Regression Line: = Sample slope = Population slope = Sample y-intercept = Population y-intercept

9 Population Regression Line
Sample Regression Line: Population Regression Line:

10 Population Regression Line
So.. We have two statistics we need to find. What do we do? A) First find the slope and then find the y-intercept. What model will we use? Student’s t-curve, with degree of freedom = df = n-2 (because we have both x values and y values to consider)

11 Confidence Interval (for slope)
How do we find the Standard Error??

12 Confidence Interval (for slope)
How do we find the Standard Error?? We don’t! We’ll let our calculator (or a computer printout) give it to us.

13 Confidence Interval (for slope)
Really? We don’t care about the Standard Error for the slope? Well….actually, we care a little. It depends on three things: 1) The spread of the residuals -more about this later! The spread of the x-values 3) The sample size (n)

14 Confidence Interval (for slope)
Let’s find the Standard Error. Ready to try?? Here is a sample computer printout.

15 Confidence Interval (for slope)
Help! That’s too confusing. What do I need?

16 Confidence Interval (for slope)
The Constant you see is the value of

17 Confidence Interval (for slope)
Age is the name of x and the slope

18 Confidence Interval (for slope)
Age is the name of x and the slope

19 Confidence Interval (for slope)
Income is the name of y (the response variable)

20 Confidence Interval (for slope)
The degree of freedom is given ….. df = 25

21 Confidence Interval (for slope)
And so is the Standard Error for the slope! 337.7

22 Confidence Interval (for slope)
The equation of the regression line would be:

23 Confidence Interval (for slope)
Wait a second….that’s chapter 8. We’re in Ch We want to find the Confidence Interval for Slope!

24 Confidence Interval (for slope)

25 Confidence Interval (for slope)
I am 95% confident that the true slope of the regression line is between and But….that’s not in context! We need to state this in context…..

26 Confidence Interval (for slope)
I am 95% confident that the true change in the amount of income for 1 year increase in age is between $ lost and $ gained.

27 Hypothesis Testing for Slope
That’s great, but what about Hypothesis Testing? What would the Null Hypothesis be?

28 Hypothesis Testing for Slope
Remember, Null means Nothing. Or…no change. Therefore, for each increase in x there must be no change in y. That means the slope must be zero. Or….

29 Hypothesis Testing for Slope
Hypotheses 2-tailed

30 Hypothesis Testing for Slope
What about the t-score? The P-Value? Easy! Everything is based on the student’s t-curve, so the mechanics are the same….

31 Hypothesis Testing for Slope
For our line, it would be:

32 Hypothesis Testing for Slope
Hey…is that value in the computer printout??

33 Hypothesis Testing for Slope
The P-Value is also the same:

34 Hypothesis Testing for Slope
With a P-value of .4763, which is large compared to an alpha level of .05, I will fail to reject the null and conclude that there is no evidence to suggest the true slope is different than zero.

35

36 Conditions & Assumptions
What about the conditions and assumptions??? We skipped them…. And …… THAT’S BAD!

37 Conditions & Assumptions
There are 4 of them to satisfy. Linearity assumption The scatterplot of the data should be “roughly linear”. We show this two ways and we have done both before! 1) Graph the scatterplot and look at it. Does it look straight? 2) Graph the residuals against the x-variable. It should be randomly scattered. If this condition fails then straighten the data (see Ch. 9)

38 Conditions & Assumptions
Here is a scatterplot comparing waist size to body fat percentage.

39 Conditions & Assumptions
Here is a scatterplot of the residuals plotted against the x-value (waist size).

40 Conditions & Assumptions
2) Independence Assumption The next three are a little tricky. That’s only because we need to understand what is happening with inference on regression lines. Here’s the situation: When you have a sample of data and you find the sample regression line for that data you are fitting the line that best fits (or passes through) the y-values that you have plotted at each x-value. Here is an example:

41 Conditions & Assumptions
Here is the scatterplot again (comparing waist size to body fat percentage).

42 Conditions & Assumptions
At each x-value there are multiple y-values that spread out around the line.

43 Conditions & Assumptions
For the true regression line, the y-values at each x-value should each be nearly normal. In fact, notice that the true regression line passes through the mean of each set of y-values…..

44 Conditions & Assumptions
That means the true regression line can be thought of as: and the residuals would be: Notice that this is the same as

45 Conditions & Assumptions
So..the residuals are really what we care about here, not the y-values…. If the residuals for a regression model make sense then so will the y-values.

46 Conditions & Assumptions
2) Independence Assumption Okay, back to #2. We now know the residuals (errors) are what we care about here. For #2 we want these to be independent for a given sample. If the sample was collected randomly, we are fine. Just state that the data can be assumed to be independent because the sample was random. You have no reason to believe that any y-value (or residual) has any impact on another one. Easy!

47 Conditions & Assumptions
2) Independence Assumption Wait…didn’t you say this was hard? Well, it can be. If you are graphing a time plot (x represents time) the y-values might not be independent. Now you need to check the residuals. So…we graph them against the x-values (you already did this!) and see what we get. It should be a random scatter. Any pattern will show there is some sort of relationship which indicates a lack of independence. Moving on….

48 Conditions & Assumptions
3) Equal Variance Assumption Okay…this one is a little tricky. But, that’s only because you don’t know WHY we are checking for it. Let’s stop and figure that out first. The best thing to do is to go once more to that image of normal models along the line….

49 Conditions & Assumptions
3) Equal Variance Assumption What we want is for the spread of each set of y-values to be roughly the same. Remember, we care about residuals, so what this means is that we want to Standard Deviation of the residuals to be uniform. That means the residuals should be the same throughout.

50 Conditions & Assumptions
3) Equal Variance Assumption That means we want the spread of each set of y-values to be roughly the same. Remember, we care about residuals, so what this means is that we want the Standard Deviation of the residuals to be uniform. Huh? Well, it means should not fan out, or clump together. The spread about the line should be the same (constant) throughout. This is called the, “DOES THE PLOT THICKEN?” Condition. How do we check for this? Residuals again. If the plot does fan out, it will show up in the residual plot against y. Here it is:

51 Conditions & Assumptions
3) Equal Variance Assumption Randomly scattered residuals! y-values (body fat percentage)

52 Conditions & Assumptions
4) Normal Population Assumption We’ve already looked at the residuals and seen that we want them to be nearly normal at each x-value. This is important so we can use the Student t-curve in the mechanics section. How do we check this? Group all the residuals together (they are sitting in the Resid List, waiting for you!) and graph them to see if they are nearly normal. Graph them? To check for nearly normal? HOW????

53 Conditions & Assumptions
4) Normal Population Assumption HOW? You know how! We’ve done this before!! Graph them as a histogram and check for unimodal and symmetric. What about the normal probability plot? Should we do that as well?

54 Conditions & Assumptions
4) Normal Population Assumption HOW? You know how! We’ve done this before!! Graph them as a histogram and check for unimodal and symmetric. What about the normal probability plot? Should we do that as well? Not this time! Phew!!

55 Practice Problem! Let’s try one that is done on the calculator instead of the computer printout. The best part of this is that Linear Regression for slope is never done by yourself. All of the calculations are either given by the calculator or by the computer or are really easy. And…one more thing…..we rarely do regression for the y-intercept. It usually isn’t a value we care about!


Download ppt "Inference for Regression"

Similar presentations


Ads by Google