Inference for regression - More details about simple linear regression IPS chapter 10.2 © 2006 W.H. Freeman and Company.

Slides:



Advertisements
Similar presentations
11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Advertisements

Topic 12: Multiple Linear Regression
 Population multiple regression model  Data for multiple regression  Multiple linear regression model  Confidence intervals and significance tests.
Chapter 12 Simple Linear Regression
Inference for Regression
11 Simple Linear Regression and Correlation CHAPTER OUTLINE
Regression Analysis Module 3. Regression Regression is the attempt to explain the variation in a dependent variable using the variation in independent.
Linear regression models
Objectives (BPS chapter 24)
© 2010 Pearson Prentice Hall. All rights reserved Least Squares Regression Models.
To accompany Quantitative Analysis for Management, 9e by Render/Stair/Hanna 4-1 © 2006 by Prentice Hall, Inc., Upper Saddle River, NJ Chapter 4 RegressionModels.
Chapter 10 Simple Regression.
Chapter Topics Types of Regression Models
Simple Linear Regression Analysis
REGRESSION AND CORRELATION
Introduction to Probability and Statistics Linear Regression and Correlation.
11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Simple Linear Regression and Correlation
Chapter 7 Forecasting with Simple Regression
Copyright ©2006 Brooks/Cole, a division of Thomson Learning, Inc. More About Regression Chapter 14.
Chapter 12 Section 1 Inference for Linear Regression.
Simple Linear Regression Analysis
Descriptive measures of the strength of a linear association r-squared and the (Pearson) correlation coefficient r.
Lecture 5 Correlation and Regression
Introduction to Linear Regression and Correlation Analysis
Inference for regression - Simple linear regression
Simple Linear Regression Models
M23- Residuals & Minitab 1  Department of ISM, University of Alabama, ResidualsResiduals A continuation of regression analysis.
OPIM 303-Lecture #8 Jose M. Cruz Assistant Professor.
INTRODUCTORY LINEAR REGRESSION SIMPLE LINEAR REGRESSION - Curve fitting - Inferences about estimated parameter - Adequacy of the models - Linear.
+ Chapter 12: Inference for Regression Inference for Linear Regression.
Introduction to Probability and Statistics Chapter 12 Linear Regression and Correlation.
Business Statistics: A First Course, 5e © 2009 Prentice-Hall, Inc. Chap 12-1 Correlation and Regression.
Introduction to Linear Regression
Multiple regression - Inference for multiple regression - A case study IPS chapters 11.1 and 11.2 © 2006 W.H. Freeman and Company.
Copyright © Cengage Learning. All rights reserved. 12 Simple Linear Regression and Correlation
Chapter 11 Linear Regression Straight Lines, Least-Squares and More Chapter 11A Can you pick out the straight lines and find the least-square?
Introduction to Probability and Statistics Thirteenth Edition Chapter 12 Linear Regression and Correlation.
An alternative approach to testing for a linear association The Analysis of Variance (ANOVA) Table.
MBP1010H – Lecture 4: March 26, Multiple regression 2.Survival analysis Reading: Introduction to the Practice of Statistics: Chapters 2, 10 and 11.
Lecture 8 Simple Linear Regression (cont.). Section Objectives: Statistical model for linear regression Data for simple linear regression Estimation.
Inference for Regression Simple Linear Regression IPS Chapter 10.1 © 2009 W.H. Freeman and Company.
Copyright ©2011 Nelson Education Limited Linear Regression and Correlation CHAPTER 12.
+ Chapter 12: More About Regression Section 12.1 Inference for Linear Regression.
1 11 Simple Linear Regression and Correlation 11-1 Empirical Models 11-2 Simple Linear Regression 11-3 Properties of the Least Squares Estimators 11-4.
Multiple Regression BPS chapter 28 © 2006 W.H. Freeman and Company.
Inference for regression - More details about simple linear regression IPS chapter 10.2 © 2006 W.H. Freeman and Company.
Simple Linear Regression ANOVA for regression (10.2)
STA 286 week 131 Inference for the Regression Coefficient Recall, b 0 and b 1 are the estimates of the slope β 1 and intercept β 0 of population regression.
Lecture 10 Chapter 23. Inference for regression. Objectives (PSLS Chapter 23) Inference for regression (NHST Regression Inference Award)[B level award]
Multiple Regression I 1 Copyright © 2005 Brooks/Cole, a division of Thomson Learning, Inc. Chapter 4 Multiple Regression Analysis (Part 1) Terry Dielman.
Chapter 10 Inference for Regression
Regression Analysis. 1. To comprehend the nature of correlation analysis. 2. To understand bivariate regression analysis. 3. To become aware of the coefficient.
Copyright © 2004 by The McGraw-Hill Companies, Inc. All rights reserved.
Regression. Height Weight How much would an adult female weigh if she were 5 feet tall? She could weigh varying amounts – in other words, there is a distribution.
Lecture 10 Introduction to Linear Regression and Correlation Analysis.
Chapter 26: Inference for Slope. Height Weight How much would an adult female weigh if she were 5 feet tall? She could weigh varying amounts – in other.
The “Big Picture” (from Heath 1995). Simple Linear Regression.
Statistics for Business and Economics Module 2: Regression and time series analysis Spring 2010 Lecture 4: Inference about regression Priyantha Wijayatunga,
Analysis of variance approach to regression analysis … an (alternative) approach to testing for a linear association.
23. Inference for regression
The simple linear regression model and parameter estimation
Chapter 20 Linear and Multiple Regression
Inference for Least Squares Lines
11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
The Practice of Statistics in the Life Sciences Fourth Edition
Inference for Regression Lines
CHAPTER 29: Multiple Regression*
Prepared by Lee Revere and John Large
Presentation transcript:

Inference for regression - More details about simple linear regression IPS chapter 10.2 © 2006 W.H. Freeman and Company

Objectives (IPS chapter 10.2) Inference for regression—more details  Analysis of variance for regression  Calculations for regression inference  Inference for correlation

Analysis of variance for regression The regression model is: Data = fit + residual y i = (  0 +  1 x i ) + (  i ) where the  i are independent and normally distributed N(0,  ), and  is the same for all values of x. It resembles an ANOVA, which also assumes equal variance, where SST = SS model + SS error and DFT = DF model + DF error

For a simple linear relationship, the ANOVA tests the hypotheses H 0 : β 1 = 0 versus H a : β 1 ≠ 0 by comparing MSM (model) to MSE (error): F = MSM/MSE When H 0 is true, F follows the F(1, n − 2) distribution. The p-value is P(> F). The ANOVA test and the two-sided t-test for H 0 : β 1 = 0 yield the same p-value. Software output for regression may provide t, F, or both, along with the p-value.

ANOVA table SourceSum of squares SSDFMean square MSFP-value Model1SSG/DFGMSG/MSETail area above F Errorn − 2SSE/DFE Totaln − 1 SST = SSM + SSE DFT = DFM + DFE The standard deviation of the sampling distribution, s, for n sample data points is calculated from the residuals e i = y i – ŷ i s is an unbiased estimate of the regression standard deviation σ.

Coefficient of determination, r 2 The coefficient of determination, r 2, square of the correlation coefficient, is the percentage of the variance in y (vertical scatter from the regression line) that can be explained by changes in x. r 2 = variation in y caused by x (i.e., the regression line) total variation in observed y values around the mean

What is the relationship between the average speed a car is driven and its fuel efficiency? We plot fuel efficiency (in miles per gallon, MPG) against average speed (in miles per hour, MPH) for a random sample of 60 cars. The relationship is curved. When speed is log transformed (log of miles per hour, LOGMPH) the new scatterplot shows a positive, linear relationship.

Calculations for regression inference To estimate the parameters of the regression, we calculate the standard errors for the estimated regression coefficients. The standard error of the least-squares slope β 1 is: The standard error of the intercept β 0 is:

To estimate or predict future responses, we calculate the following standard errors The standard error of the mean response µ y is: The standard error for predicting an individual response ŷ is:

1918 flu epidemics The line graph suggests that about 7 to 8% of those diagnosed with the flu died within about a week of diagnosis. We look at the relationship between the number of deaths in a given week and the number of new diagnosed cases one week earlier.

MINITAB - Regression Analysis: FluDeaths1 versus FluCases0 The regression equation is FluDeaths1 = FluCases0 Predictor Coef SE Coef T P Constant FluCases S = R-Sq = 83.0% R-Sq(adj) = 81.8% Analysis of Variance Source DF SS MS F P Regression Residual Error Total P-value for H 0 : β = 0; H a : β ≠ 0 r = flu epidemic: Relationship between the number of deaths in a given week and the number of new diagnosed cases one week earlier. r 2 = SSM / SST SSM SST

Inference for correlation To test for the null hypothesis of no linear association, we have the choice of also using the correlation parameter ρ.  When x is clearly the explanatory variable, this test is equivalent to testing the hypothesis H 0 : β = 0.  When there is no clear explanatory variable (e.g., arm length vs. leg length), a regression of x on y is not any more legitimate than one of y on x. In that case, the correlation test of significance should be used. Technically, in that case, the test is a test of independence much like we saw in an earlier chapter on contingency tables.

The test of significance for ρ uses the one-sample t-test for: H 0 : ρ  = 0. We compute the t statistics for sample size n and correlation coefficient r. This calculation turns out To be identical to the t- statistic based on slope