Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.

Slides:



Advertisements
Similar presentations
Simple Linear Regression Analysis
Advertisements

Multiple Regression and Model Building
11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Lesson 10: Linear Regression and Correlation
Kin 304 Regression Linear Regression Least Sum of Squares
Forecasting Using the Simple Linear Regression Model and Correlation
6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
11 Simple Linear Regression and Correlation CHAPTER OUTLINE
Regression Analysis Module 3. Regression Regression is the attempt to explain the variation in a dependent variable using the variation in independent.
Objectives (BPS chapter 24)
Simple Linear Regression
© 2010 Pearson Prentice Hall. All rights reserved Least Squares Regression Models.
LINEAR REGRESSION: Evaluating Regression Models Overview Assumptions for Linear Regression Evaluating a Regression Model.
Chapter 10 Simple Regression.
Linear Regression and Correlation
SIMPLE LINEAR REGRESSION
Chapter Topics Types of Regression Models
Chapter 11 Multiple Regression.
Simple Linear Regression Analysis
Multiple Regression and Correlation Analysis
SIMPLE LINEAR REGRESSION
© 2000 Prentice-Hall, Inc. Chap Forecasting Using the Simple Linear Regression Model and Correlation.
11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Simple Linear Regression and Correlation
Simple Linear Regression Analysis
Correlation & Regression
Correlation and Linear Regression
Correlation and Linear Regression
McGraw-Hill/IrwinCopyright © 2009 by The McGraw-Hill Companies, Inc. All Rights Reserved. Simple Linear Regression Analysis Chapter 13.
SIMPLE LINEAR REGRESSION
Introduction to Linear Regression and Correlation Analysis
Regression Analysis Regression analysis is a statistical technique that is very useful for exploring the relationships between two or more variables (one.
Chapter 13: Inference in Regression
Chapter 11 Simple Regression
Linear Regression and Correlation
Correlation and Linear Regression
Copyright © 2013, 2010 and 2007 Pearson Education, Inc. Chapter Inference on the Least-Squares Regression Model and Multiple Regression 14.
OPIM 303-Lecture #8 Jose M. Cruz Assistant Professor.
Statistics for Business and Economics 7 th Edition Chapter 11 Simple Regression Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall Ch.
1 1 Slide © 2008 Thomson South-Western. All Rights Reserved Chapter 15 Multiple Regression n Multiple Regression Model n Least Squares Method n Multiple.
McGraw-Hill/Irwin Copyright © 2010 by The McGraw-Hill Companies, Inc. All rights reserved. Chapter 13 Linear Regression and Correlation.
Examining Relationships in Quantitative Research
Multiple Regression and Model Building Chapter 15 Copyright © 2014 by The McGraw-Hill Companies, Inc. All rights reserved.McGraw-Hill/Irwin.
Inference for Regression Simple Linear Regression IPS Chapter 10.1 © 2009 W.H. Freeman and Company.
Chapter 4 Linear Regression 1. Introduction Managerial decisions are often based on the relationship between two or more variables. For example, after.
1 11 Simple Linear Regression and Correlation 11-1 Empirical Models 11-2 Simple Linear Regression 11-3 Properties of the Least Squares Estimators 11-4.
© Copyright McGraw-Hill Correlation and Regression CHAPTER 10.
STA 286 week 131 Inference for the Regression Coefficient Recall, b 0 and b 1 are the estimates of the slope β 1 and intercept β 0 of population regression.
Multiple Regression. Simple Regression in detail Y i = β o + β 1 x i + ε i Where Y => Dependent variable X => Independent variable β o => Model parameter.
Lesson 14 - R Chapter 14 Review. Objectives Summarize the chapter Define the vocabulary used Complete all objectives Successfully answer any of the review.
McGraw-Hill/IrwinCopyright © 2009 by The McGraw-Hill Companies, Inc. All Rights Reserved. Simple Linear Regression Analysis Chapter 13.
Copyright © 2004 by The McGraw-Hill Companies, Inc. All rights reserved.
1 1 Slide The Simple Linear Regression Model n Simple Linear Regression Model y =  0 +  1 x +  n Simple Linear Regression Equation E( y ) =  0 + 
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Multiple Regression Chapter 14.
©The McGraw-Hill Companies, Inc. 2008McGraw-Hill/Irwin Linear Regression and Correlation Chapter 13.
Lecturer: Ing. Martina Hanová, PhD.. Regression analysis Regression analysis is a tool for analyzing relationships between financial variables:  Identify.
Bivariate Regression. Bivariate Regression analyzes the relationship between two variables. Bivariate Regression analyzes the relationship between two.
Correlation and Linear Regression Chapter 13 McGraw-Hill/Irwin Copyright © 2012 by The McGraw-Hill Companies, Inc. All rights reserved.
11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Kin 304 Regression Linear Regression Least Sum of Squares
Multiple Regression and Model Building
BPK 304W Regression Linear Regression Least Sum of Squares
Correlation and Simple Linear Regression
CHAPTER 29: Multiple Regression*
6-1 Introduction To Empirical Models
Correlation and Simple Linear Regression
SIMPLE LINEAR REGRESSION
SIMPLE LINEAR REGRESSION
Correlation and Simple Linear Regression
Presentation transcript:

Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13

13-2 Simple Linear Regression Analysis 13.1The Simple Linear Regression Model and the Least Square Point Estimates 13.2Model Assumptions and the Standard Error 13.3Testing the Significance of Slope and y- Intercept 13.4Confidence and Prediction Intervals 13.5Simple Coefficients of Determination and Correlation

13-3 Simple Linear Regression Analysis Continued 13.6Testing the Significance of the Population Correlation Coefficient (Optional) 13.7An F Test for the Model 13.8The QHIC Case 13.9Residual Analysis (Optional) 13.10Some Shortcut Formulas (Optional)

The Simple Linear Regression Model and the Least Squares Point Estimates The dependent (or response) variable is the variable we wish to understand or predict The independent (or predictor) variable is the variable we will use to understand or predict the dependent variable Regression analysis is a statistical technique that uses observed data to relate the dependent variable to one or more independent variables The objective is to build a regression model that can describe, predict and control the dependent variable based on the independent variable LO 1: Explain the simple linear regression model.

13-5 The Least Squares Point Estimates Estimation/prediction equation y ̂ = b 0 + b 1 x Least squares point estimate of the slope β 1 Least squares point estimate of the y-intercept  0 LO 2: Find the least squares point estimates of the slope and y- intercept.

Model Assumptions and the Standard Error 1. Mean of Zero At any given value of x, the population of potential error term values has a mean equal to zero 2. Constant Variance Assumption At any given value of x, the population of potential error term values has a variance that does not depend on the value of x 3. Normality Assumption At any given value of x, the population of potential error term values has a normal distribution 4. Independence Assumption Any one value of the error term ε is statistically independent of any other value of ε LO 3: Describe the assumptions behind simple linear regression and calculate the standard error.

13-7 Sum of Squares Sum of squared errors Mean square error Point estimate of the residual variance σ 2 Standard error Point estimate of residual standard deviation σ LO3

Testing the Significance of the Slope and y-Intercept A regression model is not likely to be useful unless there is a significant relationship between x and y To test significance, we use the null hypothesis: H 0 : β 1 = 0 Versus the alternative hypothesis: H a : β 1 ≠ 0 LO 4: Test the significance of the slope and y-intercept.

13-9 Testing the Significance of the Slope #2 AlternativeReject H 0 Ifp-Value H a : β 1 > 0t > t α Area under t distribution right of t H a : β 1 < 0t < –t α Area under t distribution left of t H a : β 1 ≠ 0|t| > t α/2 * Twice area under t distribution right of |t| * That is t > t α/2 or t < –t α/2 LO3

Confidence and Prediction Intervals The point on the regression line corresponding to a particular value of x 0 of the independent variable x is y ̂ = b 0 + b 1 x 0 It is unlikely that this value will equal the mean value of y when x equals x 0 Therefore, we need to place bounds on how far the predicted value might be from the actual value We can do this by calculating a confidence interval mean for the value of y and a prediction interval for an individual value of y LO 5: Calculate and interpret a confidence interval for a mean value and a prediction interval for an individual value.

Simple Coefficient of Determination and Correlation How useful is a particular regression model? One measure of usefulness is the simple coefficient of determination It is represented by the symbol r 2 This section may be read anytime after reading Section 13.1 LO 6: Calculate and interpret the simple coefficients of determination and correlation.

Testing the Significance of the Population Correlation Coefficient (Optional) The simple correlation coefficient (r) measures the linear relationship between the observed values of x and y from the sample The population correlation coefficient (ρ) measures the linear relationship between all possible combinations of observed values of x and y r is an estimate of ρ LO 7: Test hypotheses about the population correlation coefficient (optional).

An F Test for Model For simple regression, this is another way to test the null hypothesis H 0 : β 1 = 0 This is the only test we will use for multiple regression The F test tests the significance of the overall regression relationship between x and y LO 8: Test the significance of a simple linear regression model by using an F test.

Residual Analysis (Optional) Checks of regression assumptions are performed by analyzing the regression residuals Residuals (e) are defined as the difference between the observed value of y and the predicted value of y, e = y - y ̂ Note that e is the point estimate of ε If regression assumptions valid, the population of potential error terms will be normally distributed with mean zero and variance σ 2 Different error terms will be statistically independent LO 9: Use residual analysis to check the assumptions of simple linear regression (optional).

Some Shortcut Formulas (Optional) where