Linear Regression Analysis Additional Reference: Applied Linear Regression Models – Neter, Kutner, Nachtsheim, Wasserman The lecture notes of Dr. Thomas.

Slides:



Advertisements
Similar presentations
11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Advertisements

Chapter 12 Inference for Linear Regression
Lesson 10: Linear Regression and Correlation
Chapter 12 Simple Linear Regression
6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
Probabilistic & Statistical Techniques Eng. Tamer Eshtawi First Semester Eng. Tamer Eshtawi First Semester
Simple Linear Regression. G. Baker, Department of Statistics University of South Carolina; Slide 2 Relationship Between Two Quantitative Variables If.
Simple Linear Regression
9. SIMPLE LINEAR REGESSION AND CORRELATION
SIMPLE LINEAR REGRESSION
1 Simple Linear Regression Chapter Introduction In this chapter we examine the relationship among interval variables via a mathematical equation.
Simple Linear Regression Analysis
Introduction to Probability and Statistics Linear Regression and Correlation.
Lecture 17 Interaction Plots Simple Linear Regression (Chapter ) Homework 4 due Friday. JMP instructions for question are actually for.
Business Statistics - QBM117 Statistical inference for regression.
Correlation and Regression Analysis
Simple Linear Regression and Correlation
Simple Linear Regression Analysis
Simple Linear Regression. Introduction In Chapters 17 to 19, we examine the relationship between interval variables via a mathematical equation. The motivation.
Linear Regression/Correlation
1 Simple Linear Regression 1. review of least squares procedure 2. inference for least squares lines.
1 1 Slide Simple Linear Regression Chapter 14 BA 303 – Spring 2011.
Lecture 5 Correlation and Regression
Correlation and Linear Regression
Correlation and Linear Regression
McGraw-Hill/Irwin Copyright © 2010 by The McGraw-Hill Companies, Inc. All rights reserved. Chapter 13 Linear Regression and Correlation.
Marketing Research Aaker, Kumar, Day and Leone Tenth Edition
Introduction to Linear Regression and Correlation Analysis
Linear Regression and Correlation
Simple Linear Regression
CPE 619 Simple Linear Regression Models Aleksandar Milenković The LaCASA Laboratory Electrical and Computer Engineering Department The University of Alabama.
Simple Linear Regression Models
1 Least squares procedure Inference for least squares lines Simple Linear Regression.
1 1 Slide © 2008 Thomson South-Western. All Rights Reserved Chapter 15 Multiple Regression n Multiple Regression Model n Least Squares Method n Multiple.
+ Chapter 12: Inference for Regression Inference for Linear Regression.
McGraw-Hill/Irwin Copyright © 2010 by The McGraw-Hill Companies, Inc. All rights reserved. Chapter 13 Linear Regression and Correlation.
1 Chapter 12 Simple Linear Regression. 2 Chapter Outline  Simple Linear Regression Model  Least Squares Method  Coefficient of Determination  Model.
Go to Table of Content Single Variable Regression Farrokh Alemi, Ph.D. Kashif Haqqi M.D.
Chapter 14 Inference for Regression AP Statistics 14.1 – Inference about the Model 14.2 – Predictions and Conditions.
Inference for Regression Simple Linear Regression IPS Chapter 10.1 © 2009 W.H. Freeman and Company.
Ch14: Linear Least Squares 14.1: INTRO: Fitting a pth-order polynomial will require finding (p+1) coefficients from the data. Thus, a straight line (p=1)
Correlation & Regression Analysis
Economics 173 Business Statistics Lecture 10 Fall, 2001 Professor J. Petry
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.
Linear Regression Linear Regression. Copyright © 2005 Brooks/Cole, a division of Thomson Learning, Inc. Purpose Understand Linear Regression. Use R functions.
1 1 Slide The Simple Linear Regression Model n Simple Linear Regression Model y =  0 +  1 x +  n Simple Linear Regression Equation E( y ) =  0 + 
11-1 Copyright © 2014, 2011, and 2008 Pearson Education, Inc.
Statistics 350 Lecture 2. Today Last Day: Section Today: Section 1.6 Homework #1: Chapter 1 Problems (page 33-38): 2, 5, 6, 7, 22, 26, 33, 34,
Simple linear regression. What is simple linear regression? A way of evaluating the relationship between two continuous variables. One variable is regarded.
STA302/1001 week 11 Regression Models - Introduction In regression models, two types of variables that are studied:  A dependent variable, Y, also called.
Chapter 13 Linear Regression and Correlation. Our Objectives  Draw a scatter diagram.  Understand and interpret the terms dependent and independent.
The simple linear regression model and parameter estimation
11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Chapter 11: Simple Linear Regression
Ch12.1 Simple Linear Regression
CHAPTER 12 More About Regression
Simple Linear Regression - Introduction
Linear Regression/Correlation
Regression Models - Introduction
Simple Linear Regression
Regression Lecture-5 Additional chapters of mathematics
SIMPLE LINEAR REGRESSION
CHAPTER 12 More About Regression
Simple Linear Regression
Product moment correlation
Simple Linear Regression
Chapter 14 Inference for Regression
CHAPTER 12 More About Regression
Regression Models - Introduction
Presentation transcript:

Linear Regression Analysis Additional Reference: Applied Linear Regression Models – Neter, Kutner, Nachtsheim, Wasserman The lecture notes of Dr. Thomas Wherly and Dr. Fred Dahm aided in the preparation of chapter 12 material

Linear Regression Analysis (ch. 12) Observe a response Y and one or more predictors x. Formulate a model that relates the mean response E(Y) to x. Y – Dependent Variable x – Independent Variable

Deterministic Model Y = f(x) ; Once we know the value of x, the value of Y is completely satisfied Simplest (Straight Line)Model: Y=  o +  1 x  1 = Slope of the Line  o = Y-intercept of the Line

Probabilistic Model Y = f(x) +  ; The value of Y is a R.V. Model for Simple Linear Regression: Y i =  o +  1 x i +  i, i=1,..,n Y 1,…,Y n – Observed Value of the Response x 1,…,x n – Observed Value of Predictor  o,  1 – Unknown Parameters to be Estimated from the Data  1,…,  n – Unknown Random Error Terms – Usually iid N(0,  2 ) Random Variables

Interpretation of Model For each value of x, the observed Y will fall above or below the line Y =  o +  1 x according to the error term . For each fixed x Y~N(  o +  1 x,  2 )

Questions 1.How do we estimate  o,  1, and  2 ? 2.Does the proposed model fit the data well? 3.Are the assumptions satisfied?

Plotting the Data A scatter plot of the data is a useful first step for checking whether a linear relationship is plausible.

Example (12.4) A study to assess the capability of subsurface flow wetland systems to remove biochemical oxygen demand and other various chemical constituents resulted in the following scatter plot of the data where x = BOD mass loading and y = BOD mass removal. Does the plot suggest a linear relationship?

Example (12.5) An experiment conducted to investigate the stretchability of mozzarella cheese with temperature resulted in the following scatter plot where x = temperature and y = % elongation at failure. Does the scatter plot suggest a linear relationship?

Estimating  o and  1 Consider an arbitrary line y = b 0 + b 1 x drawn through a scatter plot. We want the line to be as close to the points in the scatter plot as possible. The vertical distance from (x,y) to the corresponding point on the line (x,b 0 + b 1 x) is y-(b 0 + b 1 x).

Possible Estimation Criteria Eyeball Method L 1 Estimation - Choose  o,  1 to minimize  y i -  o x -  1 x i  Least Squares Estimation - Choose  o,  1 to minimize  (y i -  o -  1 x i ) 2 * We use Least Squares Estimation in practice since it is difficult to mathematically manipulate the other options*

Least Squares Estimation Take derivatives with respect to b 0 and b 1, and set equal to zero. This results in the “normal equations” (based on right angles – not the Normal distribution)

Formulas for Least Squares Estimates Solving for b 0 and b 1 results in the L.S. estimates

Example (12.12) Refer to the previous example (12.4). Obtain the expression for the Least Squares line

Estimating  2 Residual = Observed – Predicted Recall the definition of sample variance

Estimating  2 Cont’d The minimum value of the squared deviation is D =  (y i -  o x -  1 x i ) 2 =  (y i - ) 2 = SSE Divide the SSE by it’s degrees of freedom (n-2) to estimate  2

Example (12.12) Cont’d Predict the value of BOD mass removal when BOD loading is 35. Calculate the residual. Calculate the SSE and a point estimate of  2