11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.

Slides:



Advertisements
Similar presentations
11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Advertisements

Forecasting Using the Simple Linear Regression Model and Correlation
Inference for Regression
CmpE 104 SOFTWARE STATISTICAL TOOLS & METHODS MEASURING & ESTIMATING SOFTWARE SIZE AND RESOURCE & SCHEDULE ESTIMATING.
Probability & Statistical Inference Lecture 9
12-1 Multiple Linear Regression Models Introduction Many applications of regression analysis involve situations in which there are more than.
6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
12 Multiple Linear Regression CHAPTER OUTLINE
11 Simple Linear Regression and Correlation CHAPTER OUTLINE
Linear regression models
Objectives (BPS chapter 24)
1 Chapter 2 Simple Linear Regression Ray-Bing Chen Institute of Statistics National University of Kaohsiung.
Chapter 12 Simple Linear Regression
© 2010 Pearson Prentice Hall. All rights reserved Least Squares Regression Models.
Chapter 10 Simple Regression.
1 Chapter 3 Multiple Linear Regression Ray-Bing Chen Institute of Statistics National University of Kaohsiung.
Chapter 13 Introduction to Linear Regression and Correlation Analysis
Fall 2006 – Fundamentals of Business Statistics 1 Chapter 13 Introduction to Linear Regression and Correlation Analysis.
SIMPLE LINEAR REGRESSION
Linear Regression and Correlation Analysis
13-1 Designing Engineering Experiments Every experiment involves a sequence of activities: Conjecture – the original hypothesis that motivates the.
Simple Linear Regression Analysis
REGRESSION AND CORRELATION
Introduction to Probability and Statistics Linear Regression and Correlation.
SIMPLE LINEAR REGRESSION
© 2000 Prentice-Hall, Inc. Chap Forecasting Using the Simple Linear Regression Model and Correlation.
Simple Linear Regression Analysis
13 Design and Analysis of Single-Factor Experiments:
1 1 Slide © 2008 Thomson South-Western. All Rights Reserved Slides by JOHN LOUCKS & Updated by SPIROS VELIANITIS.
Lecture 5 Correlation and Regression
Correlation & Regression
Regression and Correlation Methods Judy Zhong Ph.D.
SIMPLE LINEAR REGRESSION
Introduction to Linear Regression and Correlation Analysis
Regression Analysis Regression analysis is a statistical technique that is very useful for exploring the relationships between two or more variables (one.
Simple Linear Regression Models
1 1 Slide © 2005 Thomson/South-Western Slides Prepared by JOHN S. LOUCKS St. Edward’s University Slides Prepared by JOHN S. LOUCKS St. Edward’s University.
OPIM 303-Lecture #8 Jose M. Cruz Assistant Professor.
Introduction to Linear Regression
1 1 Slide © 2014 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole.
Chapter 11 Linear Regression Straight Lines, Least-Squares and More Chapter 11A Can you pick out the straight lines and find the least-square?
Basic Concepts of Correlation. Definition A correlation exists between two variables when the values of one are somehow associated with the values of.
Inference for Regression Simple Linear Regression IPS Chapter 10.1 © 2009 W.H. Freeman and Company.
Chapter 4 Linear Regression 1. Introduction Managerial decisions are often based on the relationship between two or more variables. For example, after.
6-1 Introduction To Empirical Models Based on the scatter diagram, it is probably reasonable to assume that the mean of the random variable Y is.
1 11 Simple Linear Regression and Correlation 11-1 Empirical Models 11-2 Simple Linear Regression 11-3 Properties of the Least Squares Estimators 11-4.
© Copyright McGraw-Hill Correlation and Regression CHAPTER 10.
MARKETING RESEARCH CHAPTER 18 :Correlation and Regression.
Business Statistics: A Decision-Making Approach, 6e © 2005 Prentice-Hall, Inc. Chap 13-1 Introduction to Regression Analysis Regression analysis is used.
STA 286 week 131 Inference for the Regression Coefficient Recall, b 0 and b 1 are the estimates of the slope β 1 and intercept β 0 of population regression.
VI. Regression Analysis A. Simple Linear Regression 1. Scatter Plots Regression analysis is best taught via an example. Pencil lead is a ceramic material.
Regression Analysis © 2007 Prentice Hall17-1. © 2007 Prentice Hall17-2 Chapter Outline 1) Correlations 2) Bivariate Regression 3) Statistics Associated.
Correlation & Regression Analysis
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.
McGraw-Hill/IrwinCopyright © 2009 by The McGraw-Hill Companies, Inc. All Rights Reserved. Simple Linear Regression Analysis Chapter 13.
Chapter 12 Simple Linear Regression n Simple Linear Regression Model n Least Squares Method n Coefficient of Determination n Model Assumptions n Testing.
1 1 Slide The Simple Linear Regression Model n Simple Linear Regression Model y =  0 +  1 x +  n Simple Linear Regression Equation E( y ) =  0 + 
Regression Analysis Presentation 13. Regression In Chapter 15, we looked at associations between two categorical variables. We will now focus on relationships.
Chapter Outline EMPIRICAL MODELS 11-2 SIMPLE LINEAR REGRESSION 11-3 PROPERTIES OF THE LEAST SQUARES ESTIMATORS 11-4 SOME COMMENTS ON USES OF REGRESSION.
Applied Statistics and Probability for Engineers
Regression Analysis AGEC 784.
11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
6-1 Introduction To Empirical Models
Simple Linear Regression
Product moment correlation
SIMPLE LINEAR REGRESSION
St. Edward’s University
Correlation and Simple Linear Regression
Correlation and Simple Linear Regression
Presentation transcript:

11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis is a statistical technique that is very useful for these types of problems. For example, in a chemical process, suppose that the yield of the product is related to the process-operating temperature. Regression analysis can be used to build a model to predict yield at a given temperature level.

11-1 Empirical Models

Figure 11-1 Scatter Diagram of oxygen purity versus hydrocarbon level from Table 11-1.

11-1 Empirical Models Based on the scatter diagram, it is probably reasonable to assume that the mean of the random variable Y is related to x by the following straight-line relationship: where the slope and intercept of the line are called regression coefficients. The simple linear regression model is given by where  is the random error term.

11-1 Empirical Models We think of the regression model as an empirical model. Suppose that the mean and variance of  are 0 and  2, respectively, then The variance of Y given x is

11-1 Empirical Models The true regression model is a line of mean values: where  1 can be interpreted as the change in the mean of Y for a unit change in x. Also, the variability of Y at a particular value of x is determined by the error variance,  2. This implies there is a distribution of Y-values at each x and that the variance of this distribution is the same at each x.

11-1 Empirical Models Figure 11-2 The distribution of Y for a given value of x for the oxygen purity-hydrocarbon data.

11-2 Simple Linear Regression The case of simple linear regression considers a single regressor or predictor x and a dependent or response variable Y. The expected value of Y at each level of x is a random variable: We assume that each observation, Y, can be described by the model

11-2 Simple Linear Regression Suppose that we have n pairs of observations (x 1, y 1 ), (x 2, y 2 ), …, (x n, y n ). Figure 11-3 Deviations of the data from the estimated regression model.

11-2 Simple Linear Regression The method of least squares is used to estimate the parameters,  0 and  1 by minimizing the sum of the squares of the vertical deviations in Figure Figure 11-3 Deviations of the data from the estimated regression model.

11-2 Simple Linear Regression Using Equation 11-2, t he n observations in the sample can be expressed as The sum of the squares of the deviations of the observations from the true regression line is

11-2 Simple Linear Regression

Definition

11-2 Simple Linear Regression

Notation

11-2 Simple Linear Regression Example 11-1

11-2 Simple Linear Regression Example 11-1

11-2 Simple Linear Regression Example 11-1 Figure 11-4 Scatter plot of oxygen purity y versus hydrocarbon level x and regression model ŷ = x.

11-2 Simple Linear Regression Example 11-1

11-2 Simple Linear Regression Estimating  2 The error sum of squares is It can be shown that the expected value of the error sum of squares is E(SS E ) = (n – 2)  2.

11-2 Simple Linear Regression Estimating  2 An unbiased estimator of  2 is where SS E can be easily computed using

11-3 Properties of the Least Squares Estimators Slope Properties Intercept Properties

11-4 Hypothesis Tests in Simple Linear Regression Use of t-Tests Suppose we wish to test An appropriate test statistic would be

11-4 Hypothesis Tests in Simple Linear Regression Use of t-Tests We would reject the null hypothesis if The test statistic could also be written as :

11-4 Hypothesis Tests in Simple Linear Regression Use of t-Tests Suppose we wish to test An appropriate test statistic would be

11-4 Hypothesis Tests in Simple Linear Regression Use of t-Tests We would reject the null hypothesis if

11-4 Hypothesis Tests in Simple Linear Regression Use of t-Tests An important special case of the hypotheses of Equation is These hypotheses relate to the significance of regression. Failure to reject H 0 is equivalent to concluding that there is no linear relationship between x and Y.

11-4 Hypothesis Tests in Simple Linear Regression Figure 11-5 The hypothesis H 0 :  1 = 0 is not rejected.

11-4 Hypothesis Tests in Simple Linear Regression Figure 11-6 The hypothesis H 0 :  1 = 0 is rejected.

11-4 Hypothesis Tests in Simple Linear Regression Example 11-2

11-4 Hypothesis Tests in Simple Linear Regression Analysis of Variance Approach to Test Significance of Regression The analysis of variance identity is Symbolically,

11-4 Hypothesis Tests in Simple Linear Regression Analysis of Variance Approach to Test Significance of Regression If the null hypothesis, H 0 :  1 = 0 is true, the statistic follows the F 1,n-2 distribution and we would reject if f 0 > f ,1,n-2.

11-4 Hypothesis Tests in Simple Linear Regression Analysis of Variance Approach to Test Significance of Regression The quantities, MS R and MS E are called mean squares. Analysis of variance table:

11-4 Hypothesis Tests in Simple Linear Regression Example 11-3

11-4 Hypothesis Tests in Simple Linear Regression

11-5 Confidence Intervals Confidence Intervals on the Slope and Intercept Definition

11-6 Confidence Intervals Example 11-4

11-5 Confidence Intervals Confidence Interval on the Mean Response Definition

11-5 Confidence Intervals Example 11-5

11-5 Confidence Intervals Example 11-5

11-5 Confidence Intervals Example 11-5

11-5 Confidence Intervals Example 11-5 Figure 11-7 Scatter diagram of oxygen purity data from Example 11-1 with fitted regression line and 95 percent confidence limits on  Y|x0.

11-6 Prediction of New Observations If x 0 is the value of the regressor variable of interest, is the point estimator of the new or future value of the response, Y 0.

11-6 Prediction of New Observations Definition

11-6 Prediction of New Observations Example 11-6

11-6 Prediction of New Observations Example 11-6

11-6 Prediction of New Observations Example 11-6 Figure 11-8 Scatter diagram of oxygen purity data from Example 11-1 with fitted regression line, 95% prediction limits (outer lines), and 95% confidence limits on  Y|x0.

11-7 Adequacy of the Regression Model Fitting a regression model requires several assumptions. 1.Errors are uncorrelated random variables with mean zero; 2.Errors have constant variance; and, 3.Errors be normally distributed. The analyst should always consider the validity of these assumptions to be doubtful and conduct analyses to examine the adequacy of the model

11-7 Adequacy of the Regression Model Residual Analysis The residuals from a regression model are e i = y i - ŷ i, where y i is an actual observation and ŷ i is the corresponding fitted value from the regression model. Analysis of the residuals is frequently helpful in checking the assumption that the errors are approximately normally distributed with constant variance, and in determining whether additional terms in the model would be useful.

11-7 Adequacy of the Regression Model Residual Analysis Figure 11-9 Patterns for residual plots. (a) satisfactory, (b) funnel, (c) double bow, (d) nonlinear. [Adapted from Montgomery, Peck, and Vining (2001).]

11-7 Adequacy of the Regression Model Example 11-7

11-7 Adequacy of the Regression Model Example 11-7

11-7 Adequacy of the Regression Model Example 11-7 Figure Normal probability plot of residuals, Example 11-7.

11-7 Adequacy of the Regression Model Example 11-7 Figure Plot of residuals versus predicted oxygen purity, ŷ, Example 11-7.

11-7 Adequacy of the Regression Model Coefficient of Determination (R 2 ) The quantity is called the coefficient of determination and is often used to judge the adequacy of a regression model. 0  R 2  1; We often refer (loosely) to R 2 as the amount of variability in the data explained or accounted for by the regression model.

11-7 Adequacy of the Regression Model Coefficient of Determination (R 2 ) For the oxygen purity regression model, R 2 = SS R /SS T = / = Thus, the model accounts for 87.7% of the variability in the data.

11-8 Correlation

We may also write:

11-8 Correlation It is often useful to test the hypotheses The appropriate test statistic for these hypotheses is Reject H 0 if |t 0 | > t  /2,n-2.

11-8 Correlation The test procedure for the hypothesis where  0  0 is somewhat more complicated. In this case, the appropriate test statistic is Reject H 0 if |z 0 | > z  /2.

11-8 Correlation The approximate 100(1-  )% confidence interval is

11-8 Correlation Example 11-8

11-8 Correlation Figure Scatter plot of wire bond strength versus wire length, Example 11-8.

11-8 Correlation Minitab Output for Example 11-8

11-8 Correlation Example 11-8 (continued)

11-8 Correlation Example 11-8 (continued)

11-8 Correlation Example 11-8 (continued)

11-9 Transformation and Logistic Regression

Example 11-9 Table 11-5 Observed Values and Regressor Variable for Example 11-9.

11-9 Transformation and Logistic Regression Example 11-9 (Continued)

11-9 Transformation and Logistic Regression Example 11-9 (Continued)

11-9 Transformation and Logistic Regression Example 11-9 (Continued)