6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.

Slides:



Advertisements
Similar presentations
11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Advertisements

Inference for Regression
Probability & Statistical Inference Lecture 9
12-1 Multiple Linear Regression Models Introduction Many applications of regression analysis involve situations in which there are more than.
12 Multiple Linear Regression CHAPTER OUTLINE
11 Simple Linear Regression and Correlation CHAPTER OUTLINE
Copyright (c) 2004 Brooks/Cole, a division of Thomson Learning, Inc. Chapter 13 Nonlinear and Multiple Regression.
Ch11 Curve Fitting Dr. Deshi Ye
Objectives (BPS chapter 24)
Chapter 12 Simple Linear Regression
© 2010 Pearson Prentice Hall. All rights reserved Least Squares Regression Models.
The Multiple Regression Model Prepared by Vera Tabakova, East Carolina University.
Chapter 10 Simple Regression.
Correlation and Simple Regression Introduction to Business Statistics, 5e Kvanli/Guynes/Pavur (c)2000 South-Western College Publishing.
1 Chapter 3 Multiple Linear Regression Ray-Bing Chen Institute of Statistics National University of Kaohsiung.
Chapter 13 Introduction to Linear Regression and Correlation Analysis
SIMPLE LINEAR REGRESSION
Linear Regression and Correlation Analysis
Chapter 11 Multiple Regression.
Multiple Linear Regression
11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Chapter 14 Introduction to Linear Regression and Correlation Analysis
Correlation and Regression Analysis
Simple Linear Regression and Correlation
Introduction to Regression Analysis, Chapter 13,
Simple Linear Regression Analysis
1 1 Slide © 2008 Thomson South-Western. All Rights Reserved Slides by JOHN LOUCKS & Updated by SPIROS VELIANITIS.
Lecture 5 Correlation and Regression
Correlation & Regression
Regression and Correlation Methods Judy Zhong Ph.D.
SIMPLE LINEAR REGRESSION
Marketing Research Aaker, Kumar, Day and Leone Tenth Edition
Introduction to Linear Regression and Correlation Analysis
Regression Analysis Regression analysis is a statistical technique that is very useful for exploring the relationships between two or more variables (one.
Chapter 11 Simple Regression
Copyright © 2013, 2010 and 2007 Pearson Education, Inc. Chapter Inference on the Least-Squares Regression Model and Multiple Regression 14.
1 1 Slide © 2005 Thomson/South-Western Slides Prepared by JOHN S. LOUCKS St. Edward’s University Slides Prepared by JOHN S. LOUCKS St. Edward’s University.
OPIM 303-Lecture #8 Jose M. Cruz Assistant Professor.
1 1 Slide © 2008 Thomson South-Western. All Rights Reserved Chapter 15 Multiple Regression n Multiple Regression Model n Least Squares Method n Multiple.
Introduction to Linear Regression
Chap 12-1 A Course In Business Statistics, 4th © 2006 Prentice-Hall, Inc. A Course In Business Statistics 4 th Edition Chapter 12 Introduction to Linear.
Random Regressors and Moment Based Estimation Prepared by Vera Tabakova, East Carolina University.
1 1 Slide © 2014 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole.
1 Chapter 12 Simple Linear Regression. 2 Chapter Outline  Simple Linear Regression Model  Least Squares Method  Coefficient of Determination  Model.
Multiple Regression and Model Building Chapter 15 Copyright © 2014 by The McGraw-Hill Companies, Inc. All rights reserved.McGraw-Hill/Irwin.
Chapter 4 Linear Regression 1. Introduction Managerial decisions are often based on the relationship between two or more variables. For example, after.
6-1 Introduction To Empirical Models Based on the scatter diagram, it is probably reasonable to assume that the mean of the random variable Y is.
Lesson Multiple Regression Models. Objectives Obtain the correlation matrix Use technology to find a multiple regression equation Interpret the.
1 11 Simple Linear Regression and Correlation 11-1 Empirical Models 11-2 Simple Linear Regression 11-3 Properties of the Least Squares Estimators 11-4.
Business Statistics: A Decision-Making Approach, 6e © 2005 Prentice-Hall, Inc. Chap 13-1 Introduction to Regression Analysis Regression analysis is used.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.
Regression Analysis1. 2 INTRODUCTION TO EMPIRICAL MODELS LEAST SQUARES ESTIMATION OF THE PARAMETERS PROPERTIES OF THE LEAST SQUARES ESTIMATORS AND ESTIMATION.
Chapter 12 Simple Linear Regression n Simple Linear Regression Model n Least Squares Method n Coefficient of Determination n Model Assumptions n Testing.
1 1 Slide The Simple Linear Regression Model n Simple Linear Regression Model y =  0 +  1 x +  n Simple Linear Regression Equation E( y ) =  0 + 
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Multiple Regression Chapter 14.
Chapter Outline EMPIRICAL MODELS 11-2 SIMPLE LINEAR REGRESSION 11-3 PROPERTIES OF THE LEAST SQUARES ESTIMATORS 11-4 SOME COMMENTS ON USES OF REGRESSION.
Inference about the slope parameter and correlation
Applied Statistics and Probability for Engineers
11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Inference for Regression (Chapter 14) A.P. Stats Review Topic #3
Virtual COMSATS Inferential Statistics Lecture-26
Slides by JOHN LOUCKS St. Edward’s University.
LESSON 24: INFERENCES USING REGRESSION
6-1 Introduction To Empirical Models
Simple Linear Regression
Simple Linear Regression
Product moment correlation
St. Edward’s University
Correlation and Simple Linear Regression
Presentation transcript:

6-1 Introduction To Empirical Models

6-1 Introduction To Empirical Models

6-1 Introduction To Empirical Models

6-1 Introduction To Empirical Models Based on the scatter diagram, it is probably reasonable to assume that the mean of the random variable Y is related to x by the following straight-line relationship: where the slope and intercept of the line are called regression coefficients. The simple linear regression model is given by where  is the random error term.

6-1 Introduction To Empirical Models We think of the regression model as an empirical model. Suppose that the mean and variance of  are 0 and 2, respectively, then The variance of Y given x is

6-1 Introduction To Empirical Models The true regression model is a line of mean values: where 1 can be interpreted as the change in the mean of Y for a unit change in x. Also, the variability of Y at a particular value of x is determined by the error variance, 2. This implies there is a distribution of Y-values at each x and that the variance of this distribution is the same at each x.

6-1 Introduction To Empirical Models

6-1 Introduction To Empirical Models A Multiple Regression Model:

6-1 Introduction To Empirical Models

6-1 Introduction To Empirical Models

6-2 Simple Linear Regression 6-2.1 Least Squares Estimation The case of simple linear regression considers a single regressor or predictor x and a dependent or response variable Y. The expected value of Y at each level of x is a random variable: We assume that each observation, Y, can be described by the model

6-2 Simple Linear Regression 6-2.1 Least Squares Estimation Suppose that we have n pairs of observations (x1, y1), (x2, y2), …, (xn, yn). The method of least squares is used to estimate the parameters, 0 and 1 by minimizing the sum of the squares of the vertical deviations in Figure 6-6.

6-2 Simple Linear Regression 6-2.1 Least Squares Estimation Using Equation 6-8, the n observations in the sample can be expressed as The sum of the squares of the deviations of the observations from the true regression line is

6-2 Simple Linear Regression 6-2.1 Least Squares Estimation

6-2 Simple Linear Regression 6-2.1 Least Squares Estimation

6-2 Simple Linear Regression 6-2.1 Least Squares Estimation

6-2 Simple Linear Regression 6-2.1 Least Squares Estimation

6-2 Simple Linear Regression 6-2.1 Least Squares Estimation Notation

6-2 Simple Linear Regression 6-2.1 Least Squares Estimation

6-2 Simple Linear Regression 6-2.1 Least Squares Estimation

6-2 Simple Linear Regression 6-2.1 Least Squares Estimation

6-2 Simple Linear Regression 6-2.1 Least Squares Estimation

6-2 Simple Linear Regression 6-2.1 Least Squares Estimation

6-2 Simple Linear Regression 6-2.1 Least Squares Estimation

6-2 Simple Linear Regression Regression Assumptions and Model Properties

6-2 Simple Linear Regression Regression Assumptions and Model Properties

6-2 Simple Linear Regression Regression and Analysis of Variance

6-2 Simple Linear Regression Regression and Analysis of Variance

6-2 Simple Linear Regression 6-2.2 Testing Hypothesis in Simple Linear Regression Use of t-Tests Suppose we wish to test An appropriate test statistic would be

6-2 Simple Linear Regression 6-2.2 Testing Hypothesis in Simple Linear Regression Use of t-Tests We would reject the null hypothesis if

6-2 Simple Linear Regression 6-2.2 Testing Hypothesis in Simple Linear Regression Use of t-Tests Suppose we wish to test An appropriate test statistic would be

6-2 Simple Linear Regression 6-2.2 Testing Hypothesis in Simple Linear Regression Use of t-Tests We would reject the null hypothesis if

6-2 Simple Linear Regression 6-2.2 Testing Hypothesis in Simple Linear Regression Use of t-Tests An important special case of the hypotheses of Equation 6-23 is These hypotheses relate to the significance of regression. Failure to reject H0 is equivalent to concluding that there is no linear relationship between x and Y.

6-2 Simple Linear Regression 6-2.2 Testing Hypothesis in Simple Linear Regression Use of t-Tests

6-2 Simple Linear Regression 6-2.2 Testing Hypothesis in Simple Linear Regression Use of t-Tests

6-2 Simple Linear Regression The Analysis of Variance Approach

6-2 Simple Linear Regression 6-2.2 Testing Hypothesis in Simple Linear Regression The Analysis of Variance Approach

6-2 Simple Linear Regression 6-2.3 Confidence Intervals in Simple Linear Regression

6-2 Simple Linear Regression 6-2.3 Confidence Intervals in Simple Linear Regression

6-2 Simple Linear Regression

6-2 Simple Linear Regression 6-2.4 Prediction of New Observations

6-2 Simple Linear Regression 6-2.4 Prediction of New Observations

6-2 Simple Linear Regression 6-2.5 Checking Model Adequacy Fitting a regression model requires several assumptions. Errors are uncorrelated random variables with mean zero; Errors have constant variance; and, Errors be normally distributed. The analyst should always consider the validity of these assumptions to be doubtful and conduct analyses to examine the adequacy of the model

6-2 Simple Linear Regression 6-2.5 Checking Model Adequacy The residuals from a regression model are ei = yi - ŷi , where yi is an actual observation and ŷi is the corresponding fitted value from the regression model. Analysis of the residuals is frequently helpful in checking the assumption that the errors are approximately normally distributed with constant variance, and in determining whether additional terms in the model would be useful.

6-2 Simple Linear Regression 6-2.5 Checking Model Adequacy

6-2 Simple Linear Regression 6-2.5 Checking Model Adequacy

6-2 Simple Linear Regression 6-2.5 Checking Model Adequacy

6-2 Simple Linear Regression 6-2.5 Checking Model Adequacy

6-2 Simple Linear Regression 6-2.5 Checking Model Adequacy

6-2 Simple Linear Regression 6-2.6 Correlation and Regression The sample correlation coefficient between X and Y is

6-2 Simple Linear Regression 6-2.6 Correlation and Regression The sample correlation coefficient is also closely related to the slope in a linear regression model

6-2 Simple Linear Regression 6-2.6 Correlation and Regression It is often useful to test the hypotheses The appropriate test statistic for these hypotheses is Reject H0 if |t0| > t/2,n-2.

6-3 Multiple Regression 6-3.1 Estimation of Parameters in Multiple Regression

6-3 Multiple Regression 6-3.1 Estimation of Parameters in Multiple Regression The least squares function is given by The least squares estimates must satisfy

6-3 Multiple Regression 6-3.1 Estimation of Parameters in Multiple Regression The least squares normal equations are The solution to the normal equations are the least squares estimators of the regression coefficients.

6-3 Multiple Regression

6-3 Multiple Regression

6-3 Multiple Regression

6-3 Multiple Regression

6-3 Multiple Regression

6-3 Multiple Regression 6-3.1 Estimation of Parameters in Multiple Regression

6-3 Multiple Regression 6-3.2 Inferences in Multiple Regression Test for Significance of Regression

6-3 Multiple Regression 6-3.2 Inferences in Multiple Regression Inference on Individual Regression Coefficients This is called a partial or marginal test

6-3 Multiple Regression 6-3.2 Inferences in Multiple Regression Confidence Intervals on the Mean Response and Prediction Intervals

6-3 Multiple Regression 6-3.2 Inferences in Multiple Regression Confidence Intervals on the Mean Response and Prediction Intervals

6-3 Multiple Regression 6-3.2 Inferences in Multiple Regression Confidence Intervals on the Mean Response and Prediction Intervals

6-3 Multiple Regression 6-3.2 Inferences in Multiple Regression A Test for the Significance of a Group of Regressors

6-3 Multiple Regression 6-3.3 Checking Model Adequacy Residual Analysis

6-3 Multiple Regression 6-3.3 Checking Model Adequacy Residual Analysis

6-3 Multiple Regression 6-3.3 Checking Model Adequacy Residual Analysis

6-3 Multiple Regression 6-3.3 Checking Model Adequacy Residual Analysis

6-3 Multiple Regression 6-3.3 Checking Model Adequacy Residual Analysis

6-3 Multiple Regression 6-3.3 Checking Model Adequacy Influential Observations

6-3 Multiple Regression 6-3.3 Checking Model Adequacy Influential Observations

6-3 Multiple Regression 6-3.3 Checking Model Adequacy

6-3 Multiple Regression 6-3.3 Checking Model Adequacy Multicollinearity

6-4 Other Aspects of Regression 6-4.1 Polynomial Models

6-4 Other Aspects of Regression 6-4.1 Polynomial Models

6-4 Other Aspects of Regression 6-4.1 Polynomial Models

6-4 Other Aspects of Regression 6-4.1 Polynomial Models

6-4 Other Aspects of Regression 6-4.1 Polynomial Models

6-4 Other Aspects of Regression 6-4.2 Categorical Regressors Many problems may involve qualitative or categorical variables. The usual method for the different levels of a qualitative variable is to use indicator variables. For example, to introduce the effect of two different operators into a regression model, we could define an indicator variable as follows:

6-4 Other Aspects of Regression 6-4.2 Categorical Regressors

6-4 Other Aspects of Regression 6-4.2 Categorical Regressors

6-4 Other Aspects of Regression 6-4.3 Variable Selection Procedures Best Subsets Regressions

6-4 Other Aspects of Regression 6-4.3 Variable Selection Procedures Backward Elimination

6-4 Other Aspects of Regression 6-4.3 Variable Selection Procedures Forward Selection

6-4 Other Aspects of Regression 6-4.3 Variable Selection Procedures Stepwise Regression