Business Statistics for Managerial Decision Farideh Dehkordi-Vakil.

Slides:



Advertisements
Similar presentations
Forecasting Using the Simple Linear Regression Model and Correlation
Advertisements

Inference for Regression
CHAPTER 24: Inference for Regression
Ch11 Curve Fitting Dr. Deshi Ye
© 2001 Prentice-Hall, Inc.Chap 13-1 BA 201 Lecture 21 Autocorrelation and Inferences about the Slope.
Chapter 12 Simple Linear Regression
Chapter 10 Simple Regression.
Chapter 13 Introduction to Linear Regression and Correlation Analysis
Fall 2006 – Fundamentals of Business Statistics 1 Chapter 13 Introduction to Linear Regression and Correlation Analysis.
Simple Linear Regression
1 Pertemuan 13 Uji Koefisien Korelasi dan Regresi Matakuliah: A0392 – Statistik Ekonomi Tahun: 2006.
Pengujian Parameter Koefisien Korelasi Pertemuan 04 Matakuliah: I0174 – Analisis Regresi Tahun: Ganjil 2007/2008.
Chapter Topics Types of Regression Models
Linear Regression and Correlation Analysis
Chapter 13 Introduction to Linear Regression and Correlation Analysis
Business Statistics - QBM117 Interval estimation for the slope and y-intercept Hypothesis tests for regression.
Discovering and Describing Relationships
Linear Regression Example Data
Korelasi dalam Regresi Linear Sederhana Pertemuan 03 Matakuliah: I0174 – Analisis Regresi Tahun: Ganjil 2007/2008.
© 2000 Prentice-Hall, Inc. Chap Forecasting Using the Simple Linear Regression Model and Correlation.
Pertemua 19 Regresi Linier
Chapter 14 Introduction to Linear Regression and Correlation Analysis
Chapter 7 Forecasting with Simple Regression
Chapter 12 Section 1 Inference for Linear Regression.
Simple Linear Regression
© 2001 Prentice-Hall, Inc.Chap 14-1 BA 201 Lecture 23 Correlation Analysis And Introduction to Multiple Regression (Data)Data.
Introduction to Linear Regression and Correlation Analysis
Regression Analysis Regression analysis is a statistical technique that is very useful for exploring the relationships between two or more variables (one.
Inference for regression - Simple linear regression
Statistics for Managers Using Microsoft Excel, 4e © 2004 Prentice-Hall, Inc. Chap 12-1 Chapter 12 Simple Linear Regression Statistics for Managers Using.
STA291 Statistical Methods Lecture 27. Inference for Regression.
Copyright © 2013, 2010 and 2007 Pearson Education, Inc. Chapter Inference on the Least-Squares Regression Model and Multiple Regression 14.
1 Least squares procedure Inference for least squares lines Simple Linear Regression.
BPS - 3rd Ed. Chapter 211 Inference for Regression.
OPIM 303-Lecture #8 Jose M. Cruz Assistant Professor.
© 2003 Prentice-Hall, Inc.Chap 13-1 Basic Business Statistics (9 th Edition) Chapter 13 Simple Linear Regression.
+ Chapter 12: Inference for Regression Inference for Linear Regression.
Introduction to Linear Regression
Chap 12-1 A Course In Business Statistics, 4th © 2006 Prentice-Hall, Inc. A Course In Business Statistics 4 th Edition Chapter 12 Introduction to Linear.
1 1 Slide © 2014 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole.
1 Chapter 12 Simple Linear Regression. 2 Chapter Outline  Simple Linear Regression Model  Least Squares Method  Coefficient of Determination  Model.
Lecture 8 Simple Linear Regression (cont.). Section Objectives: Statistical model for linear regression Data for simple linear regression Estimation.
Inference for Regression Simple Linear Regression IPS Chapter 10.1 © 2009 W.H. Freeman and Company.
Inference for Regression Chapter 14. Linear Regression We can use least squares regression to estimate the linear relationship between two quantitative.
Lesson Multiple Regression Models. Objectives Obtain the correlation matrix Use technology to find a multiple regression equation Interpret the.
+ Chapter 12: More About Regression Section 12.1 Inference for Linear Regression.
Statistics for Business and Economics 8 th Edition Chapter 11 Simple Regression Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall Ch.
STA 286 week 131 Inference for the Regression Coefficient Recall, b 0 and b 1 are the estimates of the slope β 1 and intercept β 0 of population regression.
Copyright © 2014, 2011 Pearson Education, Inc. 1 Chapter 21 The Simple Regression Model.
Lecture 10: Correlation and Regression Model.
Applied Quantitative Analysis and Practices LECTURE#25 By Dr. Osman Sadiq Paracha.
28. Multiple regression The Practice of Statistics in the Life Sciences Second Edition.
Statistics for Managers Using Microsoft® Excel 5th Edition
Chapter 12 Simple Linear Regression n Simple Linear Regression Model n Least Squares Method n Coefficient of Determination n Model Assumptions n Testing.
Chapter 26: Inference for Slope. Height Weight How much would an adult female weigh if she were 5 feet tall? She could weigh varying amounts – in other.
BUSINESS MATHEMATICS & STATISTICS. Module 6 Correlation ( Lecture 28-29) Line Fitting ( Lectures 30-31) Time Series and Exponential Smoothing ( Lectures.
BPS - 5th Ed. Chapter 231 Inference for Regression.
Inference for Least Squares Lines
Statistics for Managers using Microsoft Excel 3rd Edition
Basic Estimation Techniques
Inference for Regression
Basic Estimation Techniques
CHAPTER 29: Multiple Regression*
PENGOLAHAN DAN PENYAJIAN
Basic Practice of Statistics - 3rd Edition Inference for Regression
Introduction to Regression
Correlation and Simple Linear Regression
Correlation and Simple Linear Regression
Presentation transcript:

Business Statistics for Managerial Decision Farideh Dehkordi-Vakil

Example:Retail sales and floor space It is customary in retail operations to asses the performance of stores partly in terms of their annual sales relative to their floor area (square feet). We might expect sales to increase linearly as stores get larger, with of course individual variation among stores of the same size. The regression model for a population of stores says that SALES =  0 +  1  AREA + 

Example:Retail sales and floor space The slope  1 is as usual a rate of change: it is the expected increase in annual sales associated with each additional square foot of floor space. The intercept  0 is needed to describe the line but has no statistical importance because no stores have area close to zero. Floor space does not completely determine sales. The term  in the model accounts for difference among individual stores with the same floor space. A store’s location, for example, is important.

Estimation of the variance of the error terms,  2 The variance  2 of the error terms  i in the regression model needs to be estimated for a variety of purposes. It gives an indication of the variability of the probability distributions of y. It is needed for making inference concerning regression function and the prediction of y.

Regression Standard Error To estimate  we work with the variance and take the square root to obtain the standard deviation. For simple linear regression the estimate of  2 is the average squared residual. To estimate , use s estimates the standard deviation  of the error term  in the statistical model for simple linear regression.

Regression Standard Error

Inference in Regression Analysis The simple linear regression which is the basis for inference, imposes several conditions. We should verify these conditions before proceeding to inference. The conditions concern the population, but we can observe only our sample. In doing inference we act as if The sample is a SRS from the population. There is a linear relationship in the population. The standard deviation of the responses about the population line is the same for all values of the explanatory variable. The response varies Normally about the population regression line.

Inference in Regression Analysis Plotting the residuals against the explanatory variable is helpful in checking these conditions because a residual plot magnifies patterns. A Normal quantile plot of the residuals can be used to check the Normality assumptions.

Confidence Intervals and Significance Tests In our previous lectures we presented confidence intervals and significance tests for means and differences in means.In each case, inference rested on the standard error s of the estimates and on t or z distributions. Inference for the slope and intercept in linear regression is similar in principal, although the recipes are more complicated. All confidence intervals, for example, have the form estimate  t* Se estimate t* is a critical value of a t distribution.

Confidence Intervals and Significance Tests Confidence intervals and tests for the slope and intercept are based on the sampling distributions of the estimates b 1 and b 0. Here are the facts: If the simple linear regression model is true, each of b 0 and b 1 has a Normal distribution. The mean of b 0 is  0 and the mean of b 1 is  1. The standard deviations of b 0 and b 1 are multiples of the model standard deviation .

Confidence Intervals and Significance Tests

Example: Do wages rise with experience? Many factors affect the wages of workers: the industry they work in, their type of job, their education and their experience, and changes in general levels of wages. We will look at a sample of 59 married women who hold customer service jobs in Indiana banks. The following table gives their weekly wages at a specific point in time also their length of service with their employer, in month. The size of the place of work is recorded simply as “large” (100 or more workers) or “small.” Because industry, job type, and the time of measurement are the same for all 59 subjects, we expect to see a clear relationship between wages and length of service.

Example: Do wages rise with experience?

Do wages rise with experience? The hypotheses are: H 0 :  1 = 0,Ha:  1 > 0 The t statistic for the significance of regression is: The P- value is: P(t > 2.85) <.005 The t distribution for this problem have n-2 = 57 degrees of freedom. Conclusion: Reject H 0 : There is strong evidence that the mean wages increase as length of service increases.

Example: Do wages rise with experience? A 95% confidence interval for the slope  1 of the regression line in the population of all married female customer service workers in Indiana bank is The t distribution for this problem have n- 2 = 57 degrees of freedom

Inference about Correlation The correlation between wages and length of service for the 59 bank workers is r = This appears in the Excel out put, where it is labeled “Multiple R.” We expect a positive correlation between length of service and wages in the population of all married female bank workers. Is the sample result convincing that this is true? This question concerns a new population parameter, the population correlation. This is correlation between length of service and wages when we measure these variables for every member of the population.

Inference about Correlation We will call the population correlation . To assess the evidence that . 0 in the bank worker population, we must test the hypotheses H 0 :  = 0 H a :  > 0 It is natural to base the test on the sample correlation r. There is a link between correlation and regression slope. The population correlation  is zero, positive, negative exactly when the slope  1 of the population regression line is zero, positive, or negative.

Inference about Correlation

Example: Do wages rise with experience? The sample correlation between wages and length of service is r = from a sample of n = 59. To test H 0 :  = 0 H a :  > 0 Use t statistic

Example: Do wages rise with experience? Compare t = with critical values from the t table with n - 2 = 57 degrees of freedom. Conclusion: P( t > 2.853) <.005, therefore we reject H 0. There is a positive correlation between wages and length of service.