Presentation on theme: "Forecasting Using the Simple Linear Regression Model and Correlation"— Presentation transcript:
1Forecasting Using the Simple Linear Regression Model and Correlation
2What is a forecast?Using a statistical method on past data to predict the future.Using experience, judgment and surveys to predict the future.
3Why forecast? to enhance planning. to force thinking about the future. to fit corporate strategy to future conditions.to coordinate departments to the same future.to reduce corporate costs.
4Kinds of ForecastsCausal forecasts are when changes in a variable (Y) you wish to predict are caused by changes in other variables (X's).Time series forecasts are when changes in a variable (Y) are predicted based on prior values of itself (Y).Regression can provide both kinds of forecasts.
5Types of Relationships Positive Linear RelationshipNegative Linear Relationship
6Types of Relationships (continued)Relationship NOT LinearNo Relationship
7RelationshipsIf the relationship is not linear, the forecaster often has to use math transformations to make the relationship linear.
8Correlation AnalysisCorrelation measures the strength of the linear relationship between variables.It can be used to find the best predictor variables.It does not assure that there is a causal relationship between the variables.
9The Correlation Coefficient Ranges between -1 and 1.The Closer to -1, The Stronger Is The Negative Linear Relationship.The Closer to 1, The Stronger Is The Positive Linear Relationship.The Closer to 0, The Weaker Is Any Linear Relationship.
10Graphs of Various Correlation (r) Values YYYXXXr = -1r = -.6r = 0YYXXr = .6r = 1
12The Scatter DiagramIs used to visualize the relationship and to assess its linearity. The scatter diagram can also be used to identify outliers.
13Regression AnalysisRegression Analysis can be used to model causality and make predictions.Terminology: The variable to be predicted is called the dependent or response variable. The variables used in the prediction model are called independent, explanatory or predictor variables.
14Simple Linear Regression Model The relationship between variables is described by a linear function.A change of one variable causes the other variable to change.
15Population Linear Regression Population Regression Line Is A Straight Line that Describes The Dependence of One Variable on The OtherPopulation Slope CoefficientRandom ErrorPopulation Y interceptDependent (Response) VariablePopulation RegressionLineIndependent (Explanatory) Variable
16How is the best line found? YObserved Value= Random ErrorXObserved Value
17Sample Linear Regression Sample Regression Line Provides an Estimate of The Population Regression LineSample Slope CoefficientSample Y InterceptResidualSample Regression Lineprovides an estimate ofprovides an estimate of
18Simple Linear Regression: An Example Annual Store Square Sales Feet ($1000), ,681, ,395, ,653, ,543, ,318, ,563, ,760You wish to examine the relationship between the square footage of produce stores and their annual sales. Sample data for 7 stores were obtained. Find the equation of the straight line that fits the data best
22Interpreting the Results Yi = XiThe slope of means that each increase of one unit in X, we predict the average of Y to increase by an estimated units.The model estimates that for each increase of 1 square foot in the size of the store, the expected annual sales are predicted to increase by $1487.
23The Coefficient of Determination SSR regression sum of squaresr2 = =SST total sum of squaresThe Coefficient of Determination (r2 ) measures the proportion of variation in Y explained by the independent variable X.
24Coefficients of Determination (R2) and Correlation (R) Y^Y=b+bXi1iX
25Coefficients of Determination (R2) and Correlation (R) (continued)r2 = .81,r = +0.9Y^Y=b+bXi1iX
26Coefficients of Determination (R2) and Correlation (R) (continued)r2 = 0,r = 0Y^Y=b+bXi1iX
27Coefficients of Determination (R2) and Correlation (R) (continued)r2 = 1,r = -1Y^Y=b+bXi1iX
28Correlation: The Symbols Population correlation coefficient (‘rho’) measures the strength between two variables.Sample correlation coefficient r estimates based on a set of sample observations.
30Inferences About the Slope t Test for a Population Slope Is There A Linear Relationship between X and Y ?Null and Alternative HypothesesH0: 1 = 0 (No Linear Relationship) H1: 1 0 (Linear Relationship)Test Statistic:Whereand df = n - 2
31Example: Produce Stores Data for 7 Stores:Estimated Regression Equation:Annual Store Square Sales Feet ($000), ,681, ,395, ,653, ,543, ,318, ,563, ,760Yi = XiThe slope of this model isIs Square Footage of the store affecting its Annual Sales?
32Inferences About the Slope: t Test Example Test Statistic:Decision:Conclusion:H0: 1 = 0H1: 1 0 .05df = 5Critical value(s):From Excel PrintoutRejectRejectReject H0.025.025There is evidence of a linear relationship.t2.5706
33Inferences About the Slope Using A Confidence Interval Confidence Interval Estimate of the Slopeb1 tn-2Excel Printout for Produce StoresAt 95% level of Confidence The confidence Interval for the slope is (1.062, 1.911). Does not include 0.Conclusion: There is a significant linear relationship between annual sales and the size of the store.
34Residual AnalysisIs used to evaluate validity of assumptions. Residual analysis uses numerical measures and plots to assure the validity of the assumptions.
35Linear Regression Assumptions 1. X is linearly related to Y.2. The variance is constant for each value of Y (Homoscedasticity).3. The Residual Error is Normally Distributed.4. If the data is over time, then the errors must be independent.
36Residual Analysis for Linearity XXeeXXNot LinearLinear
37Residual Analysis for Homoscedasticity XXeeXXHomoscedasticityHeteroscedasticity
38Residual Analysis for Independence: The Durbin-Watson Statistic It is used when data is collected over time.It detects autocorrelation; that is, the residuals in one time period are related to residuals in another time period.It measures violation of independence assumption.Calculate D and compare it to the value in Table E.8.
40Interval Estimates for Different Values of X Confidence Interval for the mean of YConfidence Interval for a individual YiYYi = b0 + b1Xi_XXA Given X
41Estimation of Predicted Values Confidence Interval Estimate for YXThe Mean of Y given a particular XiSize of interval vary according to distance away from mean, X.Standard error of the estimatet value from table with df=n-2
42Estimation of Predicted Values Confidence Interval Estimate for Individual Response Yi at a Particular XiAddition of 1 increases width of interval from that for the mean of Y
43Example: Produce Stores Data for 7 Stores:Annual Store Square Sales Feet ($000), ,681, ,395, ,653, ,543, ,318, ,563, ,760Predict the annual sales for a store with 2000 square feet.Regression Model Obtained:Yi = Xi
44Estimation of Predicted Values: Example Confidence Interval Estimate for YXFind the 95% confidence interval for the average annual sales for stores of 2,000 square feetPredicted Sales Yi = Xi = ($000)tn-2 = t5 =X =SYX == Confidence interval for mean Y
45Estimation of Predicted Values: Example Confidence Interval Estimate for Individual YFind the 95% confidence interval for annual sales of one particular store of 2,000 square feetPredicted Sales Yi = Xi = ($000)tn-2 = t5 =X =SYX == Confidence interval for individual Y