# Example 16.7b Estimating Seasonality with Regression.

## Presentation on theme: "Example 16.7b Estimating Seasonality with Regression."— Presentation transcript:

Example 16.7b Estimating Seasonality with Regression

16.116.1 | 16.1a | 16.2 | 16.3 | 16.4 | 16.5 | 16.6 | 16.2a | 16.7 | 16.7a16.1a16.216.316.416.516.616.2a16.716.7a COCACOLA.XLS n We return to this data file which contains the sales history of Coca Cola from 1986 to quarter 2 of 1996. n Does a regression approach provide forecasts that are as accurate as those provided by the other seasonal methods in this chapter?

16.116.1 | 16.1a | 16.2 | 16.3 | 16.4 | 16.5 | 16.6 | 16.2a | 16.7 | 16.7a16.1a16.216.316.416.516.616.2a16.716.7a Solution n We illustrate a multiplicative approach, although an additive approach is also possible. n The data setup is as follows:

16.116.1 | 16.1a | 16.2 | 16.3 | 16.4 | 16.5 | 16.6 | 16.2a | 16.7 | 16.7a16.1a16.216.316.416.516.616.2a16.716.7a Solution n Besides the Sales and Time variables, we need dummy variables for three of the four quarters and a Log_Sales variable. n We then can use multiple regression, with the Log_sales as the response variable and Time, Q1, Q2, and Q3 as the explanatory variables. n The regression output appears as follows:

16.116.1 | 16.1a | 16.2 | 16.3 | 16.4 | 16.5 | 16.6 | 16.2a | 16.7 | 16.7a16.1a16.216.316.416.516.616.2a16.716.7a Regression Output

16.116.1 | 16.1a | 16.2 | 16.3 | 16.4 | 16.5 | 16.6 | 16.2a | 16.7 | 16.7a16.1a16.216.316.416.516.616.2a16.716.7a Interpreting the Output n Of particular interest are the coefficients of the explanatory variables. n Recall that for a log response variable, these coefficients can be interpreted as percent changes in the original sales variable. n Specifically, the coefficient of Time means that deseasonalized sales increase by 2.4% per quarter. n This pattern is quite comparable to the pattern of seasonal indexes we saw in the last two examples.

16.116.1 | 16.1a | 16.2 | 16.3 | 16.4 | 16.5 | 16.6 | 16.2a | 16.7 | 16.7a16.1a16.216.316.416.516.616.2a16.716.7a Forecast Accuracy n To compare the forecast accuracy of this method with earlier examples, we must go through several steps manually. –The multiple regression procedure in StatPro provide fitted values and residuals for the log of sales. –We need to take these antilogs and obtain forecasts of the original sales data, and subtract these from the sales data to obtain forecast errors in Column K. –We can then use the formulas that were used in StatPros forecasting procedure to obtain the summary measures MAE, RMSE, and MAPE.

16.116.1 | 16.1a | 16.2 | 16.3 | 16.4 | 16.5 | 16.6 | 16.2a | 16.7 | 16.7a16.1a16.216.316.416.516.616.2a16.716.7a Forecast Errors and Summary Measures

16.116.1 | 16.1a | 16.2 | 16.3 | 16.4 | 16.5 | 16.6 | 16.2a | 16.7 | 16.7a16.1a16.216.316.416.516.616.2a16.716.7a Forecast Accuracy -- continued n From the summary measures it appears that the forecast are not quite as accurate. n However, looking at the plot below of the forecasts superimposed on the original data shows us that the method again tracks the data very well.