Presentation on theme: "1 Forecasting Forecasting Terminology Simple Moving Average Weighted Moving Average Exponential Smoothing Simple Linear Regression Model Holts Trend Model."— Presentation transcript:
1 Forecasting Forecasting Terminology Simple Moving Average Weighted Moving Average Exponential Smoothing Simple Linear Regression Model Holts Trend Model Seasonal Model (No Trend) Winters Model for Data with Trend and Seasonal Components
2 Evaluating Forecasts Visual Review Errors Errors Measure MPE and MAPE Tracking Signal
3 Historical Data Forecasting Terminology Initialization ExPost Forecast Historical Data
4 We are now looking at a future from here, and the future we were looking at in February now includes some of our past, and we can incorporate the past into our forecast. 1993, the first half, which is now the past and was the future when we issued our first forecast, is now over Laura DAndrea Tyson, Head of the Presidents Council of Economic Advisors, quoted in November of 1993 in the Chicago Tribune, explaining why the Administration reduced its projections of economic growth to 2 percent from the 3.1percent it predicted in February. Forecasting Terminology
5 Forecasting Problem Suppose your fraternity/sorority house consumed the following number of cases of beer for the last 6 weekends: 8, 5, 7, 3, 6, 9 How many cases do you think your fraternity / sorority will consume this weekend?
Week Cases Forecasting: Simple Moving Average Method Using a three period moving average, we would get the following forecast:
Week Cases Forecasting: Simple Moving Average Method What if we used a two period moving average?
8 The number of periods used in the moving average forecast affects the responsiveness of the forecasting method: Week Cases Forecasting: Simple Moving Average Method 2 Periods 3 Periods 1 Period
9 Forecasting Terminology Applying this terminology to our problem using the Moving Average forecast: Initialization ExPost Forecast Model Evaluation
10 Rather than equal weights, it might make sense to use weights which favor more recent consumption values. With the Weighted Moving Average, we have to select weights that are individually greater than zero and less than 1, and as a group sum to 1: Valid Weights: (.5,.3,.2), (.6,.3,.1), ( 1/2, 1/3, 1/6) Invalid Weights: (.5,.2,.1), (.6, -.1,.5), (.5,.4,.3,.2) Forecasting: Weighted Moving Average Method
11 Forecasting: Weighted Moving Average Method A Weighted Moving Average forecast with weights of (1/6, 1/3, 1/2), is performed as follows: How do you make the Weighted Moving Average forecast more responsive?
12 Exponential Smoothing is designed to give the benefits of the Weighted Moving Average forecast without the cumbersome problem of specifying weights. In Exponential Smoothing, there is only one parameter ( ): = smoothing constant (between 0 and 1) Forecasting: Exponential Smoothing
20 Forecasting: Simple Linear Regression Model Simple linear regression can be used to forecast data with trends D is the regressed forecast value or dependent variable in the model, a is the intercept value of the regression line, and b is the slope of the regression line. a D I b
21 Forecasting: Simple Linear Regression Model In linear regression, the squared errors are minimized Error
22 Forecasting: Simple Linear Regression Model
Limitations in Linear Regression Model As with the simple moving average model, all data points count equally with simple linear regression.
24 Forecasting: Holts Trend Model To forecast data with trends, we can use an exponential smoothing model with trend, frequently known as Holts model: L(t) = A(t) + (1- ) F(t) T(t) = L(t) - L(t-1) ] + (1- ) T(t-1) F(t+1) = L(t) + T(t) We could use linear regression to initialize the model
25 Holts Trend Model: Initialization First, well initialize the model: L(4) = (9.9)=60.1 T(4) = 9.9
28 Holts Model Results Initialization ExPost Forecast
29 Regression Initialization ExPost Forecast Forecast Holts Model Results
30 Forecasting: Seasonal Model (No Trend)
31 L(t) = A(t) / S(t-p) + (1- ) L(t-1) S(t) = A(t) / L(t)] + (1- ) S(t-p) Seasonal Model Formulas p is the number of periods in a season Quarterly data: p = 4 Monthly data: p = 12 F(t+1) = L(t) * S(t+1-p)
32 Seasonal Model Initialization S(5) = 0.60 S(6) = 1.00 S(7) = 1.55 S(8) = 0.85 L(8) = 26.5 Quarter Average Seasonal Factor S(t) Average Sales per Quarter =26.5 A(t) 2003Spring16 Summer27 Fall39 Winter Spring16 Summer26 Fall43 Winter23
33 Seasonal Model Forecasting Spring14 Summer29 Fall41 Winter Spring Summer Fall Winter A(t)L(t) Seasonal Factor S(t)F(t) 2004Spring Summer Fall Winter
Seasonal Model Forecasting
35 Forecasting: Winters Model for Data with Trend and Seasonal Components L(t) = A(t) / S(t-p) + (1- )[L(t-1)+T(t-1)] T(t) = L(t) - L(t-1)] + (1- ) T(t-1) S(t) = A(t) / L(t)] + (1- ) S(t-p) F(t+1) = L(t) + T(t)] S(t+1-p)
36 Seasonal-Trend Model Decomposition To initialize Winters Model, we will use Decomposition Forecasting, which itself can be used to make forecasts.
37 Decomposition Forecasting There are two ways to decompose forecast data with trend and seasonal components: –Use regression to get the trend, use the trend line to get seasonal factors –Use averaging to get seasonal factors, de- seasonalize the data, then use regression to get the trend.
38 Decomposition Forecasting The following data contains trend and seasonal components:
39 Decomposition Forecasting The seasonal factors are obtained by the same method used for the Seasonal Model forecast: PeriodQuarterSales 1Spring90 2Summer157 3Fall123 4Winter93 5Spring128 6Summer211 7Fall163 8Winter122 Average =135.9 Average to 1 Qtr. Ave Seas. Factor
40 Decomposition Forecasting With the seasonal factors, the data can be de- seasonalized by dividing the data by the seasonal factors: Regression on the De-seasonalized data will give the trend
42 Decomposition Forecast Regression on the de-seasonalized data produces the following results: –Slope (m) = 7.71 –Intercept (b) = Forecasts can be performed using the following equation –[mx + b](seasonal factor)
44 Winters Model Initialization We can use the decomposition forecast to define the following Winters Model parameters: L(n) = b + m (n) T(n) = m S(j) = S(j-p) L(8) = (7.71) = T(8) = 7.71 S(5) = 0.80 S(6) = 1.35 S(7) = 1.05 S(8) = 0.79 So from our previous model, we have
45 Winters Model Example Spring152 10Summer303 11Fall232 12Winter Spring 14Summer 15Fall 16Winter = 0.3 = 0.4 = 0.2 PeriodQuarterSalesL(t)T(t)S(t)F(t) 1Spring90 2Summer157 3Fall123 4Winter93 5Spring Summer Fall Winter
Winters Model Example
47 Evaluating Forecasts Trust, but Verify Ronald W. Reagan Computer software gives us the ability to mess up more data on a greater scale more efficiently While software like SAP can automatically select models and model parameters for a set of data, and usually does so correctly, when the data is important, a human should review the model results One of the best tools is the human eye
Visual Review How would you evaluate this forecast?
Forecast Evaluation Initialization ExPost Forecast Where Forecast is Evaluated Do not include initialization data in evaluation
Errors All error measures compare the forecast model to the actual data for the ExPost Forecast region
51 Errors Measure All error measures are based on the comparison of forecast values to actual values in the ExPost Forecast regiondo not include data from initialization.
52 Bias and MAD
53 Bias tells us whether we have a tendency to over- or under-forecast. If our forecasts are in the middle of the data, then the errors should be equally positive and negative, and should sum to 0. MAD (Mean Absolute Deviation) is the average error, ignoring whether the error is positive or negative. Errors are bad, and the closer to zero an error is, the better the forecast is likely to be. Error measures tell how well the method worked in the ExPost forecast region. How well the forecast will work in the future is uncertain. Bias and MAD
54 Absolute vs. Relative Measures Forecasts were made for two sets of data. Which forecast was better? Data Set 1 Bias = MAD = Data Set 2 Bias = 182 MAD = Data Set 1 Data Set 2
55 MPE and MAPE When the numbers in a data set are larger in magnitude, then the error measures are likely to be large as well, even though the fit might not be as good. Mean Percentage Error (MPE) and Mean Absolute Percentage Error (MAPE) are relative forms of the Bias and MAD, respectively. MPE and MAPE can be used to compare forecasts for different sets of data.
56 MPE and MAPE Mean Percentage Error (MPE) Mean Absolute Percentage Error (MAPE)
57 MPE and MAPE Data Set 1
58 MPE and MAPE Data Set 2
59 MPE and MAPE Data Set 2 Data Set 1
Tracking Signal Whats happened in this situation? How could we detect this in an automatic forecasting environment?
61 Tracking Signal The tracking signal can be calculated after each actual sales value is recorded. The tracking signal is calculated as: The tracking signal is a relative measure, like MPE and MAPE, so it can be compared to a set value (typically 4 or 5) to identify when forecasting parameters and/or models need to be changed.