Download presentation
Presentation is loading. Please wait.
Published byMarian Singleton Modified over 6 years ago
1
Demand Forecasting Four Fundamental Approaches Time Series
General Concepts Evaluating Forecasts – How ‘good’ is it? Forecasting Methods (Stationary) Cumulative Mean Naïve Forecast Moving Average Exponential Smoothing Forecasting Methods (Trends & Seasonality) OLS Regression Holt’s Method Exponential Method for Seasonal Data Winter’s Model Other Models
2
Demand Forecasting Forecasting is difficult – especially for the future Forecasts are always wrong The less aggregated, the lower the accuracy The longer the time horizon, the lower the accuracy The past is usually a pretty good place to start Everything exhibits seasonality of some sort A good forecast is not just a number – it should include a range, description of distribution, etc. Any analytical method should be supplemented by external information A forecast for one function in a company might not be useful to another function (Sales to Mkt to Mfg to Trans)
3
Four Fundamental Approaches
Subjective Judgmental Sales force surveys Delphi techniques Jury of experts Experimental Customer surveys Focus group sessions Test Marketing Objective Causal / Relational Econometric Models Leading Indicators Input-Output Models Time Series “Black Box” Approach Uses past to predict the future
4
Time Series Concepts Time Series – Regular & recurring basis to forecast Stationarity – Values hover around a mean Trend- Persistent movement in one direction Seasonality – Movement periodic to calendar Cycle – Periodic movement not tied to calendar Pattern + Noise – Predictable and random components of a Time Series forecast Generating Process –Equation that creates TS Accuracy and Bias – Closeness to actual vs Persistent tendency to over or under predict Fit versus Forecast – Tradeoff between accuracy to past forecast to usefulness of predictability Forecast Optimality – Error is equal to the random noise
5
Evaluating Forecasts Visual Review Errors Errors Measure MPE and MAPE
Tracking Signal
6
Demand Forecasting Generate the large number of short-term, SKU level, locally dis-aggregated demand forecasts required for production, logistics, and sales to operate successfully. Focus on: Forecasting product demand Mature products (not new product releases) Short time horizon (weeks, months, quarters, year) Use of models to assist in the forecast Cases where demand of items is independent
7
Forecasting Terminology
Historical Data 50 100 150 200 250 300 350 400 10 20 30 40 Historical Data ExPost Forecast Initialization Forecast
8
Forecasting Terminology
“We are now looking at a future from here, and the future we were looking at in February now includes some of our past, and we can incorporate the past into our forecast , the first half, which is now the past and was the future when we issued our first forecast, is now over” Laura D’Andrea Tyson, Head of the President’s Council of Economic Advisors, quoted in November of 1993 in the Chicago Tribune, explaining why the Administration reduced its projections of economic growth to 2 percent from the 3.1percent it predicted in February.
9
Forecasting Problem Suppose your fraternity/sorority house consumed the following number of cases of beer for the last 6 weekends: 8, 5, 7, 3, 6, 9 How many cases do you think your fraternity / sorority will consume this weekend?
10
Forecasting: Simple Moving Average Method
Using a three period moving average, we would get the following forecast: 1 2 3 4 5 6 7 8 9 10 Week Cases
11
Forecasting: Simple Moving Average Method
What if we used a two period moving average? 1 2 3 4 5 6 7 8 9 10 Week Cases
12
Forecasting: Simple Moving Average Method
The number of periods used in the moving average forecast affects the “responsiveness” of the forecasting method: 1 Period 1 2 3 4 5 6 7 8 9 10 Week Cases 2 Periods 3 Periods
13
Forecasting Terminology
Applying this terminology to our problem using the Moving Average forecast: Model Evaluation Initialization ExPost Forecast Forecast
14
Forecasting: Weighted Moving Average Method
Rather than equal weights, it might make sense to use weights which favor more recent consumption values. With the Weighted Moving Average, we have to select weights that are individually greater than zero and less than 1, and as a group sum to 1: Valid Weights: (.5, .3, .2) , (.6,.3,.1), (1/2, 1/3, 1/6) Invalid Weights: (.5, .2, .1), (.6, -.1, .5), (.5,.4,.3,.2)
15
Forecasting: Weighted Moving Average Method
A Weighted Moving Average forecast with weights of (1/6, 1/3, 1/2), is performed as follows: How do you make the Weighted Moving Average forecast more responsive?
16
Forecasting: Exponential Smoothing
Exponential Smoothing is designed to give the benefits of the Weighted Moving Average forecast without the cumbersome problem of specifying weights. In Exponential Smoothing, there is only one parameter (): = smoothing constant (between 0 and 1)
17
Forecasting: Exponential Smoothing
Initialization:
18
Forecasting: Exponential Smoothing
Using a = 0.4, t A(t) F(t) 1 8 2 5 6.5 3 7 5.9 4 6.34 6 9 5.4 6.84 10 Initialization ExPost Forecast Forecast
19
Forecasting: Exponential Smoothing
20
Forecasting: Exponential Smoothing
0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 2 3 4 5 6 7 Period Weight = 0.1 = 0.3 = 0.5 = 0.7 = 0.9
21
Outliers (eloping point)
22
Data with Trends
23
Data with Trends 10 9 8 7 A(t) 6 = 0.3 5 = 0.5 4 = 0.7 3 = 0.9
1 2 3 4 5 6 7 8 9 10 A(t) = 0.3 = 0.5 = 0.7 = 0.9
24
Forecasting: Simple Linear Regression Model
Simple linear regression can be used to forecast data with trends D I b a D is the regressed forecast value or dependent variable in the model, a is the intercept value of the regression line, and b is the slope of the regression line. 35
25
Forecasting: Simple Linear Regression Model
Error In linear regression, the squared errors are minimized
26
Forecasting: Simple Linear Regression Model
27
Limitations in Linear Regression Model
50 100 150 200 250 2 4 6 8 10 12 14 16 As with the simple moving average model, all data points count equally with simple linear regression.
28
Forecasting: Holt’s Trend Model
To forecast data with trends, we can use an exponential smoothing model with trend, frequently known as Holt’s model: L(t) = aA(t) + (1- a) F(t) T(t) = [L(t) - L(t-1) ] + (1- ) T(t-1) F(t+1) = L(t) + T(t) We could use linear regression to initialize the model
29
Holt’s Trend Model: Initialization
First, we’ll initialize the model: L(4) = (9.9)=60.1 T(4) = 9.9
30
Holt’s Trend Model: Updating
= 0.3 b = 0.4 52 64.6 7.74 72.34 6 L(t) = aA(t) + (1- a) F(t) L(5) = 0.3 (52) (70)=64.6 T(t) = [L(t) - L(t-1) ] + (1- ) T(t-1) T(5) = 0.4 [64.6 – 60.1] (9.9) = 7.74 F(t+1) = L(t) + T(t) F(6) = = 72.34
31
Holt’s Trend Model: Updating
= 0.3 b = 0.4 63 69.54 6.62 76.16 7 72 L(6) = 0.3 (63) (72.34)=69.54 T(6) = 0.4 [69.54 – 64.60] (7.74) = 6.62 F(7) = = 76.16
32
Holt’s Model Results Initialization ExPost Forecast Forecast
33
Holt’s Model Results Initialization ExPost Forecast Forecast
Regression 50 100 150 200 250 300 350 5 10 15 20 Initialization ExPost Forecast Forecast
34
Forecasting: Seasonal Model (No Trend)
35
Seasonal Model Formulas
L(t) = aA(t) / S(t-p) + (1- a) L(t-1) S(t) = g [A(t) / L(t)] + (1- g) S(t-p) F(t+1) = L(t) * S(t+1-p) p is the number of periods in a season Quarterly data: p = 4 Monthly data: p = 12
36
Seasonal Model Initialization
Factor S(t) 0.60 1.00 1.55 0.85 Quarter Average 16.0 26.5 41.0 22.5 A(t) 2003 Spring 16 Summer 27 Fall 39 Winter 22 2004 26 43 23 S(5) = 0.60 S(6) = 1.00 S(7) = 1.55 S(8) = 0.85 L(8) = 26.5 Average Sales per Quarter = 26.5
37
Seasonal Model Forecasting
A(t) L(t) Seasonal Factor S(t) F(t) 2004 Spring 16 0.60 Summer 26 1.00 Fall 43 1.55 Winter 23 26.50 0.85 g = 0.3 = 0.4 2005 Spring 14 Summer 29 Fall 41 Winter 22 25.18 0.59 16.00 26.71 1.03 25.18 26.62 1.55 41.32 26.34 0.84 22.60 2006 Spring Summer Fall Winter 15.53 27.02 40.69 22.25
38
Seasonal Model Forecasting
5 10 15 20 25 30 35 40 45 50 2 4 6 8 12 14 16
39
Forecasting: Winter’s Model for Data with Trend and Seasonal Components
L(t) = aA(t) / S(t-p) + (1- a)[L(t-1)+T(t-1)] T(t) = b [L(t) - L(t-1)] + (1- b) T(t-1) S(t) = g [A(t) / L(t)] + (1- g) S(t-p) F(t+1) = [L(t) + T(t)] S(t+1-p)
40
Seasonal-Trend Model Decomposition
To initialize Winter’s Model, we will use Decomposition Forecasting, which itself can be used to make forecasts.
41
Decomposition Forecasting
There are two ways to decompose forecast data with trend and seasonal components: Use regression to get the trend, use the trend line to get seasonal factors Use averaging to get seasonal factors, “de-seasonalize” the data, then use regression to get the trend.
42
Decomposition Forecasting
The following data contains trend and seasonal components:
43
Decomposition Forecasting
The seasonal factors are obtained by the same method used for the Seasonal Model forecast: Seas. Factor 0.80 1.35 1.05 0.79 1.00 Period Quarter Sales 1 Spring 90 2 Summer 157 3 Fall 123 4 Winter 93 5 128 6 211 7 163 8 122 Qtr. Ave. 109 184 143 107.5 Average to 1 Average = 135.9
44
Decomposition Forecasting
With the seasonal factors, the data can be de-seasonalized by dividing the data by the seasonal factors: Regression on the De-seasonalized data will give the trend
45
Decomposition Forecasting Regression Results
46
Decomposition Forecast
Regression on the de-seasonalized data produces the following results: Slope (m) = 7.71 Intercept (b) = 101.2 Forecasts can be performed using the following equation [mx + b](seasonal factor)
47
Decomposition Forecasting
50 100 150 200 250 300 1 2 3 4 5 6 7 8 9 10 11 12
48
Winter’s Model Initialization
We can use the decomposition forecast to define the following Winter’s Model parameters: L(n) = b + m (n) T(n) = m S(j) = S(j-p) So from our previous model, we have L(8) = (7.71) = T(8) = 7.71 S(5) = 0.80 S(6) = 1.35 S(7) = 1.05 S(8) = 0.79
49
Winter’s Model Example
= 0.3 b = 0.4 g = 0.2 Period Quarter Sales L(t) T(t) S(t) F(t) 1 Spring 90 2 Summer 157 3 Fall 123 4 Winter 93 5 128 0.8 6 211 1.35 7 162 1.05 8 122 162.88 7.71 0.79 9 Spring 152 10 Summer 303 11 Fall 232 12 Winter 171 176.41 10.04 0.81 136.47 197.85 14.60 1.39 251.71 215.00 15.62 1.06 223.07 226.37 13.92 0.78 182.19 13 Spring 14 Summer 15 Fall 16 Winter 195.19 352.41 283.09 220.87
50
Winter’s Model Example
50 100 150 200 250 300 350 400 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16
51
Evaluating Forecasts “Trust, but Verify”
Ronald W. Reagan Computer software gives us the ability to mess up more data on a greater scale more efficiently While software like SAP can automatically select models and model parameters for a set of data, and usually does so correctly, when the data is important, a human should review the model results One of the best tools is the human eye
52
Visual Review How would you evaluate this forecast? 10 20 30 40 50 60
10 20 30 40 50 60 1 2 3 4 5 6 7 8 9 11 12 13 14 15
53
Forecast Evaluation Where Forecast is Evaluated ExPost Initialization
50 100 150 200 250 300 350 400 10 20 30 40 Do not include initialization data in evaluation ExPost Forecast Initialization Forecast
54
Errors 50 100 150 200 250 300 350 400 20 25 30 35 40 All error measures compare the forecast model to the actual data for the ExPost Forecast region
55
Errors Measure All error measures are based on the comparison of forecast values to actual values in the ExPost Forecast region—do not include data from initialization.
56
Bias and MAD
57
Bias and MAD Bias tells us whether we have a tendency to over- or under-forecast. If our forecasts are “in the middle” of the data, then the errors should be equally positive and negative, and should sum to 0. MAD (Mean Absolute Deviation) is the average error, ignoring whether the error is positive or negative. Errors are bad, and the closer to zero an error is, the better the forecast is likely to be. Error measures tell how well the method worked in the ExPost forecast region. How well the forecast will work in the future is uncertain.
58
Absolute vs. Relative Measures
Forecasts were made for two sets of data. Which forecast was better? Data Set 1 Bias = 18.72 MAD = 43.99 Data Set 2 Bias = 182 MAD = 912.5 Data Set 1 Data Set 2
59
MPE and MAPE When the numbers in a data set are larger in magnitude, then the error measures are likely to be large as well, even though the fit might not be as “good”. Mean Percentage Error (MPE) and Mean Absolute Percentage Error (MAPE) are relative forms of the Bias and MAD, respectively. MPE and MAPE can be used to compare forecasts for different sets of data.
60
MPE and MAPE Mean Percentage Error (MPE)
Mean Absolute Percentage Error (MAPE)
61
MPE and MAPE Data Set 1
62
MPE and MAPE Data Set 2
63
MPE and MAPE Data Set 1 Data Set 2
64
Tracking Signal What’s happened in this situation? How could we detect this in an automatic forecasting environment? 10 20 30 40 50 60 1 2 3 4 5 6 7 8 9 11 12 13 14 15
65
Tracking Signal The tracking signal can be calculated after each actual sales value is recorded. The tracking signal is calculated as: The tracking signal is a relative measure, like MPE and MAPE, so it can be compared to a set value (typically 4 or 5) to identify when forecasting parameters and/or models need to be changed.
66
Tracking Signal t A(t) F(t) F(t) - A(t) RSFE | F(t) - A(t) | S MAD TS
1 15.1 2 16.8 15.9 3 11.4 14.6 3.2 3.20 1.00 4 18.7 15.8 -2.9 0.3 2.9 6.1 3.05 0.10 5 11.8 2.8 3.1 8.9 2.97 1.04 6 17.2 15.4 -1.8 1.3 1.8 10.7 2.68 0.49 7 12.9 1.7 3.0 12.4 2.48 1.21 8 22.9 17.1 -5.8 -2.8 5.8 18.2 3.03 -0.92 9 24.0 19.2 -4.8 -7.6 4.8 23 3.29 -2.31 10 32.6 23.2 -9.4 -17.0 9.4 32.4 4.05 -4.20 11 38.5 27.8 -10.7 -27.7 43.1 4.79 -5.78 12 36.6 30.4 -6.2 -33.9 6.2 49.3 4.93 -6.88 13 40.6 33.5 -7.1 -41.0 7.1 56.4 5.13 -8.00 14 51.0 38.7 -12.3 -53.3 12.3 68.7 5.73 -9.31 15 51.9 42.7 -9.2 -62.5 9.2 77.9 5.99 -10.43
67
Tracking Signal 10 20 30 40 50 60 1 2 3 4 5 6 7 8 9 11 12 13 14 15 TS = -5.78
Similar presentations
© 2024 SlidePlayer.com Inc.
All rights reserved.