Forecasting Operations Analysis and Improvement 2017 Spring Dr. Tai-Yue Wang Industrial and Information Management Department National Cheng Kung University
Outline Introduction to Forecasting Subjective Forecasting Methods Objective Forecasting Methods Evaluating Forecasts Methods for Forecasting Stationary Series Trend-Based Methods Methods for Seasonal Series Box-Jenkins Models
Overview of Operations Planning Activities
Introduction to Forecasting What is forecasting? Primary Function is to Predict the Future Why are we interested? Affects the decisions we make today Examples: who uses forecasting in their jobs? forecast demand for products and services forecast availability of manpower forecast inventory and materiel needs daily
Introduction to Forecasting -- What Makes a Good Forecast It should be timely It should be as accurate as possible It should be reliable It should be in meaningful units It should be presented in writing The method should be easy to use and understand in most cases.
Introduction to Forecasting – The Time Horizon
Introduction to Forecasting – Characteristics of Forecasts They are usually wrong! A good forecast is more than a single number mean and standard deviation range (high and low) Aggregate forecasts are usually more accurate Accuracy erodes as we go further into the future. Forecasts should not be used to the exclusion of known information
Subjective Forecasting Methods Sales Force Composites Aggregation of sales personnel estimates Customer Surveys Jury of Executive Opinion The Delphi Method Individual opinions are compiled and reconsidered. Repeat until and overall group consensus is (hopefully) reached.
Objective Forecasting Methods Two primary methods: causal models and time series methods Causal Models Let Y be the quantity to be forecasted and (X1, X2, . . . , Xn) be n variables that have predictive power for Y. A causal model is Y = f (X1, X2, . . . , Xn). A typical relationship is a linear one. That is, Y = a0 + a1X1 + . . . + an Xn.
Time Series Methods Trend Seasonality Cycles Randomness A time series is just collection of past values of the variable being predicted. Also known as naïve methods. Goal is to isolate patterns in past data. (See Figures on following pages) Trend Seasonality Cycles Randomness
Time Series Methods -- Notation Conventions Let D1, D2, . . . Dn, . . . be the past values of the series to be predicted (demand). If we are making a forecast in period t, assume we have observed Dt , Dt-1 etc. Let Ft, t + t = forecast made in period t for the demand in period t + t where t = 1, 2, 3, … Ft -1, t is the forecast made in t-1 for t and Ft, t+1 is the forecast made in t for t+1. (one step ahead) Use shorthand notation Ft = Ft - 1, t .
Evaluation of Forecasts The forecast error in period t, et, is the difference between the forecast for demand in period t and the actual value of demand in t. For a multiple step ahead forecast: et = Ft - t, t - Dt. For one step ahead forecast: et = Ft - Dt.
Evaluation of Forecasts MAD = (1/n) S | e i | MSE = (1/n) S ei 2
Evaluation of Forecasts -- Biases in Forecasts A bias occurs when the average value of a forecast error tends to be positive or negative. Mathematically an unbiased forecast is one in which E (e i ) = 0. See Figure 2.3 on page 64 in text (next slide).
Forecast Errors Over Time Figure 2.3
Forecasting for Stationary Series A stationary time series has the form: Dt = m + e t where m is a constant and e t is a random variable with mean 0 and var s2 . Two common methods for forecasting stationary series are moving averages and exponential smoothing.
Forecasting for Stationary Series -- Moving Averages In words: the arithmetic average of the n most recent observations. For a one-step-ahead forecast: Ft = (1/N) (Dt - 1 + Dt - 2 + . . . + Dt - n )
Forecasting for Stationary Series -- Moving Averages Example: Quarterly data:
Forecasting for Stationary Series -- Moving Averages --Summary Advantages of Moving Average Method Easily understood Easily computed Provides stable forecasts Disadvantages of Moving Average Method Requires saving all past N data points Lags behind a trend Ignores complex relationships in data
Forecasting for Stationary Series -- Moving Averages --Summary
Forecasting for Stationary Series -- Exponential Smoothing Method A type of weighted moving average that applies declining weights to past data. 1. New Forecast = a (most recent observation) + (1 - a) (last forecast) or 2. New Forecast = last forecast - a (last forecast error) where 0 < a < 1 and generally is small for stability of forecasts ( around .1 to .2)
Forecasting for Stationary Series -- Exponential Smoothing Method In symbols: Ft+1 = a Dt + (1 - a ) Ft = a Dt + (1 - a ) (a Dt-1 + (1 - a ) Ft-1) = a Dt + (1 - a )(a )Dt-1 + (1 - a)2 (a )Dt - 2 + . . . Hence the method applies a set of exponentially declining weights to past data. It is easy to show that the sum of the weights is exactly one. (Or Ft + 1 = Ft - a (Ft - Dt) )
Forecasting for Stationary Series -- Exponential Smoothing Method --Weights in Exponential Smoothing
Forecasting for Stationary Series -- Exponential Smoothing Method Example Quarterly data What is F3 if a=0.1?
Forecasting for Stationary Series -- Exponential Smoothing Method
Forecasting for Stationary Series -- Comparison of ES and MA Similarities Both methods are appropriate for stationary series Both methods depend on a single parameter Both methods lag behind a trend One can achieve the same distribution of forecast error by setting a = 2/ ( N + 1). .
Forecasting for Stationary Series -- Comparison of ES and MA Differences ES carries all past history. MA eliminates “bad” data after N periods MA requires all N past data points while ES only requires last forecast and last observation.
Trend-Based Methods Regression Analysis Regression Methods Can be Used When Trend is Present. Model: Dt = a + bt.
Trend-Based Methods If t is scaled to 1, 2, 3, . . . , then the least squares estimates for a and b can be computed as follows: Set Sxx = n2 (n+1)(2n+1)/6 - [n(n+1)/2]2 Set Sxy = n S iDi - [n(n + 1)/2] S Di Let b = Sxy / Sxx and a = 𝐷 - b (n+1)/2 These values of a and b provide the “best” fit of the data in a least squares sense.
Trend-Based Methods -- Double exponential smoothing with Holt’s method Double exponential smoothing, of which Holt’s method is only one example, can also be used to forecast when there is a linear trend present in the data. The method requires separate smoothing constants for slope and intercept. where St=the intercept, Gt=slope, Dt=demand, a,b = constants 𝑆 𝑡 =𝛼 𝐷 𝑡 + 1−𝛼 𝑆 𝑡−1 + 𝐺 𝑡−1 𝐺 𝑡 = 𝛽 𝑆 𝑡 − 𝑆 𝑡−1 +(1−𝛽) 𝐺 𝑡−1
Trend-Based Methods -- Double exponential smoothing with Holt’s method So the step ahead forecast made in period t, denoted by Ft, t+, 𝐹 𝑡, 𝑡+𝜏 = 𝑆 𝑡 +𝜏 𝐺 𝑡
Trend-Based Methods -- Double exponential smoothing with Holt’s method Example Quarterly data What is F3, 3+1 if a==0.1, S0=200, and G0=10 ?
Methods For Seasonal Series --A Seasonal Demand Series
Methods For Seasonal Series – Seasonal Factors for Stationary Series Seasonality corresponds to a pattern in the data that repeats at regular intervals. Multiplicative seasonal factors: c1 , c2 , . . . , cN where i = 1 is first period of season, i = 2 is second period of the season, etc.. S ci = N. ci = 1.25 implies 25% higher than the baseline on avg. ci = 0.75 implies 25% lower than the baseline on avg.
Methods For Seasonal Series – Seasonal Factors for Stationary Series Procedure: Compute the sample mean of all the data Divide each observation by the sample mean factors for each period of observed data Average the factors for like periods within each season average all\factors corresponding to the same period of a season Forecast your time series
Methods For Seasonal Series – Seasonal Factors for Stationary Series Example: Week 1 Week 2 Week 3 Week 4 Monday 16.2 17.3 14.6 16.1 Tuesday 12.2 11.5 13.1 11.8 Wednesday 14.2 15.0 13.0 12.9 Thursday 17.6 16.9 16.6 Friday 22.5 23.5 21.9 24.3
Methods For Seasonal Series – Seasonal Factors for Stationary Series Solution: Sample mean=16.425 Divide each observation by sample mean Week 1 Week 2 Week 3 Week 4 Monday 0.986 1.053 0.889 0.980 Tuesday 0.743 0.700 0.798 0.718 Wednesday 0.865 0.913 0.791 0.785 Thursday 1.072 1.029 1.011 Friday 1.370 1.431 1.333 0.479
Methods For Seasonal Series – Seasonal Factors for Stationary Series Solution: Average all factors for Mondays thru Friday Average Factors Monday 0.98 Tuesday 0.74 Wednesday 0.84 Thursday 1.04 Friday 1.40
Methods For Seasonal Series – Seasonal Factors for Stationary Series Solution: Forecast the numbers by multiplying mean and each of average factor. Average Factors Monday 16.1 Tuesday 12.1 Wednesday 13.8 Thursday 17.1 Friday 23.0
Methods For Seasonal Series – Seasonal decomposition Using Moving Average Steps of Seasonal decomposition Using Moving Average Calculate the moving averages. Center the moving average Center the centered moving average on integer-valued periods. Determine the seasonal and irregular factors. Determine the average seasonal factors.
Methods For Seasonal Series – Seasonal decomposition Using Moving Average Steps of Seasonal decomposition Using Moving Average Scale the seasonal factors. Determine the deseasonalized data. Determine a trend line of the deseasonalized data. Determine the deseasonalized predictions. Take into account the seasonality.
Methods For Seasonal Series – Seasonal decomposition Using Moving Average Example: Demands for each periods Period Demand MA(4) 1 10 2 20 3 26 4 17 18.25 5 12 18.75 6 23 19.50 7 30 20.50 8 22 21.75
Methods For Seasonal Series – Seasonal decomposition Using Moving Average Example: Demands for each periods
Methods For Seasonal Series – Seasonal decomposition Using Moving Average Solution: Moving average with k=3 Period Demand MA(4) 1 10 2 20 3 26 4 17 18.25 5 12 18.75 6 23 19.50 7 30 20.50 8 22 21.75
Methods For Seasonal Series – Seasonal decomposition Using Moving Average Period Demand MA(4) Centered MA 1 10 2 20 18.25 3 26 18.75 4 17 19.50 5 12 20.25 6 23 21.75 7 30 20.50 8 22 Solution: Center the moving average
Methods For Seasonal Series – Seasonal decomposition Using Moving Average Period Demand MA(4) Centered MA MA on Integer periods 1 10 2 20 18.25 3 26 18.50 18.75 4 17 19.125 19.50 5 12 20.50 6 23 21.125 21.75 7 30 8 22 Center the centered moving average on integer-valued periods.
Methods For Seasonal Series – Seasonal decomposition Using Moving Average Period Demand (B) MA(4) Centered MA MA on Integer periods (C) Seasonal and irregular factors (B/C) 1 10 2 20 18.25 3 26 18.50 1.405 18.75 4 17 19.125 0.888 19.50 5 12 0.600 20.50 6 23 21.125 1.089 21.75 7 30 8 22 Determine the seasonal and irregular factors by dividing (B) by (C)
Methods For Seasonal Series – Seasonal decomposition Using Moving Average Period Demand (B) MA(4) Centered MA MA on Integer periods (C) Seasonal and irregular factors (B/C) 1 10 2 20 18.25 3 26 18.50 1.405 18.75 4 17 19.125 0.888 19.50 5 12 0.600 20.50 6 23 21.125 1.089 21.75 7 30 8 22 Determine the average seasonal factors. (not in this example)
Methods For Seasonal Series – Seasonal decomposition Using Moving Average Period Demand (B) Centered MA on Integer periods (C) Seasonal and irregular factors (B/C) Scaled seasonal factors *(4/3.982) (D) 1 10 2 20 3 26 18.50 1.405 1.405*1.0045 =1.4113 4 17 …… 19.125 0.888 0.892 5 12 0.600 0.6027 6 23 21.125 1.089 1.094 7 30 8 22 Scale the seasonal factors.
Methods For Seasonal Series – Seasonal decomposition Using Moving Average Determine the deseasonalized data. Period Demand (B) Scaled seasonal factors *(4/3.982) (D) Deseasonal Data (B/D) 1 10 10/0.6027=16.592 2 20 20/1.094=18.282 3 26 1.405*1.0045 =1.4113 26/1.4113=18.423 4 17 …… 0.892 17/0.892=19.058 5 12 0.60271 19.91 6 23 1.094 21.024 7 30 21.257 8 22 24.664
Methods For Seasonal Series – Seasonal decomposition Using Moving Average Determine a trend line of the deseasonalized data. By regression, Determine the deseasonalized predictions 𝐷 𝑡 =16.8+0.7092𝑡 Period Deseasonalized predictions 9 23.18 10 23.89 11 24.6 12 25.31
Methods For Seasonal Series – Seasonal decomposition Using Moving Average Take into account the seasonality Period Deseasonalized predictions Seasonal Factors Forecast 9 23.18 0.60271 13.97 10 23.89 1.094 26.136 11 24.6 1.4113 34.72 12 25.31 0.892 22.577
Methods For Seasonal Series – Winter’s Method Moving average with or without trend can be used to predict seasonal data, however, one needs to recalculate all factors if new data comes in. Winter’s method is a triple exponential smoothing. 𝐷 𝑡 = 𝜇+ 𝐺 𝑡 𝑐 𝑡 + 𝜖 𝑡 Dt: demand at time t; 𝜇:intercept; Gt:Slope ct: seasonal factor; 𝜖 𝑡 : error term
Methods For Seasonal Series – Winter’s Method
Methods For Seasonal Series – Winter’s Method Assuming: Length of the season, N Seasonal factors are the same each season and 𝑐 𝑡 =𝑁 Three exponential smoothing equation are used: The series The trend The seasonal factors
Methods For Seasonal Series – Winter’s Method The series the current level of the deseasonalized series, St 𝑆 𝑡 =𝛼 𝐷 𝑡 𝑐 𝑡−𝑁 +(1−𝛼)( 𝑆 𝑡−1 + 𝐺 𝑡−1 ) The trend 𝐺 𝑡 =𝛽[ 𝑆 𝑡 − 𝑆 𝑡−1 ]+(1−𝛽) 𝐺 𝑡−1 The seasonal factor 𝑐 𝑡 =𝛾( 𝐷 𝑡 𝑆 𝑡 )+(1−𝛾) 𝑐 𝑡−𝑁
Methods For Seasonal Series – Winter’s Method The forecast made in period t for any future period t+ is 𝐹 𝑡, 𝑡+𝜏 = 𝑆 𝑡 + 𝜏𝐺 𝑡 𝑐 𝑡+𝜏−𝑁 if t N Note, appropriate seasonal factor would be: 𝑐 𝑡+𝜏−2𝑁 if N<𝜏 2N 𝑐 𝑡+𝜏−3𝑁 if 2N<𝜏 3N
Methods For Seasonal Series – Winter’s Method –Initialization procedure In order to get the method started, one needs to obtain initial estimates of the series Winters suggest minimum 2 seasons of data available Assuming exactly two season are considered and the time now, t=0, so the past demands are labeled: D-2N+1 , D-2N+2 , ….., D0
Methods For Seasonal Series – Winter’s Method –Initialization procedure Calculate the sample means for the two separate seasons of data 𝑉 1 = 1 𝑁 𝑗=−2𝑁+1 −𝑁 𝐷 𝑗 𝑉 2 = 1 𝑁 𝑗=−𝑁+1 0 𝐷 𝑗
Methods For Seasonal Series – Winter’s Method –Initialization procedure Define G0=(V2-V1)/N as the initial slope estimate If m>2 (m:# of seasons), compute V1,,,,Vm as step 1 shows and define G0=(Vm-V1)/[(m-1)N] Set S0=V2 + G0[(N-1)/2]
Methods For Seasonal Series – Winter’s Method –Initialization procedure a. compute initial seasonal factor 𝑐 𝑡 = 𝐷 𝑡 𝑉 𝑖 −[ 𝑁+1 2 −𝑗] 𝐺 0 𝑓𝑜𝑟 −2𝑁+1≤𝑡≤0 where i=1 for the first season, i=2 for the second season j: the period of the season, j=1 for t=-2N+1and t=-N+1, j=2 for t=-2N+2 and t=-N+2 b. average the seasonal factors by 𝑐 −𝑁+1 = 𝑐 −2𝑁+1 + 𝑐 −𝑁+1 2 , …, 𝑐 0 = 𝒄 −𝑵 + 𝒄 𝟎 𝟐 for exactly two seasons
Methods For Seasonal Series – Winter’s Method –Initialization procedure c. Normalize the seasonal factor 𝑐 𝑗 = 𝑐 𝑗 𝑖=0 −𝑁+1 𝑐 𝑖 *Note: The above initialization method is only one of methods. Moving average, linear regression can be also used.
Methods For Seasonal Series – Winter’s Method --example Initial data set: 𝑉 1 = 1 𝑁 𝑗=−2𝑁+1 −𝑁 𝐷 𝑗 =18.25 𝑉 2 = 1 𝑁 𝑗=−𝑁+1 0 𝐷 𝑗 =21.75 G0=(V2-V1)/N=0.875 S0=V2 + G0[(N-1)/2]=23.06 Initialization 𝑐 𝑡 = 𝐷 𝑡 𝑉 𝑖 −[ 𝑁+1 2 −𝑗] 𝐺 0 𝑓𝑜𝑟 −7≤𝑡≤0
Methods For Seasonal Series – Winter’s Method --example Initial data set: Initialization a. For 𝑐 −7 , 𝑐 −6 , 𝑐 −5 , 𝑐 −4 , i=1; For 𝑐 −3 , 𝑐 −2 , 𝑐 −1 , 𝑐 0 , i=2 For 𝑐 −7 , 𝑐 −5 , 𝑐 −3 , 𝑐 −1 , j=1; 𝑐 −7 = 𝐷 −7 𝑉 1 −[ 5 2 −1](0.875) = 10 18.25−[1.5](0.875) =0.5904 For 𝑐 −6 , 𝑐 −4 , 𝑐 −2 , 𝑐 0 , j=2; 𝑐 −6 = 𝐷 −6 𝑉 1 −[ 5 2 −2](0.875) = 10 18.25−[1.5](0.875) =0.5904 And so on
Methods For Seasonal Series – Winter’s Method --example Initial data set: Initialization b. Average seasonal factors: Average 𝑐 −7 𝑎𝑛𝑑 𝑐 −3 𝑐 −3 =0.588 Average 𝑐 −6 𝑎𝑛𝑑 𝑐 −2 𝑐 −2 =1.010 … c. Normalize seasonal factors 𝑐 −3 =0.5900, 𝑐 −2 =1.1100, …
Methods For Seasonal Series – Winter’s Method --example Forecasting equations: 𝐹 𝑡, 𝑡+𝜏 = 𝑆 𝑡 + 𝜏𝐺 𝑡 𝑐 𝑡+𝜏−𝑁 if t N 𝐹 0, 1 = 𝑆 0 + 𝐺 0 𝑐 −3 if t =0 and𝜏=1 𝐹 0, 2 = 𝑆 0 + 𝐺 0 𝑐 −2 if t =0 and 𝜏=2 … For t=0, 𝐹 0, 1 =14.12, 𝐹 0, 2 =27.54, 𝐹 0, 3 =35.44, 𝐹 0, 3 =24.38
Methods For Seasonal Series – Winter’s Method --example For t=1, D1 is available, and assume =0.2, =0.1, =0.1 𝑆 𝑡 =𝛼 𝐷 𝑡 𝑐 𝑡−𝑁 +(1−𝛼)( 𝑆 𝑡−1 + 𝐺 𝑡−1 ) 𝑆 1 =0.2 𝐷 1 𝑐 1−𝑁 + 1−0.2 𝑆 0 + 𝐺 0 𝐺 𝑡 =𝛽[ 𝑆 𝑡 − 𝑆 𝑡−1 ]+(1−𝛽) 𝐺 𝑡−1 𝐺 1 =0.1 𝑆 1 − 𝑆 0 + 1−0.1 𝐺 0 𝑐 𝑡 =𝛾( 𝐷 𝑡 𝑆 𝑡 )+(1−𝛾) 𝑐 𝑡−𝑁 𝑐 1 =0.1( 𝐷 1 𝑆 1 )+(1−0.1) 𝑐 1−𝑁 So 𝑆 1 =24.57, 𝐺 1 =0.9385, 𝑐 1 =0.5961
Methods For Seasonal Series – Winter’s Method --example For t=1, D1 is available, and assume =0.2, =0.1, =0.1 Renorm: 𝑐 −2 , 𝑐 −1 , 𝑐 0 , 𝑎𝑛𝑑 𝑐 1 , Forecasting for t=1 𝐹 1, 2 = 𝑆 1 + 𝐺 1 𝑐 −2 if t =1 and𝜏=1 𝐹 1, 3 = 𝑆 1 + 𝐺 1 𝑐 −1 if t =1 and 𝜏=2 … And so on repeat for the rest of data..
Practical Considerations Overly sophisticated forecasting methods can be problematic, especially for long term forecasting. (Refer to Figure on the next slide.) Tracking signals may be useful for indicating forecast bias. TS= 𝑖=1 𝑛 𝑒 𝑖 𝑀𝐴𝐷 Box-Jenkins methods require substantial data history, use the correlation structure of the data, and can provide significantly improved forecasts under some circumstances.
The Difficulty with Long-Term Forecasts