Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 King Abdulaziz University Faculty of Engineering Industrial Engineering Dept. IE 436 Dynamic Forecasting.

Similar presentations


Presentation on theme: "1 King Abdulaziz University Faculty of Engineering Industrial Engineering Dept. IE 436 Dynamic Forecasting."— Presentation transcript:

1 1 King Abdulaziz University Faculty of Engineering Industrial Engineering Dept. IE 436 Dynamic Forecasting

2 2 CHAPTER 3 Exploring Data Patterns and an Introduction to Forecasting techniques Cross-sectional data: Cross-sectional data: collected at a single point in time. collected at a single point in time. A Time series: collected, and recorded over successive increments of time. (Page 62) A Time series: collected, and recorded over successive increments of time. (Page 62)

3 3 Exploring Time Series Data Patterns Horizontal (stationary). Horizontal (stationary). Trend. Trend. Cyclical. Cyclical. Seasonal. Seasonal. A Stationary Series Its mean and variance remain constant over time

4 4 The Trend The long-term component that represents the growth or decline in the time series. The Cyclical component The wavelike fluctuation around the trend. Cyclical Peak FIGURE 3-2 Trend and Cyclical Components of an Annual Time Series Such as Housing Costs Cyclical Valley Trend Line Year Cost 01020 25 20 15 10 Page (63)

5 5 A pattern of change that repeats itself year after year. The Seasonal Component FIGURE 3-3 Electrical Usage for Washington water Power Company, 1980-1991 Page (64)

6 6 Exploring Data Patterns with Autocorrelation Analysis The correlation between a variable lagged one or more periods and itself. Autocorrelation: Autocorrelation: = autocorrelation coefficient for a lag of k periods = mean of the values of the series = observation in time period t = observation at time period t-k (Pages 64-65)

7 7 Autocorrelation Function (Correlogram) A graph of the autocorrelations for various lags. A graph of the autocorrelations for various lags. Computation of the lag 1 autocorrelation coefficient Table 3-1 (page 65)

8 8 Example 3.1 Data are presented in Table 3-1 (page 65). Data are presented in Table 3-1 (page 65). Table 3-2 shows the computations that lead to the calculation of the lag 1 autocorrelation coefficient. Table 3-2 shows the computations that lead to the calculation of the lag 1 autocorrelation coefficient. Figure 3-4 contains a scatter diagram of the pairs of observations (Y t, Y t-1 ). Figure 3-4 contains a scatter diagram of the pairs of observations (Y t, Y t-1 ). Using the totals from Table 3-2 and Equation 3.1: Using the totals from Table 3-2 and Equation 3.1:

9 9 Autocorrelation Function (Correlogram) (Cont.) FIGURE 3-5 Correlogram or Autocorrelation Function for the Data Used in Example 3.1 Minitab instructions: Stat > Time Series > Autocorrelation

10 10 using Autocorrelation Analysis Questions to be Answered Are the data random? Are the data random? Do the data have a trend? Do the data have a trend? Are the data stationary? Are the data stationary? Are the data seasonal? Are the data seasonal? (Page 68)

11 11 Are the data random? If a series is random: The successive values are not related to each other. Almost all the autocorrelation coefficients are significantly different from zero.

12 12 Is an autocorrelation coefficient significantly different from zero? -At a specified confidence level, a series can be considered random if the autocorrelation coefficients are within the interval [0 ± t SE(r k )], (z instead of t for large samples). - The autocorrelation coefficients of random data have an approximate normal sampling distribution. - The following t statistic can be used:

13 13 - Standard error of the autocorrelation at lag k: n r rSE k i i k      1 1 2 21 )( (3.2) Where: r i = the autocorrelation at time lag k. k = the time lag n = the number of observations in the time series

14 14 Example 3.2 (Page 69) At significant level = 0.05: the critical values ± 2.2 are the t upper and lower points for n-1 = 11 degrees of freedom. Decision Rule: If t 2.2, reject H ◦ : = 0 A hypothesis test: Is a particular autocorrelation coefficient is significantly different from zero? Note: t is given directly in the Minitab output under the heading T.

15 15 Is an autocorrelation coefficient different from zero? (Cont.) The Modified Box-Pierce Q statistic (developed by: Ljung, and Box) “LBQ” A portmanteau test: Whether a whole set of autocorrelation coefficients at once.

16 16 n= number of observations K= the time lag m= number of time lags to be considered = k th autocorrelation coefficient lagged k time periods The value of Q can be compared with the chi-square with m degrees of freedom.     m k k kn r nnQ 1 2 )2( (3.3) Where:

17 17 Example 3.3 (Page 70) tYtYt tYtYt tYtYt tYtYt 1343119462170431555 2574121422229132476 387913477234333612 4728144522411834574 537157272568235518 6227161472657736296 7613171992783437970 8157187442898138204 9571196272926339616 107220122304244097

18 18 FIGURE 3-7 Autocorrelation Function for the Data Used in Example 3.3

19 19 Q statistic for m= 10 time lags is calculated = 7.75 (using Minitab). The chi-square value = 18.307, (tested at 0.05 significance level, degrees of freedom df = m = 10). Table B-4 (Page 527) Q <, Conclusion: the series is random.

20 20 Do the Data have a Trend? between A significant relationship exists between successive time series values. The autocorrelation coefficients are large for the first several time lags, and then gradually drop toward zero as the number of periods increases. The autocorrelation for time lag 1: is close to 1, for time lag 2: is large but smaller than for time lag 1.

21 21 Example 3.4 (Page 72) YearYtYt YtYt YtYt YtYt 1955330719666769197717224198850251 1956355619677296197817946198953794 1957360119688178197917514199055972 1958372119698844198025195199157242 1959403619709251198127357199252345 19604134197110006198230020199350838 19614268197210991198335883199454559 19624578197312306198438828199534925 19635093197413101198540715199638236 19645716197513639198644282199741296 196563571976149501987484401998……. (Page 74) Data in Table 3-4 (Page 74)

22 22 Data Differencing A time series can be differenced to remove the trend and to create a stationary series. See FIGURE 3-8 (Page 73) for differencing the Data of Example 3.1 See FIGURES 3-12, 3-13 (Page 75)

23 23 Are The Data Seasonal? For quarterly data: a significant autocorrelation coefficient will appear at time lag 4. For monthly data: a significant autocorrelation coefficient will appear at time lag 12.

24 24 (Page 76) Example 3.5 (Page 76) Year December 31 March 31 June 30 September 30 1994147.6251.8273.1249.1 1995139.3221.2260.2259.5 1996140.5245.5298.8287.0 1997168.8322.6393.5404.3 1998259.7401.1464.6497.7 1999264.4402.6411.3385.9 2000232.7309.2310.7293.0 2001205.1234.4285.4258.7 2002193.2263.7292.5315.2 2003178.3274.5295.4286.4 2004190.8263.5318.8305.5 2005242.6318.8329.6338.2 2006232.1285.6291.0281.4 Table 3-5: (Page 77) See Figures 3-14, 3-15 (Page 77)

25 25 Time Series Graph FIGURE 3-14 Time Series Plot of Quarterly Sales for Coastal Marine for Example 3.5

26 26 Autocorrelation coefficients at time lags 1 and 4 are significantly different from zero, Sales are seasonal on quarterly basis. FIGURE 3-15 Autocorrelation Function for quarterly Sales for Coastal Marine for Example 3.5

27 27 Questions to be Considered: Choosing a Forecasting Technique Why is a forecast needed? Who will use the forecast? What are the characteristics of the data? What time period is to be forecast? What are the minimum data requirements? How much accuracy is required? What will the forecast cost?

28 28 Choosing a Forecasting Technique (Cont.) Define the nature of the forecasting problem. Define the nature of the forecasting problem. Explain the nature of the data. Explain the nature of the data. Describe the properties of the techniques. Describe the properties of the techniques. Develop criteria for selection. Develop criteria for selection. The Forecaster Should Accomplish the Following:

29 29 Level of Details. Time horizon. Based on judgment or data manipulation. Management acceptance. Cost. Factors Considered: Choosing a Forecasting Technique (Cont.)

30 30 MethodUsesConsiderations Judgment  Can be used in the absence of historical data (e.g. new product).  Most helpful in medium- and long-term forecasts  Subjective estimates are subject to the biases and motives of estimators. Causal  Sophisticated method  Very good for medium- and long-term forecasts  Must have historical data.  Relationships can be difficult to specify Time series  Easy to implement  Work well when the series is relatively stable  Rely exclusively on past data.  Most useful for short-term estimates. General considerations for choosing the appropriate method

31 31 Pattern of data: ST, stationary; T, trended; S, seasonal; C, cyclical. Time horizon: S, short term (less than three months); I, intermediate; L, long term Type of model: TS, time series; C, causal. Seasonal: s, length of seasonality. of Variable: V, number variables. Method Pattern of Data Time Horizon Type of Model Minimal Data Requirements NonseasonalSeasonal NaïveST, T, SSTS1 Simple averagesSTSTS30 Moving averagesSTSTS4-20 Single Exponential smoothingSTSTS2 Linear (Double) exponential smoothing (Holt’s)TSTS3 Quadratic exponential smoothingTSTS4 Seasonal exponential smoothing (Winter’s)SSTS2 x s Adaptive filteringSSTS5 x s Simple regressionTIC10 Multiple regressionC, SIC10 x V Classical decompositionSSTS5 x s Exponential trend modelsTI, LTS10 S-curve fittingTI, LTS10 Gompertz modelsTI, LTS10 Growth curvesTI, LTS10 Census X-12SSTS6 x s ARIMA (Box-Jenkins)ST, T, C, SSTS243 x s Lading indicatorsCSC24 Econometric modelsCSC30 Time series multiple regressionT, SI, LC 6 x s

32 32 = actual value of a time series in time t = forecast value for time period t = - = forecast error in time t (residual) Basic Forecasting Notation Measuring Forecast Error

33 33 The Mean Absolute Deviation The Mean Squared Error The Mean Absolute Percentage Error The Mean Percentage Error Measuring Forecasting Error (Cont.) The Root Mean Square Error Equations (3.7 - 3.11)

34 34 Example 3.6 (Page 83) Evaluate the model using: MAD, MSE, RMSE, MAPE, and MPE. The measurement of a technique usefulness or reliability. Comparison of the accuracy of two different techniques. The search for an optimal technique. Used for:

35 35 Empirical Evaluation of Forecasting Methods Complex methods do not necessarily produce more accurate forecasts than simpler ones. Complex methods do not necessarily produce more accurate forecasts than simpler ones. Various accuracy measures (MAD, MSE, MAPE) produce consistent results. Various accuracy measures (MAD, MSE, MAPE) produce consistent results. The performance of methods depends on the forecasting horizon and the kind of data analyzed( yearly, quarterly, monthly). The performance of methods depends on the forecasting horizon and the kind of data analyzed( yearly, quarterly, monthly). Results of the forecast accuracy for a sample of 3003 time series (1997):

36 36 Determining the Adequacy of a Forecasting Technique Are the residuals indicate a random series? Are the residuals indicate a random series? (Examine the autocorrelation coefficients of the residuals, there should be no significant ones) (Examine the autocorrelation coefficients of the residuals, there should be no significant ones) Are they approximately normally distributed? Are they approximately normally distributed? Is the technique simple and understood by decision makers? Is the technique simple and understood by decision makers?


Download ppt "1 King Abdulaziz University Faculty of Engineering Industrial Engineering Dept. IE 436 Dynamic Forecasting."

Similar presentations


Ads by Google