Presentation is loading. Please wait.

Presentation is loading. Please wait.

FORECASTING METHODS OF NON- STATIONARY STOCHASTIC PROCESSES THAT USE EXTERNAL CRITERIA Igor V. Kononenko, Anton N. Repin National Technical University.

Similar presentations


Presentation on theme: "FORECASTING METHODS OF NON- STATIONARY STOCHASTIC PROCESSES THAT USE EXTERNAL CRITERIA Igor V. Kononenko, Anton N. Repin National Technical University."— Presentation transcript:

1 FORECASTING METHODS OF NON- STATIONARY STOCHASTIC PROCESSES THAT USE EXTERNAL CRITERIA Igor V. Kononenko, Anton N. Repin National Technical University “Kharkiv Polytechnic Institute”, Ukraine The 28 th Annual International Symposium on Forecasting June 22-25, 2008, Nice, France

2 INTRODUCTION While forecasting the development of socio-economic systems there often arise the problems of forecasting the non-stationary stochastic processes having a scarce number of observations (5- 30), while the repeated realizations of processes are impossible. To solve such problems there have been suggested a number of methods, in which unknown parameters of the model are estimated not at all points of time series, but at a certain subset of points, called a learning sequence. At the remaining points not included in the learning sequence and called the check sequence, the suitability of the model for describing the time series is determined.

3 PURPOSE OF WORK The purpose of this work is to create and study an effective forecasting method of non-stationary stochastic processes in the case when observations in the base period are scarce

4 H-CRITERION METHOD (I. Kononenko, 1982) Retrospective information Vector of values of the predicted variable q – number of significant variables including the predicted variable n – number of points in the time base of forecast

5 H-CRITERION METHOD (2)

6 H-CRITERION METHOD (3) The parameters of all formed models are estimated using the learning submatrix – the vector of estimated parameters for j-th model  – the vector of weighting coefficients considering the error variance or importance for building the model

7 H-CRITERION METHOD (4) For each j - th model at all points of past history we calculate Calculating

8 H-CRITERION METHOD (5) New learning and check submatrices are chosen. The number of rows in Г L is decreased by one. The process of estimation of model parameters and calculation of  3 is repeated. The learning submatrix is used as the check one and the check submatrix - as the learning one,  4 is calculated and similarly we continue using the bipartitioning. The process is stopped after a set number of iteration g. H-criterion:

9 METHOD, THAT USES THE BOOTSTRAP EVALUATION (I. Kononenko, 1990) Retrospective information Vector of values of the predicted variable q – number of significant variables including the predicted variable n – volume of past history

10 L – the number of a model in the set of test models Testing the model N i – vector of independent variables B – vector of estimated parameters  i – independent errors having the same and symmetrical density of distribution METHOD, THAT USES THE BOOTSTRAP EVALUATION (2)

11 1. The parameters of the model we estimate using matrix  basing on the condition – loss function, Determining the deviation from points of  Numbers bias i form the BIAS vector METHOD, THAT USES THE BOOTSTRAP EVALUATION (3)

12 2. We divide the matrix  into two submatrices – learning submatrix  L and check submatrix  C. First n-1 columns of matrix  are included in submatrix  L and n-th column is included in  C. Estimating the parameters B of the test model and obtaining Calculating the deviation of the model from the statistics Let k=1, where k – number of iteration, which performs the bootstrap evaluation METHOD, THAT USES THE BOOTSTRAP EVALUATION (4)

13 3. We perform bootstrap evaluation, which consists in the following. We randomly (with equal probability) select numbers from the BIAS vector and add them to the values of model. As a result we obtain “new” statistics which looks like the following We divide the matrix  k (a new one this time) into  L,k and  C,k. We estimate the unknown parameters basing on  L,k as earlier and calculate the model deviation from  C,k METHOD, THAT USES THE BOOTSTRAP EVALUATION (5)

14 4. If k<K-1 then we suppose that k:=k+1 and return to step 3 (where K – number of bootstrap iterations), otherwise proceed to step 5 5. Evaluating 6. If L<z then we suppose that L:=L+1 and move to step 1 (where z – number of models in the list), otherwise we stop. The model with minimal D L is considered to be the best one. METHOD, THAT USES THE BOOTSTRAP EVALUATION (6)

15 ANALYSIS OF METHODS Mathematical modelsAdditive noise The loss function

16 ANALYSIS OF METHODS (2) Relative PMAD, evaluated at the estimation period PMAD, evaluated at the estimation period PMAD, evaluated at the forecasting period Relative MSE, evaluated at the forecasting period MSE, evaluated at the forecasting period

17 ANALYSIS OF METHODS (3) Average (across noise realizations):

18 ANALYSIS OF METHODS (4) We compared the characteristics with the two-sample t-test assuming the samples were drawn from the normally distributed populations:

19 ANALYSIS OF METHODS (5) PMAD, evaluated at the forecasting period

20 RESULTS OF ANALYSIS When the number of partitions increases in case of using matrices R 1 and R 2 we observe the downward trend of PMAD with some fluctuations in this trend that depend on the ways of data partition The partition according to the cross-validation procedure, in which the check points fall into the observation interval, produces significantly less accurate forecasts. The comparison of the efficiency of different partitions with randomly generated matrix R 4 has shown that the reasonable choice of partition sequences permits to get a more accurate longer-term forecast

21 RESULTS OF ANALYSIS (2) The method that uses bootstrap evaluation produces the more accurate forecast than the cross-validation procedure The comparison of two suggested methods enables to state that the method that uses bootstrap evaluation makes it possible to obtain more accurate longer-term forecasts as compared with H- criterion method only in case of a small number of partitions. Otherwise the usage of selected matrices R 1 or R 2 permits to get more accurate forecasts. Nevertheless, the method that uses bootstrap evaluation turned out to be more accurate than the H- criterion method when using matrix R 4

22 PRODUCTION VOLUME OF BREAD AND BAKERY IN KHARKIV REGION Mean relative error for 2003-2006 is 5,91 %

23 COMBINED USE OF TWO METHODS In the real-life problems the method that use the bootstrap evaluation might turn out to be more accurate in some cases. It is recommended to use the given methods together. In such case every result obtained by means of these methods must be assigned some weight on the basis of the a priori estimates of the methods accuracy. The final forecast will be received in the form of the weighted average value of individual forecasts.

24 Igor V. Kononenko – Professor, Doctor of Technical Sciences, Head of Strategic Management Department; Anton N. Repin – Post-graduate of Strategic Management Department STRATEGIC MANAGEMENT DEPARTMENT NATIONAL TECHNICAL UNIVERSITY “KHARKIV POLITECHNIC INSTITUTE” 21, Frunze St., Kharkiv, 61002, E-mail: kiv@kpi.kharkov.ua anton.repin@mail.ru Phone: +38(057)707-67-35 Fax: +38(057)707-67-35


Download ppt "FORECASTING METHODS OF NON- STATIONARY STOCHASTIC PROCESSES THAT USE EXTERNAL CRITERIA Igor V. Kononenko, Anton N. Repin National Technical University."

Similar presentations


Ads by Google