Presentation is loading. Please wait.

Presentation is loading. Please wait.

Validation and Monitoring Measures of Accuracy Combining Forecasts Managing the Forecasting Process Monitoring & Control.

Similar presentations


Presentation on theme: "Validation and Monitoring Measures of Accuracy Combining Forecasts Managing the Forecasting Process Monitoring & Control."— Presentation transcript:

1 Validation and Monitoring Measures of Accuracy Combining Forecasts Managing the Forecasting Process Monitoring & Control

2 Technique Adequacy Are the autocorrelation coefficients of the residuals random? Calculate MAD, MSE, MAPE, & MPE Are the residuals normally distributed? Is the technique simple to use?

3 Accuracy Forecasts of groups (for example, by region) are more accurate than forecasts of individual items. ItemActualForecastErrorPercent Error 1100090010010% 29001000-100-11 39001000-100-11 41200100020016.7 400039001002.5

4 Tracking Beware of GIGO Large outliers need to be managed - and usually removed Tracking signals monitor accuracy and identify when intervention is required When a forecast goes out of control, bias occurs as the errors are primarily positive or negative

5 Combining Forecasts There is not one best method of forecasting Research has shown that forecasts that are averages of other good forecasts are on average better than individual forecasts. Simple and weighted averages

6 Combining Methods If individual methods are unbiased, combined forecast should also be unbiased. Do not use simple averages when there is a large difference between the variances of the errors. With weighted averages, higher weights should be assigned to those forecasts that have the lowest error variance. Sum of the weights needs to equal 1. Can determine weights as the inverse of the squared errors (accuracy and errors are inversely related) Can determine weights through regression analysis. Models are the independent variables - problematic in that the constant term may not equal 0 and the regression coefficients may not sum to 1.

7 Past Research on Combining Process of combining improves accuracy even when compared to single best model Simple non-weighted averages works well When one model is considerably better - do not combine, drop the inferior model In some cases, there are no gains from combining

8 Recent Research on Combining Arinze et al (1997) - use a knowledge based system (computer program that includes the knowledge of a human expert) to select a model or combination of models. Yaylor & Bunn (1999) Hybrid of empirical and theoretical methods that applies quartile regression to empirical fit errors to produce forecast error. Fisher & Harvey (1999) Results show that providing information about mean absolute percentage errors updated each period enables judges to combine their forecasts in a way that outperforms the simple average. See Excel output for examples of combining forecasts with averages

9 Combining Forecasts

10 Managing the Forecasting Process Problem definition Information search Model formulation Experimental design Forecast Results analysis Implementation

11 Managing the Forecasting Process Need to use common sense - some examples: In Time Series - need to determine how many time periods to include Regression - what additional variables can be included to increase R-squared, but maintain parsimony? Complex forecasting techniques, such as Box-Jenkins that reduce forecast error need to be user friendly - or they won’t be used Chi-square tests should be used to determine goodness of fit. Advancement of pc and software has increased the frequency of methods - however, at what cost? Excel’s output of Regression is very professional and easy to incorporate charts - but does not provide autocorrelation analysis

12 Managing the Forecasting Process Why is a forecast needed? Who will use the forecast? What level of detail or aggregation is required? What data are available? Costs with technique and collecting data How accurate is the forecast expected to be? How will the forecast be used in the organization? How will the forecast be evaluated?

13 Other factors to consider Selection depends on many factors - content and context of the forecast, availability of historical data, degree of accuracy desired, time periods to be forecast, cost/benefit to the company, time & resources available. How far in the future are you forecasting? Ease of understanding. How does it compare to other models? Forecasts are usually incorrect (adjust) Forecasts should be stated in intervals (estimate of accuracy) Forecasts are more accurate for aggregate items Forecasts are less accurate further into the future


Download ppt "Validation and Monitoring Measures of Accuracy Combining Forecasts Managing the Forecasting Process Monitoring & Control."

Similar presentations


Ads by Google