Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 Chapter 3 Demand Forecasting. 2 OverviewOverview l Introduction l Qualitative Forecasting Methods l Quantitative Forecasting Models l How to Have a.

Similar presentations


Presentation on theme: "1 Chapter 3 Demand Forecasting. 2 OverviewOverview l Introduction l Qualitative Forecasting Methods l Quantitative Forecasting Models l How to Have a."— Presentation transcript:

1 1 Chapter 3 Demand Forecasting

2 2 OverviewOverview l Introduction l Qualitative Forecasting Methods l Quantitative Forecasting Models l How to Have a Successful Forecasting System l Computer Software for Forecasting

3 3 IntroductionIntroduction l Demand estimates for products and services are the starting point for all the other planning in operations management. l Forecasting integral to production planning. l Long-range survival, growth, and profitability as well as short-range efficiency and effectiveness depend on accurate forecasting.

4 4 Forecasting is an Integral Part of Business Planning ForecastMethod(s) DemandEstimates SalesForecastManagementTeam Inputs:Market,Economic,Other BusinessStrategy Production Resource Forecasts

5 5 Some Reasons Why Forecasting is Essential in OM l New Facility Planning requires long-range forecasts – It can take 5 years to design and build a new factory or design and implement a new production process. l Production Planning requires medium-range forecasts – Demand for products vary from month to month and it can take several months to change the capacities of production processes. l Workforce Scheduling requires short-range forecasts – Demand for services (and the necessary staffing) can vary from hour to hour and employees weekly work schedules must be developed in advance.

6 6 Examples of Production Resource Forecasts LongRange MediumRange ShortRange Years Months Days,Weeks Product Lines, Factory Capacities ForecastHorizonTimeSpan Item Being Forecasted Unit of Measure Product Groups, Depart. Capacities Specific Products, Machine Capacities Dollars,Tons Units,Pounds Units,Hours

7 7 Forecasting Methods l Qualitative Approaches l Quantitative Approaches

8 8 Qualitative Approaches l Usually based on judgments about causal factors that underlie the demand of particular products or services l Do not require a demand history for the product or service, therefore are useful for new products/services l Approaches vary in sophistication from scientifically conducted surveys to intuitive hunches about future events l The approach/method that is appropriate depends on a product’s life cycle stage

9 9 Qualitative Methods l Educated guessintuitive hunches l Executive committee consensus l Delphi method l Survey of sales force l Survey of customers l Historical analogy l Market research scientifically conducted surveys

10 10 Quantitative Forecasting Approaches l Based on the assumption that the “forces” that generated the past demand will generate the future demand, i.e., history will tend to repeat itself l Analysis of the past demand pattern provides a good basis for forecasting future demand l Majority of quantitative approaches fall in the category of time series analysis

11 11 l A time series is a set of numbers where the order or sequence of the numbers is important, e.g., historical demand l Analysis of the time series identifies patterns l Once the patterns are identified, an appropriate method can be used to develop a forecast Time Series Analysis

12 12 Components of a Time Series l Trends are noted by an upward or downward sloping line. l Cycle is a data pattern that may cover several years before it repeats itself. l Seasonality is a data pattern that repeats itself over the period of one year or less. l Random fluctuation (noise) results from random variation or unexplained causes.

13 13

14 14 Product Demand Charted over 4 Years with Trend and Seasonality Year 1 Year 2 Year 3 Year 4 Seasonal peaksTrend component Actual demand line Average demand over four years Demand for product or service Random variation

15 15 Seasonal Patterns Length of Time Number of Length of Time Number of Before Pattern Length of Seasons Before Pattern Length of Seasons Is Repeated Season in Pattern Is Repeated Season in Pattern YearQuarter 4 YearQuarter 4 Year Month12 Year Month12 Year Week52 Year Week52 Month Day 28-31 Month Day 28-31 Week Day 7 Week Day 7

16 16 Quantitative Forecasting Approaches l Linear Regression l Simple Moving Average l Weighted Moving Average l Exponential Smoothing (exponentially weighted moving average) l Exponential Smoothing with Trend (double exponential smoothing)

17 17 Long-Range Forecasts l Time spans usually greater than one year l Necessary to support strategic decisions about planning products, processes, and facilities l For long-range forecasting, it is important to plot historical time series data in order to identify data patterns (trends, cycles, seasonality) so that a proper forecasting method can be used.

18 18 Simple Linear Regression l Linear regression analysis establishes a relationship between a dependent variable and one or more independent variables. l In simple linear regression analysis there is only one independent variable. l If the data is a time series, the independent variable is the time period. l The dependent variable is whatever we wish to forecast.

19 19 Simple Linear Regression l Regression Equation This model is of the form: Y = a + bX Y = a + bX Y = dependent variable Y = dependent variable X = independent variable X = independent variable a = y-axis intercept (the height of the line when x = 0) a = y-axis intercept (the height of the line when x = 0) b = slope of regression line which is the amount by which y increases when x increases by 1 unit. b = slope of regression line which is the amount by which y increases when x increases by 1 unit.

20 20 Simple Linear Regression l Constants a and b The constants a and b are computed using the following equations:

21 21 Simple Linear Regression l Once the a and b values are computed, a future value of X can be entered into the regression equation and a corresponding value of Y (the forecast) can be calculated.

22 22 Example: College Enrollment l Simple Linear Regression At a small regional college enrollments have grown steadily over the past six years, as evidenced below. Use time series regression to forecast the student enrollments for the next three years. StudentsStudents Year Enrolled (1000s) Year Enrolled (1000s) 12.543.2 12.543.2 22.853.3 22.853.3 32.963.4 32.963.4

23 23 Example: College Enrollment l Simple Linear Regression xyx 2 xy 12.512.5 22.845.6 32.998.7 43.21612.8 53.32516.5 63.43620.4  x=21  y=18.1  x 2 =91  xy=66.5  x=21  y=18.1  x 2 =91  xy=66.5

24 24 Example: College Enrollment l Simple Linear Regression Y = 2.387 + 0.180X Y = 2.387 + 0.180X

25 25 Example: College Enrollment l Simple Linear Regression Y 7 = 2.387 + 0.180(7) = 3.65 or 3,650 students Y 8 = 2.387 + 0.180(8) = 3.83 or 3,830 students Y 8 = 2.387 + 0.180(8) = 3.83 or 3,830 students Y 9 = 2.387 + 0.180(9) = 4.01 or 4,010 students Note: Enrollment is expected to increase by 180 students per year. students per year.

26 26 Simple Linear Regression l Simple linear regression can also be used when the independent variable X represents a variable other than time. l In this case, linear regression is representative of a class of forecasting models called causal forecasting models.

27 27 Example: Railroad Products Co. l Simple Linear Regression – Causal Model The manager of RPC wants to project the sales for the next 3 years. He knows that RPC’s long- range sales are tied very closely to national freight car loadings. On the next slide are 7 years of relevant historical data. Develop a simple linear regression model between RPC sales and national freight car loadings. Forecast RPC sales for the next 3 years, given that the rail industry estimates car loadings of 250, 270, and 300 million.

28 28 Example: Railroad Products Co. l Simple Linear Regression – Causal Model RPC SalesCar Loadings Year($millions)(millions) 19.5120 211.0135 312.0130 412.5150 514.0170 616.0190 718.0220

29 29 Example: Railroad Products Co. l Simple Linear Regression – Causal Model xyx 2 xy 1209.514,4001,140 13511.018,2251,485 13012.016,9001,560 15012.522,5001,875 17014.028,9002,380 19016.036,1003,040 22018.048,4003,960 1,11593.0185,42515,440

30 30 Example: Railroad Products Co. l Simple Linear Regression – Causal Model Y = 0.528 + 0.0801X Y = 0.528 + 0.0801X

31 31 Example: Railroad Products Co. l Simple Linear Regression – Causal Model Y 8 = 0.528 + 0.0801(250) = $20.55 million Y 8 = 0.528 + 0.0801(250) = $20.55 million Y 9 = 0.528 + 0.0801(270) = $22.16 million Y 9 = 0.528 + 0.0801(270) = $22.16 million Y 10 = 0.528 + 0.0801(300) = $24.56 million Note: RPC sales are expected to increase by $80,100 for each additional million national freight car loadings.

32 32 Multiple Regression Analysis l Multiple regression analysis is used when there are two or more independent variables. l An example of a multiple regression equation is: Y = 50.0 + 0.05X 1 + 0.10X 2 – 0.03X 3 Y = 50.0 + 0.05X 1 + 0.10X 2 – 0.03X 3 where: Y = firm’s annual sales ($millions) X 1 = industry sales ($millions) X 1 = industry sales ($millions) X 2 = regional per capita income ($thousands) X 2 = regional per capita income ($thousands) X 3 = regional per capita debt ($thousands) X 3 = regional per capita debt ($thousands)

33 33 Coefficient of Correlation (r) l The coefficient of correlation, r, explains the relative importance of the relationship between x and y. l The sign of r shows the direction of the relationship. l The absolute value of r shows the strength of the relationship. l The sign of r is always the same as the sign of b. l r can take on any value between –1 and +1.

34 34 Coefficient of Correlation (r) l Meanings of several values of r: -1 a perfect negative relationship (as x goes up, y goes down by one unit, and vice versa) -1 a perfect negative relationship (as x goes up, y goes down by one unit, and vice versa) +1 a perfect positive relationship (as x goes up, y goes up by one unit, and vice versa) +1 a perfect positive relationship (as x goes up, y goes up by one unit, and vice versa) 0 no relationship exists between x and y 0 no relationship exists between x and y +0.3 a weak positive relationship +0.3 a weak positive relationship -0.8 a strong negative relationship -0.8 a strong negative relationship

35 35 Coefficient of Correlation (r) l r is computed by:

36 36 Coefficient of Determination (r 2 ) l The coefficient of determination, r 2, is the square of the coefficient of correlation. l The modification of r to r 2 allows us to shift from subjective measures of relationship to a more specific measure. l r 2 is determined by the ratio of explained variation to total variation:

37 37 Coefficient of Determination (r 2 ) l The coefficient of determination, r 2, is useful because it gives the proportion of the variance (fluctuation) of one variable that is predictable from the other variable. l It is a measure that allows us to determine how certain one can be in making predictions from a certain model/graph. l The coefficient of determination is the ratio of the explained variation to the total variation. l The coefficient of determination is such that 0 < r 2 < 1, and denotes the strength of the linear association between x and y. l The coefficient of determination is such that 0 < r 2 < 1, and denotes the strength of the linear association between x and y.

38 38 Coefficient of Determination (r 2 ) l The coefficient of determination represents the percent of the data that is the closest to the line of best fit. For example, if r = 0.922, then r 2 = 0.850, which means that 85% of the total variation in y can be explained by the linear relationship between x and y (as described by the regression equation). The other 15% of the total variation in y remains unexplained. l The coefficient of determination is a measure of how well the regression line represents the data. If the regression line passes exactly through every point on the scatter plot, it would be able to explain all of the variation. The further the line is away from the points, the less it is able to explain. l

39 39 Example: Railroad Products Co. l Coefficient of Correlation xyx 2 xyy 2 1209.514,4001,14090.25 13511.018,2251,485121.00 13012.016,9001,560144.00 15012.522,5001,875156.25 17014.028,9002,380196.00 19016.036,1003,040256.00 22018.048,4003,960324.00 1,11593.0185,42515,4401,287.50

40 40 Example: Railroad Products Co. l Coefficient of Correlation r =.9829 r =.9829

41 41 Example: Railroad Products Co. l Coefficient of Determination r 2 = (.9829) 2 =.966 r 2 = (.9829) 2 =.966 96.6% of the variation in RPC sales is explained by national freight car loadings.

42 42 Ranging Forecasts l Forecasts for future periods are only estimates and are subject to error. l One way to deal with uncertainty is to develop best- estimate forecasts and the ranges within which the actual data are likely to fall. l The ranges of a forecast are defined by the upper and lower limits of a confidence interval.

43 43 Ranging Forecasts l The ranges or limits of a forecast are estimated by: Upper limit = Y + t(s yx ) Upper limit = Y + t(s yx ) Lower limit = Y - t(s yx ) Lower limit = Y - t(s yx )where: Y = best-estimate forecast Y = best-estimate forecast t = number of standard deviations from the mean of the distribution to provide a given proba- bility of exceeding the limits through chance t = number of standard deviations from the mean of the distribution to provide a given proba- bility of exceeding the limits through chance s yx = standard error of the forecast s yx = standard error of the forecast

44 44 Ranging Forecasts l The standard error (deviation) of the forecast is computed as:

45 45 Example: Railroad Products Co. l Ranging Forecasts Recall that linear regression analysis provided a forecast of annual sales for RPC in year 8 equal to $20.55 million. Set the limits (ranges) of the forecast so that there is only a 5 percent probability of exceeding the limits by chance.

46 46 Example: Railroad Products Co. l Ranging Forecasts l Step 1: Compute the standard error of the forecasts, s yx. l Step 2: Determine the appropriate value for t. n = 7, so degrees of freedom = n – 2 = 5. n = 7, so degrees of freedom = n – 2 = 5. level of significance = =.05 Appendix B, Table 2 shows t = 2.571. Appendix B, Table 2 shows t = 2.571.

47 47 Example: Railroad Products Co. l Ranging Forecasts l Step 3: Compute upper and lower limits. Upper limit = 20.55 + 2.571(.5748) Upper limit = 20.55 + 2.571(.5748) = 20.55 + 1.478 = 22.028 Lower limit = 20.55 - 2.571(.5748) = 20.55 - 1.478 = 19.072 We are 95% confident the actual sales for year 8 will be between $19.072 and $22.028 million.

48 48 Seasonalized Time Series Regression Analysis l Select a representative historical data set. l Develop a seasonal index for each season. l Use the seasonal indexes to deseasonalize the data. l Perform lin. regr. analysis on the deseasonalized data. l Use the regression equation to compute the forecasts. l Use the seas. indexes to reapply the seasonal patterns to the forecasts.

49 49 Example: Computer Products Corp. l Seasonalized Times Series Regression Analysis An analyst at CPC wants to develop next year’s quarterly forecasts of sales revenue for CPC’s line of Epsilon Computers. She believes that the most recent 8 quarters of sales (shown on the next slide) are representative of next year’s sales.

50 50 Example: Computer Products Corp. l Seasonalized Times Series Regression Analysis l Representative Historical Data Set YearQtr.($mil.)YearQtr.($mil.) 117.4218.3 126.5227.4 134.9235.4 1416.12418.0

51 51 Example: Computer Products Corp. l Seasonalized Times Series Regression Analysis l Compute the Seasonal Indexes Quarterly Sales Quarterly Sales YearQ1Q2Q3Q4Total 17.46.54.916.134.9 28.37.45.418.039.1 Totals15.713.910.334.174.0 Totals15.713.910.334.174.0 Qtr. Avg.7.856.955.1517.059.25 Qtr. Avg.7.856.955.1517.059.25 Seas.Ind..849.751.5571.8434.000 Seas.Ind..849.751.5571.8434.000

52 52 Example: Computer Products Corp. l Seasonalized Times Series Regression Analysis l Deseasonalize the Data Quarterly Sales Quarterly Sales YearQ1Q2Q3Q4 18.728.668.808.74 29.789.859.699.77

53 53 Example: Computer Products Corp. l Seasonalized Times Series Regression Analysis l Perform Regression on Deseasonalized Data Yr.Qtr.xyx 2 xy 1118.7218.72 1228.66417.32 1338.80926.40 1448.741634.96 2159.782548.90 2269.853659.10 2379.694967.83 2489.776478.16 Totals3674.01204341.39

54 54 Example: Computer Products Corp. l Seasonalized Times Series Regression Analysis l Perform Regression on Deseasonalized Data Y = 8.357 + 0.199X Y = 8.357 + 0.199X

55 55 Example: Computer Products Corp. l Seasonalized Times Series Regression Analysis l Compute the Deseasonalized Forecasts Y 9 = 8.357 + 0.199(9) = 10.148 Y 9 = 8.357 + 0.199(9) = 10.148 Y 10 = 8.357 + 0.199(10) = 10.347 Y 10 = 8.357 + 0.199(10) = 10.347 Y 11 = 8.357 + 0.199(11) = 10.546 Y 11 = 8.357 + 0.199(11) = 10.546 Y 12 = 8.357 + 0.199(12) = 10.745 Y 12 = 8.357 + 0.199(12) = 10.745 Note: Average sales are expected to increase by.199 million (about $200,000) per quarter..199 million (about $200,000) per quarter.

56 56 Example: Computer Products Corp. l Seasonalized Times Series Regression Analysis l Seasonalize the Forecasts Seas.Deseas.Seas. Yr.Qtr.IndexForecastForecast 31.84910.1488.62 32.75110.3477.77 33.55710.5465.87 341.84310.74519.80

57 57 Evaluating Forecast-Model Performance l Accuracy l Accuracy is the typical criterion for judging the performance of a forecasting approach l Accuracy is how well the forecasted values match the actual values

58 58 Monitoring Accuracy l Accuracy of a forecasting approach needs to be monitored to assess the confidence you can have in its forecasts and changes in the market may require reevaluation of the approach l Accuracy can be measured in several ways l Standard error of the forecast (s yx,covered earlier) l Mean absolute deviation (MAD) l Mean squared error (MSE)

59 59 Monitoring Accuracy l Mean Absolute Deviation (MAD)

60 60 l Mean Squared Error (MSE) MSE = (S yx ) 2 A small value for S yx means data points are tightly grouped around the regression line and error range is small. When forecast errors are assumed to be normally distributed, the values of MAD and s yx are related: MSE = 1.25(MAD) MSE = 1.25(MAD) Monitoring Accuracy

61 61 Short-Range Forecasting Methods l Naive Method l (Simple) Moving Average l Weighted Moving Average l Exponential Smoothing l Exponential Smoothing with Trend

62 62 Naive Method For Forecasting l Forecasts based only on the most recent observations are called as “naive forecasts.” l Assumes that the next period will be identical to the present: l F t = A t-1 l F t : Forecast value for time period t l A t-1 : Observed value one period earlier

63 63 Simple Moving Average l An averaging period (AP) is given or selected l The forecast for the next period is the arithmetic average of the AP most recent actual demands l It is called a “simple” average because each period used to compute the average is equally weighted.

64 64 Simple Moving Average l It is called “moving” because as new demand data becomes available, the oldest data is not used. l By increasing the AP, the forecast can become less responsive to fluctuations in demand l By decreasing the AP, the forecast is more responsive to fluctuations in demand

65 65 EXAMPLE (Moving Average) l See example 3.5 (page 83)

66 66 Summary of Moving Averages l Advantages of Moving Average Method l Easily understood l Easily computed l Provides stable forecasts l Disadvantages of Moving Average Method l Requires saving all past N data points l Lags behind a trend l Ignores complex relationships in data

67 67 Simple Moving Average Figure 3-4 MA n = n AiAi i = 1  n Actual MA3 MA5

68 68 Weighted Moving Average l This is a variation on the simple moving average where the weights used to compute the average are not equal. l This allows more recent demand data to have a greater effect on the moving average, therefore the forecast. l The weights must add to 1.0 and generally decrease in value with the age of the data.

69 69 l The weights used to compute the forecast (moving average) are exponentially distributed. The forecast is the sum of the old forecast and a portion (  ) of the forecast error (A t-1  -  F t-1 ). The forecast is the sum of the old forecast and a portion (  ) of the forecast error (A t-1  -  F t-1 ). F t = F t-1 +  (A t-1  -  F t-1 ) F t = F t-1 +  (A t-1  -  F t-1 ) F t = (1 -  )F t-1 +  A t -1 F t = (1 -  )F t-1 +  A t -1 Exponential Smoothing

70 70 Exponential Smoothing: Concept l Include all past observations l Weigh recent observations much more heavily than very old observations: weight today Decreasing weight given to older observations

71 71 Exponential Smoothing l It is a special case of the weighted moving averages method in which we select only the weight for the most recent observation. The weight placed on the most recent observation is the value of the smoothing constant, . The weight placed on the most recent observation is the value of the smoothing constant, . l The weights for the other data values are computed automatically and become smaller at an exponential rate as the observations become older.

72 72 Example: Central Call Center l Moving Average CCC wishes to forecast the number of incoming calls it receives in a day from the customers of one of its clients, BMI. CCC schedules the appropriate number of telephone operators based on projected call volumes. CCC believes that the most recent 12 days of call volumes (shown on the next slide) are representative of the near future call volumes.

73 73 Example: Central Call Center l Moving Average l Representative Historical Data DayCallsDayCalls 11597203 22178195 31869188 416110168 517311198 615712159

74 74 Example: Central Call Center l Moving Average Use the moving average method with an AP = 3 days to develop a forecast of the call volume in Day 13. F 13 = (168 + 198 + 159)/3 = 175.0 calls F 13 = (168 + 198 + 159)/3 = 175.0 calls

75 75 Example: Central Call Center l Weighted Moving Average Use the weighted moving average method with an AP = 3 days and weights of.1 (for oldest datum),.3, and.6 to develop a forecast of the call volume in Day 13. F 13 =.1(168) +.3(198) +.6(159) = 171.6 calls F 13 =.1(168) +.3(198) +.6(159) = 171.6 calls Note: The WMA forecast is lower than the MA forecast because Day 13’s relatively low call volume carries almost twice as much weight in the WMA (.60) as it does in the MA (.33).

76 76 Example: Central Call Center l Exponential Smoothing If a smoothing constant value of.25 is used and the exponential smoothing forecast for Day 11 was 180.76 calls, what is the exponential smoothing forecast for Day 13? F 12 = 180.76 +.25(198 – 180.76) = 185.07 F 13 = 185.07 +.25(159 – 185.07) = 178.55

77 77 Example: Central Call Center l Forecast Accuracy - MAD Which forecasting method (the AP = 3 moving average or the  =.25 exponential smoothing) is preferred, based on the MAD over the most recent 9 days? (Assume that the exponential smoothing forecast for Day 3 is the same as the actual call volume.)

78 78 Example: Central Call Center AP = 3  =.25 AP = 3  =.25 DayCallsForec.|Error|Forec.|Error| 4161187.326.3186.025.0 5173188.015.0179.86.8 6157173.316.3178.121.1 7203163.739.3172.830.2 8195177.717.3180.414.6 9188185.03.0184.04.0 10168195.327.3185.017.0 11198183.714.3180.817.2 12159184.725.7185.126.1 MAD20.518.0

79 79 Exponential Smoothing with Trend l As we move toward medium-range forecasts, trend becomes more important. l Incorporating a trend component into exponentially smoothed forecasts is called double exponential smoothing. l The estimate for the average and the estimate for the trend are both smoothed.

80 80 Exponential Smoothing with Trend l Model Form FT t = S t-1 + T t-1 FT t = S t-1 + T t-1where: FT t = forecast with trend in period t S t-1 = smoothed forecast (average) in period t-1 T t-1 = smoothed trend estimate in period t-1

81 81 Exponential Smoothing with Trend l Smoothing the Average S t = FT t +  (A t – FT t ) S t = FT t +  (A t – FT t ) l Smoothing the Trend T t = T t-1 +  (FT t – FT t-1 - T t-1 ) T t = T t-1 +  (FT t – FT t-1 - T t-1 ) where:  = smoothing constant for the average  = smoothing constant for the trend

82 82 Example (Exponential Smoothing with Trend) l See example 3.7 (page 89)

83 83 Monitoring and Controlling a Forecasting Model l Tracking Signal (TS) l The TS measures the cumulative forecast error over n periods in terms of MAD l If the forecasting model is performing well, the TS should be around zero l The TS indicates the direction of the forecasting error; if the TS is positive -- increase the forecasts, if the TS is negative -- decrease the forecasts.

84 84 Computer Software for Forecasting l Examples of computer software with forecasting capabilities l Forecast Pro l Autobox l SmartForecasts for Windows l SAS l SPSS l SAP l POM Software Libary Primarily for forecasting HaveForecastingmodules


Download ppt "1 Chapter 3 Demand Forecasting. 2 OverviewOverview l Introduction l Qualitative Forecasting Methods l Quantitative Forecasting Models l How to Have a."

Similar presentations


Ads by Google