Presentation is loading. Please wait.

Presentation is loading. Please wait.

Copyright © 2014, 2011 Pearson Education, Inc. 1 Chapter 27 Time Series.

Similar presentations


Presentation on theme: "Copyright © 2014, 2011 Pearson Education, Inc. 1 Chapter 27 Time Series."— Presentation transcript:

1 Copyright © 2014, 2011 Pearson Education, Inc. 1 Chapter 27 Time Series

2 Copyright © 2014, 2011 Pearson Education, Inc. 2 27.1 Decomposing a Time Series Based on monthly shipments of computers and electronics in the US from 1992 through 2011, what would you forecast for the future?  Use methods for modeling time series, including regression.  Remember that forecasts are always extrapolations in time.

3 Copyright © 2014, 2011 Pearson Education, Inc. 3 27.1 Decomposing a Time Series  The analysis of a time series begins with a timeplot, such as that of monthly shipments of computers and electronics shown below.

4 Copyright © 2014, 2011 Pearson Education, Inc. 4 27.1 Decomposing a Time Series  Forecast: a prediction of a future value of a time series that extrapolates historical patterns.  Components of a time series are:  Trend: smooth, slow meandering pattern.  Seasonal: cyclical oscillations related to seasons.  Irregular: random variation.

5 Copyright © 2014, 2011 Pearson Education, Inc. 5 27.1 Decomposing a Time Series Smoothing  Smoothing: removing irregular and seasonal components of a time series to enhance the visibility of the trend.  Moving average: a weighted average of adjacent values of a time series; the more terms that are averaged, the smoother the estimate of the trend.

6 Copyright © 2014, 2011 Pearson Education, Inc. 6 27.1 Decomposing a Time Series Smoothing: Monthly Shipments Example Red: 13 month moving average

7 Copyright © 2014, 2011 Pearson Education, Inc. 7 27.1 Decomposing a Time Series Smoothing  Seasonally adjusted: removing the seasonal component of a time series.  Many government reported series are seasonally adjusted, for example, unemployment rates.

8 Copyright © 2014, 2011 Pearson Education, Inc. 8 27.1 Decomposing a Time Series Smoothing: Monthly Shipments Example Strong seasonal component (three-month cycle).

9 Copyright © 2014, 2011 Pearson Education, Inc. 9 27.1 Decomposing a Time Series Exponential Smoothing  Exponentially weighted moving average (EWMA): a weighted average of past observations with geometrically declining weights.  EWMA can be written as. Hence, the current smoothed value is the weighted average of the current observation and the prior smoothed value.

10 Copyright © 2014, 2011 Pearson Education, Inc. 10 27.1 Decomposing a Time Series Exponential Smoothing  The choice of w affects the level of smoothing. The larger w is, the smoother s t becomes.  The larger w is, the more the smoothed values trail behind the observations.

11 Copyright © 2014, 2011 Pearson Education, Inc. 11 27.1 Decomposing a Time Series Exponential Smoothing Monthly Shipments Example (w = 0.5)

12 Copyright © 2014, 2011 Pearson Education, Inc. 12 27.1 Decomposing a Time Series Exponential Smoothing Monthly Shipments Example (w = 0.8)

13 Copyright © 2014, 2011 Pearson Education, Inc. 13 27.2 Regression Models  Leading indicator: an explanatory variable that anticipates coming changes in a time series.  Leading indicators are hard to find.  Predictor: an ad hoc explanatory variable in a regression model used to forecast a time series (e.g., time index, t)

14 Copyright © 2014, 2011 Pearson Education, Inc. 14 27.2 Regression Models Polynomial Trends  Polynomial trend: a regression model for a time series that uses powers of t as explanatory variables.  Example: the third-degree or cubic polynomial.

15 Copyright © 2014, 2011 Pearson Education, Inc. 15 27.2 Regression Models Polynomial Trends Monthly shipments: Six-degree polynomial The high R 2 indicates a great fit to historical data.

16 Copyright © 2014, 2011 Pearson Education, Inc. 16 27.2 Regression Models Polynomial Trends Monthly shipments: Six-degree polynomial The model has serious problems forecasting.

17 Copyright © 2014, 2011 Pearson Education, Inc. 17 27.2 Regression Models Polynomial Trends  Avoid forecasting with polynomials that have high powers of the time index.

18 Copyright © 2014, 2011 Pearson Education, Inc. 18 4M Example 27.1: PREDICTING SALES OF NEW CARS Motivation The U.S. auto industry neared collapse in 2008- 2009. How badly did the recession hit this industry? What would we have expected had the recession not happened?

19 Copyright © 2014, 2011 Pearson Education, Inc. 19 4M Example 27.1: PREDICTING SALES OF NEW CARS Motivation – Timeplot of quarterly sales (in thousands) Cars in blue; light trucks in orange.

20 Copyright © 2014, 2011 Pearson Education, Inc. 20 4M Example 27.1: PREDICTING SALES OF NEW CARS Method Use regression to model the trend and seasonal components apparent in the timeplot. Use a polynomial for trend and three dummy variables for the four quarters. Let Q1 = 1 if quarter 1, 0 otherwise; Q2 = 1 if quarter 2, 0 otherwise; Q3 = 1 if quarter 3, 0 otherwise. The fourth quarter is the baseline category. Consider the possibility of lurking variables (e.g., gasoline prices).

21 Copyright © 2014, 2011 Pearson Education, Inc. 21 4M Example 27.1: PREDICTING SALES OF NEW CARS Mechanics Linear and quadratic trend fit to the data. Linear appears more appropriate.

22 Copyright © 2014, 2011 Pearson Education, Inc. 22 4M Example 27.1: PREDICTING SALES OF NEW CARS Mechanics Estimate the model. Check conditions before proceeding with inference.

23 Copyright © 2014, 2011 Pearson Education, Inc. 23 4M Example 27.1: PREDICTING SALES OF NEW CARS Mechanics Examine residual plot. This plot, along with the Durbin-Watson statistic D = 0.86, indicates dependence in the residuals. Cannot form confidence or prediction intervals.

24 Copyright © 2014, 2011 Pearson Education, Inc. 24 4M Example 27.1: PREDICTING SALES OF NEW CARS Message A regression model with linear time trend and seasonal factors closely predicts sales of new cars in the first two quarters of 2008, but substantially over predicts sales in the last two quarters and into 2009. The forecasts for 2008 are 1,807 thousand for the first quarter, 2,129 thousand for the second quarter, 1,969 thousand for the third quarter, and 1,741 thousand for the fourth quarter.

25 Copyright © 2014, 2011 Pearson Education, Inc. 25 4M Example 27.1: PREDICTING SALES OF NEW CARS Message Actual values in blue, historical forecasts in orange. Even though the recession ended June 2009, car sales remain less than historical trend.

26 Copyright © 2014, 2011 Pearson Education, Inc. 26 27.2 Regression Models Autoregression  Autoregression: a regression that uses prior values of the response as predictors.  Lagged variable: a prior value of the response in a time series.

27 Copyright © 2014, 2011 Pearson Education, Inc. 27 27.2 Regression Models Autoregression  Simplest is a simple regression that has one lag:  This model is called a first-order autoregression, denoted as AR(1).

28 Copyright © 2014, 2011 Pearson Education, Inc. 28 27.2 Regression Models Autoregression Example: AR(1) for Monthly Shipments

29 Copyright © 2014, 2011 Pearson Education, Inc. 29 27.2 Regression Models Autoregression Scatterplot of Shipments on the Lag Indicates a strong positive linear association.

30 Copyright © 2014, 2011 Pearson Education, Inc. 30 27.2 Regression Models Autoregression Summary of AR(1) model for Shipments

31 Copyright © 2014, 2011 Pearson Education, Inc. 31 27.2 Regression Models Forecasting an Autoregression Example: Use AR(1) to forecast shipments. For Jan. 2010, use observed shipment for Dec. 2009:

32 Copyright © 2014, 2011 Pearson Education, Inc. 32 27.2 Regression Models Forecasting an Autoregression For Feb. 2010, there is no observed shipment for Jan. 2010. Use forecast for Jan. 2010: Once forecasts are used in place of observations, the uncertainty compounds and is hard to quantify.

33 Copyright © 2014, 2011 Pearson Education, Inc. 33 27.2 Regression Models AR(5) Model for Shipments Fit, forecasts and prediction intervals.

34 Copyright © 2014, 2011 Pearson Education, Inc. 34 27.3 Checking the Model Autoregression and the Durbin-Watson Statistic Example 27.1: New Car Sales

35 Copyright © 2014, 2011 Pearson Education, Inc. 35 27.3 Checking the Model Autoregression and the Durbin-Watson Statistic Example 27.1: New Car Sales

36 Copyright © 2014, 2011 Pearson Education, Inc. 36 27.3 Checking the Model Autoregression and the Durbin-Watson Statistic  Residual plots show that the regression model leaves substantial dependence in the residuals.  In the scatterplot the correlation between residuals e t and e t-1 is apparent (r = 0.56).

37 Copyright © 2014, 2011 Pearson Education, Inc. 37 27.3 Checking the Model Autoregression and the Durbin-Watson Statistic  The Durbin-Watson statistic is related to the autocorrelation of the residuals in a regression:

38 Copyright © 2014, 2011 Pearson Education, Inc. 38 27.3 Checking the Model Summary Examine these plots of residuals when fitting a time series regression:  Timeplot of residuals;  Scatterplot of residuals versus fitted values; and  Scatterplot of residuals versus lags of the residuals.

39 Copyright © 2014, 2011 Pearson Education, Inc. 39 4M Example 27.2: FORECASTING UNEMPLOYMENT Motivation Using seasonally adjusted unemployment data from 1980 through the end of 2011, can a time series regression predict what happens to unemployment in 2012?

40 Copyright © 2014, 2011 Pearson Education, Inc. 40 4M Example 27.2: FORECASTING UNEMPLOYMENT Motivation

41 Copyright © 2014, 2011 Pearson Education, Inc. 41 4M Example 27.2: FORECASTING UNEMPLOYMENT Method Use a multiple regression of the percentage unemployed on lags of unemployment and time trends. In other words, use a combination of an autoregression with a polynomial trend. The scatterplot matrix shows linear association and possible collinearity; hopefully the lags will capture the effects of important omitted variables.

42 Copyright © 2014, 2011 Pearson Education, Inc. 42 4M Example 27.2: FORECASTING UNEMPLOYMENT Mechanics Estimate the model.

43 Copyright © 2014, 2011 Pearson Education, Inc. 43 4M Example 27.2: FORECASTING UNEMPLOYMENT Mechanics All conditions for the model are satisfied; proceed with inference. Based on the F-statistic, reject H 0. The model explains statistically significant variation. The fitted equation is

44 Copyright © 2014, 2011 Pearson Education, Inc. 44 4M Example 27.2: FORECASTING UNEMPLOYMENT Message A multiple regression fit to monthly unemployment data from 1980 through 2011 predicts that unemployment in January 2012 will be between 8.1 and 8.7% with 95% probability. Forecasts for February and March call for unemployment to fall further to 8.3% and 8.2%, respectively.

45 Copyright © 2014, 2011 Pearson Education, Inc. 45 4M Example 27.3: FORECASTING PROFITS Motivation Forecast Best Buy’s gross profits for 2012. Use their quarterly gross profits from 1995 to 2011.

46 Copyright © 2014, 2011 Pearson Education, Inc. 46 4M Example 27.3: FORECASTING PROFITS Method Best Buy’s profits have not only grown nonlinearly (faster and faster), but the growth is seasonal. In addition, the variation in profits appears to be increasing with level. Consequently, transform the data by calculating the percentage change from year to year. Let y i denote these year-over-year percentage changes.

47 Copyright © 2014, 2011 Pearson Education, Inc. 47 4M Example 27.3: FORECASTING PROFITS Method Timeplot of year-over-year percentage change.

48 Copyright © 2014, 2011 Pearson Education, Inc. 48 4M Example 27.3: FORECASTING PROFITS Method Scatterplot of the year-over-year percentage change on its lag. Indicates positive linear association.

49 Copyright © 2014, 2011 Pearson Education, Inc. 49 4M Example 27.3: FORECASTING PROFITS Mechanics Estimate the model.

50 Copyright © 2014, 2011 Pearson Education, Inc. 50 4M Example 27.3: FORECASTING PROFITS Mechanics All conditions for the model are satisfied; proceed with inference. The fitted equation has R 2 = 74.5% with s e = 6.99. The F-statistic shows that the model is statistically significant. Individual t-statistics show that each slope is statistically significant.

51 Copyright © 2014, 2011 Pearson Education, Inc. 51 4M Example 27.3: FORECASTING PROFITS Mechanics Forecast for the first quarter of 2012: However, with s e = 6.99, the range of the 95% prediction interval includes zero. It is [-17.5% to 12.1%].

52 Copyright © 2014, 2011 Pearson Education, Inc. 52 4M Example 27.3: FORECASTING PROFITS Message The time series regression that describes year- over-year percentage changes in gross profits at Best Buy is significant and explains 75% of the historical variation. It predicts profits in the first quarter of 2012 to fall about 2.4% below profits in the first quarter of 2011; however, the model can’t rule out an increase (up to 12%) or substantial contraction (dropping about 17%).

53 Copyright © 2014, 2011 Pearson Education, Inc. 53 Best Practices  Provide a prediction interval for your forecast.  Find a leading indicator.  Use lags in plots so that you can see the autocorrelation.

54 Copyright © 2014, 2011 Pearson Education, Inc. 54 Best Practices (Continued)  Provide a reasonable planning horizon.  Enjoy finding dependence in the residuals of a model.  Check plots of residuals.

55 Copyright © 2014, 2011 Pearson Education, Inc. 55 Pitfalls  Don’t summarize a time series with a histogram unless you’re confident that the data don’t have a pattern.  Avoid polynomials with high powers.  Do not let the high R 2 of a time series regression convince you that predictions from the regression will be accurate.

56 Copyright © 2014, 2011 Pearson Education, Inc. 56 Pitfalls (Continued)  Do not include explanatory variables that also have to be forecast.  Don’t assume that more data is better.


Download ppt "Copyright © 2014, 2011 Pearson Education, Inc. 1 Chapter 27 Time Series."

Similar presentations


Ads by Google