Time series Decomposition Farideh Dehkordi-Vakil.

Slides:



Advertisements
Similar presentations
Module 4. Forecasting MGS3100.
Advertisements

Decomposition Method.
©The McGraw-Hill Companies, Inc. 2008McGraw-Hill/Irwin Lesson 12.
Part II – TIME SERIES ANALYSIS C3 Exponential Smoothing Methods © Angel A. Juan & Carles Serrat - UPC 2007/2008.
Forecasting OPS 370.
Time Series and Forecasting
19- 1 Chapter Nineteen McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc., All Rights Reserved.
1 BIS APPLICATION MANAGEMENT INFORMATION SYSTEM Advance forecasting Forecasting by identifying patterns in the past data Chapter outline: 1.Extrapolation.
CHAPTER 5 TIME SERIES AND THEIR COMPONENTS (Page 165)
Exponential Smoothing Methods
Time Series Analysis Autocorrelation Naive & Simple Averaging
Ka-fu Wong © 2003 Chap Dr. Ka-fu Wong ECON1003 Analysis of Economic Data.
Chapter 12 - Forecasting Forecasting is important in the business decision-making process in which a current choice or decision has future implications:
Chapter 5 Time Series Analysis
Data Sources The most sophisticated forecasting model will fail if it is applied to unreliable data Data should be reliable and accurate Data should be.
MOVING AVERAGES AND EXPONENTIAL SMOOTHING
Forecasting & Time Series Minggu 6. Learning Objectives Understand the three categories of forecasting techniques available. Become aware of the four.
Part II – TIME SERIES ANALYSIS C2 Simple Time Series Methods & Moving Averages © Angel A. Juan & Carles Serrat - UPC 2007/2008.
Time series Decomposition Additive Model Farideh Dehkordi-Vakil.
McGraw-Hill/Irwin Copyright © 2010 by The McGraw-Hill Companies, Inc. All rights reserved. Time Series and Forecasting Chapter 16.
Business Forecasting Chapter 5 Forecasting with Smoothing Techniques.
Time Series and Forecasting
Slides 13b: Time-Series Models; Measuring Forecast Error
MOVING AVERAGES AND EXPONENTIAL SMOOTHING. Forecasting methods: –Averaging methods. Equally weighted observations –Exponential Smoothing methods. Unequal.
CHAPTER 18 Models for Time Series and Forecasting
Business Forecasting Chapter 4 Data Collection and Analysis in Forecasting.
1 1 Slide © 2009 South-Western, a part of Cengage Learning Chapter 6 Forecasting n Quantitative Approaches to Forecasting n Components of a Time Series.
Slides by John Loucks St. Edward’s University.
Copyright © 2004 by The McGraw-Hill Companies, Inc. All rights reserved. Copyright © 2003 by The McGraw-Hill Companies, Inc. All rights reserved.
Time series Decomposition
1 1 Slide © 2012 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole.
Chapter 16: Time-Series Analysis
Applied Business Forecasting and Planning
Copyright © 2015 McGraw-Hill Education. All rights reserved. No reproduction or distribution without the prior written consent of McGraw-Hill Education.
Holt’s exponential smoothing
DSc 3120 Generalized Modeling Techniques with Applications Part II. Forecasting.
Chapter 17 Time Series Analysis and Forecasting ©.
McGraw-Hill/Irwin Copyright © 2007 by The McGraw-Hill Companies, Inc. All rights reserved. Time Series Forecasting Chapter 13.
Chapter 6 Business and Economic Forecasting Root-mean-squared Forecast Error zUsed to determine how reliable a forecasting technique is. zE = (Y i -
1 1 Slide © 2011 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole.
Maintenance Workload Forecasting
1 QNT 531 Advanced Problems in Statistics and Research Methods WORKSHOP 4 By Dr. Serhat Eren University OF PHOENIX.
Economics 173 Business Statistics Lecture 26 © Fall 2001, Professor J. Petry
©2003 Thomson/South-Western 1 Chapter 17 – Quantitative Business Forecasting Slides prepared by Jeff Heyl, Lincoln University ©2003 South-Western/Thomson.
COMPLETE BUSINESS STATISTICS
Time Series and Forecasting
1 1 Chapter 6 Forecasting n Quantitative Approaches to Forecasting n The Components of a Time Series n Measures of Forecast Accuracy n Using Smoothing.
Time Series and Forecasting Chapter 16 McGraw-Hill/Irwin Copyright © 2012 by The McGraw-Hill Companies, Inc. All rights reserved.
MBF1413 | Quantitative Methods Prepared by Dr Khairul Anuar 8: Time Series Analysis & Forecasting – Part 1
Times Series Forecasting and Index Numbers Chapter 16 Copyright © 2014 by The McGraw-Hill Companies, Inc. All rights reserved.McGraw-Hill/Irwin.
Chapter 20 Time Series Analysis and Forecasting. Introduction Any variable that is measured over time in sequential order is called a time series. We.
Statistics for Business and Economics Module 2: Regression and time series analysis Spring 2010 Lecture 7: Time Series Analysis and Forecasting 1 Priyantha.
Chapter 20 Time Series Analysis and Forecasting. Introduction Any variable that is measured over time in sequential order is called a time series. We.
Chapter 11 – With Woodruff Modications Demand Management and Forecasting Copyright © 2010 by The McGraw-Hill Companies, Inc. All rights reserved.McGraw-Hill/Irwin.
TIME SERIES MODELS. Definitions Forecast is a prediction of future events used for planning process. Time Series is the repeated observations of demand.
©The McGraw-Hill Companies, Inc. 2008McGraw-Hill/Irwin Time Series and Forecasting Chapter 16.
Yandell – Econ 216 Chap 16-1 Chapter 16 Time-Series Forecasting.
Chapter Nineteen McGraw-Hill/Irwin
Forecasting Methods Dr. T. T. Kachwala.
What is Correlation Analysis?
John Loucks St. Edward’s University . SLIDES . BY.
Chapter 4: Seasonal Series: Forecasting and Decomposition
FORCASTING AND DEMAND PLANNING
MOVING AVERAGES AND EXPONENTIAL SMOOTHING
MBF1413 | Quantitative Methods Prepared by Dr Khairul Anuar
Forecasting is an Integral Part of Business Planning
Exponential Smoothing
Chapter Nineteen McGraw-Hill/Irwin
Forecasting Plays an important role in many industries
Exponential Smoothing
Presentation transcript:

Time series Decomposition Farideh Dehkordi-Vakil

Introduction One approach to the analysis of time series data is based on smoothing past data in order to separate the underlying pattern in the data series from randomness. The underlying pattern then can be projected into the future and used as the forecast.

Introduction The underlying pattern can also be broken down into sub patterns to identify the component factors that influence each of the values in a series. This procedure is called decomposition. Decomposition methods usually try to identify two separate components of the basic underlying pattern that tend to characterize economics and business series. Trend Cycle Seasonal Factors

Introduction The trend Cycle represents long term changes in the level of series. The Seasonal factor is the periodic fluctuations of constant length that is usually caused by known factors such as rainfall, month of the year, temperature, timing of the Holidays, etc. The decomposition model assumes that the data has the following form: Data = Pattern + Error = f (trend-cycle, Seasonality, error)

Decomposition Model Mathematical representation of the decomposition approach is: Y t is the time series value (actual data) at period t. S t is the seasonal component ( index) at period t. T t is the trend cycle component at period t. E t is the irregular (remainder) component at period t.

Decomposition Model The exact functional form depends on the decomposition model actually used. Two common approaches are: Additive Model Multiplicative Model

Decomposition Model An additive model is appropriate if the magnitude of the seasonal fluctuation does not vary with the level of the series. Time plot of U.S. retail Sales of general merchandise stores for each month from Jan to May 2002.

Decomposition Model Multiplicative model is more prevalent with economic series since most seasonal economic series have seasonal variation which increases with the level of the series. Time plot of number of DVD players sold for each month from April 1997 to June 2002.

Decomposition Model Transformations can be used to model additively, when the original data are not additive. We can fit a multiplicative relationship by fitting an additive relationship to the logarithm of the data, since if Then

Seasonal Adjustment A useful by-product of decomposition is that it provides an easy way to calculate seasonally adjusted data. For additive decomposition, the seasonally adjusted data are computed by subtracting the seasonal component.

Seasonal Adjustment For Multiplicative decomposition, the seasonally adjusted data are computed by dividing the original observation by the seasonal component. Most published economic series are seasonally adjusted because Seasonal variation is usually not of primary interest

Deseasonalizing the data The process of deseasonalizing the data has useful results: The seasonalized data allow us to see better the underlying pattern in the data. It provides us with measures of the extent of seasonality in the form of seasonal indexes. It provides us with a tool in projecting what one quarter ’ s (or month ’ s) observation may portend for the entire year.

Deseasonalizing the data Fore example, assume you are working for a manufacturer of major household appliances and heard that housing starts for the first quarter were Since your sales depend heavily new construction, you want to project this forward for the year. We know that housing starts show strong seasonal components.To make a more accurate projection you need to take this into consideration. Suppose that the seasonal index for the first quarter of the housing start is.797.

Deseasonalizing the data and Finding Seasonal Indexes Once the Seasonal indexes are known you can deseasonalize data by dividing by the appropriate index that is: Deseasonalized data = Raw data/Seasonal Index Therefore Multiplying this deseasonalized value by 4 would give a projection for the year of 1,

Deseasonalizing the data and Finding Seasonal Indexes In general: Seasonal adjustment allows reliable comparison of values at different points in time. It is easier to understand the relationship among economic or business variables once the complicating factor of seasonality has been removed from the data. Seasonal adjustment may be a useful element in the production of short term forecasts of future values of a time series.

Trend-Cycle Estimation The trend-cycle can be estimated by smoothing the series to reduce the random variation. There is a range of smoother available. We will look at Moving Average Simple moving average Centered moving average Double Moving average Local Regression Smoothing

Simple Moving Average The idea behind the moving averages is that observations which are nearby in time are also likely to be close in value. The average of the points near an observation will provide a reasonable estimate of the trend-cycle at that observation. The average eliminate some of the randomness in the data, and leaves a smooth trend-cycle component.

Simple Moving Average The first question is; how many data points to include in each average. Moving average of order 3 or MA(3) is when we use averages of three points. Moving average of order 5 or MA(5) is when we use averages of five points. The term moving average is used because each average is computed by dropping the oldest observation and including the next observation.

Simple Moving Average Simple moving averages can be defined for any odd order. A moving average of order k, or MA(k) where k is an odd integer is defined is defined as the average consisting of an observation and the m = (k-1)/2 points on either side. For example for MA(3)

Simple Moving Average What is the formula for the MA(5) smoother?

Simple Moving Average The number of points included in a moving average affects the smoothness of the resulting estimate. As a rule, the larger the value of k the smoother will be the resulting trend-cycle estimate. Determining the appropriate length of a moving average is an important task in decomposition methods.

Example: Weekly Department Store Sales The weekly sales figures (in millions of dollars) presented in the following table are used by a major department store to determine the need for temporary sales personnel.

Example: Weekly Department Store Sales

Calculation of MA(3) and MA(5) smoother for the weekly department store sales. In applying a k-term moving average, m=(k-1)/2 neighboring points are needed on either side of the observation. Therefore it is not possible to estimate the trend-cycle close to the beginning and end of series. To overcome this problem a shorter length moving average can be used.

Example: Weekly Department Store Sales

Centered Moving Average The simple moving average required an odd number of observations to be included in each average. This was to ensure that the average was centered at the middle of the data values being averaged. What about moving average with an even number of observations? For example MA(4)

Centered Moving Average To calculate a MA(4) for the weekly sales data, the trend cycle at time 3 can be calculated as The center of the first moving average is at 2.5 (half period early) and the center of the second moving average is at 3.5 (half period late). How ever the center of the two moving averages is centered at 3.

Centered Moving Average A centered moving average can be expressed as a single but weighted moving average, where the weights for each period are unequal.

Centered Moving Average The first and the last term in this average have weights of 1/8 and all the other terms have weights of 1/4. Therefore a 2  MA(4) smoother is equivalent to a weighted moving average of order 5. In general a 2  MA(k) smoother is equivalent to a weighted moving average of order k+1 with weights 1/k for all observations except for the first and the last observation in the average, which have weights 1/2k.

Least squares estimates The general procedure for estimating the pattern of a relationship is through fitting some functional form in such a way as to minimize the error component of equation data = pattern + Error The name least squares is based on the fact that this estimation procedure seeks to minimize the sum of the squared errors in the above equation.

Least squares estimates A major consideration in forecasting is to identify and fit the most appropriate pattern (functional form) so as to minimize the MSE. A possible functional form is a straight line. Recall that a straight line is represented by the equation Where the two parameters a, and b represent the intercept and the slope respectively.

Least squares estimates The values a and b can be chosen by minimizing the MSE. This procedure is known as simple linear regression and will be examined in detail in chapter 5. One way to estimate trend-cycle is through extending the idea of moving averages to moving lines. That is instead of taking average of the points, we may fit a straight line to these points and estimate trend-cycle that way.

Least squares estimates A straight trend line can be represented by the equation The values of a and b can be found by minimizing the sum of squared errors where the errors are the differences between the data values of the time series and the corresponding trend line values. That is: A straight trend line is sometimes appropriate, but there are many time series where some curved trend is better.

Least squares estimates Local regression is a way of fitting a much more flexible trend-cycle curve to the data. Instead of fitting a straight line to the entire dataset, a series of straight lines will be fitted to sections of the data.

Classical Decomposition Additive Decomposition We assume that the time series is additive. A classical decomposition can be carried out using the following steps. Step 1: The trend cycle is computed using a centered MA of order k. Step2: The detrended series is computed by subtracting the trend-cycle component from the data

Classical Decomposition Additive Decomposition Step3: In classical decomposition we assume the seasonal component is constant from year to year. So we the average of the detrended value for a given month (for monthly data) and given quarter (for quarterly data) will be the seasonal index for the corresponding month or quarter. Step4: the irregular series E t is computed by simply subtracting the estimated seasonality, and trend- cycle from the original data.