CH2 Time series.

Slides:



Advertisements
Similar presentations
SMA 6304 / MIT / MIT Manufacturing Systems Lecture 11: Forecasting Lecturer: Prof. Duane S. Boning Copyright 2003 © Duane S. Boning. 1.
Advertisements

Autocorrelation Functions and ARIMA Modelling
Dates for term tests Friday, February 07 Friday, March 07
ELG5377 Adaptive Signal Processing
CmpE 104 SOFTWARE STATISTICAL TOOLS & METHODS MEASURING & ESTIMATING SOFTWARE SIZE AND RESOURCE & SCHEDULE ESTIMATING.
Part II – TIME SERIES ANALYSIS C5 ARIMA (Box-Jenkins) Models
Model specification (identification) We already know about the sample autocorrelation function (SAC): Properties: Not unbiased (since a ratio between two.
Time Series Building 1. Model Identification
R. Werner Solar Terrestrial Influences Institute - BAS Time Series Analysis by means of inference statistical methods.
Simple Linear Regression
An Introduction to Time Series Ginger Davis VIGRE Computational Finance Seminar Rice University November 26, 2003.
10 Further Time Series OLS Issues Chapter 10 covered OLS properties for finite (small) sample time series data -If our Chapter 10 assumptions fail, we.
STAT 497 APPLIED TIME SERIES ANALYSIS
Non-Seasonal Box-Jenkins Models
Time series analysis - lecture 2 A general forecasting principle Set up probability models for which we can derive analytical expressions for and estimate.
Modeling Cycles By ARMA
Time Series Basics Fin250f: Lecture 3.1 Fall 2005 Reading: Taylor, chapter
Descriptive statistics Experiment  Data  Sample Statistics Experiment  Data  Sample Statistics Sample mean Sample mean Sample variance Sample variance.
Prediction and model selection
ARIMA Forecasting Lecture 7 and 8 - March 14-16, 2011
Simple Linear Regression Analysis
BOX JENKINS METHODOLOGY
ARMA models Gloria González-Rivera University of California, Riverside
STAT 497 LECTURE NOTES 2.
1 Least squares procedure Inference for least squares lines Simple Linear Regression.
Linear Stationary Processes. ARMA models. This lecture introduces the basic linear models for stationary processes. Considering only stationary processes.
TIME SERIES ANALYSIS Time Domain Models: Red Noise; AR and ARMA models LECTURE 7 Supplementary Readings: Wilks, chapters 8.
Data analyses 2008 Lecture Last Lecture Basic statistics Testing Linear regression parameters Skill.
K. Ensor, STAT Spring 2005 Model selection/diagnostics Akaike’s Information Criterion (AIC) –A measure of fit plus a penalty term for the number.
Time Series Basics (2) Fin250f: Lecture 3.2 Fall 2005 Reading: Taylor, chapter , 3.9(skip 3.6.1)
Week 21 Stochastic Process - Introduction Stochastic processes are processes that proceed randomly in time. Rather than consider fixed random variables.
K. Ensor, STAT Spring 2004 Memory characterization of a process How would the ACF behave for a process with no memory? What is a short memory series?
Time Series Basics Fin250f: Lecture 8.1 Spring 2010 Reading: Brooks, chapter
3.Analysis of asset price dynamics 3.1Introduction Price – continuous function yet sampled discretely (usually - equal spacing). Stochastic process – has.
MULTIVARIATE TIME SERIES & FORECASTING 1. 2 : autocovariance function of the individual time series.
Estimation Method of Moments (MM) Methods of Moment estimation is a general method where equations for estimating parameters are found by equating population.
Chapter 8: Simple Linear Regression Yang Zhenlin.
Review and Summary Box-Jenkins models Stationary Time series AR(p), MA(q), ARMA(p,q)
The Box-Jenkins (ARIMA) Methodology
Section Copyright © 2014, 2012, 2010 Pearson Education, Inc. Lecture Slides Elementary Statistics Twelfth Edition and the Triola Statistics Series.
Introduction to stochastic processes
Time Series Analysis PART II. Econometric Forecasting Forecasting is an important part of econometric analysis, for some people probably the most important.
Random Signals Basic concepts Bibliography Oppenheim’s book, Appendix A. Except A.5. We study a few things that are not in the book.
EC 827 Module 2 Forecasting a Single Variable from its own History.
1 Autocorrelation in Time Series data KNN Ch. 12 (pp )
Analysis of Financial Data Spring 2012 Lecture 4: Time Series Models - 1 Priyantha Wijayatunga Department of Statistics, Umeå University
Stochastic Process - Introduction
The simple linear regression model and parameter estimation
Covariance, stationarity & some useful operators
Regression Analysis AGEC 784.
6. Simple Regression and OLS Estimation
REGRESSION DIAGNOSTIC III: AUTOCORRELATION
Linear Regression.
Financial Econometrics Lecture Notes 2
Ch8 Time Series Modeling
Chapter 11: Simple Linear Regression
Chapter 6: Autoregressive Integrated Moving Average (ARIMA) Models
TM 745 Forecasting for Business & Technology Dr. Frank Joseph Matejcik
ECONOMETRICS DR. DEEPTI.
Model Building For ARIMA time series
CHAPTER 16 ECONOMIC FORECASTING Damodar Gujarati
Forecasting with non-stationary data series
Machine Learning Week 4.
Chapter 4, Regression Diagnostics Detection of Model Violation
Simple Linear Regression
The Spectral Representation of Stationary Time Series
Lecturer Dr. Veronika Alhanaqtah
16. Mean Square Estimation
BOX JENKINS (ARIMA) METHODOLOGY
Presentation transcript:

CH2 Time series

Agenda Overview Autocorrelation Time Series Model- White noise Time Series Model- MA Time Series Model- AR Time Series Model- ARMA

Overview Define : A time series is a sequence of values measured over time. Deterministic time series: ->Derived from a fixed deterministic formula. Probabilistic or Stochastic time series: ->Obtained by drawing a sample from a probability distribution. We focus on Probabilistic or Stochastic time series in this chapter.

Autocorrelation Is there a relationship between the value now and the value observed one time(even more) step in the past? The strength of the (linear) relationship is reflected in the correlation number. The answer to these questions, spanning the entire range of time steps , could very well be the autocorrelation function.

Autocorrelation The autocorrelation function: The plot of the correlation between values in the time series based on the time interval between them: X-axis : The length of the time lag between the current value and the value in the past. Y-axis : Value for a time lag t , (x = t) is the correlation between the values in the time series t time units apart . Use 𝜌 𝜏 to estimate correlation. 𝜌 𝜏 = 1 𝑇 𝑡=𝜏+1 𝑇 𝑦 𝑡 − 𝑦 [ 𝑦 𝑡−𝜏 − 𝑦 ] 1 𝑇 𝑡=1 𝑇 [ 𝑦 𝑡 − 𝑦 ] 2

Autocorrelation Correlogram: The plot of the estimated correlation against time intervals forms an estimation of the autocorrelation function, called the correlogram. It serves as a proxy for the autocorrelation function of the time series.

Time Series Model- White Noise 𝑦 𝑡 = 𝜀 𝑡 It is constructed by drawing a value from a normal distribution at each time instance and the parameters of the normal distribution are fixed and do not change with time. The most widely used version of white noise in practice and is referred to as Gaussian white noise.

Time Series Model- White Noise White noise series:

Time Series Model- White Noise Analysis: 1.

Time Series Model- White Noise Analysis: At the lag value of zero, the correlation is unity; that is, every sample is perfectly correlated with itself. 𝜏=0⇒ 𝜌 0 = 1 𝑇 𝑡=1 𝑇 𝑦 𝑡 − 𝑦 [ 𝑦 𝑡 − 𝑦 ] 1 𝑇 𝑡=1 𝑇 [ 𝑦 𝑡 − 𝑦 ] 2 =1 𝑦 𝑡 = 𝜀 𝑡 ~ 𝑁𝑜𝑟𝑚𝑎𝑙 𝑑𝑖𝑠𝑡, ∴𝑐𝑜𝑟𝑟 𝜀 𝑖 , 𝜀 𝑗 =0 ⇒The correlation between the values for all time intervals is zero. iid

Time Series Model- White Noise Question: Does knowledge of the past realization help in the prediction of the time series value in the next time instant? Ans : Past realization can help to estimate the varience of Normal dist. , so we can arrive at some intelligent conclusions about the odds of the next realization of the time series being greater than or less than some value. (like Chebyshev's Inequality)

Time Series Model- White Noise Summary: The variance of the value at each point in the series is the variance of the normal distribution used for drawing the white noise values. This distribution with a specific mean and variance is time invariant. Thus, a white noise series is a sequence of uncorrelated random variables with constant mean and variance.

Time Series Model- MA 𝑦 𝑡 = 𝜀 𝑡 +𝛽 𝜀 𝑡−1 called MA(1) MA means moving average . If 𝛽 = 0 , 𝑦 𝑡 is white noises. In the correlogram , there is a steep drop in the value after 𝜏 = 1. Why?

Time Series Model- MA Why? If 𝜏 = 1 , have in their terms one common white noise realization value . Between 𝑦 𝑡 and 𝑦 𝑡+1 ,the common white noise realization is 𝜀 𝑡 , between 𝑦 𝑡+1 and 𝑦 𝑡+2 is 𝜀 𝑡+1 …..,so we expect there to be some correlation between them. If 𝜏 = 2 , have no common white noise realization value . between 𝑦 𝑡 and 𝑦 𝑡+2 are independent drawings from normal distributions and are therefore uncorrelated (correlation = 0).

Time Series Model- MA Does knowledge of the past realization help in the prediction of the next time series value? At time step t we know what the white noise realization was at time step t – 1. 𝑦 𝑡 = 𝜀 𝑡 +𝛽 𝜀 𝑡−1 is normal distribution with the mean 𝑦 𝑡 𝑝𝑟𝑒𝑑 = 𝛽 𝜀 𝑡−1 , and the variance 𝑉𝑎𝑟 𝜀 𝑡 . But those value are based on the condition that we know the past realization of the time series. So , it is the conditional mean and the conditional variance of the time series.

Time Series Model- MA Summary: The series is called a moving average (MA) series because it was constructed using a linear combination (moving average) of white noise realizations. It is easily generalized to a series where the value is constructed using q lagged value of white noise realizations. 𝑦 𝑡 = 𝜀 𝑡 + 𝛽 1 𝜀 𝑡−1 +…+ 𝛽 𝑞 𝜀 𝑡−𝑞 called MA(q)

Time Series Model- AR AR means autoregressive process . It construct the series using a linear combination of infinite past values of the white noise realization. 𝑦 𝑡 = 𝜀 𝑡 +𝛼 𝜀 𝑡−1 + 𝛼 2 𝜀 𝑡−2 +…, 𝑦 𝑡−1 = 𝜀 𝑡−1 +𝛼 𝜀 𝑡−2 + 𝛼 2 𝜀 𝑡−3 +… ∴ 𝑦 𝑡 = 𝛼𝑦 𝑡−1 + 𝜀 𝑡 Since the next value in the time series is obtained by multiplying the past value with the slope of the regression, it is called an autoregressive (AR) series.

Time Series Model- AR The correlation values fall off gradually with increasing lag values

Time Series Model- AR Why? Every time step has in it additive terms comprising all the previous white noise realizations. There will always be white noise realizations that are common between two values of the time series however far apart they may be. Knowledge of the past values of the time series is helpful in predicting what the next value . 𝑦 𝑡 = 𝜀 𝑡 + 𝛼 1 𝑦 𝑡−1 + 𝛼 2 𝑦 𝑡−2 +…+ 𝛼 𝑝 𝑦 𝑡−𝑝 is AR(p).

Time Series Model- ARMA (Just The General ARMA Process) The AR(p) and MA(q) models can be mixed to form an ARMA(p, q) model. 𝑦 𝑡 =[ 𝛼 1 𝑦 𝑡−1 + 𝛼 2 𝑦 𝑡−2 +…+ 𝛼 𝑝 𝑦 𝑡−𝑝 ]+ [ 𝜀 𝑡 + 𝛽 1 𝜀 𝑡−1 +…+ 𝛽 𝑞 𝜀 𝑡−𝑞 ] The preceding models are all constructed using a linear combination of past values of the white noise series. The sum of two independent ARMA series is also ARMA.

The Random Walk Process A random walk is an AR(1) series with 𝛼 = 1. From the definition of an AR series given, the value of the time series at time t : 𝑦 𝑡 = 𝜀 𝑡 + 𝜀 𝑡−1 + 𝜀 𝑡−2 +…= 𝜀 𝑡 + 𝑦 𝑡−1

The Random Walk Process Properties of the random walk: 𝑉𝑎𝑟 𝑦 𝑡 =𝑉𝑎𝑟 𝜀 𝑡 +𝑉𝑎𝑟 𝜀 𝑡−1 +…+𝑉𝑎𝑟 𝜀 1 =𝑡∗𝑉𝑎𝑟( 𝜀 𝑡 ) The variance depends on the time instant, and it increases linearly with time t and the standard deviation increases linearly with 𝑡 . The statistical parameters like the unconditional mean and variance are not time invariant, or stationary. The series is therefore called a non-stationary time series.

The Random Walk Process The correlation between a value and its immediate lagging value is 1. Prediction for the next time step would then be a value with mean equal to the current time step 𝑦 𝑡 𝑝𝑟𝑒𝑑 = 𝑦 𝑡−1 ,and the variance is equal to the the white noise realizations. Such series where the expected value at the next time step is the value at the current time step are known as martingales. Note 𝐸( 𝑦 𝑡+𝑘 | 𝐹 𝑡 )=𝐸 𝑦 𝑡 𝑦 𝑡 𝑖𝑠 𝑚𝑎𝑟𝑡𝑖𝑛𝑔𝑎𝑙𝑒 𝑝𝑟𝑜𝑐𝑒𝑠𝑠.