STAT 497 LECTURE NOTES 2.

Slides:



Advertisements
Similar presentations
FINANCIAL TIME-SERIES ECONOMETRICS SUN LIJIAN Feb 23,2001.
Advertisements

Autocorrelation Functions and ARIMA Modelling
Stationary Time Series
Dates for term tests Friday, February 07 Friday, March 07
Using SAS for Time Series Data
Model Building For ARIMA time series
Part II – TIME SERIES ANALYSIS C5 ARIMA (Box-Jenkins) Models
Unit Roots & Forecasting
Slide 1 DSCI 5340: Predictive Modeling and Business Forecasting Spring 2013 – Dr. Nick Evangelopoulos Lecture 7: Box-Jenkins Models – Part II (Ch. 9) Material.
R. Werner Solar Terrestrial Influences Institute - BAS Time Series Analysis by means of inference statistical methods.
STAT 497 LECTURE NOTES 8 ESTIMATION.
An Introduction to Time Series Ginger Davis VIGRE Computational Finance Seminar Rice University November 26, 2003.
STAT 497 APPLIED TIME SERIES ANALYSIS
1 Alberto Montanari University of Bologna Simulation of synthetic series through stochastic processes.
1 Ka-fu Wong University of Hong Kong Modeling Cycles: MA, AR and ARMA Models.
Time series analysis - lecture 2 A general forecasting principle Set up probability models for which we can derive analytical expressions for and estimate.
Modeling Cycles By ARMA
Prediction and model selection
ARIMA Forecasting Lecture 7 and 8 - March 14-16, 2011
Financial Econometrics
Modern methods The classical approach: MethodProsCons Time series regression Easy to implement Fairly easy to interpret Covariates may be added (normalization)
1 Ka-fu Wong University of Hong Kong Some Final Words.
Modern methods The classical approach: MethodProsCons Time series regression Easy to implement Fairly easy to interpret Covariates may be added (normalization)
Review of Probability.
BOX JENKINS METHODOLOGY
ARMA models Gloria González-Rivera University of California, Riverside
Time Series Forecasting (Part II)
Time Series Analysis.
Week 2ELE Adaptive Signal Processing 1 STOCHASTIC PROCESSES AND MODELS.
Slide 1 DSCI 5340: Predictive Modeling and Business Forecasting Spring 2013 – Dr. Nick Evangelopoulos Lecture 8: Estimation & Diagnostic Checking in Box-Jenkins.
Linear Stationary Processes. ARMA models. This lecture introduces the basic linear models for stationary processes. Considering only stationary processes.
TIME SERIES ANALYSIS Time Domain Models: Red Noise; AR and ARMA models LECTURE 7 Supplementary Readings: Wilks, chapters 8.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Definitions Random Signal Analysis (Review) Discrete Random Signals Random.
Byron Gangnes Econ 427 lecture 11 slides Moving Average Models.
Data analyses 2008 Lecture Last Lecture Basic statistics Testing Linear regression parameters Skill.
Elements of Stochastic Processes Lecture II
Week 21 Stochastic Process - Introduction Stochastic processes are processes that proceed randomly in time. Rather than consider fixed random variables.
CHAPTER 5 SIGNAL SPACE ANALYSIS
K. Ensor, STAT Spring 2004 Memory characterization of a process How would the ACF behave for a process with no memory? What is a short memory series?
STAT 497 LECTURE NOTE 11 VAR MODELS AND GRANGER CAUSALITY 1.
Time Series Basics Fin250f: Lecture 8.1 Spring 2010 Reading: Brooks, chapter
3.Analysis of asset price dynamics 3.1Introduction Price – continuous function yet sampled discretely (usually - equal spacing). Stochastic process – has.
MULTIVARIATE TIME SERIES & FORECASTING 1. 2 : autocovariance function of the individual time series.
Auto Regressive, Integrated, Moving Average Box-Jenkins models A stationary times series can be modelled on basis of the serial correlations in it. A non-stationary.
Estimation Method of Moments (MM) Methods of Moment estimation is a general method where equations for estimating parameters are found by equating population.
Stochastic models - time series. Random process. an infinite collection of consistent distributions probabilities exist Random function. a family of random.
Dynamic Models, Autocorrelation and Forecasting ECON 6002 Econometrics Memorial University of Newfoundland Adapted from Vera Tabakova’s notes.
Discrete-time Random Signals
Review and Summary Box-Jenkins models Stationary Time series AR(p), MA(q), ARMA(p,q)
The Box-Jenkins (ARIMA) Methodology
Univariate Time series - 2 Methods of Economic Investigation Lecture 19.
Introduction to stochastic processes
STAT 497 LECTURE NOTES 3 STATIONARY TIME SERIES PROCESSES
EC 827 Module 2 Forecasting a Single Variable from its own History.
Analysis of Financial Data Spring 2012 Lecture 4: Time Series Models - 1 Priyantha Wijayatunga Department of Statistics, Umeå University
Stochastic Process - Introduction
Covariance, stationarity & some useful operators
Time Series Analysis.
Ch8 Time Series Modeling
Chapter 6: Autoregressive Integrated Moving Average (ARIMA) Models
Econ 240 C Lecture 4.
Model Building For ARIMA time series
Stochastic models - time series.
Box-Jenkins models Stationary Time series AR(p), MA(q), ARMA(p,q)
Machine Learning Week 4.
Stochastic models - time series.
Lecture 13 Time Series: Stationarity, AR(p) & MA(q)
The Spectral Representation of Stationary Time Series
Time Series introduction in R - Iñaki Puigdollers
CH2 Time series.
Presentation transcript:

STAT 497 LECTURE NOTES 2

THE AUTOCOVARIANCE AND THE AUTOCORRELATION FUNCTIONS For a stationary process {Yt}, the autocovariance between Yt and Yt-k is and the autocorrelation function is

THE AUTOCOVARIANCE AND THE AUTOCORRELATION FUNCTIONS PROPERTIES: 1. 2. 3. 4. (necessary condition) k and k are positive semi-definite for any set of time points t1,t2,…,tn and any real numbers 1,2,…,n.

THE PARTIAL AUTOCORRELATION FUNCTION (PACF) PACF is the correlation between Yt and Yt-k after their mutual linear dependency on the intervening variables Yt-1, Yt-2, …, Yt-k+1 has been removed. The conditional correlation is usually referred as the partial autocorrelation in time series.

CALCULATION OF PACF 1. REGRESSION APPROACH: Consider a model from a zero mean stationary process where ki denotes the coefficients of Ytk+i and etk is the zero mean error term which is uncorrelated with Ytk+i, i=0,1,…,k. Multiply both sides by Ytk+j

CALCULATION OF PACF and taking the expectations diving both sides by 0 PACF

CALCULATION OF PACF For j=1,2,…,k, we have the following system of equations

CALCULATION OF PACF Using Cramer’s rule successively for k=1,2,…

CALCULATION OF PACF

CALCULATION OF PACF 2. Levinson and Durbin’s Recursive Formula:

WHITE NOISE (WN) PROCESS A process {at} is called a white noise (WN) process, if it is a sequence of uncorrelated random variables from a fixed distribution with constant mean {E(at)=}, constant variance {Var(at)= } and Cov(Yt, Yt-k)=0 for all k≠0.

WHITE NOISE (WN) PROCESS It is a stationary process with autocovariance function Basic Phenomenon: ACF=PACF=0, k0.

WHITE NOISE (WN) PROCESS White noise (in spectral analysis): white light is produced in which all frequencies (i.e., colors) are present in equal amount. Memoryless process Building block from which we can construct more complicated models It plays the role of an orthogonal basis in the general vector and function analysis.

ESTIMATION OF THE MEAN, AUTOCOVARIANCE AND AUTOCORRELATION THE SAMPLE MEAN:

ERGODICITY Kolmogorov’s law of large number (LLN) tells that if Xii.i.d.(μ, 2) for i = 1, . . . , n, then we have the following limit for the ensemble average In time series, we have time series average, not ensemble average. Hence, the mean is computed by averaging over time. Does the time series average converges to the same limit as the ensemble average? The answer is yes, if Yt is stationary and ergodic.

ERGODICITY A covariance stationary process is said to ergodic for the mean, if the time series average converges to the population mean. Similarly, if the sample average provides an consistent estimate for the second moment, then the process is said to be ergodic for the second moment.

ERGODICITY A sufficient condition for a covariance stationary process to be ergodic for the mean is that . Further, if the process is Gaussian, then absolute summable autocovariances also ensure that the process is ergodic for all moments.

THE SAMPLE AUTOCOVARIANCE FUNCTION or

THE SAMPLE AUTOCORRELATION FUNCTION A plot versus k a sample correlogram For large sample sizes, is normally distributed with mean k and variance is approximated by Bartlett’s approximation for processes in which k=0 for k>m.

THE SAMPLE AUTOCORRELATION FUNCTION In practice, i’s are unknown and replaced by their sample estimates, . Hence, we have the following large-lag standard error of :

THE SAMPLE AUTOCORRELATION FUNCTION For a WN process, we have The ~95% confidence interval for k: Hence, to test the process is WN or not, draw a 2/n1/2 lines on the sample correlogram. If all are inside the limits, the process could be WN (we need to check the sample PACF, too). For a WN process, it must be close to zero.

THE SAMPLE PARTIAL AUTOCORRELATION FUNCTION For a WN process, 2/n1/2 can be used as critical limits on kk to test the hypothesis of a WN process.

BACKSHIFT (OR LAG) OPERATORS Backshift operator, B is defined as e.g. Random Shock Process:

MOVING AVERAGE REPRESENTATION OF A TIME SERIES Also known as Random Shock Form or Wold (1938) Representation. Let {Yt} be a time series. For a stationary process {Yt}, we can write {Yt} as a linear combination of sequence of uncorrelated (WN) r.v.s. A GENERAL LINEAR PROCESS: where 0=I, {at} is a 0 mean WN process and

MOVING AVERAGE REPRESENTATION OF A TIME SERIES

MOVING AVERAGE REPRESENTATION OF A TIME SERIES

MOVING AVERAGE REPRESENTATION OF A TIME SERIES Because they involve infinite sums, to be statinary Hence, is the required condition for the process to be stationary. It is a non-deterministic process: A process contains no deterministic components (no randomness in the future states of the system) that can be forecast exactly from its own past.

AUTOCOVARIANCE GENERATING FUNCTION For a given sequence of autocovariances k, k=0,1, 2,… the autocovariance generating function is defined as where the variance of a given process 0 is the coefficient of B0 and the autocovariance of lag k, k is the coefficient of both Bk and Bk. 2 1

AUTOCOVARIANCE GENERATING FUNCTION Using and stationarity where j=0 for j<0.

AUTOCORRELATION GENERATING FUNCTION

EXAMPLE Write the above equation in random shock form. Find the autocovariance generating function.

AUTOREGRESSIVE REPRESENTATION OF A TIME SERIES This representation is also known as INVERTED FORM. Regress the value of Yt at time t on its own past plus a random shock.

AUTOREGRESSIVE REPRESENTATION OF A TIME SERIES It is an invertible process (it is important for forecasting). Not every stationary process is invertible (Box and Jenkins, 1978). Invertibility provides uniqueness of the autocorrelation function. It means that different time series models can be re-expressed by each other.

INVERTIBILITY RULE USING THE RANDOM SHOCK FORM For a linear process, to be invertible, the roots of (B)=0 as a function of B must lie outside the unit circle. If  is a root of (B), then ||>1. (real number) || is the absolute value of . (complex number)  || is

INVERTIBILITY RULE USING THE RANDOM SHOCK FORM It can be stationary if the process can be re-written in a RSF, i.e.,

STATIONARITY RULE USING THE INVERTED FORM For a linear process, to be invertible, the roots of (B)=0 as a function of B must lie outside the unit circle. If  is a root of (B), then ||>1.

RANDOM SHOCK FORM AND INVERTED FORM AR and MA representations are not the model form. Because they contain infinite number of parameters that are impossible to estimate from a finite number of observations.

TIME SERIES MODELS In the Inverted Form of a process, if only finite number of  weights are non-zero, i.e., the process is called AR(p) process.

TIME SERIES MODELS In the Random Shock Form of a process, if only finite number of  weights are non-zero, i.e., the process is called MA(q) process.

TIME SERIES MODELS AR(p) Process: MA(q) Process:

TIME SERIES MODELS The number of parameters in a model can be large. A natural alternate is the mixed AR and MA process  ARMA(p,q) process For a fixed number of observations, the more parameters in a model, the less efficient is the estimation of the parameters. Choose a simpler model to describe the phenomenon.