ARMA models Gloria González-Rivera University of California, Riverside

Slides:



Advertisements
Similar presentations
FINANCIAL TIME-SERIES ECONOMETRICS SUN LIJIAN Feb 23,2001.
Advertisements

Autocorrelation Functions and ARIMA Modelling
State Space Models. Let { x t :t T} and { y t :t T} denote two vector valued time series that satisfy the system of equations: y t = A t x t + v t (The.
Autoregressive Integrated Moving Average (ARIMA) models
Stationary Time Series
Copyright(© MTS-2002GG): You are free to use and modify these slides for educational purposes, but please if you improve this material send us your new.
Dates for term tests Friday, February 07 Friday, March 07
VAR Models Gloria González-Rivera University of California, Riverside
Model Building For ARIMA time series
Slide 1 DSCI 5340: Predictive Modeling and Business Forecasting Spring 2013 – Dr. Nick Evangelopoulos Lecture 7: Box-Jenkins Models – Part II (Ch. 9) Material.
Time Series Building 1. Model Identification
R. Werner Solar Terrestrial Influences Institute - BAS Time Series Analysis by means of inference statistical methods.
STAT 497 LECTURE NOTES 8 ESTIMATION.
An Introduction to Time Series Ginger Davis VIGRE Computational Finance Seminar Rice University November 26, 2003.
STAT 497 APPLIED TIME SERIES ANALYSIS
Non-Seasonal Box-Jenkins Models
1 Ka-fu Wong University of Hong Kong Modeling Cycles: MA, AR and ARMA Models.
Time series analysis - lecture 2 A general forecasting principle Set up probability models for which we can derive analytical expressions for and estimate.
Modeling Cycles By ARMA
1 ECON 240C Lecture 8. 2 Outline: 2 nd Order AR Roots of the quadratic Roots of the quadratic Example: capumfg Example: capumfg Polar form Polar form.
1 ECON 240C Lecture 8. 2 Outline: 2 nd Order AR Roots of the quadratic Roots of the quadratic Example: change in housing starts Example: change in housing.
ARIMA Forecasting Lecture 7 and 8 - March 14-16, 2011
Financial Econometrics
1 Ka-fu Wong University of Hong Kong Some Final Words.
Non-Seasonal Box-Jenkins Models
BOX JENKINS METHODOLOGY
Box Jenkins or Arima Forecasting. H:\My Documents\classes\eco346\Lectures\chap ter 7\Autoregressive Models.docH:\My Documents\classes\eco346\Lectures\chap.
Teknik Peramalan: Materi minggu kedelapan
Time Series Analysis.
STAT 497 LECTURE NOTES 2.
Linear Stationary Processes. ARMA models. This lecture introduces the basic linear models for stationary processes. Considering only stationary processes.
Autoregressive Integrated Moving Average (ARIMA) Popularly known as the Box-Jenkins methodology.
Models for Non-Stationary Time Series The ARIMA(p,d,q) time series.
Byron Gangnes Econ 427 lecture 11 slides Moving Average Models.
Byron Gangnes Econ 427 lecture 12 slides MA (part 2) and Autoregressive Models.
Data analyses 2008 Lecture Last Lecture Basic statistics Testing Linear regression parameters Skill.
K. Ensor, STAT Spring 2005 Model selection/diagnostics Akaike’s Information Criterion (AIC) –A measure of fit plus a penalty term for the number.
2. Stationary Processes and Models
Week 21 Stochastic Process - Introduction Stochastic processes are processes that proceed randomly in time. Rather than consider fixed random variables.
K. Ensor, STAT Spring 2004 Memory characterization of a process How would the ACF behave for a process with no memory? What is a short memory series?
Time Series Basics Fin250f: Lecture 8.1 Spring 2010 Reading: Brooks, chapter
MULTIVARIATE TIME SERIES & FORECASTING 1. 2 : autocovariance function of the individual time series.
Auto Regressive, Integrated, Moving Average Box-Jenkins models A stationary times series can be modelled on basis of the serial correlations in it. A non-stationary.
Estimation Method of Moments (MM) Methods of Moment estimation is a general method where equations for estimating parameters are found by equating population.
Stochastic models - time series. Random process. an infinite collection of consistent distributions probabilities exist Random function. a family of random.
Linear Filters. denote a bivariate time series with zero mean. Let.
Joint Moments and Joint Characteristic Functions.
Review and Summary Box-Jenkins models Stationary Time series AR(p), MA(q), ARMA(p,q)
Univariate Time series - 2 Methods of Economic Investigation Lecture 19.
Introduction to stochastic processes
Time Series Analysis PART II. Econometric Forecasting Forecasting is an important part of econometric analysis, for some people probably the most important.
STAT 497 LECTURE NOTES 3 STATIONARY TIME SERIES PROCESSES
EC 827 Module 2 Forecasting a Single Variable from its own History.
Copyright(© MTS-2002GG): You are free to use and modify these slides for educational purposes, but please if you improve this material send us your new.
Model Building For ARIMA time series
Stochastic models - time series.
Box-Jenkins models Stationary Time series AR(p), MA(q), ARMA(p,q)
Machine Learning Week 4.
Stochastic models - time series.
ARMA models 2012 International Finance CYCU
Chapter 3 ARMA Time Series Models
VAR Models Gloria González-Rivera University of California, Riverside
Linear Filters.
The Spectral Representation of Stationary Time Series
Forecasting II (forecasting with ARMA models)
Forecasting II (forecasting with ARMA models)
CH2 Time series.
BOX JENKINS (ARIMA) METHODOLOGY
Forecasting II (forecasting with ARMA models)
Presentation transcript:

ARMA models Gloria González-Rivera University of California, Riverside and Jesús Gonzalo U. Carlos III de Madrid

White Noise A sequence of uncorrelated random variables is called a white noise process. . . . . 1 2 3 4 k

The Wold Decomposition If {Zt} is a nondeterministic stationary time series, then

Some Remarks on the Wold Decomposition

What the Wold theorem does not say The at need not be normally distributed, and hence need not be iid Though P[at|Zt-j]=0, it need not be true that E[at|Zt-j]=0 (think on the possible consequences???) The shocks a need not be the “true” shocks to the system. When will this happen??? The uniqueness result only states that the Wold representation is the unique linear representation where the shocks are linear forecast errors. Non-linear representations, or representations in terms of non-forecast error shocks are perfectly possible.

Birth of the ARMA models Under general conditions the infinite lag polynomial of the Wold Decomposition can be approximated by the ratio of two finite lag polynomials: Therefore AR(p) MA(q)

MA(1) processes Let a zero-mean white noise process Expectation Variance Autocovariance

MA(1) processes (cont) Autocovariance of higher order Autocorrelation MA(1) process is covariance-stationary because MA(1) process is ergodic because If were Gaussian, then would be ergodic for all moments

Both processes share the same autocorrelation function Plot the function 0.5 -1 1 -0.5 Both processes share the same autocorrelation function MA(1) is not uniquely identifiable, except for

Invertibility Definition: A MA(q) process defined by the equation is said to be invertible if there exists a sequence of constants and Theorem: Let {Zt} be a MA(q). Then {Zt} is invertible if and only if The coefficients {pj} are determined by the relation

Identification of the MA(1) If we identify the MA(1) through the autocorrelation structure, we need to decide with value of q to choose, the one greater than one or othe one less than one. Requiring the condition of invertibility (think why????) we will choose the value q<1. Another reason to choose the value less than one can be found by paying attention to the error variance of the two “equivalent” representations:

covariance-stationary MA(q) Moments MA(q) is covariance-stationary and ergodic for the same reasons as in a MA(1)

Is it covariance-stationary? MA(infinite) Is it covariance-stationary? The process is covariance-stationary provided that mention the change of notation from theta to psi (square summable sequence)

Some interesting results Proposition 1. (absolutely summable) (square summable) Proposition 2. Ergodic for the mean

Proof 1. (1) (2) It is finite because N is finite It is finite because is absolutely summable then

Proof 2.

AR(1) Using backward substitution geometric progression Remember: is the condition for stationarity and ergodicity

AR(1) (cont) Hence, this AR(1) process has a stationary solution if Alternatively, consider the solution of the characteristic equation: i.e. the roots of the characteristic equation lie outside of the unit circle Mean of a stationary AR(1) Variance of a stationary AR(1)

Autocovariance of a stationary AR(1) Rewrite the process as Autocorrelation of a stationary AR(1) ACF PACF: from Yule-Walker equations Make a graph of the autocorrelations of an AR(1)

Causality and Stationarity Definition: An AR(p) process defined by the equation is said to be causal, or a causal function of {at}, if there exists a sequence of constants and Causality is equivalent to the condition Definition: A stationary solution {Zt} of the equation exists (and is also the unique stationary solution) if and only if From now on we will be dealing only with causal AR models

AR(2) Stationarity Study of the roots of the characteristic equation (a) Multiply by -1 (b) Divide by

For a stationary causal solution is required that Necessary conditions for a stationary causal solution Roots can be real or complex. (1) Real roots (2) Complex roots

1 real -2 -1 1 2 complex -1

Mean of AR(2) Variance and Autocorrelations of AR(2)

different shapes according to the roots, real or complex Difference equation different shapes according to the roots, real or complex Show correlograms of AR(2) Ask the students to prove the PACF Partial autocorrelations: from Yule-Walker equations

AR(p) All p roots of the characteristic equation outside of the unit circle stationarity ACF System to solve for the first p autocorrelations: p unknowns and p equations ACF decays as mixture of exponentials and/or damped sine waves, Depending on real/complex roots PACF

Relationship between AR(p) and MA(q) Stationary AR(p) Example

Write an example, i.e. MA(2), and proceed as in the previous example Invertible MA(q) Ask the students to calculate the pi from a MA(2) Write an example, i.e. MA(2), and proceed as in the previous example

ARMA (p,q)

Autocorrelations of ARMA(p,q) taking expectations: Picture the autocorrelograms of ARMA PACF

ARMA(1,1)

ACF of ARMA(1,1) taking expectations

ACF PACF

ACF and PACF of an ARMA(1,1)

ACF and PACF of an MA(2)

ACF and PACF of an AR(2)

Problems P1: Determine which of the following ARMA processes are casual and which of them are invertible (in each case at denotes a white noise): P2: Show that the two MA(1) processes have the same autocovariances functions.

Problems (cont) P.3: Let {Zt} denote the unique stationary solution of the autoregressive equations Where . Then is given by the expression Define the new sequence These calculations show that {Zt} is the (unique stationary) solution of the causal AR equations

Problems (cont) P4: Let Yt be the AR(1) plus noise time series defined by Yt =Zt + Wt, where for all s and t. Show that {Yt} is stationary and find its autocovariance functions. Show that the time series is an MA(1). Conclude from the previous point that {Yt} is an ARMA(1,1) and express the three parameters of this model in terms of

Appendix: Lag Operator L Definition Properties Examples

Appendix: Inverse Operator Definition Note that : this definition does not hold because the limit does not exist Example:

Appendix: Inverse Operator (cont) Suppose you have the ARMA model and want to find the MA representation . You could try to crank out directly, but that’s not much fun. Instead you could find and matching terms in Lj to make sure this works. Example: Suppose . Multiplying both polynomials and matching powers of L, which you can easily solver recursively for the TRY IT!!!

Appendix: Factoring Lag Polynomials Suppose we need to invert the polynomial We can do that by factoring it: Now we need to invert each factor and multiply: Check the last expression!!!!

Appendix: Partial Fraction Tricks There is a prettier way to express the last inversion by using the partial fraction tricks. Find the constants a and b such that The numerator on the right hand side must be 1, so

Appendix: More on Invertibility Consider a MA(1) Definition A MA process is said to be invertible if it can be written as an AR( ) For a MA(1) to be invertible we require For a MA(q) to be invertible, all roots of the characteristic equation should lie outside of the unit circle MA processes have an invertible and a non-invertible representations Invertible representation optimal forecast depends on past information Non-invertible representation forecast depends on the future!!!