Linear Stationary Processes. ARMA models. This lecture introduces the basic linear models for stationary processes. Considering only stationary processes.

Slides:



Advertisements
Similar presentations
FINANCIAL TIME-SERIES ECONOMETRICS SUN LIJIAN Feb 23,2001.
Advertisements

Autocorrelation Functions and ARIMA Modelling
Autoregressive Integrated Moving Average (ARIMA) models
Stationary Time Series
Dates for term tests Friday, February 07 Friday, March 07
Model Building For ARIMA time series
Part II – TIME SERIES ANALYSIS C5 ARIMA (Box-Jenkins) Models
Model specification (identification) We already know about the sample autocorrelation function (SAC): Properties: Not unbiased (since a ratio between two.
Slide 1 DSCI 5340: Predictive Modeling and Business Forecasting Spring 2013 – Dr. Nick Evangelopoulos Lecture 7: Box-Jenkins Models – Part II (Ch. 9) Material.
Time Series Building 1. Model Identification
R. Werner Solar Terrestrial Influences Institute - BAS Time Series Analysis by means of inference statistical methods.
STAT 497 LECTURE NOTES 8 ESTIMATION.
An Introduction to Time Series Ginger Davis VIGRE Computational Finance Seminar Rice University November 26, 2003.
STAT 497 APPLIED TIME SERIES ANALYSIS
Non-Seasonal Box-Jenkins Models
1 Ka-fu Wong University of Hong Kong Modeling Cycles: MA, AR and ARMA Models.
Time series analysis - lecture 2 A general forecasting principle Set up probability models for which we can derive analytical expressions for and estimate.
Modeling Cycles By ARMA
ARIMA Forecasting Lecture 7 and 8 - March 14-16, 2011
Financial Econometrics
Modern methods The classical approach: MethodProsCons Time series regression Easy to implement Fairly easy to interpret Covariates may be added (normalization)
1 Ka-fu Wong University of Hong Kong Some Final Words.
Non-Seasonal Box-Jenkins Models
Modern methods The classical approach: MethodProsCons Time series regression Easy to implement Fairly easy to interpret Covariates may be added (normalization)
Box Jenkins or Arima Forecasting. H:\My Documents\classes\eco346\Lectures\chap ter 7\Autoregressive Models.docH:\My Documents\classes\eco346\Lectures\chap.
AR- MA- och ARMA-.
ARMA models Gloria González-Rivera University of California, Riverside
Teknik Peramalan: Materi minggu kedelapan
Time Series Analysis.
STAT 497 LECTURE NOTES 2.
Slide 1 DSCI 5340: Predictive Modeling and Business Forecasting Spring 2013 – Dr. Nick Evangelopoulos Lecture 8: Estimation & Diagnostic Checking in Box-Jenkins.
#1 EC 485: Time Series Analysis in a Nut Shell. #2 Data Preparation: 1)Plot data and examine for stationarity 2)Examine ACF for stationarity 3)If not.
Slide 1 DSCI 5340: Predictive Modeling and Business Forecasting Spring 2013 – Dr. Nick Evangelopoulos Exam 2 review: Quizzes 7-12* (*) Please note that.
Autoregressive Integrated Moving Average (ARIMA) Popularly known as the Box-Jenkins methodology.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Definitions Random Signal Analysis (Review) Discrete Random Signals Random.
Byron Gangnes Econ 427 lecture 11 slides Moving Average Models.
Byron Gangnes Econ 427 lecture 12 slides MA (part 2) and Autoregressive Models.
Data analyses 2008 Lecture Last Lecture Basic statistics Testing Linear regression parameters Skill.
K. Ensor, STAT Spring 2005 Model selection/diagnostics Akaike’s Information Criterion (AIC) –A measure of fit plus a penalty term for the number.
Time Series Basics (2) Fin250f: Lecture 3.2 Fall 2005 Reading: Taylor, chapter , 3.9(skip 3.6.1)
K. Ensor, STAT Spring 2004 Memory characterization of a process How would the ACF behave for a process with no memory? What is a short memory series?
SYSTEMS Identification Ali Karimpour Assistant Professor Ferdowsi University of Mashhad Reference: “System Identification Theory For The User” Lennart.
STAT 497 LECTURE NOTE 11 VAR MODELS AND GRANGER CAUSALITY 1.
Time Series Basics Fin250f: Lecture 8.1 Spring 2010 Reading: Brooks, chapter
3.Analysis of asset price dynamics 3.1Introduction Price – continuous function yet sampled discretely (usually - equal spacing). Stochastic process – has.
MULTIVARIATE TIME SERIES & FORECASTING 1. 2 : autocovariance function of the individual time series.
Auto Regressive, Integrated, Moving Average Box-Jenkins models A stationary times series can be modelled on basis of the serial correlations in it. A non-stationary.
Stochastic models - time series. Random process. an infinite collection of consistent distributions probabilities exist Random function. a family of random.
Linear Filters. denote a bivariate time series with zero mean. Let.
Review and Summary Box-Jenkins models Stationary Time series AR(p), MA(q), ARMA(p,q)
Univariate Time series - 2 Methods of Economic Investigation Lecture 19.
Introduction to stochastic processes
STAT 497 LECTURE NOTES 3 STATIONARY TIME SERIES PROCESSES
EC 827 Module 2 Forecasting a Single Variable from its own History.
Advanced Econometrics - Lecture 5 Univariate Time Series Models.
Dr. Thomas Kigabo RUSUHUZWA
Analysis of Financial Data Spring 2012 Lecture 4: Time Series Models - 1 Priyantha Wijayatunga Department of Statistics, Umeå University
Covariance, stationarity & some useful operators
VAR models and cointegration
Model Building For ARIMA time series
Stochastic models - time series.
Machine Learning Week 4.
Stochastic models - time series.
ARMA models 2012 International Finance CYCU
Lecture 13 Time Series: Stationarity, AR(p) & MA(q)
Linear Filters.
The Spectral Representation of Stationary Time Series
CH2 Time series.
BOX JENKINS (ARIMA) METHODOLOGY
Presentation transcript:

Linear Stationary Processes. ARMA models

This lecture introduces the basic linear models for stationary processes. Considering only stationary processes is very restrictive since most economic variables are non-stationary. However, stationary linear models are used as building blocks in more complicated nonlinear and/or non-stationary models.

Roadmap 1.The Wold decomposition 2.From the Wold decomposition to the ARMA representation. 3.MA processes and invertibility 4.AR processes, stationarity and causality 5.ARMA, invertibility and causality.

The Wold Decomposition Wold theorem in words: Any stationary process {Zt} can be expressed as a sum of two components: - a stochastic component: a linear combination of lags of a white noise process. - a deterministic component, uncorrelated with the latter stochastic component.

The Wold Theorem If {Z t } is a nondeterministic stationary time series, then

Some Remarks on the Wold Decomposition, I

Importance of the Wold decomposition Any stationary process can be written as a linear combination of lagged values of a white noise process (MA(∞) representation). This implies that if a process is stationary we immediately know how to write a model for it. Problem: we might need to estimate a lot of parameters (in most cases, an infinite number of them!) ARMA models: they are an approximation to the Wold representation. This approximation is more parsimonious (=less parameters)

Birth of the ARMA(p,q) models Under general conditions the infinite lag polynomial of the Wold decomposition can be approximated by the ratio of two finite-lag polynomials: Therefore AR(p)‏MA(q)‏

MA processes

MA(1) process (or ARMA(0,1)) Leta zero-mean white noise process - Expectation - Variance Autocovariance

MA(1) processes (cont)‏ -Autocovariance of higher order - Autocorrelation Partial autocorrelation

MA(1) processes (cont)‏ Stationarity MA(1) process is always covariance-stationary because

MA(q)‏ Moments MA(q) is covariance- Stationary for the same reasons as in a MA(1)‏

MA(infinite)‏ Is it covariance-stationary? The process is covariance-stationary provided that (the MA coefficients are square-summable)‏

Invertibility Definition: A MA(q) process is said to be invertible if it admits an autoregressive representation. Theorem: (necessary and sufficient conditions for invertibility) Let {Z t } be a MA(q),.Then {Z t } is invertible if and only The coefficients of the AR representation, {  j }, are determined by the relation

Consider the autocorrelation function of these two MA(1) processes: The autocorrelation functions are: Then, this two processes show identical correlation pattern. The MA coefficient is not uniquely identified. In other words: any MA(1) process has two representations (one with MA parameter larger than 1, and the other, with MA parameter smaller than 1). Identification of the MA(1)‏

If we identify the MA(1) through the autocorrelation structure, we would need to decide which value of  to choose, the one greater than one or the one smaller than one. We prefer representations that are invertible so we will choose the value .

AR processes

AR(1)‏ process Stationarity geometric progression Remember!!

AR(1) (cont)‏ Hence, an AR(1) process is stationary if Mean of a stationary AR(1)‏ Variance of a stationary AR(1)‏

Autocovariance of a stationary AR(1)‏ You need to solve a system of equations: Autocorrelation of a stationary AR(1)‏ ACF

EXERCISE Compute the Partial autocorrelation function of an AR(1) process. Compare its pattern to that of the MA(1) process.

AR(p)‏ stationarity All p roots of the characteristic equation outside of the unit circle ACF System to solve for the first p autocorrelations: p unknowns and p equations ACF decays as mixture of exponentials and/or damped sine waves, Depending on real/complex roots PACF

Exercise Compute the mean, the variance and the autocorrelation function of an AR(2) process. Describe the pattern of the PACF of an AR(2) process.

Causality and Stationarity Consider the AR(1) process,

Causality and Stationarity (II) However, this stationary representation depends on future values of It is customary to restrict attention to AR(1) processes with Such processes are called stationary but also CAUSAL, or future- indepent AR representations. Remark: any AR(1) process with can be rewritten as an AR(1) process with and a new white sequence. Thus, we can restrict our analysis (without loss of generality) to processes with

Causality (III) Definition: An AR(p) process defined by the equation is said to be causal, or a causal function of {a t }, if there exists a sequence of constants and - A necessary and sufficient condition for causality is

Relationship between AR(p) and MA(q)‏ Stationary AR(p)‏ Invertible MA(q)‏

ARMA(p,q) Processes

ARMA (p,q)‏

ARMA(1,1)‏

ACF of ARMA(1,1)‏ taking expectations you get this system of equations

PACF ACF

Summary Key concepts –Wold decomposition –ARMA as an approx. to the Wold decomp. –MA processes: moments. Invertibility –AR processes: moments. Stationarity and causality. –ARMA processes: moments, invertibility, causality and stationarity.