3.Analysis of asset price dynamics 3.1Introduction Price – continuous function yet sampled discretely (usually - equal spacing). Stochastic process – has.

Slides:



Advertisements
Similar presentations
FINANCIAL TIME-SERIES ECONOMETRICS SUN LIJIAN Feb 23,2001.
Advertisements

Cointegration and Error Correction Models
Autocorrelation Functions and ARIMA Modelling
Chapter 3 Properties of Random Variables
Dates for term tests Friday, February 07 Friday, March 07
Economics 20 - Prof. Anderson1 Stationary Stochastic Process A stochastic process is stationary if for every collection of time indices 1 ≤ t 1 < …< t.
Model Building For ARIMA time series
Part II – TIME SERIES ANALYSIS C5 ARIMA (Box-Jenkins) Models
Nonstationary Time Series Data and Cointegration Prepared by Vera Tabakova, East Carolina University.
Unit Roots & Forecasting
Slide 1 DSCI 5340: Predictive Modeling and Business Forecasting Spring 2013 – Dr. Nick Evangelopoulos Lecture 7: Box-Jenkins Models – Part II (Ch. 9) Material.
R. Werner Solar Terrestrial Influences Institute - BAS Time Series Analysis by means of inference statistical methods.
An Introduction to Time Series Ginger Davis VIGRE Computational Finance Seminar Rice University November 26, 2003.
10 Further Time Series OLS Issues Chapter 10 covered OLS properties for finite (small) sample time series data -If our Chapter 10 assumptions fail, we.
STAT 497 APPLIED TIME SERIES ANALYSIS
Stationary Stochastic Process
Stationary process NONSTATIONARY PROCESSES 1 In the last sequence, the process shown at the top was shown to be stationary. The expected value and variance.
A Primer on Financial Time Series Analysis
QA-3 FRM-GARP Sep-2001 Zvi Wiener Quantitative Analysis 3.
Prediction and model selection
ARIMA Forecasting Lecture 7 and 8 - March 14-16, 2011
FRM Zvi Wiener Following P. Jorion, Financial Risk Manager Handbook Financial Risk Management.
Economics 20 - Prof. Anderson
Review of Probability.
BOX JENKINS METHODOLOGY
ARMA models Gloria González-Rivera University of California, Riverside
STATISTICS: BASICS Aswath Damodaran 1. 2 The role of statistics Aswath Damodaran 2  When you are given lots of data, and especially when that data is.
Time Series Analysis.
STAT 497 LECTURE NOTES 2.
Slide 1 DSCI 5340: Predictive Modeling and Business Forecasting Spring 2013 – Dr. Nick Evangelopoulos Lecture 8: Estimation & Diagnostic Checking in Box-Jenkins.
Linear Stationary Processes. ARMA models. This lecture introduces the basic linear models for stationary processes. Considering only stationary processes.
Data analyses 2008 Lecture Last Lecture Basic statistics Testing Linear regression parameters Skill.
Week 21 Stochastic Process - Introduction Stochastic processes are processes that proceed randomly in time. Rather than consider fixed random variables.
Stats Probability Theory Summary. The sample Space, S The sample space, S, for a random phenomena is the set of all possible outcomes.
K. Ensor, STAT Spring 2004 Memory characterization of a process How would the ACF behave for a process with no memory? What is a short memory series?
MULTIVARIATE TIME SERIES & FORECASTING 1. 2 : autocovariance function of the individual time series.
Review of Probability. Important Topics 1 Random Variables and Probability Distributions 2 Expected Values, Mean, and Variance 3 Two Random Variables.
Statistics 349.3(02) Analysis of Time Series. Course Information 1.Instructor: W. H. Laverty 235 McLean Hall Tel:
Review and Summary Box-Jenkins models Stationary Time series AR(p), MA(q), ARMA(p,q)
Univariate Time series - 2 Methods of Economic Investigation Lecture 19.
Introduction to stochastic processes
Stationarity and Unit Root Testing Dr. Thomas Kigabo RUSUHUZWA.
© 2013 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.
Dr. Thomas Kigabo RUSUHUZWA
Analysis of Financial Data Spring 2012 Lecture 4: Time Series Models - 1 Priyantha Wijayatunga Department of Statistics, Umeå University
Stochastic Process - Introduction
Multiple Random Variables and Joint Distributions
Covariance, stationarity & some useful operators
Time Series Analysis.
Financial Econometrics Lecture Notes 2
VAR models and cointegration
Outline Introduction Signal, random variable, random process and spectra Analog modulation Analog to digital conversion Digital transmission through baseband.
Chapter 6: Autoregressive Integrated Moving Average (ARIMA) Models
Further Issues Using OLS with Time Series Data
Computational Data Analysis
Statistics 153 Review - Sept 30, 2008
Model Building For ARIMA time series
CHAPTER 16 ECONOMIC FORECASTING Damodar Gujarati
Further Issues in Using OLS with Time Series Data
Stochastic models - time series.
Machine Learning Week 4.
Stochastic models - time series.
EMIS 7300 SYSTEMS ANALYSIS METHODS FALL 2005
Unit Root & Augmented Dickey-Fuller (ADF) Test
Lecture 13 Time Series: Stationarity, AR(p) & MA(q)
The Spectral Representation of Stationary Time Series
Further Issues Using OLS with Time Series Data
Lecturer Dr. Veronika Alhanaqtah
CH2 Time series.
Stationary Stochastic Process
Presentation transcript:

3.Analysis of asset price dynamics 3.1Introduction Price – continuous function yet sampled discretely (usually - equal spacing). Stochastic process – has a random component, hence cannot be exactly predicted. Sequence of random variables => time series Basic notions from Statistics & Probability Theory: - distribution function - mean (expectation) - variance / standard deviation - (auto) correlation - stationary process (mean & variance do not change) 1

Statistical concepts 1 Consider a random variable (or variate) X. Probability density function f(x) defines the probability to find X between a and b: Pr(a ≤ X ≤ b) = The probability density must satisfy the normalization condition = 1 Cumulative distribution function: Pr(X≤b) = Obviously, Pr(X > b) = 1 – Pr(X ≤ b)

Statistical concepts 2 Two characteristics are used to describe the most probable values of random variables: (1) mean (or expectation), and (2) median. Mean of X is the average of all possible values of X that are weighed with the probability density f(x): m = E[X] = ∫ x f(x) dx Median of X is the value M for which Pr(X > M) = Pr(X < M) = 0.5 Variance, Var, and the standard deviation, σ, are the conventional estimates of the deviations from the mean values of X Var[X] ≡ σ 2 = ∫(x – m) 2 f(x) dx

Statistical concepts 3 Higher-order moments of the probability distributions are defined as m n = E[X n ] = ∫ x n f(x) dx According to this definition, mean is the first moment (m ≡ m 1 ), and variance can be expressed via the first two moments, σ 2 = m 2 – m 2. Two other important parameters, skewness S and kurtosis K, are related to the third and fourth moments, respectively: S = E[(x – m) 3 ] / σ 3, K = E[(x – m) 4 ] / σ 4 Both S and K are dimensionless. Zero skewness implies that f(x) is symmetrical around its mean value. The positive and negative values of skewness indicate long positive tails and long negative tails, respectively. Kurtosis characterizes the distribution peakedness. Kurtosis of the normal distribution equals three. The excess kurtosis, K e = K – 3, is often used as a measure of deviation from the normal distribution.

Statistical concepts 4 Joint distribution of two random variables X and Y Pr(X ≤ b, Y ≤ c) = h(x, y) is the joint density that satisfies the normalization condition =1 Two random variables are independent if their joint density function is the product of the univariate density functions: h(x, y) = f(x) g(y). Covariance between two variates provides a measure of their simultaneous change. Consider two variates X and Y that have the means m X and m Y, respectively. Their covariance equals Cov(x, y) = σ XY = E[(x – m X )(y – m Y )] = E[xy] – m X m Y

Statistical concepts 5 Positive (negative) covariance between two variates implies that these variates tend to change simultaneously in the same (opposite) direction. Another popular measure of simultaneous change is correlation coefficient: Corr(x, y) = Cov(x, y)/(σ X σ Y ); -1 ≤ Corr(x, y) ≤ 1 Autocovariance: γ(k, t) = E[y(t) – m)(y(t – k) – m)] Autocorrelation function (ACF): ρ(k) = γ(k)/γ(0); ρ(0) = 1; |ρ(k)| < 1 Ljung-Box test H 0 hypothesis: ρ(1) = ρ(2) = … ρ(k) = 0; p-value. In the general case with N variates X 1,..., X N (where N > 2), correlations among variates are described with the covariance matrix, which has the following elements Cov(x i, x j ) = σ ij = E[(x i – m i )(x j – m j )]

Statistical concepts 6 Uniform distribution has a constant value within the given interval [a, b] and equals zero outside this interval f U = 0, x b f U = 1/(b – a), a ≤ x ≤ b m U = 0.5(a+b), σ 2 U = (b – a) 2 /12, S U = 0, K eU = –6/5 Normal (Gaussian) distribution has the form f N (x) = exp[–(x – m) 2 /2σ 2 ] It is often denoted N(m, σ). Skewness and excess kurtosis of the normal distribution equal zero. The transform z = (x – m)/σ converts the normal distribution into the standard normal distribution f SN (x) = exp[–z 2 /2]

Statistical concepts 7 8

9 3.Analysis of asset price dynamics 3.1Introduction (continued) Time series analysis: - ARMA model - linear regression - trends (deterministic vs stochastic) - vector autoregressions /simultaneous equations - cointegration 9

10 3.Analysis of asset price dynamics 3.2 Autoregressive model AR(p) Univariate time series y(t) observed at moments t = 0, 1, …, n; y(t k ) ≡ y(k) ≡ y k y(t) = a 1 y(t-1) + a 2 y(t-2) + …+ a p y(t-p) + ε(t), t > p (lag) Random process ε(t) (noise, shock, innovation) White noise: E[ε(t)] = 0; E[ε 2 (t)] =  2 ; E[ε(t) ε(s)] = 0, if t  s. Lag operator: L p = y(t-p); A p (L) = 1 – a 1 L – a 2 L 2 - … - a p L p AR(p): A p (L)y(t) = ε(t), 10

11 3.Analysis of asset price dynamics 3.2 Autoregressive model AR(p) (continued 1) AR(1): y(t) = a 1 y(t-1) + ε(t), y(t) = a 1 i ε(t-i) Mean-reverting process: shocks decay and process returns to its mean. “Old” noise converges with time to zero when | a 1 | < 1 If a 1 = 1, AR(1) is the random walk (RW): y(t) = y(t-1) + ε(t) => y(t) = ε(t-i) RW is not mean-reverting. 11

3. Analysis of asset price dynamics 3.2 Autoregressive model AR(p) (continued 2) The 1 st difference of RW: x(t) = y(t) – y(t-1) = ε(t) => mean-reverting Processes that must be differenced d times in order to exclude non- transitory noise shocks are named integrated of order d: I(d). Unit roots exist for AR(p) when shocks are not transitory. Then modulus of solutions to the characterisitc equation 1 – a 1 z – a 2 z 2 - … - a p z p = 0 must be lower than 1 (inside unit circle): y(t) = 0.5y(t-1) – 0.2y(t-2) => z + 0.2z 2 = 0; 12

3. Analysis of asset price dynamics 3.2 Autoregressive model AR(p) (continued 3) AR(p) with non-zero mean: If E[y(t)] = m, RW: y(t) = c + a 1 y(t-1) + ε(t), c = m(1- a 1 ) AR(p): A p (L)y(t) = c + ε(t), c = m(1- a a p ) Autocorrelation coefficients: y(t) is covariance-stationary (or weakly stationary) if γ(k, t) = γ(k). AR(1): ρ(1) = a 1, ρ(k) = a 1 ρ(k-1) AR(2): ρ (1) = a 1 /(1 – a 2 ), ρ(k) = a 1 ρ(k-1) + a 2 ρ(k-2), k ≥ 2 13

3. Analysis of asset price dynamics 3.3 Moving average model MA(q) y(t) = ε(t) + b 1 ε(t-1) + b 2 ε(t-2) b q ε(t-q) = B q (L) ε(t) B q (L) = 1 + b 1 L + b 2 L 2 + … + b q L q MA(1): y(t) = ε(t) + b 1 ε(t-1), ε(0) = 0; MA(1) incorporates past like AR(  ): y(t)(1-b 1 L + b 1 L 2 -b 1 L ) = ε(t) MA(1): ρ(1) = b 1 /( b ), ρ(k>1) = 0 MA(q) is invertible if it can be transformed into AR(  ). In this case, all solutions to 1 + b 1 z + b 2 z 2 + … + b q z q = 0 must be outside unit circle. Hence MA(1) is invertible when |b 1 | < 1. MA(q) with non-zero mean m: y(t) = c + B p (L)ε(t), c = m 14

3. Analysis of asset price dynamics 3.4 The ARMA(p, q) model y(t) = a 1 y(t-1) + a 2 y(t-2) + …+ a p y(t-p) + ε(t) + b 1 ε(t-1) + b 2 ε(t-2) b q ε(t-q) Strict stationarity when higher moments do not depend on time. Any MA(q) is covariance-stationary. AR(p) is covariance-stationary only if the roots of its polynomial are outside the unit circle. 15

16 3. Analysis of asset price dynamics 3.5 Linear regression Empirical TS: y i = a + bx i + ε i., i = 1, 2,.., N. a – intercept; b – slope. Estimator: y = A + Bx; Residual: e i = y i - A - Bx i ; RSS = MSE => OLS = min(RSS) => A = y m - Bx m ; B = x m = ; y m = X i = x i – x m Y i = y i – y m 16

3. Analysis of asset price dynamics 3.5 Linear regression (continued) Assumptions: 1) E[ε i ] = 0; otherwise intercept is biased. 2) Var(ε i ) = σ 2 = const ; 3) E[ε(t) ε(s)] = 0, if t  s. 4) Independent variable is deterministic. Goodness of fit (coefficient of determination; R 2 ) R 2 =

3. Analysis of asset price dynamics 3.5 Linear regression (continued) 18

3.Analysis of asset price dynamics 3.6 Multiple regression y i = a + b 1 x 1,i + b 2 x 2,i b K x K,i + ε i Additional assumption: no perfect collinearity, i.e. no X i is a linear combination of other X i. Overspecification => no bias in estimates of b i but overstates σ 2 Underspecification => yields biased b i and understates σ 2 Adjusted R 2 =

3. Analysis of asset price dynamics 3.7 Trends Trends => non-stationary time series Deterministic trend vs stochastic trend AR(1): y(t) – m – ct = a 1 [y(t – 1) – m – c(t – 1)] + ε(t) z(t) = y(t) – m – ct = a 1 t z(0) + ε(t) If |a 1 | < 1, shocks are transitory. If a 1 =1, random walk with drift: y(t) = c + y(t – 1) + ε(t) For m=0, deterministic trend: y(t) = at + ε(t) stochastic trend: y(t) = a + y(t – 1) + ε(t) May look similar for some time. 20

3. Analysis of asset price dynamics 21

3. Analysis of asset price dynamics 3.8 Multivariate time series A multivariate time series y(t) = (y1(t), y2(t),..., yn(t))' is a vector of n processes Multivariate moving average models are rarely used. Therefore we focus on the vector autoregressive model (VAR). Bivariate VAR(1) process: y 1 (t) = a 10 + a 11 y 1 (t - 1) + a 12 y 2 (t - 1) + ε 1 (t) y 2 (t) = a 20 + a 21 y 1 (t - 1) + a 22 y 2 (t - 1) + ε 2 (t) Matrix form: y(t) = a 0 + Ay(t - 1) + ε(t) y(t) = (y1(t), y2(t))', a0 = (a10, a20)', ε(t) = (ε1(t), ε2(t))', A = 22

3. Analysis of asset price dynamics 3.9 Multivariate time series (continued) Simultaneous dynamic models y 1 (t) = a 11 y 1 (t - 1) + a 12 y 2 (t) + ε 1 (t) y 2 (t) = a 21 y 1 (t) + a 22 y 2 (t - 1) + ε 2 (t) can be transformed to VAR: = (1 - a 12 a 21 ) -1 + (1 - a 12 a 21 ) -1 Two covariance stationary processes are x(t) and y(t) are jointly covariance-stationary if Cov(x(t), y(t – s)) depends on lag s only. 23