10 Further Time Series OLS Issues Chapter 10 covered OLS properties for finite (small) sample time series data -If our Chapter 10 assumptions fail, we.

Slides:



Advertisements
Similar presentations
Random Processes Introduction (2)
Advertisements

Autocorrelation Functions and ARIMA Modelling
Multiple Regression Analysis
The Simple Regression Model
Economics 20 - Prof. Anderson1 Stationary Stochastic Process A stochastic process is stationary if for every collection of time indices 1 ≤ t 1 < …< t.
10.3 Time Series Thus Far Whereas cross sectional data needed 3 assumptions to make OLS unbiased, time series data needs only 2 -Although the third assumption.
Using SAS for Time Series Data
3.3 Omitted Variable Bias -When a valid variable is excluded, we UNDERSPECIFY THE MODEL and OLS estimates are biased -Consider the true population model:
Christopher Dougherty EC220 - Introduction to econometrics (chapter 13) Slideshow: stationary processes Original citation: Dougherty, C. (2012) EC220 -
1 MF-852 Financial Econometrics Lecture 11 Distributed Lags and Unit Roots Roy J. Epstein Fall 2003.
Unit Roots & Forecasting
STAT 497 APPLIED TIME SERIES ANALYSIS
Stationary Stochastic Process
ASYMPTOTIC PROPERTIES OF ESTIMATORS: PLIMS AND CONSISTENCY
Stationary process NONSTATIONARY PROCESSES 1 In the last sequence, the process shown at the top was shown to be stationary. The expected value and variance.
Assumption MLR.3 Notes (No Perfect Collinearity)
Part 1 Cross Sectional Data
Economics Prof. Buckles1 Time Series Data y t =  0 +  1 x t  k x tk + u t 1. Basic Analysis.
2.5 Variances of the OLS Estimators
Simple Linear Regression
Econ Prof. Buckles1 Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u 1. Estimation.
12.3 Correcting for Serial Correlation w/ Strictly Exogenous Regressors The following autocorrelation correction requires all our regressors to be strictly.
Multiple Regression Analysis
4. Multiple Regression Analysis: Estimation -Most econometric regressions are motivated by a question -ie: Do Canadian Heritage commercials have a positive.
Multiple Regression Analysis
1 Prof. Dr. Rainer Stachuletz Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u 3. Asymptotic Properties.
Economics 20 - Prof. Anderson
1Prof. Dr. Rainer Stachuletz Time Series Data y t =  0 +  1 x t  k x tk + u t 1. Basic Analysis.
1 10. Joint Moments and Joint Characteristic Functions Following section 6, in this section we shall introduce various parameters to compactly represent.
Ordinary Least Squares
12 Autocorrelation Serial Correlation exists when errors are correlated across periods -One source of serial correlation is misspecification of the model.
3. Multiple Regression Analysis: Estimation -Although bivariate linear regressions are sometimes useful, they are often unrealistic -SLR.4, that all factors.
Time Series Analysis.
Limits and the Law of Large Numbers Lecture XIII.
Random Regressors and Moment Based Estimation Prepared by Vera Tabakova, East Carolina University.
The Examination of Residuals. Examination of Residuals The fitting of models to data is done using an iterative approach. The first step is to fit a simple.
1 Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u.
10. Basic Regressions with Times Series Data 10.1 The Nature of Time Series Data 10.2 Examples of Time Series Regression Models 10.3 Finite Sample Properties.
1 Copyright © 2007 Thomson Asia Pte. Ltd. All rights reserved. CH5 Multiple Regression Analysis: OLS Asymptotic 
2.4 Units of Measurement and Functional Form -Two important econometric issues are: 1) Changing measurement -When does scaling variables have an effect.
3.4 The Components of the OLS Variances: Multicollinearity We see in (3.51) that the variance of B j hat depends on three factors: σ 2, SST j and R j 2.
Week 21 Stochastic Process - Introduction Stochastic processes are processes that proceed randomly in time. Rather than consider fixed random variables.
Chapter 4 The Classical Model Copyright © 2011 Pearson Addison-Wesley. All rights reserved. Slides by Niels-Hugo Blunch Washington and Lee University.
1 We will now look at the properties of the OLS regression estimators with the assumptions of Model B. We will do this within the context of the simple.
Estimation Method of Moments (MM) Methods of Moment estimation is a general method where equations for estimating parameters are found by equating population.
Chapter 8: Simple Linear Regression Yang Zhenlin.
Review of Probability. Important Topics 1 Random Variables and Probability Distributions 2 Expected Values, Mean, and Variance 3 Two Random Variables.
1 Prof. Dr. Rainer Stachuletz Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u 1. Estimation.
5. Consistency We cannot always achieve unbiasedness of estimators. -For example, σhat is not an unbiased estimator of σ -It is only consistent -Where.
1 EE571 PART 3 Random Processes Huseyin Bilgekul Eeng571 Probability and astochastic Processes Department of Electrical and Electronic Engineering Eastern.
CLASSICAL NORMAL LINEAR REGRESSION MODEL (CNLRM )
Joint Moments and Joint Characteristic Functions.
Previously Definition of a stationary process (A) Constant mean (B) Constant variance (C) Constant covariance White Noise Process:Example of Stationary.
Economics 20 - Prof. Anderson1 Time Series Data y t =  0 +  1 x t  k x tk + u t 1. Basic Analysis.
Stationarity and Unit Root Testing Dr. Thomas Kigabo RUSUHUZWA.
© 2013 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.
EC 827 Module 2 Forecasting a Single Variable from its own History.
STA302/1001 week 11 Regression Models - Introduction In regression models, two types of variables that are studied:  A dependent variable, Y, also called.
Stochastic Process - Introduction
Further Issues Using OLS with Time Series Data
Further Issues in Using OLS with Time Series Data
Serial Correlation and Heteroskedasticity in Time Series Regressions
STOCHASTIC HYDROLOGY Random Processes
Serial Correlation and Heteroscedasticity in
Further Issues Using OLS with Time Series Data
Lecturer Dr. Veronika Alhanaqtah
Serial Correlation and Heteroscedasticity in
CH2 Time series.
Multiple Regression Analysis
Stationary Stochastic Process
Presentation transcript:

10 Further Time Series OLS Issues Chapter 10 covered OLS properties for finite (small) sample time series data -If our Chapter 10 assumptions fail, we need to derive large sample time series data OLS properties -for example, if strict exogeneity fails (TS.3) -Unfortunately large sample analysis is more complicated since observations may be correlated over time -but some cases still exist where OLS is valid

11. Further Issues in Using OLS with Time Series Data 11.1 Stationary and Weakly Dependent Time Series 11.2 Asymptotic Properties of OLS 11.3 Using Highly Persistent Time Series in Regression Analysis 11.4 Dynamically Complete Models and the Absence of Serial Correlation 11.5 The Homoskedasticity Assumption for Time Series Models

11.1 Key Time Series Concepts To derive OLS properties in large time series data sample, we need to understand two properties: 1)Stationary Process -x distributions are constant over time -a weaker form is Covariance Stationary Process -x variables differ with distance but not with time 2) Weakly Dependent Time Series -variables lose connection when separated by time

11.1 Stationary Stochastic Process The stochastic process {x t : t=1,2…} is stationary if for every collection of time indices 1≤t 1 <t 2 <…<t m, the joint distribution of (x t1, x t2,…,x tm ) is the same as the joint distribution (x t1+h, x t2+h,…,x tm+h ) for all integers h≥1

11.1 Stationary Stochastic Process The above definition has two implications: 1)The sequence {x t : t=1,2…} is identically distributed -x 1 has the same distribution as any x t 2)Any joint distribution (ie: the joint distribution of [x 1, x 2 ]) remains the same over time (ie: same joint distribution of [x t, x t+1 ]) Basically, any collection of random variables has the same joint distribution now as in the future

11.1 Stationary Stochastic Process It is hard to prove if a data was generated by a stochastic process -Although certain sequences are obviously not stationary -Often a weaker form of stationarity suffices -Some texts even call this weaker form stationarity:

11.1 Covariance Stationary Process The stochastic process {x t : t=1,2…} with a finite second moment [E(x t 2 )<∞] is covariance stationary if: i)E(x t ) is constant ii)Var(x t ) is constant iii)For any t, h≥1, Cov(x t,x t+h ) depends on h and not on t

11.1 Covariance Stationary Process Essentially, covariance between variables can only depend on the distance between them Likewise, correlation between two variables can only depend on the distance between them -This does however allow for different correlation between variables of different distances -Note that Stationarity, often called “strict stationarity”, is stronger than and implies covariance stationarity

11.1 Stationarity Importance Stationarity is important for two reasons: 1)It simplifies the law of large numbers and the central limit theorem -it makes it easier to statisticians to prove theorems for economists 2)If the relationship between variables (y and x) are allowed to vary each period we cannot accurately estimate their relationship -Note that we already assume some form of stationarity by assuming that B j does not differ over time

11.1 Weakly Dependent Time Series The stochastic process {x t : t=1,2…} is said to be WEAKLY DEPENDENT if x t and x t+h are “almost independent” as h increases without bound If in the covariance stationary sequence Cor(x t,x t+h )->0 as h->∞, the sequence is ASYMPTOTICALLY UNCORRELATED

11.1 Weakly Dependent Time Series -weak dependence cannot be formally defined since it varies across applications -essentially, the correlation between x t and x t+h must go to zero “sufficiently quickly” as the distance between them approaches infinity -Essentially, weak dependence replaces the random sampling assumption in the law of large numbers and the central limit theorem

11.1 MA(1) -an independent sequence is trivially weakly dependent -a more interesting example is: -were e t is an iid (independent and identically distributed) sequence with mean zero and variance σ e 2 -the process {x t } is called a MOVING AVERAGE PROCESS OF ORDER ONE [MA(1)] -x t is a weighted average of e t and e t-1

11.1 MA(1) -an MA(1) process is weakly dependent because: 1)Adjacent terms are correlated -ie: both x t and x t+1 depend on e t -note that: 2) Variables more than two periods apart are uncorrelated

11.1 AR(1) -another key example is: -were e t is an iid (independent and identically distributed) sequence with mean zero and variance σ e 2 -we also assume e t is independent of y 0 and E(y 0 )=0 -the process {y t } is called an AUTOREGRESSIVE PROCESS OF ORDER ONE [AR(1)] -x t is a weighted average of e t and e t-1

11.1 AR(1) -The critical assumption for weak dependence of AR(1) is the stability condition: -if this condition holds, we have a STABLE AR(1) PROCESS -we can show that this process is stationary and weakly dependent

11.1 Stationary Notes -Although a trending series is nonstationary, it CAN be weakly dependent -If a series is stationary ABOUT ITS TIME TREND, and is also weakly dependent, it is often called a TREND-STATIONARY PROCESS -if time trends are included in these model, OLS can be performed as in chapter 10

11.2 Asymptotic OLS Properties -We’ve already seen cases where our CLM assumptions are not satisfied for time series -In these cases, stationarity and weak dependence lead to modified assumptions that allow us to use OLS -in general, these modified assumptions deal with single periods instead of across time

Assumption TS.1’ (Linear and Weak Dependence) Same as TS.1, only add the assumption that {(X t,y t ) t=1, 2,…} is stationary and weakly dependent. In particular, the law of large numbers and the central limit theorem can be applied to sample averages.

Assumption TS.2’ (No Perfect Collinearity) Same as TS.2 Don’t mess with a good thing!

Assumption TS.3’ (Zero Conditional Mean) For each t, the expected value of the error u t, given the explanatory variables for ITS OWN time periods, is zero. Mathematically, X t is CONTEMPORANEOUSLY EXOGENOUS (Note: Due to stationarity, if contemporaneous exogeneity holds for one time period, it holds for them all.)

Assumption TS.3’ Notes Note that technically the only requirement for Theorem 11.1 is zero UNconditional mean on u t and zero covariance between u and x: However assumption TS.3’ leads to a more straightforward analysis

Theorem 11.1 (Consistency of OLS) Under assumptions TS.1’ through TS.3’, the OLS estimators consistent:

11.2 Asymptotic OLS Properties -We’ve effectively: 1) weakened exogeneity to allow for contemporaneous exogeneity 2) strengthened the variable assumption to be weakly dependent In order to conclude consistency of OLS with unbiasedness is impossible. -To allow for tests, we will now impose less strict homoskedasticity and no serial correlation assumptions:

Assumption TS.4’ (Homoskedasticity) The errors are CONTEMPORANEOUSLY HOMOSKEDASTIC, that is,

Assumption TS.5’ (No Serial Correlation) For all t≠s,

11.2 Asymptotic OLS Properties Note the modifications in these assumptions: -TS.4’ now only conditions on explanatory variables in time period t -TS.5’ now only conditions on explanatory variables in the involved time periods t and s. -Note that TS.5’ does hold in AR(1) models -if only one lag exists, errors are seareally uncorrelated -these assumptions allow us to test OLS without the random distribution assumption:

Theorem 11.2 (Asymptotic Normality of OLS) Under assumptions TS.1’ through TS.5’, the OLS estimators are asymptotically normally distributed. Further, the usual OLS standard errors, t statistics and F statistics are asymptotically valid.

11.3 Random Walks -We’ve already seen consistent AR(1) with |ρ|<1, however some series are better explained with ρ=1, producing a RANDOM WALK process: -where e t is iid with mean zero and constant variance σ e 2, and the initial value y 0 is independent of all e t -essentially, this period’s y is the sum as last period’s y and a zero mean random variable e

11.3 Random Walks -We can also calculate expected value and variance for a random walk: -therefore the expected value of a random walk does not depend on t, although the variance does:

11.3 Random Walks -A random walk is an example of a HIGHLY PERSISTENT or STRONGLY DEPENDENT time series without trending -an example of a highly persistent time series with trending is a RANDOM WALK WITH DRIFT: -Note that if y 0 =0,

11.5 Time Series Homoskedasticity -Due to lagged terms, time series homoskedasticity from TS. 4’ can differ from cross-sectional homoskedasticity -a simple static model and its homoskedasiticity is of the form: -The addition of lagged terms can change this homoskedasticity statement:

11.5 Time Series Homoskedasticity -For the AR(1) model: -Likewise for a more complicated model: -Essentially, y’s variance is constant given ALL explanatory variables, lagged or not -This may lead to dynamic homoskedasticity statements