1Prof. Dr. Rainer Stachuletz Time Series Data y t =  0 +  1 x t1 +...+  k x tk + u t 1. Basic Analysis.

Slides:



Advertisements
Similar presentations
Multiple Regression Analysis
Advertisements

Multivariate Regression
The Simple Regression Model
Economics 20 - Prof. Anderson1 Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u 7. Specification and Data Problems.
Economics 20 - Prof. Anderson1 Stationary Stochastic Process A stochastic process is stationary if for every collection of time indices 1 ≤ t 1 < …< t.
10.3 Time Series Thus Far Whereas cross sectional data needed 3 assumptions to make OLS unbiased, time series data needs only 2 -Although the third assumption.
3.3 Omitted Variable Bias -When a valid variable is excluded, we UNDERSPECIFY THE MODEL and OLS estimates are biased -Consider the true population model:
3.2 OLS Fitted Values and Residuals -after obtaining OLS estimates, we can then obtain fitted or predicted values for y: -given our actual and predicted.
Lecture #9 Autocorrelation Serial Correlation
Economics 20 - Prof. Anderson1 Time Series Data y t =  0 +  1 x t  k x tk + u t 2. Further Issues.
10 Further Time Series OLS Issues Chapter 10 covered OLS properties for finite (small) sample time series data -If our Chapter 10 assumptions fail, we.
Stationary Stochastic Process
Lecture 4 Econ 488. Ordinary Least Squares (OLS) Objective of OLS  Minimize the sum of squared residuals: where Remember that OLS is not the only possible.
Assumption MLR.3 Notes (No Perfect Collinearity)
8.4 Weighted Least Squares Estimation Before the existence of heteroskedasticity-robust statistics, one needed to know the form of heteroskedasticity -Het.
Economics 20 - Prof. Anderson
Part 1 Cross Sectional Data
Multiple Linear Regression Model
Economics Prof. Buckles1 Time Series Data y t =  0 +  1 x t  k x tk + u t 1. Basic Analysis.
Chapter 10 Simple Regression.
CHAPTER 3 ECONOMETRICS x x x x x Chapter 2: Estimating the parameters of a linear regression model. Y i = b 1 + b 2 X i + e i Using OLS Chapter 3: Testing.
Economics 20 - Prof. Anderson1 Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u 6. Heteroskedasticity.
1Prof. Dr. Rainer Stachuletz Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u 7. Specification and Data Problems.
Simple Linear Regression
Prof. Dr. Rainer Stachuletz
Econ Prof. Buckles1 Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u 1. Estimation.
1Prof. Dr. Rainer Stachuletz Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u 6. Heteroskedasticity.
FIN357 Li1 Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u 1. Estimation.
Multiple Regression Analysis
Multiple Regression Analysis
1Prof. Dr. Rainer Stachuletz Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u 4. Further Issues.
1Prof. Dr. Rainer Stachuletz Fixed Effects Estimation When there is an observed fixed effect, an alternative to first differences is fixed effects estimation.
FIN357 Li1 Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u 1. Estimation.
1 Prof. Dr. Rainer Stachuletz Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u 3. Asymptotic Properties.
Chapter 11 Multiple Regression.
Economics 20 - Prof. Anderson
Prof. Dr. Rainer Stachuletz 1 Welcome to the Workshop What is Econometrics?
Topic 3: Regression.
The Simple Regression Model
1.The independent variables do not form a linearly dependent set--i.e. the explanatory variables are not perfectly correlated. 2.Homoscedasticity --the.
1Prof. Dr. Rainer Stachuletz Time Series Data y t =  0 +  1 x t  k x tk + u t 2. Further Issues.
Economics Prof. Buckles
1Prof. Dr. Rainer Stachuletz Panel Data Methods y it =  0 +  1 x it  k x itk + u it.
Ordinary Least Squares
12 Autocorrelation Serial Correlation exists when errors are correlated across periods -One source of serial correlation is misspecification of the model.
7.1 Multiple Regression More than one explanatory/independent variable This makes a slight change to the interpretation of the coefficients This changes.
Random Regressors and Moment Based Estimation Prepared by Vera Tabakova, East Carolina University.
1 Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u.
10. Basic Regressions with Times Series Data 10.1 The Nature of Time Series Data 10.2 Examples of Time Series Regression Models 10.3 Finite Sample Properties.
1 Copyright © 2007 Thomson Asia Pte. Ltd. All rights reserved. CH5 Multiple Regression Analysis: OLS Asymptotic 
2.4 Units of Measurement and Functional Form -Two important econometric issues are: 1) Changing measurement -When does scaling variables have an effect.
1Spring 02 Problems in Regression Analysis Heteroscedasticity Violation of the constancy of the variance of the errors. Cross-sectional data Serial Correlation.
3.4 The Components of the OLS Variances: Multicollinearity We see in (3.51) that the variance of B j hat depends on three factors: σ 2, SST j and R j 2.
Chapter 4 The Classical Model Copyright © 2011 Pearson Addison-Wesley. All rights reserved. Slides by Niels-Hugo Blunch Washington and Lee University.
1 Prof. Dr. Rainer Stachuletz Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u 1. Estimation.
5. Consistency We cannot always achieve unbiasedness of estimators. -For example, σhat is not an unbiased estimator of σ -It is only consistent -Where.
REGRESSION WITH TIME SERIES VARIABLES
Class 5 Multiple Regression CERAM February-March-April 2008 Lionel Nesta Observatoire Français des Conjonctures Economiques
Multiple Regression Analysis: Estimation. Multiple Regression Model y = ß 0 + ß 1 x 1 + ß 2 x 2 + …+ ß k x k + u -ß 0 is still the intercept -ß 1 to ß.
Economics 20 - Prof. Anderson1 Time Series Data y t =  0 +  1 x t  k x tk + u t 1. Basic Analysis.
Components of Time Series Su, Chapter 2, section II.
Regression Overview. Definition The simple linear regression model is given by the linear equation where is the y-intercept for the population data, is.
Basic Regression Analysis with Time Series Data
Multiple Regression Analysis: Estimation
Multiple Regression Analysis
The Regression Model Suppose we wish to estimate the parameters of the following relationship: A common method is to choose parameters to minimise the.
Basic Regression Analysis with Time Series Data
Multiple Regression Analysis: Estimation
Basic Regression Analysis with Time Series Data
Multiple Regression Analysis
Presentation transcript:

1Prof. Dr. Rainer Stachuletz Time Series Data y t =  0 +  1 x t  k x tk + u t 1. Basic Analysis

2Prof. Dr. Rainer Stachuletz Time Series vs. Cross Sectional Time series data has a temporal ordering, unlike cross-section data Will need to alter some of our assumptions to take into account that we no longer have a random sample of individuals Instead, we have one realization of a stochastic (i.e. random) process

3Prof. Dr. Rainer Stachuletz Examples of Time Series Models A static model relates contemporaneous variables: y t =  0 +  1 z t + u t A finite distributed lag (FDL) model allows one or more variables to affect y with a lag: y t =  0 +  0 z t +  1 z t-1 +  2 z t-2 + u t More generally, a finite distributed lag model of order q will include q lags of z

4Prof. Dr. Rainer Stachuletz Finite Distributed Lag Models We can call  0 the impact propensity – it reflects the immediate change in y For a temporary, 1-period change, y returns to its original level in period q+1 We can call  0 +  1 +…+  q the long-run propensity (LRP) – it reflects the long-run change in y after a permanent change

5Prof. Dr. Rainer Stachuletz Assumptions for Unbiasedness Still assume a model that is linear in parameters: y t =  0 +  1 x t  k x tk + u t Still need to make a zero conditional mean assumption: E(u t |X) = 0, t = 1, 2, …, n Note that this implies the error term in any given period is uncorrelated with the explanatory variables in all time periods

6Prof. Dr. Rainer Stachuletz Assumptions (continued) This zero conditional mean assumption implies the x’s are strictly exogenous An alternative assumption, more parallel to the cross-sectional case, is E(u t |x t ) = 0 This assumption would imply the x’s are contemporaneously exogenous Contemporaneous exogeneity will only be sufficient in large samples

7Prof. Dr. Rainer Stachuletz Assumptions (continued) Still need to assume that no x is constant, and that there is no perfect collinearity Note we have skipped the assumption of a random sample The key impact of the random sample assumption is that each u i is independent Our strict exogeneity assumption takes care of it in this case

8Prof. Dr. Rainer Stachuletz Unbiasedness of OLS Based on these 3 assumptions, when using time-series data, the OLS estimators are unbiased Thus, just as was the case with cross- section data, under the appropriate conditions OLS is unbiased Omitted variable bias can be analyzed in the same manner as in the cross-section case

9Prof. Dr. Rainer Stachuletz Variances of OLS Estimators Just as in the cross-section case, we need to add an assumption of homoskedasticity in order to be able to derive variances Now we assume Var(u t |X) = Var(u t ) =  2 Thus, the error variance is independent of all the x’s, and it is constant over time We also need the assumption of no serial correlation: Corr(u t,u s | X)=0 for t  s

10Prof. Dr. Rainer Stachuletz OLS Variances (continued) Under these 5 assumptions, the OLS variances in the time-series case are the same as in the cross-section case. Also, The estimator of  2 is the same OLS remains BLUE With the additional assumption of normal errors, inference is the same

11Prof. Dr. Rainer Stachuletz Trending Time Series Economic time series often have a trend Just because 2 series are trending together, we can’t assume that the relation is causal Often, both will be trending because of other unobserved factors Even if those factors are unobserved, we can control for them by directly controlling for the trend

12Prof. Dr. Rainer Stachuletz Trends (continued) One possibility is a linear trend, which can be modeled as y t =  0 +  1 t + e t, t = 1, 2, … Another possibility is an exponential trend, which can be modeled as log(y t ) =  0 +  1 t + e t, t = 1, 2, … Another possibility is a quadratic trend, which can be modeled as y t =  0 +  1 t +  2 t 2 + e t, t = 1, 2, …

13Prof. Dr. Rainer Stachuletz Detrending Adding a linear trend term to a regression is the same thing as using “detrended” series in a regression Detrending a series involves regressing each variable in the model on t The residuals form the detrended series Basically, the trend has been partialled out

14Prof. Dr. Rainer Stachuletz Detrending (continued) An advantage to actually detrending the data (vs. adding a trend) involves the calculation of goodness of fit Time-series regressions tend to have very high R 2, as the trend is well explained The R 2 from a regression on detrended data better reflects how well the x t ’s explain y t

15Prof. Dr. Rainer Stachuletz Seasonality Often time-series data exhibits some periodicity, referred to seasonality Example: Quarterly data on retail sales will tend to jump up in the 4 th quarter Seasonality can be dealt with by adding a set of seasonal dummies As with trends, the series can be seasonally adjusted before running the regression