An Introduction to Time Series Ginger Davis VIGRE Computational Finance Seminar Rice University November 26, 2003.

Slides:



Advertisements
Similar presentations
FINANCIAL TIME-SERIES ECONOMETRICS SUN LIJIAN Feb 23,2001.
Advertisements

Random Processes Introduction (2)
Autocorrelation Functions and ARIMA Modelling
Time Series Presented by Vikas Kumar vidyarthi Ph.D Scholar ( ),CE Instructor Dr. L. D. Behera Department of Electrical Engineering Indian institute.
Time Series Analysis Topics in Machine Learning Fall 2011 School of Electrical Engineering and Computer Science.
Dates for term tests Friday, February 07 Friday, March 07
Model Building For ARIMA time series
Unit Roots & Forecasting
Time Series Building 1. Model Identification
R. Werner Solar Terrestrial Influences Institute - BAS Time Series Analysis by means of inference statistical methods.
10 Further Time Series OLS Issues Chapter 10 covered OLS properties for finite (small) sample time series data -If our Chapter 10 assumptions fail, we.
STAT 497 APPLIED TIME SERIES ANALYSIS
Time series analysis - lecture 2 A general forecasting principle Set up probability models for which we can derive analytical expressions for and estimate.
3 mo treasury yield borrowing costs Dow industrials NY Times 18 Sept 2008 front page.
NY Times 23 Sept time series of the day. Stat Sept 2008 D. R. Brillinger Chapter 4 - Fitting t.s. models in the time domain sample autocovariance.
Financial Time Series CS3. Financial Time Series.
Financial Econometrics
Bristol MSc Time Series Econometrics, Spring 2015 Univariate time series processes, moments, stationarity.
L7: ARIMA1 Lecture 7: ARIMA Model Process The following topics will be covered: Properties of Stock Returns AR model MA model ARMA Non-Stationary Process.
Review of Probability.
BOX JENKINS METHODOLOGY
Box Jenkins or Arima Forecasting. H:\My Documents\classes\eco346\Lectures\chap ter 7\Autoregressive Models.docH:\My Documents\classes\eco346\Lectures\chap.
AR- MA- och ARMA-.
ARMA models Gloria González-Rivera University of California, Riverside
Probability Theory and Random Processes
Time Series Analysis.
STAT 497 LECTURE NOTES 2.
Week 2ELE Adaptive Signal Processing 1 STOCHASTIC PROCESSES AND MODELS.
Slide 1 DSCI 5340: Predictive Modeling and Business Forecasting Spring 2013 – Dr. Nick Evangelopoulos Lecture 8: Estimation & Diagnostic Checking in Box-Jenkins.
Linear Stationary Processes. ARMA models. This lecture introduces the basic linear models for stationary processes. Considering only stationary processes.
TIME SERIES ANALYSIS Time Domain Models: Red Noise; AR and ARMA models LECTURE 7 Supplementary Readings: Wilks, chapters 8.
Byron Gangnes Econ 427 lecture 11 slides Moving Average Models.
Byron Gangnes Econ 427 lecture 12 slides MA (part 2) and Autoregressive Models.
Data analyses 2008 Lecture Last Lecture Basic statistics Testing Linear regression parameters Skill.
Time Series Basics (2) Fin250f: Lecture 3.2 Fall 2005 Reading: Taylor, chapter , 3.9(skip 3.6.1)
Introduction to Time Series Analysis
Week 21 Stochastic Process - Introduction Stochastic processes are processes that proceed randomly in time. Rather than consider fixed random variables.
K. Ensor, STAT Spring 2004 Memory characterization of a process How would the ACF behave for a process with no memory? What is a short memory series?
Time Series Basics Fin250f: Lecture 8.1 Spring 2010 Reading: Brooks, chapter
3.Analysis of asset price dynamics 3.1Introduction Price – continuous function yet sampled discretely (usually - equal spacing). Stochastic process – has.
MULTIVARIATE TIME SERIES & FORECASTING 1. 2 : autocovariance function of the individual time series.
Auto Regressive, Integrated, Moving Average Box-Jenkins models A stationary times series can be modelled on basis of the serial correlations in it. A non-stationary.
Estimation Method of Moments (MM) Methods of Moment estimation is a general method where equations for estimating parameters are found by equating population.
Stochastic models - time series. Random process. an infinite collection of consistent distributions probabilities exist Random function. a family of random.
Dynamic Models, Autocorrelation and Forecasting ECON 6002 Econometrics Memorial University of Newfoundland Adapted from Vera Tabakova’s notes.
Linear Filters. denote a bivariate time series with zero mean. Let.
Discrete-time Random Signals
1 EE571 PART 3 Random Processes Huseyin Bilgekul Eeng571 Probability and astochastic Processes Department of Electrical and Electronic Engineering Eastern.
Review and Summary Box-Jenkins models Stationary Time series AR(p), MA(q), ARMA(p,q)
Introduction to stochastic processes
Random Signals Basic concepts Bibliography Oppenheim’s book, Appendix A. Except A.5. We study a few things that are not in the book.
STAT 497 LECTURE NOTES 3 STATIONARY TIME SERIES PROCESSES
Adv DSP Spring-2015 Lecture#11 Spectrum Estimation Parametric Methods.
Chapter 6 Random Processes
Analysis of Financial Data Spring 2012 Lecture 4: Time Series Models - 1 Priyantha Wijayatunga Department of Statistics, Umeå University
Stochastic Process - Introduction
Financial Econometrics Lecture Notes 2
Ch8 Time Series Modeling
Chapter 6: Autoregressive Integrated Moving Average (ARIMA) Models
Statistics 153 Review - Sept 30, 2008
Stochastic models - time series.
Box-Jenkins models Stationary Time series AR(p), MA(q), ARMA(p,q)
Machine Learning Week 4.
Stochastic models - time series.
ARMA models 2012 International Finance CYCU
STOCHASTIC HYDROLOGY Random Processes
Linear Filters.
The Spectral Representation of Stationary Time Series
Eni Sumarminingsih, SSi, MM
CH2 Time series.
Presentation transcript:

An Introduction to Time Series Ginger Davis VIGRE Computational Finance Seminar Rice University November 26, 2003

What is a Time Series? Time Series –Collection of observations indexed by the date of each observation Lag Operator –Represented by the symbol L Mean of Y t = μ t

White Noise Process Basic building block for time series processes

White Noise Processes, cont. Independent White Noise Process –Slightly stronger condition that and are independent Gaussian White Noise Process

Autocovariance Covariance of Y t with its own lagged value Example: Calculate autocovariances for:

Stationarity Covariance-stationary or weakly stationary process –Neither the mean nor the autocovariances depend on the date t

Stationarity, cont. 2 processes –1 covariance stationary, 1 not covariance stationary

Stationarity, cont. Covariance stationary processes –Covariance between Y t and Y t-j depends only on j (length of time separating the observations) and not on t (date of the observation)

Stationarity, cont. Strict stationarity –For any values of j 1, j 2, …, j n, the joint distribution of (Y t, Y t+j 1, Y t+j 2,..., Y t+j n ) depends only on the intervals separating the dates and not on the date itself

Gaussian Processes Gaussian process {Y t } –Joint density is Gaussian for any What can be said about a covariance stationary Gaussian process?

Ergodicity A covariance-stationary process is said to be ergodic for the mean if converges in probability to E(Y t ) as

Describing the dynamics of a Time Series Moving Average (MA) processes Autoregressive (AR) processes Autoregressive / Moving Average (ARMA) processes Autoregressive conditional heteroscedastic (ARCH) processes

Moving Average Processes MA(1): First Order MA process “moving average” –Y t is constructed from a weighted sum of the two most recent values of.

Properties of MA(1) for j>1

MA(1) Covariance stationary –Mean and autocovariances are not functions of time Autocorrelation of a covariance-stationary process MA(1)

Autocorrelation Function for White Noise:

Autocorrelation Function for MA(1):

Moving Average Processes of higher order MA(q): q th order moving average process Properties of MA(q)

Autoregressive Processes AR(1): First order autoregression Stationarity: We will assume Can represent as an MA

Properties of AR(1)

Properties of AR(1), cont.

Autocorrelation Function for AR(1):

Gaussian White Noise

AR(1),

Autoregressive Processes of higher order p th order autoregression: AR(p) Stationarity: We will assume that the roots of the following all lie outside the unit circle.

Properties of AR(p) Can solve for autocovariances / autocorrelations using Yule-Walker equations

Mixed Autoregressive Moving Average Processes ARMA(p,q) includes both autoregressive and moving average terms

Time Series Models for Financial Data A Motivating Example –Federal Funds rate –We are interested in forecasting not only the level of the series, but also its variance. –Variance is not constant over time

U. S. Federal Funds Rate

Modeling the Variance AR(p): ARCH(m) –Autoregressive conditional heteroscedastic process of order m –Square of u t follows an AR(m) process –w t is a new white noise process

References Investopia.com Economagic.com Hamilton, J. D. (1994), Time Series Analysis, Princeton, New Jersey: Princeton University Press.