Time Series Presented by Vikas Kumar vidyarthi Ph.D Scholar (10203069),CE Instructor Dr. L. D. Behera Department of Electrical Engineering Indian institute.

Slides:



Advertisements
Similar presentations
FINANCIAL TIME-SERIES ECONOMETRICS SUN LIJIAN Feb 23,2001.
Advertisements

SMA 6304 / MIT / MIT Manufacturing Systems Lecture 11: Forecasting Lecturer: Prof. Duane S. Boning Copyright 2003 © Duane S. Boning. 1.
Autocorrelation Functions and ARIMA Modelling
Autoregressive Integrated Moving Average (ARIMA) models
Stationary Time Series
Time Series Analysis -- An Introduction -- AMS 586 Week 2: 2/4,6/2014.
Time Series Analysis Topics in Machine Learning Fall 2011 School of Electrical Engineering and Computer Science.
Dates for term tests Friday, February 07 Friday, March 07
DSCI 5340: Predictive Modeling and Business Forecasting Spring 2013 – Dr. Nick Evangelopoulos Exam 1 review: Quizzes 1-6.
Model Building For ARIMA time series
Part II – TIME SERIES ANALYSIS C5 ARIMA (Box-Jenkins) Models
Slide 1 DSCI 5340: Predictive Modeling and Business Forecasting Spring 2013 – Dr. Nick Evangelopoulos Lecture 7: Box-Jenkins Models – Part II (Ch. 9) Material.
Time Series Building 1. Model Identification
R. Werner Solar Terrestrial Influences Institute - BAS Time Series Analysis by means of inference statistical methods.
An Introduction to Time Series Ginger Davis VIGRE Computational Finance Seminar Rice University November 26, 2003.
STAT 497 APPLIED TIME SERIES ANALYSIS
Modeling Cycles By ARMA
1 Identifying ARIMA Models What you need to know.
Data Sources The most sophisticated forecasting model will fail if it is applied to unreliable data Data should be reliable and accurate Data should be.
Forecasting.
ARIMA Forecasting Lecture 7 and 8 - March 14-16, 2011
Financial Econometrics
Modern methods The classical approach: MethodProsCons Time series regression Easy to implement Fairly easy to interpret Covariates may be added (normalization)
L7: ARIMA1 Lecture 7: ARIMA Model Process The following topics will be covered: Properties of Stock Returns AR model MA model ARMA Non-Stationary Process.
Forecast for the solar activity based on the autoregressive desciption of the sunspot number time series R. Werner Solar Terrestrial Influences Institute.
BOX JENKINS METHODOLOGY
AR- MA- och ARMA-.
ARMA models Gloria González-Rivera University of California, Riverside
TIME SERIES by H.V.S. DE SILVA DEPARTMENT OF MATHEMATICS
Time Series Analysis.
STAT 497 LECTURE NOTES 2.
Slide 1 DSCI 5340: Predictive Modeling and Business Forecasting Spring 2013 – Dr. Nick Evangelopoulos Lecture 8: Estimation & Diagnostic Checking in Box-Jenkins.
Linear Stationary Processes. ARMA models. This lecture introduces the basic linear models for stationary processes. Considering only stationary processes.
TIME SERIES ANALYSIS Time Domain Models: Red Noise; AR and ARMA models LECTURE 7 Supplementary Readings: Wilks, chapters 8.
Time series data: each case represents a point in time. Each cell gives a value for each variable for each time period. Stationarity: Data are stationary.
Definition of Time Series: An ordered sequence of values of a variable at equally spaced time intervals. The variable shall be time dependent.
It’s About Time Mark Otto U. S. Fish and Wildlife Service.
Data analyses 2008 Lecture Last Lecture Basic statistics Testing Linear regression parameters Skill.
K. Ensor, STAT Spring 2005 Model selection/diagnostics Akaike’s Information Criterion (AIC) –A measure of fit plus a penalty term for the number.
Time Series Basics (2) Fin250f: Lecture 3.2 Fall 2005 Reading: Taylor, chapter , 3.9(skip 3.6.1)
Module 3: Introduction to Time Series Methods and Models.
Big Data at Home Depot KSU – Big Data Survey Course Steve Einbender Advanced Analytics Architect.
Week 21 Stochastic Process - Introduction Stochastic processes are processes that proceed randomly in time. Rather than consider fixed random variables.
K. Ensor, STAT Spring 2004 Memory characterization of a process How would the ACF behave for a process with no memory? What is a short memory series?
Time Series Basics Fin250f: Lecture 8.1 Spring 2010 Reading: Brooks, chapter
3.Analysis of asset price dynamics 3.1Introduction Price – continuous function yet sampled discretely (usually - equal spacing). Stochastic process – has.
MULTIVARIATE TIME SERIES & FORECASTING 1. 2 : autocovariance function of the individual time series.
Auto Regressive, Integrated, Moving Average Box-Jenkins models A stationary times series can be modelled on basis of the serial correlations in it. A non-stationary.
Linear Filters. denote a bivariate time series with zero mean. Let.
Review and Summary Box-Jenkins models Stationary Time series AR(p), MA(q), ARMA(p,q)
Computational Finance II: Time Series K.Ensor. What is a time series? Anything observed sequentially (by time?) Returns, volatility, interest rates, exchange.
Introduction to stochastic processes
Random Signals Basic concepts Bibliography Oppenheim’s book, Appendix A. Except A.5. We study a few things that are not in the book.
STAT 497 LECTURE NOTES 3 STATIONARY TIME SERIES PROCESSES
EC 827 Module 2 Forecasting a Single Variable from its own History.
1 Autocorrelation in Time Series data KNN Ch. 12 (pp )
Analysis of Financial Data Spring 2012 Lecture 4: Time Series Models - 1 Priyantha Wijayatunga Department of Statistics, Umeå University
Time Series Analysis By Tyler Moore.
Stochastic Process - Introduction
Covariance, stationarity & some useful operators
Time Series Analysis and Its Applications
Chapter 6: Autoregressive Integrated Moving Average (ARIMA) Models
“The Art of Forecasting”
Statistics 153 Review - Sept 30, 2008
Machine Learning Week 4.
Linear Filters.
Time Series Analysis and Forecasting
These slides are based on:
CH2 Time series.
Load forecasting Prepared by N.CHATHRU.
Presentation transcript:

Time Series Presented by Vikas Kumar vidyarthi Ph.D Scholar ( ),CE Instructor Dr. L. D. Behera Department of Electrical Engineering Indian institute of Technology Kanpur

Contents:- Correlation and Regression What is Time Series? Field of its Applications Methods: Autoregressive (AR) process Moving average (MA) process ARMA process Example of input variable selection by ACF, CCF and PACF. Understanding

Correlation and Regression Correlation: Measures the degree of association between two variable or two series and with what extent. It is measured by the correlation coefficient r. Regression: Discovering how a dependent variable (y) is related to one or more independent variable (x). So we get y= f(x) and in this way we can forecast the dependent variables for the future.

What is a Time Series? An ordered sequence of values of a variable at equally spaced time intervals. i.e, Collection of observations indexed by the date of each observation In any time series plot we generally get these four components: Trend: Season:

What is a Time Series? Cont….. Cycle: these are generally sinusoidal type of curve Random:

Field of its Application The usage of time series models is two fold: – Obtain an understanding of the underlying forces and structure that produced the observed data. – Fit a model and proceed to forecasting, monitoring or even feedback and feedforward control. Time Series Analysis is used for many applications such as: Economic Forecasting Sales Forecasting Budgetary Analysis Stock Market Analysis Yield Projections Process and Quality Control Inventory Studies Workload Projections Utility Studies Census Analysis Weather data analysis Climate data analysis Tide levels analysis Seismic waves analysis

Methods: Autoregressive (AR) Processes AR(1): First order autoregression ε t is noise. Stationarity: We will assume Can be written as

Properties of AR(1)

Properties of AR(1), cont……….

Autocorrelation Function for AR(1):

Autoregressive Processes of higher order p th order autoregression: AR(p) Stationarity: We will assume that the roots of the following all lie outside the unit circle.

Properties of AR(p) Can solve for Autocovariances / Autocorrelations using Yule-Walker equations

Moving Average Processes MA(1): First Order MA process moving average – Y t is constructed from a weighted sum of the two most recent values of.

Properties of MA(1) for j>1

MA(1) Covariance stationary – Mean and autocovariances are not functions of time Autocorrelation of a covariance-stationary process MA(1)

Autocorrelation Function for White Noise:

Autocorrelation Function for MA(1):

Mixed Autoregressive Moving Average (ARMA) Processes ARMA(p,q) includes both autoregressive and moving average terms

Thank you!

White Noise Process Basic building block for time series processes Independent White Noise Process – Slightly stronger condition that εt and εζ are independent

Autocovariance Covariance of Y t with its own lagged value Example: Calculate autocovariances for:

Stationarity Covariance-stationary or weakly stationary process – Neither the mean nor the autocovariances depend on the date t

Stationarity, cont. Covariance stationary processes – Covariance between Y t and Y t-j depends only on j (length of time separating the observations) and not on t (date of the observation)

Stationarity, cont. Strict stationarity – For any values of j 1, j 2, …, j n, the joint distribution of (Y t, Y t+j 1, Y t+j 2,..., Y t+j n ) depends only on the intervals separating the dates and not on the date itself

Table 1: Correlation coefficients of Q (t) for Bird Creek Auto Correlation coefficients Cross Correlation coefficients Flow ValueRainfall Value Q (t) P (t) Q (t-1) P (t-1) Q (t-2) P (t-2) Q (t-3) P (t-3) Q (t-4) P (t-4) Q (t-5) P (t-5) Q (t-6) P (t-6) Q (t-7) P (t-7) Q (t-8) P (t-8) Q (t-9) P (t-9) Q (t-10) P (t-10)

Auto correlation plot of Q (t) Cross correlation plot of Q (t)

Partial Auto Correlation Coefficient RainfallValue Q (t) Q (t-1) Q (t-2) Q (t-3) Q (t-4) Q (t-5) Q (t-6) Q (t-7) Q (t-8) Q (t-9) Q (t-10)