Week 21 Stochastic Process - Introduction Stochastic processes are processes that proceed randomly in time. Rather than consider fixed random variables.

Slides:



Advertisements
Similar presentations
FINANCIAL TIME-SERIES ECONOMETRICS SUN LIJIAN Feb 23,2001.
Advertisements

Random Processes Introduction (2)
Random Processes Introduction
Spatial point patterns and Geostatistics an introduction
Autocorrelation Functions and ARIMA Modelling
ELEN 5346/4304 DSP and Filter Design Fall Lecture 15: Stochastic processes Instructor: Dr. Gleb V. Tcheslavski Contact:
Economics 20 - Prof. Anderson1 Stationary Stochastic Process A stochastic process is stationary if for every collection of time indices 1 ≤ t 1 < …< t.
Week11 Parameter, Statistic and Random Samples A parameter is a number that describes the population. It is a fixed number, but in practice we do not know.
R. Werner Solar Terrestrial Influences Institute - BAS Time Series Analysis by means of inference statistical methods.
An Introduction to Time Series Ginger Davis VIGRE Computational Finance Seminar Rice University November 26, 2003.
10 Further Time Series OLS Issues Chapter 10 covered OLS properties for finite (small) sample time series data -If our Chapter 10 assumptions fail, we.
STAT 497 APPLIED TIME SERIES ANALYSIS
Stationary Stochastic Process
DEPARTMENT OF HEALTH SCIENCE AND TECHNOLOGY STOCHASTIC SIGNALS AND PROCESSES Lecture 1 WELCOME.
Economics Prof. Buckles1 Time Series Data y t =  0 +  1 x t  k x tk + u t 1. Basic Analysis.
Time Series Basics Fin250f: Lecture 3.1 Fall 2005 Reading: Taylor, chapter
Applied Geostatistics
Probability By Zhichun Li.
Economics 20 - Prof. Anderson
Modern methods The classical approach: MethodProsCons Time series regression Easy to implement Fairly easy to interpret Covariates may be added (normalization)
Random Variables.
Review of Probability.
Prof. SankarReview of Random Process1 Probability Sample Space (S) –Collection of all possible outcomes of a random experiment Sample Point –Each outcome.
Probability Theory and Random Processes
Time Series Analysis.
STAT 497 LECTURE NOTES 2.
TIME SERIES ANALYSIS Time Domain Models: Red Noise; AR and ARMA models LECTURE 7 Supplementary Readings: Wilks, chapters 8.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Definitions Random Signal Analysis (Review) Discrete Random Signals Random.
Data analyses 2008 Lecture Last Lecture Basic statistics Testing Linear regression parameters Skill.
Elements of Stochastic Processes Lecture II
ارتباطات داده (883-40) فرآیندهای تصادفی نیمسال دوّم افشین همّت یار دانشکده مهندسی کامپیوتر 1.
Random processes. Matlab What is a random process?
Time Series Basics Fin250f: Lecture 8.1 Spring 2010 Reading: Brooks, chapter
3.Analysis of asset price dynamics 3.1Introduction Price – continuous function yet sampled discretely (usually - equal spacing). Stochastic process – has.
1 EE571 PART 4 Classification of Random Processes Huseyin Bilgekul Eeng571 Probability and astochastic Processes Department of Electrical and Electronic.
Chapter 1 Random Process
ENEE631 Digital Image Processing (Spring'04) Basics on 2-D Random Signal Spring ’04 Instructor: Min Wu ECE Department, Univ. of Maryland, College Park.
Stochastic models - time series. Random process. an infinite collection of consistent distributions probabilities exist Random function. a family of random.
Lab for Remote Sensing Hydrology and Spatial Modeling Dept of Bioenvironmental Systems Engineering National Taiwan University 1/45 GEOSTATISTICS INTRODUCTION.
Discrete-time Random Signals
1 EE571 PART 3 Random Processes Huseyin Bilgekul Eeng571 Probability and astochastic Processes Department of Electrical and Electronic Engineering Eastern.
Geology 6600/7600 Signal Analysis 09 Sep 2015 © A.R. Lowry 2015 Last time: Signal Analysis is a set of tools used to extract information from sequences.
Geology 5600/6600 Signal Analysis 11 Sep 2015 © A.R. Lowry 2015 Last time: The Central Limit theorem : The sum of a sequence of random variables tends.
Introduction to stochastic processes
Random Signals Basic concepts Bibliography Oppenheim’s book, Appendix A. Except A.5. We study a few things that are not in the book.
© 2013 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.
Analysis of Financial Data Spring 2012 Lecture 4: Time Series Models - 1 Priyantha Wijayatunga Department of Statistics, Umeå University
Stochastic Process - Introduction
Multiple Random Variables and Joint Distributions
Covariance, stationarity & some useful operators
Time Series Analysis.
Introduction to Time Series Analysis
水分子不時受撞,跳格子(c.p. 車行) 投骰子 (最穩定) 股票 (價格是不穏定,但報酬過程略穩定) 地震的次數 (不穩定)
Time Series Analysis and Its Applications
Probability.
Further Issues Using OLS with Time Series Data
Computational Data Analysis
Parameter, Statistic and Random Samples
Further Issues in Using OLS with Time Series Data
Stochastic models - time series.
Machine Learning Week 4.
Stochastic models - time series.
STOCHASTIC HYDROLOGY Random Processes
Further Issues Using OLS with Time Series Data
Eni Sumarminingsih, SSi, MM
Introduction to Time Series Analysis
Introduction to Time Series Analysis
Lecturer Dr. Veronika Alhanaqtah
CH2 Time series.
Presentation transcript:

week 21 Stochastic Process - Introduction Stochastic processes are processes that proceed randomly in time. Rather than consider fixed random variables X, Y, etc. or even sequences of i.i.d random variables, we consider sequences X 0, X 1, X 2, …. Where X t represent some random quantity at time t. In general, the value X t might depend on the quantity X t-1 at time t-1, or even the value X s for other times s < t. Example: simple random walk.

week 22 Stochastic Process - Definition A stochastic process is a family of time indexed random variables X t where t belongs to an index set. Formal notation, where I is an index set that is a subset of R. Examples of index sets: 1) I = (-∞, ∞) or I = [0, ∞]. In this case X t is a continuous time stochastic process. 2) I = {0, ±1, ±2, ….} or I = {0, 1, 2, …}. In this case X t is a discrete time stochastic process. We use uppercase letter {X t } to describe the process. A time series, {x t } is a realization or sample function from a certain process. We use information from a time series to estimate parameters and properties of process {X t }.

week 23 Probability Distribution of a Process For any stochastic process with index set I, its probability distribution function is uniquely determined by its finite dimensional distributions. The k dimensional distribution function of a process is defined by for any and any real numbers x 1, …, x k. The distribution function tells us everything we need to know about the process {X t }.

week 24 Moments of Stochastic Process We can describe a stochastic process via its moments, i.e., We often use the first two moments. The mean function of the process is The variance function of the process is The covariance function between X t, X s is The correlation function between X t, X s is These moments are often function of time.

week 25 Stationary Processes A process is said to be strictly stationary if has the same joint distribution as. That is, if If {X t } is a strictly stationary process and then, the mean function is a constant and the variance function is also a constant. Moreover, for a strictly stationary process with first two moments finite, the covariance function, and the correlation function depend only on the time difference s. A trivial example of a strictly stationary process is a sequence of i.i.d random variables.

week 26 Weak Stationarity Strict stationarity is too strong of a condition in practice. It is often difficult assumption to assess based on an observed time series x 1,…,x k. In time series analysis we often use a weaker sense of stationarity in terms of the moments of the process. A process is said to be nth-order weakly stationary if all its joint moments up to order n exists and are time invariant, i.e., independent of time origin. For example, a second-order weakly stationary process will have constant mean and variance, with the covariance and the correlation being functions of the time difference along. A strictly stationary process with the first two moments finite is also a second-ordered weakly stationary. But a strictly stationary process may not have finite moments and therefore may not be weakly stationary.

week 27 The Autocovariance and Autocorrelation Functions For a stationary process {X t }, with constant mean μ and constant variance σ 2. The covariance between X t and X t+s is The correlation between X t and X t+s is Where As functions of s, γ(s) is called the autocovariance function and ρ(s) is called the autocorrelation function (ATF). They represent the covariance and correlation between X t and X t+s from the same process, separated only by s time lags.

week 28 Properties of γ(s) and ρ(s) For a stationary process, the autocovariance function γ(s) and the autocorrelation function ρ(s) have the following properties:  . .  The autocovariance function γ(s) and the autocorrelation function ρ(s) are positive semidefinite in the sense that for any real numbers

week 29 Correlogram A correlogram is a plot of the autocorrelation function ρ(s) versus the lag s where s = 0,1, …. Example…

week 210 Partial Autocorrelation Function Often we want to investigate the dependency / association between X t and X t+k adjusting for their dependency on X t+1, X t+2,…, X t+k-1. The conditional correlation Corr(X t, X t+k | X t+1, X t+2,…, X t+k-1 ) is usually referred to as the partial correlation in time series analysis. Partial autocorrelation is usually useful for identifying autoregressive models.

week 211 Gaussian process A stochastic process is said to be a normal or Gaussian process if its joint probability distribution is normal. A Gaussian process is strictly and weakly stationary because the normal distribution is uniquely characterized by its first two moments. The processes we will discuss are assumed to be Gaussian unless mentioned otherwise. Like other areas in statistics, most time series results are established for Gaussian processes.

week 212 White Noise Processes A process {X t } is called white noise process if it is a sequence of uncorrelated random variables from a fixed distribution with constant mean μ (usually assume to be 0) and constant variance σ 2. A white noise process is stationary with autocovariance and autocorrelation functions given by …. A white noise process is Gaussian if its joint distribution is normal.

week 213 Estimation of the mean Given a single realization {x t } of a stationary process {X t }, a natural estimator of the mean is the sample mean which is the time average of n observations. It can be shown that the sample mean is unbiased and consistent estimator for μ.

week 214 Sample Autocovariance Function Given a single realization {x t } of a stationary process {X t }, the sample autocovariance function given by is an estimate of the autocivariance function.

week 215 Sample Autocorrelation Function For a given time series {x t }, the sample autocorrelation function is given by The sample autocorrelation function is non-negative definite. The sample autocovariance and autocorrelation functions have the same properties as the autocovariance and autocorrelation function of the entire process.

week 216 Example