 # Stochastic processes Lecture 8 Ergodicty.

## Presentation on theme: "Stochastic processes Lecture 8 Ergodicty."— Presentation transcript:

Stochastic processes Lecture 8 Ergodicty

Random process

Agenda (Lec. 8) Ergodicity Central equations
Biomedical engineering example: Analysis of heart sound murmurs

Ergodicity A random process X(t) is ergodic if all of its statistics can be determined from a sample function of the process That is, the ensemble averages equal the corresponding time averages with probability one.

Ergodicity ilustrated
statistics can be determined by time averaging of one realization

Ergodicity and stationarity
Wide-sense stationary: Mean and Autocorrelation is constant over time Strictly stationary: All statistics is constant over time

Weak forms of ergodicity
The complete statistics is often difficult to estimate so we are often only interested in: Ergodicity in the Mean Ergodicity in the Autocorrelation

Ergodicity in the Mean A random process is ergodic in mean if E(X(t)) equals the time average of sample function (Realization) Where the <> denotes time averaging Necessary and sufficient condition: X(t+τ) and X(t) must become independent as τ approaches ∞

Example Ergodic in mean: Not Ergodic in mean: X 𝑡 =a sin⁡(2𝜋 𝜔 𝑟 +𝜃)
Where: 𝜔 𝑟 is a random variable a and θ are constant variables Mean is impendent on the random variable 𝜔 𝑟 Not Ergodic in mean: X 𝑡 =𝑎 sin 2𝜋𝜔𝑟+𝜃 + 𝑑𝑐 𝑟 𝜔 𝑟 and dcr are random variables Mean is not impendent on the random variable 𝑑𝑐 𝑟

Ergodicity in the Autocorrelation
Ergodic in the autocorrelation mean that the autocorrelation can be found by time averaging a single realization Where Necessary and sufficient condition: X(t+τ) X(t) and X(t+τ+a) X(t+a) must become independent as a approaches ∞

The time average autocorrelation (Discrete version)
𝑅𝑥𝑥 𝑚 = 𝑛=0 𝑁− 𝑚 −1 𝑥 𝑛 𝑥[𝑛+𝑚]

Example (1/2) Autocorrelation
A random process where A and fc are constants, and Θ is a random variable uniformly distributed over the interval [0, 2π] The Autocorraltion of of X(t) is: What is the autocorrelation of a sample function?

Example (2/2) The time averaged autocorrelation of the sample function
= lim 𝑇→∞ 𝐴 2𝑇 −𝑇 𝑇 cos 2𝜋 𝑓 𝑐 𝜏 + cos 4𝜋 𝑓 𝑐 𝑡+2𝜋 𝑓 𝑐 𝜏+𝜃 Thereby

Ergodicity of the First-Order Distribution
If an process is ergodic the first-Order Distribution can be determined by inputting x(t) in a system Y(t) And the integrating the system Necessary and sufficient condition: X(t+τ) and X(t) must become independent as τ approaches ∞

Ergodicity of Power Spectral Density
A wide-sense stationary process X(t) is ergodic in power spectral density if, for any sample function x(t),

Example Ergodic in PSD: Not Ergodic in PSD: X 𝑡 =a sin⁡(2𝜋 𝜔 + 𝜃 𝑟 )
Where: θ 𝑟 is a random variable a and 𝜔 are constant variables The PSD is impendent on the phase the random variable 𝜃 𝑟 Not Ergodic in PSD: X 𝑡 =𝑎 sin 2𝜋 𝜔 𝑟 +𝜃 𝜔 𝑟 are random variables a and θ are constant variables The PSD is not impendent on the random variable 𝜔 𝑟

Essential equations

Typical signals Dirac delta δ(t) Complex exponential functions
𝛿 𝑡 = ∞ 𝑡=0 0 𝑒𝑙𝑠𝑒 −∞ ∞ 𝛿 𝑡 𝑑𝑡=0 Complex exponential functions 𝑒 𝑗𝑡 = cos 𝑡 +𝑗𝑠𝑖𝑛(𝑡)

Essential equations Distribution and density functions
First-order distribution: First-order density function: 2end order distribution 2end order density function

Essential equations Expected value 1st order (Mean)
Expected value (Mean) In the case of WSS 𝑚𝑥=𝐸[𝑋(𝑡)] In the case of ergodicity Where<> denotes time averaging such as

Essential equations Auto-correlations
In the general case Thereby If X(t) is WSS 𝑅𝑥𝑥 𝜏 =𝑅𝑥𝑥 𝑡+𝜏,𝑡 =𝐸[𝑋 𝑡+𝜏 𝑋(𝑡)] If X(i) is Ergodic where

Essential equations Cross-correlations
In the general case In the case of WSS 𝑅𝑥𝑦 𝑡1,𝑡2 =𝐸 𝑋 𝑡1 𝑌 𝑡2 =𝑅𝑦𝑥∗(𝑡2,𝑡1) 𝑅𝑥𝑦 𝜏 =𝑅𝑥𝑦 𝑡+𝜏,𝑡 =𝐸[𝑋 𝑡+𝜏 𝑌(𝑡)]

Properties of autocorrelation and crosscorrelation
Rxx(t1,t1)=E[|X(t)|2] When WSS: Rxx(0)=E[|X(t)|2]=σx2+mx2 Cross-correlation: If Y(t) and X(t) is independent Rxy(t1,t2)=E[X(t)Y(t)]=E[X(t)]E[Y(t)] If Y(t) and X(t) is orthogonal Rxy(t1,t2)=E[X(t)Y(t)]=E[X(t)]E[Y(t)]=0;

Essential equations PSD
Truncated Fourier transform of X(t): Power spectrum Or from the autocorrelation The Fourier transform of the auto-correlation

Essential equations LTI systems (1/4)
Convolution in time domain: Where h(t) is the impulse response Frequency domain: Where X(f) and H(f) is the Fourier transformed signal and impluse response

Essential equations LTI systems (2/4)
Expected value (mean) of the output: If WSS: Expected Mean square value of the output 𝐸 𝑌 𝑡 = −∞ ∞ 𝐸 𝑋 𝑡−𝛼 ℎ 𝛼 𝑑𝛼 = −∞ ∞ 𝑚𝑥(𝑡−𝛼)ℎ 𝛼 𝑑𝛼 𝑤ℎ𝑒𝑟𝑒 𝑚𝑥 𝑡 𝑖𝑠 𝑚𝑒𝑎𝑛 𝑜𝑓𝑋 𝑡 𝑎𝑠 𝐸[𝑋(𝑡)] 𝑚𝑦=𝐸 𝑌 𝑡 =𝑚𝑥 −∞ ∞ ℎ 𝛼 𝑑𝛼 𝐸 𝑌 𝑡 2 = −∞ ∞ −∞ ∞ 𝑅𝑥𝑥(𝑡−𝛼,𝑡−𝛽)ℎ 𝛼 ℎ 𝛽 𝑑𝛼1𝑑𝛼2 𝐸 𝑌 𝑡 2 = −∞ ∞ −∞ ∞ 𝑅𝑥𝑥(𝛼−𝛽) ℎ 𝛼 ℎ 𝛽 𝑑𝛼1𝑑𝛼2

Essential equations LTI systems (3/4)
Cross correlation function between input and output when WSS Autocorrelation of the output when WSS 𝑅𝑦𝑥 𝜏 = −∞ ∞ 𝑅𝑥𝑥 𝜏−𝛼 ℎ 𝛼 𝑑𝛼=𝑅𝑥𝑥 𝜏 ∗ℎ(𝜏) 𝑅𝑦𝑦 𝜏 = −∞ ∞ −∞ ∞ 𝐸[𝑋 𝑡+𝜏−𝛼 𝑋 𝑡+𝛼 ]ℎ 𝛼 ℎ −𝑎 𝑑𝛼𝑑𝛼 𝑅𝑦𝑦 𝜏 =𝑅𝑦𝑥 𝜏 ∗ℎ(−𝜏) 𝑅𝑦𝑦 𝜏 =𝑅𝑥𝑥 𝜏 ∗ℎ(𝜏)∗ℎ(−𝜏)

Essential equations LTI systems (4/4)
PSD of the output Where H(f) is the transfer function Calculated as the four transform of the impulse response 𝑆𝑦𝑦 𝑓 =𝑆𝑥𝑥 𝑓 𝐻 𝑓 𝐻 ∗ (𝑓) 𝑆𝑦𝑦 𝑓 =𝑆𝑥𝑥 𝑓 |𝐻 𝑓 | 2

A biomedical example on a stochastic process
Analyze of Heart murmurs from Aortic valve stenosis using methods from stochastic process.

Introduction to heart sounds
The main sounds is S1 and S2 S1 the first heart sound Closure of the AV valves S2 the second heart sound Closure of the semilunar valves

Aortic valve stenosis Narrowing of the Aortic valve

Reflections of Aortic valve stenosis in the heart sound
A clear diastolic murmur which is due to post stenotic turbulence

Abnormal heart sounds

Signals analyze for algorithm specification
Is heart sound stationary, quasi-stationary or non-stationary? What is the frequency characteristic of systolic Murmurs versus a normal systolic period?

exercise Chi meditation and autonomic nervous system

Similar presentations