Stochastic processes Lecture 8 Ergodicty.

Slides:



Advertisements
Similar presentations
Communication System Overview
Advertisements

ELEN 5346/4304 DSP and Filter Design Fall Lecture 15: Stochastic processes Instructor: Dr. Gleb V. Tcheslavski Contact:
Lecture 7 Linear time invariant systems
Statistical properties of Random time series (“noise”)
ELEC 303 – Random Signals Lecture 20 – Random processes
Lecture 6 Power spectral density (PSD)
Random Processes ECE460 Spring, Random (Stocastic) Processes 2.
EE322 Digital Communications
Sep 22, 2005CS477: Analog and Digital Communications1 Random Processes and PSD Analog and Digital Communications Autumn
SYSTEMS Identification
Random Data Workshops 1 & 2 L. L. Koss. Random Data L.L. Koss Random Data Analysis.
1 11 Lecture 14 Random Signals and Noise (I) Fall 2008 NCTU EE Tzu-Hsien Sang.
Review of Probability and Random Processes
Lecture 16 Random Signals and Noise (III) Fall 2008 NCTU EE Tzu-Hsien Sang.
Sep 20, 2005CS477: Analog and Digital Communications1 Random variables, Random processes Analog and Digital Communications Autumn
Digital Communication
ELEC 303 – Random Signals Lecture 21 – Random processes
Review of Probability.
Chapter 4. Random Processes
Probability Theory and Random Processes
Random Process The concept of random variable was defined previously as mapping from the Sample Space S to the real line as shown below.
COSC 4214: Digital Communications Instructor: Dr. Amir Asif Department of Computer Science and Engineering York University Handout # 2: Random Signals.
TELECOMMUNICATIONS Dr. Hugh Blanton ENTC 4307/ENTC 5307.
Review for Exam I ECE460 Spring, 2012.
Week 2ELE Adaptive Signal Processing 1 STOCHASTIC PROCESSES AND MODELS.
EE484: Probability and Introduction to Random Processes Autocorrelation and the Power Spectrum By: Jason Cho
Random Processes ECE460 Spring, Power Spectral Density Generalities : Example: 2.
1 Part 5 Response of Linear Systems 6.Linear Filtering of a Random Signals 7.Power Spectrum Analysis 8.Linear Estimation and Prediction Filters 9.Mean-Square.
Course Review for Final ECE460 Spring, Common Fourier Transform Pairs 2.
Data analyses 2008 Lecture Last Lecture Basic statistics Testing Linear regression parameters Skill.
2. Stationary Processes and Models
Elements of Stochastic Processes Lecture II
Course Outline (Tentative) Fundamental Concepts of Signals and Systems Signals Systems Linear Time-Invariant (LTI) Systems Convolution integral and sum.
ارتباطات داده (883-40) فرآیندهای تصادفی نیمسال دوّم افشین همّت یار دانشکده مهندسی کامپیوتر 1.
COSC 4214: Digital Communications Instructor: Dr. Amir Asif Department of Computer Science and Engineering York University Handout # 3: Baseband Modulation.
Random Processes and Spectral Analysis
1 EE571 PART 4 Classification of Random Processes Huseyin Bilgekul Eeng571 Probability and astochastic Processes Department of Electrical and Electronic.
revision Transfer function. Frequency Response
Lecture#10 Spectrum Estimation
Chapter 1 Random Process
ENEE631 Digital Image Processing (Spring'04) Basics on 2-D Random Signal Spring ’04 Instructor: Min Wu ECE Department, Univ. of Maryland, College Park.
Geology 5600/6600 Signal Analysis 16 Sep 2015 © A.R. Lowry 2015 Last time: A process is ergodic if time averages equal ensemble averages. Properties of.
Chapter 2. Fourier Representation of Signals and Systems
Geology 6600/7600 Signal Analysis 21 Sep 2015 © A.R. Lowry 2015 Last time: The Cross-Power Spectrum relating two random processes x and y is given by:
Discrete-time Random Signals
1 EE571 PART 3 Random Processes Huseyin Bilgekul Eeng571 Probability and astochastic Processes Department of Electrical and Electronic Engineering Eastern.
and shall lay stress on CORRELATION
EE354 : Communications System I
Geology 6600/7600 Signal Analysis 09 Sep 2015 © A.R. Lowry 2015 Last time: Signal Analysis is a set of tools used to extract information from sequences.
Random Processes Gaussian and Gauss-Markov processes Power spectrum of random processes and white processes.
Geology 5600/6600 Signal Analysis 14 Sep 2015 © A.R. Lowry 2015 Last time: A stationary process has statistical properties that are time-invariant; a wide-sense.
ELEC 303 – Random Signals Lecture 19 – Random processes Dr. Farinaz Koushanfar ECE Dept., Rice University Nov 12, 2009.
Lecture 5,6,7: Random variables and signals Aliazam Abbasfar.
1 Review of Probability and Random Processes. 2 Importance of Random Processes Random variables and processes talk about quantities and signals which.
Geology 6600/7600 Signal Analysis 05 Oct 2015 © A.R. Lowry 2015 Last time: Assignment for Oct 23: GPS time series correlation Given a discrete function.
EEE Chapter 6 Random Processes and LTI Huseyin Bilgekul EEE 461 Communication Systems II Department of Electrical and Electronic Engineering Eastern.
Hi everybody I am robot. You can call me simply as robo. My knowledge is 10,000 times of yours. And my memory is in the order of tera bytes. Do you know.
Geology 5600/6600 Signal Analysis 11 Sep 2015 © A.R. Lowry 2015 Last time: The Central Limit theorem : The sum of a sequence of random variables tends.
Random process UNIT III Prepared by: D.MENAKA, Assistant Professor, Dept. of ECE, Sri Venkateswara College of Engineering, Sriperumbudur, Tamilnadu.
Chapter 2. Signals and Linear Systems
E&CE 358: Tutorial-4 Instructor: Prof. Xuemin (Sherman) Shen TA: Miao Wang 1.
Chapter 6 Random Processes
Properties of the power spectral density (1/4)
UNIT-III Signal Transmission through Linear Systems
SIGNALS PROCESSING AND ANALYSIS
Outline Introduction Signal, random variable, random process and spectra Analog modulation Analog to digital conversion Digital transmission through baseband.
Random Process The concept of random variable was defined previously as mapping from the Sample Space S to the real line as shown below.
Chapter 6 Random Processes
Basic descriptions of physical data
copyright Robert J. Marks II
Presentation transcript:

Stochastic processes Lecture 8 Ergodicty

Random process

Agenda (Lec. 8) Ergodicity Central equations Biomedical engineering example: Analysis of heart sound murmurs

Ergodicity A random process X(t) is ergodic if all of its statistics can be determined from a sample function of the process That is, the ensemble averages equal the corresponding time averages with probability one.

Ergodicity ilustrated statistics can be determined by time averaging of one realization

Ergodicity and stationarity Wide-sense stationary: Mean and Autocorrelation is constant over time Strictly stationary: All statistics is constant over time

Weak forms of ergodicity The complete statistics is often difficult to estimate so we are often only interested in: Ergodicity in the Mean Ergodicity in the Autocorrelation

Ergodicity in the Mean A random process is ergodic in mean if E(X(t)) equals the time average of sample function (Realization) Where the <> denotes time averaging Necessary and sufficient condition: X(t+τ) and X(t) must become independent as τ approaches ∞

Example Ergodic in mean: Not Ergodic in mean: X 𝑡 =a sin⁡(2𝜋 𝜔 𝑟 +𝜃) Where: 𝜔 𝑟 is a random variable a and θ are constant variables Mean is impendent on the random variable 𝜔 𝑟 Not Ergodic in mean: X 𝑡 =𝑎 sin 2𝜋𝜔𝑟+𝜃 + 𝑑𝑐 𝑟 𝜔 𝑟 and dcr are random variables Mean is not impendent on the random variable 𝑑𝑐 𝑟

Ergodicity in the Autocorrelation Ergodic in the autocorrelation mean that the autocorrelation can be found by time averaging a single realization Where Necessary and sufficient condition: X(t+τ) X(t) and X(t+τ+a) X(t+a) must become independent as a approaches ∞

The time average autocorrelation (Discrete version) 𝑅𝑥𝑥 𝑚 = 𝑛=0 𝑁− 𝑚 −1 𝑥 𝑛 𝑥[𝑛+𝑚]

Example (1/2) Autocorrelation A random process where A and fc are constants, and Θ is a random variable uniformly distributed over the interval [0, 2π] The Autocorraltion of of X(t) is: What is the autocorrelation of a sample function?

Example (2/2) The time averaged autocorrelation of the sample function = lim 𝑇→∞ 𝐴 2𝑇 −𝑇 𝑇 cos 2𝜋 𝑓 𝑐 𝜏 + cos 4𝜋 𝑓 𝑐 𝑡+2𝜋 𝑓 𝑐 𝜏+𝜃 Thereby

Ergodicity of the First-Order Distribution If an process is ergodic the first-Order Distribution can be determined by inputting x(t) in a system Y(t) And the integrating the system Necessary and sufficient condition: X(t+τ) and X(t) must become independent as τ approaches ∞

Ergodicity of Power Spectral Density A wide-sense stationary process X(t) is ergodic in power spectral density if, for any sample function x(t),

Example Ergodic in PSD: Not Ergodic in PSD: X 𝑡 =a sin⁡(2𝜋 𝜔 + 𝜃 𝑟 ) Where: θ 𝑟 is a random variable a and 𝜔 are constant variables The PSD is impendent on the phase the random variable 𝜃 𝑟 Not Ergodic in PSD: X 𝑡 =𝑎 sin 2𝜋 𝜔 𝑟 +𝜃 𝜔 𝑟 are random variables a and θ are constant variables The PSD is not impendent on the random variable 𝜔 𝑟

Essential equations

Typical signals Dirac delta δ(t) Complex exponential functions 𝛿 𝑡 = ∞ 𝑡=0 0 𝑒𝑙𝑠𝑒 −∞ ∞ 𝛿 𝑡 𝑑𝑡=0 Complex exponential functions 𝑒 𝑗𝑡 = cos 𝑡 +𝑗𝑠𝑖𝑛(𝑡)

Essential equations Distribution and density functions First-order distribution: First-order density function: 2end order distribution 2end order density function

Essential equations Expected value 1st order (Mean) Expected value (Mean) In the case of WSS 𝑚𝑥=𝐸[𝑋(𝑡)] In the case of ergodicity Where<> denotes time averaging such as

Essential equations Auto-correlations In the general case Thereby If X(t) is WSS 𝑅𝑥𝑥 𝜏 =𝑅𝑥𝑥 𝑡+𝜏,𝑡 =𝐸[𝑋 𝑡+𝜏 𝑋(𝑡)] If X(i) is Ergodic where

Essential equations Cross-correlations In the general case In the case of WSS 𝑅𝑥𝑦 𝑡1,𝑡2 =𝐸 𝑋 𝑡1 𝑌 𝑡2 =𝑅𝑦𝑥∗(𝑡2,𝑡1) 𝑅𝑥𝑦 𝜏 =𝑅𝑥𝑦 𝑡+𝜏,𝑡 =𝐸[𝑋 𝑡+𝜏 𝑌(𝑡)]

Properties of autocorrelation and crosscorrelation Rxx(t1,t1)=E[|X(t)|2] When WSS: Rxx(0)=E[|X(t)|2]=σx2+mx2 Cross-correlation: If Y(t) and X(t) is independent Rxy(t1,t2)=E[X(t)Y(t)]=E[X(t)]E[Y(t)] If Y(t) and X(t) is orthogonal Rxy(t1,t2)=E[X(t)Y(t)]=E[X(t)]E[Y(t)]=0;

Essential equations PSD Truncated Fourier transform of X(t): Power spectrum Or from the autocorrelation The Fourier transform of the auto-correlation

Essential equations LTI systems (1/4) Convolution in time domain: Where h(t) is the impulse response Frequency domain: Where X(f) and H(f) is the Fourier transformed signal and impluse response

Essential equations LTI systems (2/4) Expected value (mean) of the output: If WSS: Expected Mean square value of the output 𝐸 𝑌 𝑡 = −∞ ∞ 𝐸 𝑋 𝑡−𝛼 ℎ 𝛼 𝑑𝛼 = −∞ ∞ 𝑚𝑥(𝑡−𝛼)ℎ 𝛼 𝑑𝛼 𝑤ℎ𝑒𝑟𝑒 𝑚𝑥 𝑡 𝑖𝑠 𝑚𝑒𝑎𝑛 𝑜𝑓𝑋 𝑡 𝑎𝑠 𝐸[𝑋(𝑡)] 𝑚𝑦=𝐸 𝑌 𝑡 =𝑚𝑥 −∞ ∞ ℎ 𝛼 𝑑𝛼 𝐸 𝑌 𝑡 2 = −∞ ∞ −∞ ∞ 𝑅𝑥𝑥(𝑡−𝛼,𝑡−𝛽)ℎ 𝛼 ℎ 𝛽 𝑑𝛼1𝑑𝛼2 𝐸 𝑌 𝑡 2 = −∞ ∞ −∞ ∞ 𝑅𝑥𝑥(𝛼−𝛽) ℎ 𝛼 ℎ 𝛽 𝑑𝛼1𝑑𝛼2

Essential equations LTI systems (3/4) Cross correlation function between input and output when WSS Autocorrelation of the output when WSS 𝑅𝑦𝑥 𝜏 = −∞ ∞ 𝑅𝑥𝑥 𝜏−𝛼 ℎ 𝛼 𝑑𝛼=𝑅𝑥𝑥 𝜏 ∗ℎ(𝜏) 𝑅𝑦𝑦 𝜏 = −∞ ∞ −∞ ∞ 𝐸[𝑋 𝑡+𝜏−𝛼 𝑋 𝑡+𝛼 ]ℎ 𝛼 ℎ −𝑎 𝑑𝛼𝑑𝛼 𝑅𝑦𝑦 𝜏 =𝑅𝑦𝑥 𝜏 ∗ℎ(−𝜏) 𝑅𝑦𝑦 𝜏 =𝑅𝑥𝑥 𝜏 ∗ℎ(𝜏)∗ℎ(−𝜏)

Essential equations LTI systems (4/4) PSD of the output Where H(f) is the transfer function Calculated as the four transform of the impulse response 𝑆𝑦𝑦 𝑓 =𝑆𝑥𝑥 𝑓 𝐻 𝑓 𝐻 ∗ (𝑓) 𝑆𝑦𝑦 𝑓 =𝑆𝑥𝑥 𝑓 |𝐻 𝑓 | 2

A biomedical example on a stochastic process Analyze of Heart murmurs from Aortic valve stenosis using methods from stochastic process.

Introduction to heart sounds The main sounds is S1 and S2 S1 the first heart sound Closure of the AV valves S2 the second heart sound Closure of the semilunar valves

Aortic valve stenosis Narrowing of the Aortic valve

Reflections of Aortic valve stenosis in the heart sound A clear diastolic murmur which is due to post stenotic turbulence

Abnormal heart sounds

Signals analyze for algorithm specification Is heart sound stationary, quasi-stationary or non-stationary? What is the frequency characteristic of systolic Murmurs versus a normal systolic period?

exercise Chi meditation and autonomic nervous system