Lecture 6 Power spectral density (PSD)

Slides:



Advertisements
Similar presentations
ELEN 5346/4304 DSP and Filter Design Fall Lecture 15: Stochastic processes Instructor: Dr. Gleb V. Tcheslavski Contact:
Advertisements

Lecture 7 Linear time invariant systems
Statistical properties of Random time series (“noise”)
ELEC 303 – Random Signals Lecture 20 – Random processes
Stochastic processes Lecture 8 Ergodicty.
Copyright Robert J. Marks II ECE 5345 Random Processes - Example Random Processes.
Sep 22, 2005CS477: Analog and Digital Communications1 Random Processes and PSD Analog and Digital Communications Autumn
SYSTEMS Identification
Review of Probability and Random Processes
Matched Filters By: Andy Wang.
Sep 20, 2005CS477: Analog and Digital Communications1 Random variables, Random processes Analog and Digital Communications Autumn
Correlation and spectral analysis Objective: –investigation of correlation structure of time series –identification of major harmonic components in time.
1 ECE310 – Lecture 23 Random Signal Analysis 04/27/01.
Introduction To Signal Processing & Data Analysis
ELEC 303 – Random Signals Lecture 21 – Random processes
Review of Probability.
Introduction to Spectral Estimation
Prof. SankarReview of Random Process1 Probability Sample Space (S) –Collection of all possible outcomes of a random experiment Sample Point –Each outcome.
1 Chapter 12 Introduction to Statistics A random variable is one in which the exact behavior cannot be predicted, but which may be described in terms of.
Goals For This Class Quickly review of the main results from last class Convolution and Cross-correlation Discrete Fourier Analysis: Important Considerations.
Probability Theory and Random Processes
Random Process The concept of random variable was defined previously as mapping from the Sample Space S to the real line as shown below.
Aristotle University of Thessaloniki – Department of Geodesy and Surveying A. DermanisSignals and Spectral Methods in Geoinformatics A. Dermanis Signals.
TELECOMMUNICATIONS Dr. Hugh Blanton ENTC 4307/ENTC 5307.
TELECOMMUNICATIONS Dr. Hugh Blanton ENTC 4307/ENTC 5307.
Review for Exam I ECE460 Spring, 2012.
EE484: Probability and Introduction to Random Processes Autocorrelation and the Power Spectrum By: Jason Cho
Random Processes ECE460 Spring, Power Spectral Density Generalities : Example: 2.
1 Part 5 Response of Linear Systems 6.Linear Filtering of a Random Signals 7.Power Spectrum Analysis 8.Linear Estimation and Prediction Filters 9.Mean-Square.
Lecture 2 Signals and Systems (I)
Spectral Analysis AOE March 2011 Lowe 1. Announcements Lectures on both Monday, March 28 th, and Wednesday, March 30 th. – Fracture Testing –
Mathematical Preliminaries. 37 Matrix Theory Vectors nth element of vector u : u(n) Matrix mth row and nth column of A : a(m,n) column vector.
Borgan and Henderson:. Event History Methodology
One Random Variable Random Process.
Elements of Stochastic Processes Lecture II
Probability Distributions
ارتباطات داده (883-40) فرآیندهای تصادفی نیمسال دوّم افشین همّت یار دانشکده مهندسی کامپیوتر 1.
Chapter 3 Foundation of Mathematical Analysis § 3.1 Statistics and Probability § 3.2 Random Variables and Magnitude Distribution § 3.3 Probability Density.
Random Processes and Spectral Analysis
SYSTEMS Identification Ali Karimpour Assistant Professor Ferdowsi University of Mashhad Reference: “System Identification Theory For The User” Lennart.
IE 300, Fall 2012 Richard Sowers IESE. 8/30/2012 Goals: Rules of Probability Counting Equally likely Some examples.
1 EE571 PART 4 Classification of Random Processes Huseyin Bilgekul Eeng571 Probability and astochastic Processes Department of Electrical and Electronic.
revision Transfer function. Frequency Response
Lecture#10 Spectrum Estimation
Chapter 1 Random Process
Electronic Noise Noise phenomena Device noise models
Geology 5600/6600 Signal Analysis 16 Sep 2015 © A.R. Lowry 2015 Last time: A process is ergodic if time averages equal ensemble averages. Properties of.
INTRODUCTION TO SIGNALS
Discrete-time Random Signals
and shall lay stress on CORRELATION
Stochastic Process Theory and Spectral Estimation Bijan Pesaran Center for Neural Science New York University.
Geology 6600/7600 Signal Analysis 09 Sep 2015 © A.R. Lowry 2015 Last time: Signal Analysis is a set of tools used to extract information from sequences.
Random Processes Gaussian and Gauss-Markov processes Power spectrum of random processes and white processes.
Geology 5600/6600 Signal Analysis 14 Sep 2015 © A.R. Lowry 2015 Last time: A stationary process has statistical properties that are time-invariant; a wide-sense.
Lecture 5,6,7: Random variables and signals Aliazam Abbasfar.
1 Review of Probability and Random Processes. 2 Importance of Random Processes Random variables and processes talk about quantities and signals which.
Geology 6600/7600 Signal Analysis 05 Oct 2015 © A.R. Lowry 2015 Last time: Assignment for Oct 23: GPS time series correlation Given a discrete function.
EEE Chapter 6 Random Processes and LTI Huseyin Bilgekul EEE 461 Communication Systems II Department of Electrical and Electronic Engineering Eastern.
Hi everybody I am robot. You can call me simply as robo. My knowledge is 10,000 times of yours. And my memory is in the order of tera bytes. Do you know.
Random process UNIT III Prepared by: D.MENAKA, Assistant Professor, Dept. of ECE, Sri Venkateswara College of Engineering, Sriperumbudur, Tamilnadu.
Fourier Transform and Spectra
Chapter 6 Random Processes
Chapter 4 Discrete-Time Signals and transform
SIGNALS PROCESSING AND ANALYSIS
Outline Introduction Signal, random variable, random process and spectra Analog modulation Analog to digital conversion Digital transmission through baseband.
Laboratory in Oceanography: Data and Methods
Random Process The concept of random variable was defined previously as mapping from the Sample Space S to the real line as shown below.
copyright Robert J. Marks II
Electrical Communications Systems ECE Spring 2019
Presentation transcript:

Lecture 6 Power spectral density (PSD) Stochastic processes Lecture 6 Power spectral density (PSD)

Random process

1st order Distribution & density function First-order distribution First-order density function

2end order Distribution & density function 2end order distribution 2end order density function

EXPECTATIONS Expected value The autocorrelation

Some random processes Single pulse Multiple pulses Periodic Random Processes The Gaussian Process The Poisson Process Bernoulli and Binomial Processes The Random Walk Wiener Processes The Markov Process

Single pulse X (t) = A S(t −Θ) Single pulse with random amplitude and arrival time: Deterministic pulse: S(t): Deterministic function. Random variables: A: gain a random variable Θ: arrival time. A and Θ are statistically independent X (t) = A S(t −Θ)

Multiple pulses Single pulse with random amplitude and arrival time: Deterministic pulse: S(t): Deterministic function. Random variables: Ak: gain a random variable Θk: arrival time. n: number of pulses Ak and Θk are statistically independent x 𝑡 = 𝑘=1 𝑛 𝐴 𝑘 𝑆(𝑡− 𝛩 𝑘 )

Periodic Random Processes A process which is periodic with T x 𝑡 =𝑥 𝑡+𝑛𝑇 n is an integrer x 𝑡 =𝑠𝑖𝑛 2𝜋𝑡 50 +Θ +𝑠𝑖𝑛 2𝜋𝑡 100 +Θ

The Gaussian Process X(t1),X(t2),X(t3),….X(tn) are jointly Gaussian fro all t and n values Example: randn() in Matlab

The Poisson Process Typically used for modeling of cumulative number of events over time. Example: counting the number of phone call from a phone 𝑃 𝑋 𝑡 =𝑘 = 𝜆 𝑡 𝑘 𝑘! 𝑒 −𝜆(𝑡)

Alternative definition Poisson points The number of events in an interval N(t1,t2) 𝑃 𝑁 𝑡1,𝑡2 =𝑘 =𝑃 𝑋 𝑡2 −𝑋 𝑡1 =𝑘 = 𝜆 𝑡2−𝑡1 𝑘 𝑘! 𝑒 −𝜆(𝑡2−𝑡1) 𝑃 𝑁 0,𝑡2 =𝑘 =𝑃 𝑋 𝑡 =𝑘 = 𝜆𝑡 𝑘 𝑘! 𝑒 −𝜆𝑡

Bernoulli Processes A process of zeros and ones X=[0 0 1 1 0 1 0 0 1 1 1 0] Each sample must be independent and identically distributed Bernoulli variables. The likelihood of 1 is defined by p The likelihood of 0 is defined by q=1-p

Binomial process Summed Bernoulli Processes Where X[n] is a Bernoulli Processes

Random walk For every T seconds take a step (size Δ) to the left or right after tossing a fair coin

The Markov Process 1st order Markov process The current sample is only depended on the previous sample Density function Expected value

The frequency of earth quakes Statement the number large earth quakes has increased dramatically in the last 10 year!

The frequency of earth quakes Is the frequency of large earth quakes unusual high? Which random processes can we use for modeling of the earth quakes frequency?

The frequency of earth quakes Data http://earthquake.usgs.gov/earthquakes/eqarchives/year/graphs.php

Agenda (Lec 16) Power spectral density Definition and background Wiener-Khinchin Cross spectral densities Practical implementations Examples

Fourier transform recap 1 Transform between time and frequency domain Fourier transform Invers Fourier transform

Fourier transform recap 2 Assumption: The signal can be reconstructed from sines and cosines functions. Requirement: absolute integrable 𝑒 −𝑗2𝜋𝑓𝑡 = cos 2𝜋𝑓𝑡 −𝑗sin 2𝜋𝑓𝑡

Fourier transform of a stochastic process A stationary stochastic process is typical not absolute integrable There the signal is truncated Before Fourier transform

What is power? In the power spectrum density power is related to electrical power 𝑃= 𝑉 2 𝑅

Power of a signal The power of a signal is calculated by squaring the signal. 𝑥(𝑡) 2 The average power in e period is :

Parseval's theorem The power of the squared absolute Fourier transform is equal the power of the signal

Power of a stochastic process Thereby can the expected power can be calculated from the Fourier spectrum

Power spectrum density Since the integral of the squared absolute Fourier transform contains the full power of the signal it is a density function. So the power spectral density of a random process is: Due to absolute factor the PSD is always real

PSD Example Fourier transform |X(f)|2

Power spectrum density The PSD is a density function. In the case of the random process the PSD is the density function of the random process and not necessarily the frequency spectrum of a single realization. Example A random process is defined as Where ωr is a unifom distributed random variable wiht a range from 0-π What is the PSD for the process and The power sepctrum for a single realization X 𝑡 =sin⁡( 𝜔 𝑟 𝑡)

PSD of random process versus spectrum of deterministic signals In the case of the random process the PSD is usual the expected value E[Sxx(f)] In the case of deterministic signals the PSD is exact (There is still estimation error)

Properties of the PSD Sxx(f) is real and nonnegative The average power in X(t) is given by: 𝐸 𝑋 2 (𝑡) =𝑅𝑥𝑥 0 = −∞ ∞ 𝑆𝑥𝑥 𝑓 𝑑𝑓 If X(t) is real Rxx(τ) and Sxx(f) are also even 𝑆𝑥𝑥 −𝑓 =𝑆𝑥𝑥 𝑓 If X(t) has periodic components Sxx(f)has impulses Independent on phase

Wiener-Khinchin 1 If the X(t) is stationary in the wide-sense the PSD is the Fourier transform of the Autocorrelation Proof: page 175

Wiener-Khinchin Two method for estimation of the PSD Fourier Transform |X(f)|2 X(t) Sxx(f) Fourier Transform Autocorrelation

The inverse Fourier Transform of the PSD Since the PSD is the Fourier transformed autocorrelation The inverse Fourier transform of the PSD is the autocorrelation

Cross spectral densities If X(t) and Y(t) are two jointly wide-sense stationary processes, is the Cross spectral densities Or

Properties of Cross spectral densities Since is Syx(f) is not necessary real If X(t) and Y(t) are orthogonal Sxy(f)=0 If X(t) and Y(t) are independent Sxy(f)=E[X(t)] E[Y(t)] δ(f)

Cross spectral densities example 1 Hz Sinus curves in white noise Where w(t) is Gaussian noise 𝑋 𝑡 = sin 2𝜋 𝑡 +3 𝑤(𝑡) 𝑌 𝑡 = sin 2𝜋 𝑡+ 𝜋 2 +3 𝑤(𝑡)

Implementations issues The challenges includes Finite signals Discrete time

The periodogram The estimate of the PSD The PSD can be estimate from the autocorrelation Or directly from the signal 𝑆𝑥𝑥 ω = 𝑚=−𝑁+1 𝑁−1 𝑅𝑥𝑥 [𝑚] 𝑒 −𝑗ω𝑚 𝑆𝑥𝑥 ω = 1 𝑁 𝑛=0 𝑁−1 𝑥 [𝑛] 𝑒 −𝑗ω𝑛 2

The discrete version of the autocorrelation Rxx(τ)=E[X1(t) X(t+τ)]≈Rxx[m] m=τ where m is an integer N: number of samples Normalized version: 𝑅𝑥𝑥 𝑚 = 𝑛=0 𝑁− 𝑚 −1 𝑥 𝑛 𝑥[𝑛+𝑚] 𝑅𝑥𝑥 𝑚 = 1 𝑁 𝑛=0 𝑁− 𝑚 −1 𝑥 𝑛 𝑥[𝑛+𝑚]

Bias in the estimates of the autocorrelation 𝑅𝑥𝑥 𝑚 = 𝑛=0 𝑁− 𝑚 −1 𝑥 𝑛 𝑥[𝑛+𝑚]

Bias in the estimates of the autocorrelation The edge effect correspond to multiplying the true autocorrelation with a Bartlett window 𝐸[𝑅𝑥𝑥 𝑚 ]=𝑤[𝑚]𝑅𝑥𝑥_𝑢𝑛𝑏𝑖𝑎𝑠𝑒𝑑[𝑚] 𝑤𝑏 𝑚 = 𝑁−|𝑚| 𝑁 𝑚 <𝑁 0 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒

Alternative estimation of autocorrelation The unbiased estimate Disadvantage: high variance when |m|→N 𝑅𝑥𝑥 𝑚 = 1 𝑁−|𝑚| 𝑛=0 𝑁− 𝑚 −1 𝑥 𝑛 𝑥[𝑛+𝑚]

Influence at the power spectrum Biased version: a Bartlett window is applied Unbiased version: a Rectangular window is applied 𝑆𝑥𝑥 ω = 𝑚=−∞ ∞ 𝑤𝑟[𝑚]𝑅𝑥𝑥_𝑢𝑛𝑏𝑖𝑎𝑠𝑒𝑑[𝑚] 𝑒 −𝑗ω𝑚 𝑆𝑥𝑥 ω = 𝑚=−∞ ∞ 𝑤𝑏[𝑚]𝑅𝑥𝑥_𝑢𝑛𝑏𝑖𝑎𝑠𝑒𝑑[𝑚] 𝑒 −𝑗ω𝑚 𝑤𝑟 𝑚 = 1 𝑚 <𝑁 0 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒

Example Autocorrelation biased and unbiased Estimated PSD’s

Variance in the PSD The variance of the periodogram is estimated to the power of two of PSD 𝑉𝑎𝑟 𝑆𝑥𝑥 𝜔 = 𝑆𝑥𝑥(𝜔) 2

Averaging Divide the signal into K segments of M length 𝑥𝑖=𝑥 𝑖−1 𝑀+1:𝑖 𝑀 1≤𝑖≤𝐾 Calculate the periodogram of each segment 𝑆𝑖𝑥𝑥 ω = 1 𝑀 𝑛=0 𝑀−1 𝑥 𝑖[𝑛] 𝑒 −𝑗ω𝑛 2 Calculate the average periodogram 𝑆 𝑥𝑥[ω]= 1 𝐾 𝑖=0 𝐾 𝑆𝑖𝑥𝑥[ω]

Illustrations of Averaging

Effect of Averaging The variance is decreased But the spectral resolution is also decreased 𝑉𝑎𝑟 𝑆𝑥𝑥 𝜔 = 1 𝐾 𝑆𝑥𝑥(𝜔) 2

Additional options The Welch method Introduce overlap between segment 𝑥𝑖=𝑥 𝑖−1 𝑄+1: 𝑖−1 𝑄+𝑀 1≤𝑖≤𝐾 Where Q is the length between the segments Multiply the segment's with windows 𝑆𝑖𝑥𝑥 ω = 1 𝑀 𝑛=0 𝑀−1 𝑤[𝑛]𝑥 𝑖[𝑛] 𝑒 −𝑗ω𝑛 2

Example Heart rate variability http://circ.ahajournals.org/cgi/content/full/93/5/1043#F3 High frequency component related to Parasympathetic nervous system ("rest and digest") Low frequency component related to sympathetic nervous system (fight-or-flight)

Agenda (Lec 16) Power spectral density Definition and background Wiener-Khinchin Cross spectral densities Practical implementations Examples