Random Processes and Spectral Analysis

Slides:



Advertisements
Similar presentations
Lecture 7 Linear time invariant systems
Advertisements

ELEC 303 – Random Signals Lecture 20 – Random processes
Lecture 6 Power spectral density (PSD)
Stochastic processes Lecture 8 Ergodicty.
EE322 Digital Communications
Sep 22, 2005CS477: Analog and Digital Communications1 Random Processes and PSD Analog and Digital Communications Autumn
3F4 Power and Energy Spectral Density Dr. I. J. Wassell.
Review of Probability and Random Processes
Matched Filters By: Andy Wang.
1 Digital Communication Systems Lecture-3, Prof. Dr. Habibullah Jamal Under Graduate, Spring 2008.
Sep 20, 2005CS477: Analog and Digital Communications1 Random variables, Random processes Analog and Digital Communications Autumn
1 For a deterministic signal x(t), the spectrum is well defined: If represents its Fourier transform, i.e., if then represents its energy spectrum. This.
Review of Probability.
Modulation, Demodulation and Coding Course
Probability Theory and Random Processes
Random Process The concept of random variable was defined previously as mapping from the Sample Space S to the real line as shown below.
EBB Chapter 2 SIGNALS AND SPECTRA Chapter Objectives: Basic signal properties (DC, RMS, dBm, and power); Fourier transform and spectra; Linear systems.
Dept. of EE, NDHU 1 Chapter Three Baseband Demodulation/Detection.
COSC 4214: Digital Communications Instructor: Dr. Amir Asif Department of Computer Science and Engineering York University Handout # 2: Random Signals.
TELECOMMUNICATIONS Dr. Hugh Blanton ENTC 4307/ENTC 5307.
Review for Exam I ECE460 Spring, 2012.
Week 2ELE Adaptive Signal Processing 1 STOCHASTIC PROCESSES AND MODELS.
EE484: Probability and Introduction to Random Processes Autocorrelation and the Power Spectrum By: Jason Cho
Random Processes ECE460 Spring, Power Spectral Density Generalities : Example: 2.
1 Part 5 Response of Linear Systems 6.Linear Filtering of a Random Signals 7.Power Spectrum Analysis 8.Linear Estimation and Prediction Filters 9.Mean-Square.
Module 2 SPECTRAL ANALYSIS OF COMMUNICATION SIGNAL.
Course Review for Final ECE460 Spring, Common Fourier Transform Pairs 2.
Baseband Demodulation/Detection
EEE Chapter 6 Matched Filters Huseyin Bilgekul EEE 461 Communication Systems II Department of Electrical and Electronic Engineering Eastern Mediterranean.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Definitions Random Signal Analysis (Review) Discrete Random Signals Random.
2. Stationary Processes and Models
Elements of Stochastic Processes Lecture II
Chapter 2 Signals and Spectra (All sections, except Section 8, are covered.)
Eeng Chapter4 Bandpass Signalling  Definitions  Complex Envelope Representation  Representation of Modulated Signals  Spectrum of Bandpass Signals.
Chapter 4: Baseband Pulse Transmission Digital Communication Systems 2012 R.Sokullu1/46 CHAPTER 4 BASEBAND PULSE TRANSMISSION.
CHAPTER 5 SIGNAL SPACE ANALYSIS
Astronomical Data Analysis I
ارتباطات داده (883-40) فرآیندهای تصادفی نیمسال دوّم افشین همّت یار دانشکده مهندسی کامپیوتر 1.
COSC 4214: Digital Communications Instructor: Dr. Amir Asif Department of Computer Science and Engineering York University Handout # 3: Baseband Modulation.
Chapter 11 Filter Design 11.1 Introduction 11.2 Lowpass Filters
1 EE571 PART 4 Classification of Random Processes Huseyin Bilgekul Eeng571 Probability and astochastic Processes Department of Electrical and Electronic.
revision Transfer function. Frequency Response
Chapter 1 Random Process
Dept. of EE, NDHU 1 Chapter One Signals and Spectra.
ECE 4710: Lecture #31 1 System Performance  Chapter 7: Performance of Communication Systems Corrupted by Noise  Important Practical Considerations: 
Chapter 2. Fourier Representation of Signals and Systems
Geology 6600/7600 Signal Analysis 21 Sep 2015 © A.R. Lowry 2015 Last time: The Cross-Power Spectrum relating two random processes x and y is given by:
Discrete-time Random Signals
1 EE571 PART 3 Random Processes Huseyin Bilgekul Eeng571 Probability and astochastic Processes Department of Electrical and Electronic Engineering Eastern.
Baseband Receiver Receiver Design: Demodulation Matched Filter Correlator Receiver Detection Max. Likelihood Detector Probability of Error.
Random Processes Gaussian and Gauss-Markov processes Power spectrum of random processes and white processes.
1 Review of Probability and Random Processes. 2 Importance of Random Processes Random variables and processes talk about quantities and signals which.
EEE Chapter 6 Random Processes and LTI Huseyin Bilgekul EEE 461 Communication Systems II Department of Electrical and Electronic Engineering Eastern.
Performance of Digital Communications System
Random process UNIT III Prepared by: D.MENAKA, Assistant Professor, Dept. of ECE, Sri Venkateswara College of Engineering, Sriperumbudur, Tamilnadu.
Digital Communications Chapter 1 Signals and Spectra Signal Processing Lab.
Eeng360 1 Chapter 2 Linear Systems Topics:  Review of Linear Systems Linear Time-Invariant Systems Impulse Response Transfer Functions Distortionless.
Chapter 2. Signals and Linear Systems
SungkyunKwan Univ Communication Systems Chapter. 7 Baseband pulse Transmission by Cho Yeon Gon.
Chapter 6 Random Processes
Copyright 1998, S.D. Personick. All Rights Reserved. Telecommunications Networking I Lectures 4&5 Quantifying the Performance of Communication Systems.
Chapter 6 Matched Filters
SIGNALS PROCESSING AND ANALYSIS
Chapter 6 Matched Filters
Outline Introduction Signal, random variable, random process and spectra Analog modulation Analog to digital conversion Digital transmission through baseband.
Lecture 1.30 Structure of the optimal receiver deterministic signals.
Random Process The concept of random variable was defined previously as mapping from the Sample Space S to the real line as shown below.
Error rate due to noise In this section, an expression for the probability of error will be derived The analysis technique, will be demonstrated on a binary.
Chapter 6 Random Processes
For a deterministic signal x(t), the spectrum is well defined: If
Presentation transcript:

Random Processes and Spectral Analysis Chapter 6 Random Processes and Spectral Analysis

Introduction (chapter objectives) Power spectral density Matched filters Recall former Chapter that random signals are used to convey information. Noise is also described in terms of statistics. Thus, knowledge of random signals and noise is fundamental to an understanding of communication systems.

Introduction Signals with random parameter are random singals ; All noise that can not be predictable are called random noise or noise ; Random signals and noise are called random process ; Random process (stochastic process) is an indexed set of function of some parameter( usually time) that has certain statistical properties. A random process may be described by an indexed set of random variables. A random variable maps events into constants, whereas a random process maps events into functions of the parameter t.

Introduction Random process can be classified as strictly stationary or wide-sense stationary; Definition: A random process x(t) is said to be stationary to the order N if , for any t1,t2,…,tN, : Where t0 si any arbitrary real constant. Furthermore, the process is said to be strictly stationary if it is stationary to the order N→infinite Definition: A random process is said to be wide-sense stationary if Where τ=t2-t1.

Introduction Definition: A random process is said to be ergodic if all time averages of any sample function are equal to the corresponding ensemble averages(expectations) Note: if a process is ergodic, all time and ensemble averages are interchangeable. Because time average cannot be a function of time, the ergodic process must be stationary, otherwise the ensemble averages would be a function of time. But not all stationary processes are ergodic.

Introduction Definition : the autocorrelation function of a real process x(t) is: Where x1=x(t1), and x2=x(t2), if the process is a second-order stationary, the autocorrelation function is a function only of the time difference τ=t2-t1. Properties of the autocorrelation function of a real wide-sense stationary process are as follows:

Introduction Definition : the cross-correlation function for two real process x(t) and y(t) is: if x=x(t1), and y=x(t2) are jointly stationary, the cross-correlation function is a function only of the time difference τ=t2-t1. Properties of the cross-correlation function of two real jointly stationary process are as follows:

Introduction Two random processes x(t) and y(t) are said to be uncorrelated if : For all value of τ, similarly, two random processes x(t) and y(t) are said to be orthogonal if For all value of τ. If the random processes x(t) and y(t) are jointly ergodic, the time average may be used to replace the ensemble average. For correlation functions, this becomes:

Introduction Definition: a complex random process is: Where x(t) and y(t) are real random processes. Definition: the autocorrelation for complex random process is: Where the asterisk denotes the complex conjugate. the autocorrelation for a wide-sense stationary complex random process has the Hermitian symmetry property:

Introduction For a Gaussian process, the one-dimension PDF can be represented by: some properties of f(x) are: (1) f(x) is a symmetry function about x=mx; (2) f(x) is a monotony increasing function at(- infinite,mx) and a monotony decreasing funciton at (mx, ), the maximum value at mx is 1/[(2π)(1/2)σ];

Introduction The cumulative distribution function (CDF) for the Gaussian distribution is: Where the Q function is defined by: And the error function (erf) defined as: And the complementary error function (erfc) defined as: And

6.2 Power Spectral Density (definition) The definition of the PSD for the case of deterministic waveform is Eq.(2-66): Definition: The power spectral density (PSD) for a random process x(t) is given by: where

6.2 Power Spectral Density (Wiener-Khintchine Theorem) When x(t) is a wide-sense stationary process, the PSD can be obtained from the Fourier transform of the autocorrelation function: Conversely, Provided that R(τ) becomes sufficiently small for large values of τ, so that This theorem is also valid for a nonstationary process, provided that we replace R(τ) by < R(t,t+τ) >. Proof: (notebook p)

6.2 Power Spectral Density (Wiener-Khintchine Theorem) There are two different methods that may be used to evaluate the PSD of a random process: 2 using the indirect method by evaluating the Fourier transform of Rx(τ) , where Rx(τ) has to obtained first Properties of the PSD: (1) Px(f) is always real; (2) Px(f)>=0; (3) When x(t) is real, Px(-f)= Px(f); (4) When x(t) is wide-sense stationary,

6.2 Power Spectral Density Example 6-3: (notebook p)

6.2 Power Spectral Density summary,the general expression for the PSD of a digital signal can obtained by starting from: Where f(t) is the signaling pulse shape, and Ts is the duration of one symbol. {an} is a set of random variables that represent the data. The autocorrelation of data is: By truncating x(t) we get: Where T/2=(N+1/2)Ts, its Fourier transform is:

6.2 Power Spectral Density According to the definition of PSD, we get: Thus:

6.2 Power Spectral Density furthermore Thus an equivalent expression of PSD is: Where the autocorrelation of the data is: In which Pi is the probability of getting the product (anan+k), of which there are I possible value

6.2 Power Spectral Density Note that the quantity in brackets in Eq.(6.70b) is similar to the discrete Fourier transform of the data autocorrelation function R(k), except that the frequency variable ω is continuous; that the PSD of the baseband digtial signal is influenced by both the “spectrum” of the data and the spectrum of the pulse shape used for the line code; that spectrum may contain delta functions if the mean value of data, an, is nonzero, that is: this is the case that the data symbols are uncorrelated.

6.2 Power Spectral Density thus Where D=1/Ts. And the Poisson sum formula is used. For the general case where there is correlation between the data, let the data autocorrelation function R(k) be expressed in terms of the normalized–data autocorrelation function ρ(k) , the PSD of the digital signal is

6.2 Power Spectral Density where is a spectral weight function obtained form the Fourier transform of the normalized autocorrelation impulse train

6.2 Power Spectral Density White noise processes: Definition: A random process x(t) is said to be a white-noise process if the PSD is constant over all frequencies; that is: Where N0 is a positive constant. The autocorrelation function for the white-noise process is obtained by taking the inverse Fourier transform of eq. Above. The result is:

6.2 Power Spectral Density White Guassian Noise:n(t) is a random process (random signal) Gaussian – Gaussian PDF(probability-density-function) White -- a flat PSD (Power-Spectrum-Density) or a impulse-like auto-correlation

6.2 Power Spectral Density Bandpass White Gaussian Noise:n(t) is a (narrow) bandpass random process (random signal) of 2BHz, while the baseband signal is BHz) Gaussian – Gaussian PDF (probability-density-function) White -- a flat PSD (Power-Spectrum-Density) in a band of BHz or a sinc-like auto-correlation

6.2 Power Spectral Density Measurement of PSD Analog techniques Numerical computation of the PSD Note: in either case the measurement can only approximate the true PSD, because the measurement is carried out over a finite time interval instead of the infinite interval.

Input-Output Relationships for Linear System Theorem: if a wide-sense stationary random process x(t) is applied to the input of a time-invariant linear network with impulse response h(t) the output autocorrelation is: The output PSD is: Linear network h(t) H(f) Input x(t) output y(t) X(f) Rx(τ) Px(f) Y(f) Ry(τ) Py(f) Fig.6-6 Linear system Where H(f)=F{h(t)}.

6.8 Matched Filters Matched filtering is a technique for designing a linear filter to minimize the effect of noise while maximize the signal. A general representation for a matched filter is illustrated as follows: Matched filter h(t) H(f) r(t)=s(t)+n(t) Fig.6-15 matched filter r0(t)=s0(t)+n0(t) The input signal is denoted by s(t) and the output signal by s0(t), Similar notation is used for the noise. The signal is assumed to be (absolutely) time limited to the interval (0,T) and is zero otherwise. The PSD, Pn(f),of the additive input noise n(t) is known, if signal is present, its waveform is also known.

6.8 Matched Filters The matched-filter design criterion: Finding a h(t) or , equivalently H(f), so that the instantaneous output signal power is maximized at a sampling time t0, that is: Is a maximum at t=t0. Note: the matched filter does not preserve the input signal waveshape. Its objective is to distort the input signal waveshape and filter the noise so that at the sampling time t0, the output signal level will be as large as possible with respect to the rms output noise level.

6.8 Matched Filters Theorem: the matched filter is the linear filter that maximizes (S/N)out=s02(t0)/<n02(t)>, and that has a transfer function given by: Where s(f)=F[s(t)] is the Fourier transform of the known input signal s(t) of duration T sec. Pn(f) is the PSD of the input noise, t0 is the sampling time when (S/N)out is evaluated, and K is an arbitrary real nonzero constant. Matched filter h(t) H(f) r(t)=s(t)+n(t) Fig.6-15 matched filter r0(t)=s0(t)+n0(t)

6.8 Matched Filters Proof: the output signal at time t0 is: The average power of the output noise is: Then: With the aid of Schwarz inequality: Where A(f) and B(f) may be complex function of the real variable f. equality is obtained only when:

6.8 Matched Filters Leting: then: The maximum (S/N)out is obtained when H(f) is chosen such that equality is attained. This occurs when A(f)=KB*(f). Or :

6.8 Matched Filters Results for White Noise For white noise, Pn(f)=N0/2, thus we get: Theorem when the input noise is white, the impulse response of the matched filter becomes: h(t)=Cs(t0-t) (6-160) Where C is an arbitrary real positive constant, t0 is the time of the peak signal output, and s(t) is the known input signal waveshape. The impulse response of the matched filter (white–noise case) is simply the known signal waveshape that is “played backward” and translated by an amount to. Thus, the filter is said to be “matched” to the signal.

6.8 Matched Filters An important property: the actual value of (S/N)out that is obtained form the matched filter is: The result states that (S/N)out depends on the signal energy and PSD level of the noise, and not on the particular signal waveshape that is used. It can also be written in another terms. Assume that the input noise power is measured in a band that is W hertz wide. The signal has a duration of T seconds. Then,

6.8 Matched Filters Example 6-11 Integrate-and-Dump (Matched) filter a) Input signal t b) “backwards” signal t c) matched-Filter impulse response t t0==t2 h(t)=s(t0-t) d) Signal output of matched filter t1 t

6.8 Matched Filters Fig.6-17

6.8 Matched Filters correlation processing Theorem: The matched filter may be realized by correlating the input with s(t) for the case of white noise. that is: Where s(t) is the known signal waveshape and r(t) is the processor input, as illustrated in Fig.6-18 Fig.6-18 Matched-filter realization by correlation processing

6.8 Matched Filters Proof: the output of the matched filter at time t0 is: Because of h(t)=Cs(t0-t) (6-160) so This is over

6.8 Matched Filters Example 6-12 Matched Filter for Detection of a BPSK signal

6.8 Matched Filters (Transversal Matched Filter) we wish to find the set of transversal filter coefficients {ai;i=1,2,…..,N} such that signal-to-average–noise–power ratio is maximized Fig. 6-20 Transversal matched filter

6.8 Matched Filters (Transversal Matched Filter) The output signal at time t=t0 is: Similarly, the output noise at time t=t0 is: The average noise power is:

6.8 Matched Filters (Transversal Matched Filter) Thus the output-peak-signal to average-noise-power ratio is: Using Lagrange’s method of maximizing the numerator while constraining the denominator to be a constant, we can get: Is the matrix notation of (6-174)

6.8 Matched Filters (Transversal Matched Filter) Where the known signal vector and the known autocorrelation matrix for the input noise and the unknown transversal matched filter coefficient vector are given by the transversal matched filter coefficient vector are given by

6.8 Matched Filters (Transversal Matched Filter)

6.8 Matched Filters (Transversal Matched Filter)

Homework