Chapter 1 Random Process

Slides:



Advertisements
Similar presentations
Lecture 7 Linear time invariant systems
Advertisements

ELEC 303 – Random Signals Lecture 20 – Random processes
STAT 497 APPLIED TIME SERIES ANALYSIS
Lecture 6 Power spectral density (PSD)
Random Processes ECE460 Spring, Random (Stocastic) Processes 2.
EE322 Digital Communications
SYSTEMS Identification
Page67.m page69.m page72_Gaussian.m.
Review of Probability and Random Processes
Lecture 16 Random Signals and Noise (III) Fall 2008 NCTU EE Tzu-Hsien Sang.
ELEC 303 – Random Signals Lecture 21 – Random processes
Review of Probability.
Prof. SankarReview of Random Process1 Probability Sample Space (S) –Collection of all possible outcomes of a random experiment Sample Point –Each outcome.
Chapter 1 Random Processes
Chapter 4. Random Processes
Probability Theory and Random Processes
Random Process The concept of random variable was defined previously as mapping from the Sample Space S to the real line as shown below.
COSC 4214: Digital Communications Instructor: Dr. Amir Asif Department of Computer Science and Engineering York University Handout # 2: Random Signals.
TELECOMMUNICATIONS Dr. Hugh Blanton ENTC 4307/ENTC 5307.
Review for Exam I ECE460 Spring, 2012.
Chapter 2. Fourier Representation of Signals and Systems
Week 2ELE Adaptive Signal Processing 1 STOCHASTIC PROCESSES AND MODELS.
Random Processes ECE460 Spring, Power Spectral Density Generalities : Example: 2.
1 Part 5 Response of Linear Systems 6.Linear Filtering of a Random Signals 7.Power Spectrum Analysis 8.Linear Estimation and Prediction Filters 9.Mean-Square.
Lecturer: Dr. Peter Tsang Room: G6505 Phone: Website: Files: SIG01.ppt, SIG02.ppt,
Mathematical Preliminaries. 37 Matrix Theory Vectors nth element of vector u : u(n) Matrix mth row and nth column of A : a(m,n) column vector.
Course Review for Final ECE460 Spring, Common Fourier Transform Pairs 2.
2. Stationary Processes and Models
Elements of Stochastic Processes Lecture II
Week 21 Stochastic Process - Introduction Stochastic processes are processes that proceed randomly in time. Rather than consider fixed random variables.
CHAPTER 5 SIGNAL SPACE ANALYSIS
ارتباطات داده (883-40) فرآیندهای تصادفی نیمسال دوّم افشین همّت یار دانشکده مهندسی کامپیوتر 1.
Chapter 3 Foundation of Mathematical Analysis § 3.1 Statistics and Probability § 3.2 Random Variables and Magnitude Distribution § 3.3 Probability Density.
COSC 4214: Digital Communications Instructor: Dr. Amir Asif Department of Computer Science and Engineering York University Handout # 3: Baseband Modulation.
Random Processes and Spectral Analysis
Chapter 6 Bandpass Random Processes
SYSTEMS Identification Ali Karimpour Assistant Professor Ferdowsi University of Mashhad Reference: “System Identification Theory For The User” Lennart.
1 EE571 PART 4 Classification of Random Processes Huseyin Bilgekul Eeng571 Probability and astochastic Processes Department of Electrical and Electronic.
Dept. of EE, NDHU 1 Chapter One Signals and Spectra.
ENEE631 Digital Image Processing (Spring'04) Basics on 2-D Random Signal Spring ’04 Instructor: Min Wu ECE Department, Univ. of Maryland, College Park.
Chapter 2. Fourier Representation of Signals and Systems
Geology 6600/7600 Signal Analysis 21 Sep 2015 © A.R. Lowry 2015 Last time: The Cross-Power Spectrum relating two random processes x and y is given by:
Discrete-time Random Signals
1 EE571 PART 3 Random Processes Huseyin Bilgekul Eeng571 Probability and astochastic Processes Department of Electrical and Electronic Engineering Eastern.
EE354 : Communications System I
Geology 6600/7600 Signal Analysis 28 Sep 2015 © A.R. Lowry 2015 Last time: Energy Spectral Density; Linear Systems given (deterministic) finite-energy.
Geology 6600/7600 Signal Analysis 09 Sep 2015 © A.R. Lowry 2015 Last time: Signal Analysis is a set of tools used to extract information from sequences.
Random Processes Gaussian and Gauss-Markov processes Power spectrum of random processes and white processes.
Geology 5600/6600 Signal Analysis 14 Sep 2015 © A.R. Lowry 2015 Last time: A stationary process has statistical properties that are time-invariant; a wide-sense.
ELEC 303 – Random Signals Lecture 19 – Random processes Dr. Farinaz Koushanfar ECE Dept., Rice University Nov 12, 2009.
Lecture 5,6,7: Random variables and signals Aliazam Abbasfar.
1 Review of Probability and Random Processes. 2 Importance of Random Processes Random variables and processes talk about quantities and signals which.
EEE Chapter 6 Random Processes and LTI Huseyin Bilgekul EEE 461 Communication Systems II Department of Electrical and Electronic Engineering Eastern.
Geology 5600/6600 Signal Analysis 11 Sep 2015 © A.R. Lowry 2015 Last time: The Central Limit theorem : The sum of a sequence of random variables tends.
Oh-Jin Kwon, EE dept., Sejong Univ., Seoul, Korea: 2.3 Fourier Transform: From Fourier Series to Fourier Transforms.
Random process UNIT III Prepared by: D.MENAKA, Assistant Professor, Dept. of ECE, Sri Venkateswara College of Engineering, Sriperumbudur, Tamilnadu.
Random Signals Basic concepts Bibliography Oppenheim’s book, Appendix A. Except A.5. We study a few things that are not in the book.
Digital Communications Chapter 1 Signals and Spectra Signal Processing Lab.
Eeng360 1 Chapter 2 Linear Systems Topics:  Review of Linear Systems Linear Time-Invariant Systems Impulse Response Transfer Functions Distortionless.
Chapter 2. Signals and Linear Systems
디지털통신 Random Process 임 민 중 동국대학교 정보통신공학과 1.
Chapter 6 Random Processes
UNIT-III Signal Transmission through Linear Systems
SIGNALS PROCESSING AND ANALYSIS
Outline Introduction Signal, random variable, random process and spectra Analog modulation Analog to digital conversion Digital transmission through baseband.
Chapter 2. Fourier Representation of Signals and Systems
Random Process The concept of random variable was defined previously as mapping from the Sample Space S to the real line as shown below.
STOCHASTIC HYDROLOGY Random Processes
Chapter 6 Random Processes
Presentation transcript:

Chapter 1 Random Process 1.1 Introduction (Physical phenomenon) Deterministic model : No uncertainty about its time- dependent behavior at any instant of time . Random model :The future value is subject to “chance”(probability) Example: Thermal noise , Random data stream 1.2 Mathematical Definition of a Random Process (RP) The properties of RP a. Function of time. b. Random in the sense that before conducting an experiment, not possible to define the waveform. Sample space S function of time, X(t,s) mapping

Difference between RV and RP (1.1) 2T:The total observation interval (1.2) = sample function At t = tk, xj (tk) is a random variable (RV). To simplify the notation , let X(t,s) = X(t) X(t):Random process, an ensemble of time function together with a probability rule. Difference between RV and RP RV: The outcome is mapped into a number RP: The outcome is mapped into a function of time

Figure 1.1 An ensemble of sample functions:

1.3 Stationary Process Stationary Process : The statistical characterization of a process is independent of the time at which observation of the process is initiated. Nonstationary Process: Not a stationary process (unstable phenomenon ) Consider X(t) which is initiated at t = , X(t1),X(t2)…,X(tk) denote the RV obtained at t1,t2…,tk For the RP to be stationary in the strict sense (strictly stationary) The joint distribution function (1.3) For all time shift t, all k, and all possible choice of t1,t2…,tk

X(t) and Y(t) are jointly strictly stationary if the joint finite-dimensional distribution of and are invariant w.r.t. the origin t = 0. Special cases of Eq.(1.3) 1. for all t and t (1.4) 2. k = 2 , t = -t1 (1.5) which only depends on t2-t1 (time difference)

Figure 1.2 Illustrating the probability of a joint event.

Figure 1.3 Illustrating the concept of stationarity in Example 1.1.

1.4 Mean, Correlation,and Covariance Function Let X(t) be a strictly stationary RP The mean of X(t) is (1.6) for all t (1.7) fX(t)(x) : the first order pdf. The autocorrelation function of X(t) is for all t1 and t2 (1.8)

The autocovariance function (1.10) Which is of function of time difference (t2-t1). We can determine CX(t1,t2) if mX and RX(t2-t1) are known. Note that: 1. mX and RX(t2-t1) only provide a partial description. 2. If mX(t) = mX and RX(t1,t2)=RX(t2-t1), then X(t) is wide-sense stationary (stationary process). 3. The class of strictly stationary processes with finite second-order moments is a subclass of the class of all stationary processes. 4. The first- and second-order moments may not exist.

Properties of the autocorrelation function For convenience of notation , we redefine The mean-square value 2. 3.

Proof of property 3:

The RX() provides the interdependence information of two random variables obtained from X(t) at times  seconds apart

Example 1.2 (1.15) (1.16) (1.17)

Appendix 2.1 Fourier Transform

We refer to |G(f)| as the magnitude spectrum of the signal g(t), and refer to arg {G(f)} as its phase spectrum.

DIRAC DELTA FUNCTION Strictly speaking, the theory of the Fourier transform is applicable only to time functions that satisfy the Dirichlet conditions. Such functions include energy signals. However, it would be highly desirable to extend this theory in two ways: To combine the Fourier series and Fourier transform into a unified theory, so that the Fourier series may be treated as a special case of the Fourier transform. To include power signals (i.e., signals for which the average power is finite) in the list of signals to which we may apply the Fourier transform.

The Dirac delta function or just delta function, denoted by , is defined as having zero amplitude everywhere except at , where it is infinitely large in such a way that it contains unit area under its curve; that is and (A2.3) (A2.4) (A2.5) (A2.6)

Example 1.3 Random Binary Wave / Pulse 1. The pulses are represented by ±A volts (mean=0). 2. The first complete pulse starts at td. 3. During , the presence of +A or –A is random. 4.When , Tk and Ti are not in the same pulse interval, hence, X(tk) and X(ti) are independent.

Figure 1.6 Sample function of random binary wave.

4. When , Tk and Ti are not in the same pulse interval, hence, X(tk) and X(ti) are independent.

5. For X(tk) and X(ti) occur in the same pulse interval

6. Similar reason for any other value of tk What is the Fourier Transform of ? Reference : A.Papoulis, Probability, Random Variables and Stochastic Processes, Mc Graw-Hill Inc.

Cross-correlation Function and Note and are not general even functions. The correlation matrix is If X(t) and Y(t) are jointly stationary

Proof of :

Example 1.4 Quadrature-Modulated Process where X(t) is a stationary process and  is uniformly distributed over [0, 2]. = 0

1.5 Ergodic Processes Ensemble averages of X(t) are averages “across the process”. Long-term averages (time averages) are averages “along the process ” DC value of X(t) (random variable) If X(t) is stationary,

represents an unbiased estimate of The process X(t) is ergodic in the mean, if The time-averaged autocorrelation function If the following conditions hold, X(t) is ergodic in the autocorrelation functions

1.6 Transmission of a random Process Through a Linear Time-Invariant Filter (System) where h(t) is the impulse response of the system If E[X(t)] is finite and system is stable If X(t) is stationary, H(0) :System DC response.

Consider autocorrelation function of Y(t): If is finite and the system is stable, If (stationary) Stationary input, Stationary output

1.7 Power Spectral Density (PSD) Consider the Fourier transform of g(t), Let H(f ) denote the frequency response,

: the magnitude response Define: Power Spectral Density ( Fourier Transform of ) Recall Let be the magnitude response of an ideal narrowband filter D f : Filter Bandwidth (1.40)

Properties of The PSD Einstein-Wiener-Khintahine relations: is more useful than !

Example 1.5 Sinusoidal Wave with Random Phase

Example 1.6 Random Binary Wave (Example 1.3) Define the energy spectral density of a pulse as

Example 1.7 Mixing of a Random Process with a Sinusoidal Process We shift the to the right by , shift it to the left by , add them and divide by 4.

Relation Among The PSD of The Input and Output Random Processes Recall (1.32) X(t) Y(t) h(t) SX (f) SY (f)

Relation Among The PSD and The Magnitude Spectrum of a Sample Function Let x(t) be a sample function of a stationary and ergodic Process X(t). In general, the condition for Fourier transformable is This condition can never be satisfied by any stationary x(t) with infinite duration. We may write If x(t) is a power signal (finite average power) Time-averaged autocorrelation periodogram function

Take inverse Fourier Transform of right side of (1.62) From (1.61),(1.63),we have Note that for any given x(t) periodogram does not converge as Since x(t) is ergodic (1.67) is used to estimate the PSD of x(t)

Cross-Spectral Densities

Example 1.8 X(t) and Y(t) are zero mean stationary processes. Consider Example 1.9 X(t) and Y(t) are jointly stationary.

Define : Y as a linear functional of X(t) 1.8 Gaussian Process Define : Y as a linear functional of X(t) The process X(t) is a Gaussian process if every linear functional of X(t) is a Gaussian random variable ( g(t): some function) ( e.g g(t): (e) ) Fig. 1.13 Normalized Gaussian distribution

Central Limit Theorem Let Xi , i =1,2,3,….N be (a) statistically independent R.V. and (b) have mean and variance . Since they are independently and identically distributed (i.i.d.) Normalized Xi The Central Limit Theorem The probability distribution of VN approaches N(0,1) as N approaches infinity.

Properties of A Gaussian Process 1. X(t) h(t) Y(t) Gaussian

2. If X(t) is Gaussisan Then X(t1) , X(t2) , X(t3) , …., X(tn) are jointly Gaussian. Let and the set of covariance functions be

3. If a Gaussian process is stationary then it is strictly stationary. (This follows from Property 2) 4. If X(t1),X(t2),…..,X(tn) are uncorrelated as Then they are independent Proof : uncorrelated is also a diagonal matrix (1.85)

1.9 Noise · Shot noise · Thermal noise k: Boltzmann’s constant = 1.38 x 10-23 joules/K, T is the absolute temperature in degree Kelvin.

· White noise

Example 1.10 Ideal Low-Pass Filtered White Noise

Example 1.11 Correlation of White Noise with a Sinusoidal Wave

1.10 Narrowband Noise (NBN) Two representations a. in-phase and quadrature components (cos(2 fct) ,sin(2 fct)) b.envelope and phase 1.11 In-phase and quadrature representation

Important Properties 1.nI(t) and nQ(t) have zero mean. 2.If n(t) is Gaussian then nI(t) and nQ(t) are jointly Gaussian. 3.If n(t) is stationary then nI(t) and nQ(t) are jointly stationary. 4. 5. nI(t) and nQ(t) have the same variance . 6.Cross-spectral density is purely imaginary. 7.If n(t) is Gaussian, its PSD is symmetric about fc, then nI(t) and nQ(t) are statistically independent.

Example 1.12 Ideal Band-Pass Filtered White Noise

1.12 Representation in Terms of Envelope and Phase Components Let NI and NQ be R.V.s obtained (at some fixed time) from nI(t) and nQ(t). NI and NQ are independent Gaussian with zero mean and variance .

Substituting (1.110) - (1.112) into (1.109)

Figure 1.22 Normalized Rayleigh distribution.

1.13 Sine Wave Plus Narrowband Noise If n(t) is Gaussian with zero mean and variance 1. and are Gaussian and statistically independent. 2.The mean of is A and that of is zero. 3.The variance of and is .

The modified Bessel function of the first kind of zero order is defined is (Appendix 3) It is called Rician distribution.

Figure 1.23 Normalized Rician distribution.