ارتباطات داده (883-40) فرآیندهای تصادفی نیمسال دوّم 93-92 افشین همّت یار دانشکده مهندسی کامپیوتر 1.

Slides:



Advertisements
Similar presentations
Chapter 3. Noise Husheng Li The University of Tennessee.
Advertisements

Lecture 7 Linear time invariant systems
ELEC 303 – Random Signals Lecture 20 – Random processes
Random Processes ECE460 Spring, Random (Stocastic) Processes 2.
Stochastic processes Lecture 8 Ergodicty.
EE322 Digital Communications
Stochastic Processes Dr. Talal Skaik Chapter 10 1 Probability and Stochastic Processes A friendly introduction for electrical and computer engineers Electrical.
SYSTEMS Identification
1 11 Lecture 14 Random Signals and Noise (I) Fall 2008 NCTU EE Tzu-Hsien Sang.
Page67.m page69.m page72_Gaussian.m.
Review of Probability and Random Processes
Lecture 16 Random Signals and Noise (III) Fall 2008 NCTU EE Tzu-Hsien Sang.
Matched Filters By: Andy Wang.
Digital Communication
ELEC 303 – Random Signals Lecture 21 – Random processes
Review of Probability.
Prof. SankarReview of Random Process1 Probability Sample Space (S) –Collection of all possible outcomes of a random experiment Sample Point –Each outcome.
Chapter 1 Random Processes
Chapter 4. Random Processes
Probability Theory and Random Processes
Random Process The concept of random variable was defined previously as mapping from the Sample Space S to the real line as shown below.
COSC 4214: Digital Communications Instructor: Dr. Amir Asif Department of Computer Science and Engineering York University Handout # 2: Random Signals.
TELECOMMUNICATIONS Dr. Hugh Blanton ENTC 4307/ENTC 5307.
Review for Exam I ECE460 Spring, 2012.
Week 2ELE Adaptive Signal Processing 1 STOCHASTIC PROCESSES AND MODELS.
Random Processes ECE460 Spring, Power Spectral Density Generalities : Example: 2.
1 Part 5 Response of Linear Systems 6.Linear Filtering of a Random Signals 7.Power Spectrum Analysis 8.Linear Estimation and Prediction Filters 9.Mean-Square.
Mathematical Preliminaries. 37 Matrix Theory Vectors nth element of vector u : u(n) Matrix mth row and nth column of A : a(m,n) column vector.
Course Review for Final ECE460 Spring, Common Fourier Transform Pairs 2.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Definitions Random Signal Analysis (Review) Discrete Random Signals Random.
2. Stationary Processes and Models
Elements of Stochastic Processes Lecture II
Week 21 Stochastic Process - Introduction Stochastic processes are processes that proceed randomly in time. Rather than consider fixed random variables.
Chapter 3 Foundation of Mathematical Analysis § 3.1 Statistics and Probability § 3.2 Random Variables and Magnitude Distribution § 3.3 Probability Density.
COSC 4214: Digital Communications Instructor: Dr. Amir Asif Department of Computer Science and Engineering York University Handout # 3: Baseband Modulation.
Random Processes and Spectral Analysis
Chapter 6 Bandpass Random Processes
Robotics Research Laboratory 1 Chapter 7 Multivariable and Optimal Control.
1 EE571 PART 4 Classification of Random Processes Huseyin Bilgekul Eeng571 Probability and astochastic Processes Department of Electrical and Electronic.
Chapter 1 Random Process
Dept. of EE, NDHU 1 Chapter One Signals and Spectra.
Electronic Noise Noise phenomena Device noise models
EE 3220: Digital Communication Dr. Hassan Yousif Ahmed Department of Electrical Engineering College of Engineering at Wadi Al Dawaser Prince Sattam bin.
Chapter 2. Fourier Representation of Signals and Systems
Geology 6600/7600 Signal Analysis 21 Sep 2015 © A.R. Lowry 2015 Last time: The Cross-Power Spectrum relating two random processes x and y is given by:
Discrete-time Random Signals
1 EE571 PART 3 Random Processes Huseyin Bilgekul Eeng571 Probability and astochastic Processes Department of Electrical and Electronic Engineering Eastern.
EE354 : Communications System I
Geology 6600/7600 Signal Analysis 09 Sep 2015 © A.R. Lowry 2015 Last time: Signal Analysis is a set of tools used to extract information from sequences.
Random Processes Gaussian and Gauss-Markov processes Power spectrum of random processes and white processes.
ELEC 303 – Random Signals Lecture 19 – Random processes Dr. Farinaz Koushanfar ECE Dept., Rice University Nov 12, 2009.
Lecture 5,6,7: Random variables and signals Aliazam Abbasfar.
The Chinese University of Hong Kong
1 Review of Probability and Random Processes. 2 Importance of Random Processes Random variables and processes talk about quantities and signals which.
EEE Chapter 6 Random Processes and LTI Huseyin Bilgekul EEE 461 Communication Systems II Department of Electrical and Electronic Engineering Eastern.
Random process UNIT III Prepared by: D.MENAKA, Assistant Professor, Dept. of ECE, Sri Venkateswara College of Engineering, Sriperumbudur, Tamilnadu.
Random Signals Basic concepts Bibliography Oppenheim’s book, Appendix A. Except A.5. We study a few things that are not in the book.
Digital Communications Chapter 1 Signals and Spectra Signal Processing Lab.
Chapter 2. Signals and Linear Systems
디지털통신 Random Process 임 민 중 동국대학교 정보통신공학과 1.
Chapter 6 Random Processes
UNIT-III Signal Transmission through Linear Systems
SIGNALS PROCESSING AND ANALYSIS
Outline Introduction Signal, random variable, random process and spectra Analog modulation Analog to digital conversion Digital transmission through baseband.
Random Process The concept of random variable was defined previously as mapping from the Sample Space S to the real line as shown below.
Machine Learning Week 4.
STOCHASTIC HYDROLOGY Random Processes
The Spectral Representation of Stationary Time Series
Chapter 6 Random Processes
For a deterministic signal x(t), the spectrum is well defined: If
Presentation transcript:

ارتباطات داده (883-40) فرآیندهای تصادفی نیمسال دوّم افشین همّت یار دانشکده مهندسی کامپیوتر 1

Random Process Introduction Mathematical Definition Stationary Process Mean, Correlation, and Covariance Ergodic Process Random Process through LTI Filter Power Spectral Density Gaussian Process White Noise Narrowband Noise 2

Introduction Deterministic Model – No uncertainty about time-dependent behavior at any instant of time Stochastic (Random) Model – Probability of a future value lying between two specified limits – Example: Received signal = Information-bearing signal + Inference + channel noise 3

Mathematical Definition (1) – Each outcome of the experiment is associated with a “Sample point” – Set of all possible outcomes of the experiment is called the “Sample space” – Function of time assigned to each sample point: X(t,s), -T ≤ t ≤ T 2T: total observation interval – “Sample function” of random process: x j (t) = X(t, s j ) – Random variables: { x 1 (t k ), x 2 (t k ),..., x n (t k )} = {X(t k, s 1 ),X(t k, s 2 ),...,X(t k, s n )} 4

Mathematical Definition (2) An ensemble of sample functions 5

Mathematical Definition (3) – Random Process X(t): “An ensemble of time functions together with a probability rule that assigns a probability to any meaningful event associated with an observation of one of the sample functions of the random process” For a random variable, the outcome of a random experiment is mapped into a number. For a random process, the outcome of a random experiment is mapped into a waveform that is a function of time. 6

Stationary Process (1) Strictly Stationary F X(t 1 +τ),...,X(t k +τ) ( x 1,..., x k ) = F X(t 1 ),...,X(t k ) ( x 1,..., x k ) (F is joint distribution function) “ A random Process X(t), initiated at time t=-∞, is strictly stationary if the joint distribution of any set of random variables obtained by observing the random process X(t) is invariant with respect to the location of the origin t=0.” 7

Stationary Process (2) Strictly Stationary F X(t 1 +τ),...,X(t k +τ) ( x 1,..., x k ) = F X(t 1 ),...,X(t k ) ( x 1,..., x k ) (F is joint distribution function) 1)K = 1: F X(t+τ) ( x ) = F X(t) ( x ) = F X ( x ) for all t and τ First-order distribution function of a stationary process is independent of time. 2) K = 2 & τ = -t1: F X(t 1 ),X(t 2 ) ( x 1, x 2 ) = F X(0), X(t 2 -t 1 ) ( x 1, x 2 ) for all t 1 and t 2 Second-order distribution function of a stationary process depends only on the time difference between observation times. 8

Stationary Process (3) Example: 9

Mean “Expectation of the random variable by observing the process at some time t” μ X (t) = E[X(t)] = ∫ x f X(t) ( x )d x f X(t) (x) is the first –order probability density function of the process. The mean of a strictly stationary process is a constant: μ X (t) = μ X for all t 10

Correlation “ Expectation of the product of two random variables X(t 1 ), X(t 2 ), by observing the process X(t) at times t 1 and t 2 ” R X (t 1,t 2 ) = E[X(t 1 )X(t 2 )] = ∫ ∫ x 1 x 2 f X(t 1 ),X(t 2 ) ( x 1, x 2 )d x 1 d x 2 f X(t 1 ),X(t 2 ) ( x 1, x 2 ) is the second –order probability density function of the process. Autocorrelation of a strictly stationary process: R X (t 1,t 2 ) = R X (t 2 - t 1 )for all t 1 and t 2 11

Covariance Autocovariance C X (t 1,t 2 ) = E[(X(t 1 )-μ X )(X(t 2 )-μ X )] = R X (t 2 - t 1 ) – μ 2 X Points: 1)The mean and autocorrelation functions only provide a partial description of the distribution of a random process. 2)The conditions of the equations for Mean and Autocorrelation are not sufficient to guarantee the random process X(t) is strictly stationary. 12

Autocorrelation Properties R X (τ) = E[(X(t+τ)X(t)]for all t 1)R X (0) = E[X 2 (t)] (mean-square value of process) 2)R X (τ) = R X (-τ) (even function of τ) 3)ІR X (τ)І ≤ R X (0) (maximum magnitude at τ=0) E[(X(t+τ)±X(t)) 2 ] ≥ 0 E[X 2 (t+τ)] ± 2E[X(t+τ)X(t)] + E[X 2 (t)] ≥ 0 2 R X (0) ± 2R X (τ) ≥ 0  -R X (0) ≤ R X (τ) ≤ R X (0) 13

Autocorrelation Example 1 14

Autocorrelation Example 2 (1) 15

Autocorrelation Example 2 (2) 16

Cross-Correlation (1) Correlation Matrix : X(t) and Y(t) stationary and jointly stationary  Cross-correlation is not even nor have maximum at origin but have symmetry: 17

Cross-Correlation (2) 18

Ergodic Process (1) DC value of x (t): Mean of process X(t)  Ergodic in Mean 19

Ergodic Process (2) Time-averaged Autocorrelation  Ergodic in Autocorrelation Note: Computing Time-averaged Mean and Autocorrelation, requires that the process be stationary. 20

Random Process through LTI Filter 21

Power Spectral Density (1) (Power Spectral Density) 22

Power Spectral Density (2) An example: 23

Power Spectral Density (3) (1) (2) (3) (4) (5) PROPERTIESPROPERTIES (Probability Density Function) 24

Power Spectral Density (4) 25

Power Spectral Density (5) 26

Power Spectral Density (6) 27

Power Spectral Density (7)  Fourier transformable (Periodogram) 28

Power Spectral Density (8) Cross-Spectral Densities: Cross-Correlations: 29

Gaussian Process (1) Linear Functional of X(t): Normalization  Probability Density Function Gaussian Distribution: 30

Gaussian Process (2) X i, i =1,2,..., N, is a set of random variables that satisfy: 1)The X i are statistically independent. 2)The X i have the same probability distribution with mean μ X and variance σ 2 X. (Independently and identically distributed (i.i.d.) set of random variables) Normalized variable: Defined variable: The central limit theorem states that the probability distribution of V N approaches a normalized Gaussian distribution N(0,1) in the Limit as the number of random variables N approaches infinity. 31

Gaussian Process (3) Property 1:  32

Gaussian Process (4) Property 2: 33

Gaussian Process (5) Property 3: Property 4: 34

Noise (1) Shot Noise arises in electronic devices such as diodes and transistors because of the discrete nature of current flow in these devices. h(t) is waveform of current pulse ν is the n umber of electrons emitted between t and t+t 0 >> Poisson Distribution 35

Noise (2) Thermal Noise is the electrical noise arising from the random motion of the electrons in a conductor. 36

Noise (3) White Noise is an idealized form of noise for ease in analysis. T e is the equivalent noise temperature of a system if defined as the temperature at which a noisy resistor has to be maintained such that, by connecting the resistor to the input of a noiseless version of the system, it produces the same available noise power at the output of the system as that produced by all the sources of noise in the actual system. 37

Noise (4) According to the autocorrelation function, any two different samples of white noise, no matter how closely together in time they are taken, are uncorrelated. If the white noise is also Gaussian, then the two samples are statistically independent. White Gaussian noise represents the ultimate in randomness. White noise has infinite average power and, as such, it is not physically realizable. The utility of white noise process is parallel to that of an impulse function or delta function in the analysis of linear systems. 38

Noise (5) 39

Noise (6) 40

Narrowband Noise (1) In-phase and Quadrature components: Properties: 1)Both components have zero mean. 2)If narrowband noise is Gaussian, then both components are jointly Gaussian. 3)If narrowband noise is stationary, then both components are jointly stationary. 4)Both components have the same power spectral density: 5)Both components have the same variance as narrowband noise. 6)The cross-spectral density of components is purely imaginary: 7)If the narrowband noise is Gaussian and its power spectral density is symmetric about the mid-band frequency, then the components are statistically independent. 41

Narrowband Noise (2) In-phase and Quadrature components 42

Narrowband Noise (3) Envelope and Phase components:  >> Uniform Distribution >> Rayleigh Distribution 43

Narrowband Noise (4) Rayleigh Distribution Normalized form >> 44

Sine-Wave plus Narrowband Noise (1) >> Rician Distribution 45

Sine-Wave plus Narrowband Noise (2) Rician Distribution Normalized form >> 46