Presentation is loading. Please wait.

Presentation is loading. Please wait.

ارتباطات داده (883-40) فرآیندهای تصادفی نیمسال دوّم 93-92 افشین همّت یار دانشکده مهندسی کامپیوتر 1.

Similar presentations


Presentation on theme: "ارتباطات داده (883-40) فرآیندهای تصادفی نیمسال دوّم 93-92 افشین همّت یار دانشکده مهندسی کامپیوتر 1."— Presentation transcript:

1 ارتباطات داده (883-40) فرآیندهای تصادفی نیمسال دوّم 93-92 افشین همّت یار دانشکده مهندسی کامپیوتر 1

2 Random Process Introduction Mathematical Definition Stationary Process Mean, Correlation, and Covariance Ergodic Process Random Process through LTI Filter Power Spectral Density Gaussian Process White Noise Narrowband Noise 2

3 Introduction Deterministic Model – No uncertainty about time-dependent behavior at any instant of time Stochastic (Random) Model – Probability of a future value lying between two specified limits – Example: Received signal = Information-bearing signal + Inference + channel noise 3

4 Mathematical Definition (1) – Each outcome of the experiment is associated with a “Sample point” – Set of all possible outcomes of the experiment is called the “Sample space” – Function of time assigned to each sample point: X(t,s), -T ≤ t ≤ T 2T: total observation interval – “Sample function” of random process: x j (t) = X(t, s j ) – Random variables: { x 1 (t k ), x 2 (t k ),..., x n (t k )} = {X(t k, s 1 ),X(t k, s 2 ),...,X(t k, s n )} 4

5 Mathematical Definition (2) An ensemble of sample functions 5

6 Mathematical Definition (3) – Random Process X(t): “An ensemble of time functions together with a probability rule that assigns a probability to any meaningful event associated with an observation of one of the sample functions of the random process” For a random variable, the outcome of a random experiment is mapped into a number. For a random process, the outcome of a random experiment is mapped into a waveform that is a function of time. 6

7 Stationary Process (1) Strictly Stationary F X(t 1 +τ),...,X(t k +τ) ( x 1,..., x k ) = F X(t 1 ),...,X(t k ) ( x 1,..., x k ) (F is joint distribution function) “ A random Process X(t), initiated at time t=-∞, is strictly stationary if the joint distribution of any set of random variables obtained by observing the random process X(t) is invariant with respect to the location of the origin t=0.” 7

8 Stationary Process (2) Strictly Stationary F X(t 1 +τ),...,X(t k +τ) ( x 1,..., x k ) = F X(t 1 ),...,X(t k ) ( x 1,..., x k ) (F is joint distribution function) 1)K = 1: F X(t+τ) ( x ) = F X(t) ( x ) = F X ( x ) for all t and τ First-order distribution function of a stationary process is independent of time. 2) K = 2 & τ = -t1: F X(t 1 ),X(t 2 ) ( x 1, x 2 ) = F X(0), X(t 2 -t 1 ) ( x 1, x 2 ) for all t 1 and t 2 Second-order distribution function of a stationary process depends only on the time difference between observation times. 8

9 Stationary Process (3) Example: 9

10 Mean “Expectation of the random variable by observing the process at some time t” μ X (t) = E[X(t)] = ∫ x f X(t) ( x )d x f X(t) (x) is the first –order probability density function of the process. The mean of a strictly stationary process is a constant: μ X (t) = μ X for all t 10

11 Correlation “ Expectation of the product of two random variables X(t 1 ), X(t 2 ), by observing the process X(t) at times t 1 and t 2 ” R X (t 1,t 2 ) = E[X(t 1 )X(t 2 )] = ∫ ∫ x 1 x 2 f X(t 1 ),X(t 2 ) ( x 1, x 2 )d x 1 d x 2 f X(t 1 ),X(t 2 ) ( x 1, x 2 ) is the second –order probability density function of the process. Autocorrelation of a strictly stationary process: R X (t 1,t 2 ) = R X (t 2 - t 1 )for all t 1 and t 2 11

12 Covariance Autocovariance C X (t 1,t 2 ) = E[(X(t 1 )-μ X )(X(t 2 )-μ X )] = R X (t 2 - t 1 ) – μ 2 X Points: 1)The mean and autocorrelation functions only provide a partial description of the distribution of a random process. 2)The conditions of the equations for Mean and Autocorrelation are not sufficient to guarantee the random process X(t) is strictly stationary. 12

13 Autocorrelation Properties R X (τ) = E[(X(t+τ)X(t)]for all t 1)R X (0) = E[X 2 (t)] (mean-square value of process) 2)R X (τ) = R X (-τ) (even function of τ) 3)ІR X (τ)І ≤ R X (0) (maximum magnitude at τ=0) E[(X(t+τ)±X(t)) 2 ] ≥ 0 E[X 2 (t+τ)] ± 2E[X(t+τ)X(t)] + E[X 2 (t)] ≥ 0 2 R X (0) ± 2R X (τ) ≥ 0  -R X (0) ≤ R X (τ) ≤ R X (0) 13

14 Autocorrelation Example 1 14

15 Autocorrelation Example 2 (1) 15

16 Autocorrelation Example 2 (2) 16

17 Cross-Correlation (1) Correlation Matrix : X(t) and Y(t) stationary and jointly stationary  Cross-correlation is not even nor have maximum at origin but have symmetry: 17

18 Cross-Correlation (2) 18

19 Ergodic Process (1) DC value of x (t): Mean of process X(t)  Ergodic in Mean 19

20 Ergodic Process (2) Time-averaged Autocorrelation  Ergodic in Autocorrelation Note: Computing Time-averaged Mean and Autocorrelation, requires that the process be stationary. 20

21 Random Process through LTI Filter 21

22 Power Spectral Density (1) (Power Spectral Density) 22

23 Power Spectral Density (2) An example: 23

24 Power Spectral Density (3) (1) (2) (3) (4) (5) PROPERTIESPROPERTIES (Probability Density Function) 24

25 Power Spectral Density (4) 25

26 Power Spectral Density (5) 26

27 Power Spectral Density (6) 27

28 Power Spectral Density (7)  Fourier transformable (Periodogram) 28

29 Power Spectral Density (8) Cross-Spectral Densities: Cross-Correlations: 29

30 Gaussian Process (1) Linear Functional of X(t): Normalization  Probability Density Function Gaussian Distribution: 30

31 Gaussian Process (2) X i, i =1,2,..., N, is a set of random variables that satisfy: 1)The X i are statistically independent. 2)The X i have the same probability distribution with mean μ X and variance σ 2 X. (Independently and identically distributed (i.i.d.) set of random variables) Normalized variable: Defined variable: The central limit theorem states that the probability distribution of V N approaches a normalized Gaussian distribution N(0,1) in the Limit as the number of random variables N approaches infinity. 31

32 Gaussian Process (3) Property 1:  32

33 Gaussian Process (4) Property 2: 33

34 Gaussian Process (5) Property 3: Property 4: 34

35 Noise (1) Shot Noise arises in electronic devices such as diodes and transistors because of the discrete nature of current flow in these devices. h(t) is waveform of current pulse ν is the n umber of electrons emitted between t and t+t 0 >> Poisson Distribution 35

36 Noise (2) Thermal Noise is the electrical noise arising from the random motion of the electrons in a conductor. 36

37 Noise (3) White Noise is an idealized form of noise for ease in analysis. T e is the equivalent noise temperature of a system if defined as the temperature at which a noisy resistor has to be maintained such that, by connecting the resistor to the input of a noiseless version of the system, it produces the same available noise power at the output of the system as that produced by all the sources of noise in the actual system. 37

38 Noise (4) According to the autocorrelation function, any two different samples of white noise, no matter how closely together in time they are taken, are uncorrelated. If the white noise is also Gaussian, then the two samples are statistically independent. White Gaussian noise represents the ultimate in randomness. White noise has infinite average power and, as such, it is not physically realizable. The utility of white noise process is parallel to that of an impulse function or delta function in the analysis of linear systems. 38

39 Noise (5) 39

40 Noise (6) 40

41 Narrowband Noise (1) In-phase and Quadrature components: Properties: 1)Both components have zero mean. 2)If narrowband noise is Gaussian, then both components are jointly Gaussian. 3)If narrowband noise is stationary, then both components are jointly stationary. 4)Both components have the same power spectral density: 5)Both components have the same variance as narrowband noise. 6)The cross-spectral density of components is purely imaginary: 7)If the narrowband noise is Gaussian and its power spectral density is symmetric about the mid-band frequency, then the components are statistically independent. 41

42 Narrowband Noise (2) In-phase and Quadrature components 42

43 Narrowband Noise (3) Envelope and Phase components:  >> Uniform Distribution >> Rayleigh Distribution 43

44 Narrowband Noise (4) Rayleigh Distribution Normalized form >> 44

45 Sine-Wave plus Narrowband Noise (1) >> Rician Distribution 45

46 Sine-Wave plus Narrowband Noise (2) Rician Distribution Normalized form >> 46


Download ppt "ارتباطات داده (883-40) فرآیندهای تصادفی نیمسال دوّم 93-92 افشین همّت یار دانشکده مهندسی کامپیوتر 1."

Similar presentations


Ads by Google