Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 Chapter 7 Generating and Processing Random Signals 第一組 電機四 B93902016 蔡馭理 資工四 B93902076 林宜鴻.

Similar presentations


Presentation on theme: "1 Chapter 7 Generating and Processing Random Signals 第一組 電機四 B93902016 蔡馭理 資工四 B93902076 林宜鴻."— Presentation transcript:

1 1 Chapter 7 Generating and Processing Random Signals 第一組 電機四 B93902016 蔡馭理 資工四 B93902076 林宜鴻

2 2 Outline Stationary and Ergodic Process Uniform Random Number Generator Mapping Uniform RVs to an Arbitrary pdf Generating Uncorrelated Gaussian RV Generating correlated Gaussian RV PN Sequence Generators Signal processing Outline

3 3 Random Number Generator Noise, interference Random Number Generator- computational or physical device designed to generate a sequence of numbers or symbols that lack any pattern, i.e. appear random, pseudo-random sequence MATLAB - rand(m,n), randn(m,n)

4 4 Stationary and Ergodic Process strict-sense stationary (SSS) wide-sense stationary (WSS) Gaussian SSS =>WSS ; WSS=>SSS Time average v.s ensemble average The ergodicity requirement is that the ensemble average coincide with the time average Sample function generated to represent signals, noise, interference should be ergodic

5 5 Time average v.s ensemble average Time average ensemble average

6 6 Example 7.1 (N=100)

7 7 Uniform Random Number Genrator Generate a random variable that is uniformly distributed on the interval (0,1) Generate a sequence of numbers (integer) between 0 and M and the divide each element of the sequence by M The most common technique is linear congruence genrator (LCG)

8 8 Linear Congruence LCG is defined by the operation: x i+1 =[ax i +c]mod(m) x 0 is seed number of the generator a, c, m, x 0 are integer Desirable property- full period

9 9 Technique A: The Mixed Congruence Algorithm The mixed linear algorithm takes the form: x i+1 =[ax i +c]mod(m) - c ≠ 0 and relative prime to m - a-1 is a multiple of p, where p is the prime factors of m - a-1 is a multiple of 4 if m is a multiple of 4

10 10 Example 7.4 m=5000=(2 3 )(5 4 ) c=(3 3 )(7 2 )=1323 a-1=k1 ‧ 2 or k2 ‧ 5 or 4 ‧ k3 so, a-1=4 ‧ 2 ‧ 5 ‧ k =40k With k=6, we have a=241 x i+1 =[241x i + 1323]mod(5000) We can verify the period is 5000, so it’s full period

11 11 Technique B: The Multiplication Algorithm With Prime Modulus The multiplicative generator defined as : x i+1 =[ax i ]mod(m) - m is prime (usaually large) - a is a primitive element mod(m) a m -1/m = k =interger a i -1/m ≠ k, i=1, 2, 3,…, m-2

12 12 Technique C: The Multiplication Algorithm With Nonprime Modulus The most important case of this generator having m equal to a power of two : x i+1 =[ax i ]mod(2 n ) The maximum period is 2 n /4= 2 n-2 the period is achieved if - The multiplier a is 3 or 5 - The seed x 0 is odd

13 13 Example of Multiplication Algorithm With Nonprime Modulus a=3 c=0 m=16 x 0 =1

14 14 Testing Random Number Generator Chi-square test, spectral test…… Testing the randomness of a given sequence Scatterplots - a plot of x i+1 as a function of x i Durbin-Watson Test -

15 15 Scatterplots Example 7.5 (i) rand(1,2048) (ii)x i+1 =[65x i +1] mod(2048) (iii)x i+1 =[1229x i + 1]mod(2048)

16 16 Durbin-Watson Test (1) Let X = X[n] & Y = X[n-1] Let Assume X[n] and X[n-1] are correlated and X[n] is an ergodic process

17 17 Durbin-Watson Test (2) X and Z are uncorrelated and zero mean D>2 – negative correlation D=2 –- uncorrelation (most desired) D<2 – positive correlation

18 18 Example 7.6 rand(1,2048) - The value of D is 2.0081 and ρ is 0.0041. x i+1 =[65x i +1]mod(2048) - The value of D is 1.9925 and ρ is 0.0037273. x i+1 =[1229x i +1]mod(2048 ) - The value of D is 1.6037 and ρ is 0.19814.

19 19 Minimum Standards Full period Passes all applicable statistical tests for randomness. Easily transportable from one computer to another Lewis, Goodman, and Miller Minimum Standard (prior to MATLAB 5) x i+1 =[16807x i ]mod(2 31 -1)

20 20 Mapping Uniform RVs to an Arbitrary pdf The cumulative distribution for the target random variable is known in closed form – Inverse Transform Method The pdf of target random variable is known in closed form but the CDF is not known in closed form – Rejection Method Neither the pdf nor CDF are known in closed form – Histogram Method

21 21 Inverse Transform Method CDF F X (X) are known in closed form U = F X (X) = Pr { X ≦ x } X = F X -1 (U) F X (X) = Pr { F X -1 (U) ≦ x } = Pr {U ≦ F X (x) } = F X (x)

22 22 Example 7.8 (1) Rayleigh random variable with pdf – ∴ Setting F R (R) = U

23 23 Example 7.8 (2) ∵ RV 1-U is equivalent to U (have same pdf) ∴ Solving for R gives [n,xout] = hist( Y,nbins ) - bar( xout,n ) - plot the histogram

24 24 Example 7.8 (3)

25 25 The Histogram Method CDF and pdf are unknown Pi = Pr{x i-1 < x < x i } = c i (x i -x i-1 ) F X (x) = F i-1 + c i (x i -x i-1 ) F X (X) = U = F i-1 + c i (X-x i ) more samples more accuracy!

26 26 Rejection Methods (1) Having a target pdf Mg X (x) ≧ f X (x), all x

27 27 Rejection Methods (2) Generate U 1 and U 2 uniform in (0,1) Generate V 1 uniform in (0,a), where a is the maximum value of X Generate V 2 uniform in (0,b), where b is at least the maximum value of f X (x) If V 2 ≦ f X (V 1 ), set X= V 1. If the inequality is not satisfied, V 1 and V 2 are discarded and the process is repeated from step 1

28 28 Example 7.9 (1)

29 29 Example 7.9 (2)

30 30 Generating Uncorrelated Gaussian RV Its CDF can’t be written in closed form , so Inverse method can’t be used and rejection method are not efficient Other techniques 1.The sum of uniform method 2.Mapping a Rayleigh to Gaussian RV 3.The polar method

31 31 The Sum of Uniforms Method(1) 1.Central limit theorem 2.See next. 3. represent independent uniform R.V is a constant that decides the var of Y converges to a Gaussian R.V.

32 32 The Sum of Uniforms Method(2) Expectation and Variance We can set to any desired value Nonzero at

33 33 The Sum of Uniforms Method(3) Approximate Gaussian Maybe not a realistic situation.

34 34 Mapping a Rayleigh to Gaussian RV(1) Rayleigh can be generated by U is the uniform RV in [0,1] Assume X and Y are indep. Gaussian RV and their joint pdf 

35 35 Mapping a Rayleigh to Gaussian RV(2) Transform  let and  and  

36 36 Mapping a Rayleigh to Gaussian RV(3) Examine the marginal pdf  R is Rayleigh RV and is uniform RV

37 37 The Polar Method From previous We may transform

38 38 The Polar Method Alothgrithm 1.Generate two uniform RV , and and they are all on the interval (0,1) 2.Let and , so they are independent and uniform on (-1,1) 3.Let if continue , else back to step2 4.Form 5.Set and

39 39 Establishing a Given Correlation Coefficient(1) Assume two Gaussian RV X and Y , they are zero mean and uncorrelated Define a new RV We also can see Z is Gaussian RV Show is correlation coefficient relating X and Z

40 40 Establishing a Given Correlation Coefficient(2) Mean , Variance , Correlation coefficient

41 41 Establishing a Given Correlation Coefficient(3) Covariance between X and Z  as desired

42 42 Pseudonoise(PN) Sequence Genarators PN generator produces periodic sequence that appears to be random Generated by algorithm using initial seed Although not random , but can pass many tests of randomness Unless algorithm and seed are known , the sequence is impractical to predict

43 43 PN Generator implementation

44 44 Property of Linear Feedback Shift Register(LFSR) Nearly random with long period May have max period If output satisfy period , is called max-length sequence or m-sequence We define generator polynomial as The coefficient to generate m-sequence can always be found

45 45 Example of PN generator

46 46 Different seed for the PN generator

47 47 Family of M-sequences

48 48 Property of m-sequence Has ones , zeros The periodic autocorrelation of a m- sequence is If PN has a large period , autocorrelation function approaches an impulse , and PSD is approximately white as desired

49 49 PN Autocorrelation Function

50 50 Signal Processing Relationship 1.mean of input and output 2.variance of input and output 3.input-output cross-correlation 4.autocorrelation and PSD

51 51 Input/Output Means Assume system is linear  convolution Assume stationarity assumption  We can get and 

52 52 Input/Output Cross-Correlation The Cross-Correlation is defined by This use is used in the development of a number of performance estimators , which will be developed in chapter 8

53 53 Output Autocorrelation Function(1) Autocorrelation of the output Can’t be simplified without knowledge of the Statistics of

54 54 Output Autocorrelation Function(2) If input is delta-correlated(i.e. white noise) substitute previous equation

55 55 Input/Output Variances By definition  Let m=0 substitute into But if is white noise sequence

56 56 The End Thanks for listening


Download ppt "1 Chapter 7 Generating and Processing Random Signals 第一組 電機四 B93902016 蔡馭理 資工四 B93902076 林宜鴻."

Similar presentations


Ads by Google