Presentation is loading. Please wait.

Presentation is loading. Please wait.

DCSP-5: Noise Jianfeng Feng Department of Computer Science Warwick Univ., UK

Similar presentations


Presentation on theme: "DCSP-5: Noise Jianfeng Feng Department of Computer Science Warwick Univ., UK"— Presentation transcript:

1 DCSP-5: Noise Jianfeng Feng Department of Computer Science Warwick Univ., UK Jianfeng.feng@warwick.ac.uk http://www.dcs.warwick.ac.uk/~feng/dsp.html

2

3

4 Assignment

5 Question 8 load handel for i=1:73113 noise(i)=cos(100*i*0.1); end x=y+noise'; sound(x)

6 Noise in communication systems: probability and random signals I = imread('peppers.png'); imshow(I); noise = 1*randn(size(I)); Noisy = imadd(I,im2uint8(noise)); imshow(Noisy);

7 Noise in communication systems: probability and random signals Noise is a random signal. By this we mean that we cannot predict its value. We can only make statements about the probability of it taking a particular value, or range of values.

8 The Probability density function (pdf) p(x) of a random signal, or random variable x is defined to be the probability that the random variable x takes a value between x 0 and x 0 + x. We write this as follows: p(x 0 ) x =P(x 0 <x< x 0 + x)

9 The probability that the random variable will take a value lying between x 1 and x 2 is then the integral of the pdf over the interval [x 1 x 2 ]:

10 From the rules of integration: P(x_1<x<x_2)=P(x_2)-P(x_1)

11 Continuous distribution. An example of a continuous distribution is the Normal, or Gaussian distribution: where m=, is the mean and standard variation value of p(x). The constant term ensures that the distribution is normalized.

12 Continuous distribution. This expression is important as many actually occurring noise source can be described by it, i.e. white noise or coloured noise.

13

14

15 Generating f(x) from matlab

16 How would this be used? If we want to know the probability of, say, the noise signal, n(t), having the value [-v_1, v_1], we would evaluate: P(v_1)-P(-v_1)

17 The distribution function P(x) is usually written in terms of as a function of the error function erf(x). The complementary error function erfc is defined by erfc(x)=1- erf(x)

18 Discrete distribution. Probability density functions need not be continuous. If a random variable can only take discrete value, its PDF takes the forms of lines. An example of a discrete distribution is the Poisson distribution

19

20

21 We cannot predicate value a random variable may take on a particular occasion. We can introduce measures that summarise what we expect to happen on average. The two most important measures are the mean (or expectation )and the standard deviation. The mean of a random variable x is defined to be

22 Expectation EX =

23 In the examples above we have assumed that the mean of the Gaussian distribution to be 0, the mean of the Poisson distribution is found to be. The mean of a distribution is, in common sense, the average value.

24 The variance is defined to be The square root of the variance is called standard deviation.

25 The standard deviation is a measure of the spread of the probability distribution around the mean. A small standard deviation means the distribution are close to the mean. A large value indicates a wide range of possible outcomes. The Gaussian distribution contains the standard deviation within its definition.

26 In many cases the noise present in communication signals can be modelled as a zero-mean, Gaussian random variable. This means that its amplitude at a particular time has a PDF given by Eq. above. The statement that noise is zero mean says that, on average, the noise signal takes the values zero.

27 We have already seen that the signal to noise ratio is an important quantity in determining the performance of a communication channel. The noise power referred to in the definition is the mean noise power. It can therefore be rewritten as SNR= 10 log 10 ( S / 2 )

28 If only thermal noise is considered, we have =kT m B where T is the Boltzman's constant (k=1.38 x 10 -23 J/K) T m is the temperature and B is the receiver bandwidth.

29 Correlation or covariance Cov(X,Y) = E(X-EX)(Y-EY) correlation is normalized covariance Positive correlation Negative correlation No correlation

30 Stochastic process A stochastic process is a collection of random variables x(t), for each fixed t, it is a random variable For example, when x(t) is a Gaussian variable, it is called a Gaussian process A special Gaussian process is called

31 white noise n(t) White noise is a random process with a flat power spectrum. In other words, the signal contains equal power within a fixed bandwidth at any center frequency. White noise draws its name from white light in which the power spectral density of the light is distributed over the visible band in such a way that the eye's three color receptors are approximately equally stimulated.

32 White noise vs. colour noise The most noisy noise is a white noise since its autocorrelation is zero, i.e. corr(n(t), n(s))=0 when t > s Otherwise, we called it colour noise since we can predict some outcome of n(t), given n(s)

33 load handel plot(y) size(y) x=randn(73113,1); plot(abs(fft(z))) z=y+.1*x; plot(abs(fft(z))) hold on plot(abs(fft(y)),'r') plot(abs(fft(.1*x)),'g')

34 Why do we love Gaussian?

35 + =

36 X+Y (two Gaussian) = another Gaussian + =


Download ppt "DCSP-5: Noise Jianfeng Feng Department of Computer Science Warwick Univ., UK"

Similar presentations


Ads by Google