 # EE322 Digital Communications

## Presentation on theme: "EE322 Digital Communications"— Presentation transcript:

EE322 Digital Communications
Communication Theory 1/15/03 EE322 Digital Communications Lecture #2 Random Processes EE A. Al-Sanie Spring 2003

Random Processes Random processes have the following properties:
Communication Theory 1/15/03 Random Processes Random processes have the following properties: Random processes are functions of time. Random processes are random in the sense that it is not possible to predict exactly what waveform will be observed in the future. Suppose that we assign to each sample point s a function of time with the label And the sample function as EE A. Al-Sanie Spring 2003

Communication Theory 1/15/03 EE A. Al-Sanie Spring 2003

Communication Theory 1/17/03 Random Variables Random variables map the outcome of a random experiment to a number. S heads tails X EE A. Al-Sanie Spring 2003

Random Processes Random Processes map the outcome of a random
Communication Theory 1/17/03 Random Processes Random Processes map the outcome of a random experiment to a signal (function of time). signal associated with the outcome: A random process evaluated at a particular time is a random variable sample function ensemble S heads tails Spring 2003

Communication Theory 1/15/03 Typical ensemble members for four random processes commonly encountered in communications (a) thermal noise (b) uniform phase. (c) Rayleigh fading process and (d) binary random data process. EE A. Al-Sanie Spring 2003

Random Process Terminology
Communication Theory 1/17/03 Random Process Terminology The expected value, ensemble average or mean of a random process is: The autocorrelation function (ACF) is: Autocorrelation is a measure of how alike the random process is from one time instant to another. EE A. Al-Sanie Spring 2003

Mean and Autocorrelation
Communication Theory 1/17/03 Mean and Autocorrelation Finding the mean and autocorrelation is not as hard as it might appear! Why: because oftentimes a random process can be expressed as a function of a random variable. We already know how to work with functions of random variables. Example: This is just a function g() of : We know how to find the expected value of a function of a random variable: To find this you need to know the pdf of . a random variable EE A. Al-Sanie Spring 2003

An Example If  is uniform between 0 and , then: Communication Theory
1/17/03 An Example If  is uniform between 0 and , then: Spring 2003

Communication Theory 1/17/03 Stationarity A process is strict-sense stationary (SSS) if all its joint densities are invariant to a time shift: in general, it is difficult to prove that a random process is strict sense stationary. A process is wide-sense stationary (WSS) if: The mean is a constant: The autocorrelation is a function of time difference only: If a process is strict-sense stationary, then it is also wide-sense stationary. EE A. Al-Sanie Spring 2003

Communication Theory 1/15/03 Illustrating the autocorrelation functions of slowly and rapidly fluctuating random processes EE A. Al-Sanie Spring 2003

Properties of the Autocorrelation Function
Communication Theory 1/17/03 Properties of the Autocorrelation Function If x(t) is Wide Sense Stationary, then its autocorrelation function has the following properties: Examples: Which of the following are valid ACF’s? this is the second moment even symmetry EE A. Al-Sanie Spring 2003

Power Spectral Density
Communication Theory 1/17/03 Power Spectral Density Power Spectral Density (PSD) is a measure of a random process’ power content per unit frequency. Denoted S(f). Units of W/Hz. S(f) is nonnegative function. For real-valued processes, (f) is an even function. The total power of the process if found by: The power within bandwidth B is found by: EE A. Al-Sanie Spring 2003

Power Spectral density (PSD) of Random Process
Communication Theory 1/17/03 Power Spectral density (PSD) of Random Process We can easily find the PSD of a WSS random processes. Wiener-Khintchine theorem: If x(t) is a wide sense stationary random process, then: i.e. the PSD is the Fourier Transform of the ACF. Example: Find the PSD of a WSS R.P with autocorrelation: EE A. Al-Sanie Spring 2003

Example: Random Binary wave
Communication Theory 1/15/03 Example: Random Binary wave Sample function of random binary wave. EE A. Al-Sanie Spring 2003

Autocorrelation function of random binary wave
Communication Theory 1/15/03 Autocorrelation function of random binary wave Power spectral density of random binary wave. EE A. Al-Sanie Spring 2003

Noise in Communication Systems
Communication Theory 1/15/03 Noise in Communication Systems The term noise refers to unwanted electrical. The noise limits the receiver’s ability to make correct symbol decisions. Noise arises from a variety of sources: Man-made noise: such as spark-plug ignition noise, switching transients, and other radiating electromagnetic signals. Natural noise: such as sun and other galactic sources, thermal noise. EE A. Al-Sanie Spring 2003

Communication Theory 1/15/03 Thermal Noise Thermal noise is caused by the thermal motion of electrons in all dissipative components (resistors, wires ). The same electrons that are responsible for electrical conduction are also responsible for thermal noise. Thermal noise cannot be eliminated. We can describe the thermal noise as a zero mean Gaussian random process. EE A. Al-Sanie Spring 2003

Communication Theory 1/15/03 A Gaussian process n(t) is a random function whose value n at any arbitrary time t is statistically characterized by the Gaussian probability density function The Gaussian distribution is often used as the system noise model because of the theorem called central limit theorem. EE A. Al-Sanie Spring 2003

Communication Theory 1/15/03 EE A. Al-Sanie Spring 2003

Communication Theory 1/15/03 White Gaussian Noise The power spectral density Sn(f) of the thermal noise is flat for all frequencies as shown in the figure and is denoted as Where the factor of 2 is included to indicate that Sn(f) is two-sided power density. Because the noise has such Sn(f) we refer to it as white noise. EE A. Al-Sanie Spring 2003

The autocorrelation function of white noise is
Communication Theory 1/15/03 The autocorrelation function of white noise is Any two different samples of white noise, no matter how close together in time they are taken, are uncorrelated. Since thermal noise is a Gaussian process and the samples are uncorrelated, then noise sample are also independent. Therefore, the noise affects each transmitted symbol independently. EE A. Al-Sanie Spring 2003

The parameter No may be expressed as
Communication Theory 1/15/03 The parameter No may be expressed as were k is Boltzmann’s constant =1.38×10-23 J/K And Te is the equivalent noise temperature of the receiver. The equivalent noise temperature of a system is defined as the temperature at which a noisy resistor has to be maintained such that, by connecting the resistor to the input of noiseless version of the system, it produced by all the sources of noise in the actual system. EE A. Al-Sanie Spring 2003

Thermal noise is present in all communication systems.
Communication Theory 1/15/03 Thermal noise is present in all communication systems. The thermal noise characteristics : White, Gaussian, and additive. We call it Additive White Gaussian Noise (AWGN). In this course we shall assume that the signals are corrupted by AWGN. EE A. Al-Sanie Spring 2003

Communication Theory 1/15/03 n n Characteristics of white noise. (a) Power spectral density. (b) Autocorrelation function. EE A. Al-Sanie Spring 2003

Communication Theory 1/17/03 Linear Systems The output of a linear time invariant (LTI) system is found by convolution. However, if the input to the system is a random process, we can’t find X(f). Solution: use power spectral densities: This implies that the output of a LTI system is WSS if the input is WSS. x(t) y(t) h(t) EE A. Al-Sanie Spring 2003

Ideal low-pass filtered white noise:
Communication Theory 1/15/03 Ideal low-pass filtered white noise: Low-pass filter H(f) White noise n(t) H(f) 1 -B B (a) Power spectral density. (b) Autocorrelation function EE A. Al-Sanie Spring 2003