Chapter 6 Random Processes

Slides:



Advertisements
Similar presentations
Continuous Random Variables Chapter 5 Nutan S. Mishra Department of Mathematics and Statistics University of South Alabama.
Advertisements

Random Variables ECE460 Spring, 2012.
Lecture 7 Linear time invariant systems
COMMUNICATION SYSTEMS
ELEC 303 – Random Signals Lecture 20 – Random processes
STAT 497 APPLIED TIME SERIES ANALYSIS
Random Processes ECE460 Spring, Random (Stocastic) Processes 2.
DEPARTMENT OF HEALTH SCIENCE AND TECHNOLOGY STOCHASTIC SIGNALS AND PROCESSES Lecture 1 WELCOME.
Stochastic processes Lecture 8 Ergodicty.
Copyright Robert J. Marks II ECE 5345 Random Processes - Example Random Processes.
EE322 Digital Communications
Stochastic Processes Dr. Talal Skaik Chapter 10 1 Probability and Stochastic Processes A friendly introduction for electrical and computer engineers Electrical.
SYSTEMS Identification
SUMS OF RANDOM VARIABLES Changfei Chen. Sums of Random Variables Let be a sequence of random variables, and let be their sum:
1 11 Lecture 14 Random Signals and Noise (I) Fall 2008 NCTU EE Tzu-Hsien Sang.
Review of Probability and Random Processes
Sep 20, 2005CS477: Analog and Digital Communications1 Random variables, Random processes Analog and Digital Communications Autumn
1 10. Joint Moments and Joint Characteristic Functions Following section 6, in this section we shall introduce various parameters to compactly represent.
ELEC 303 – Random Signals Lecture 21 – Random processes
Review of Probability.
Prof. SankarReview of Random Process1 Probability Sample Space (S) –Collection of all possible outcomes of a random experiment Sample Point –Each outcome.
Chapter 4. Random Processes
Copyright Robert J. Marks II ECE 5345 Random Processes - Stationary Random Processes.
Probability Theory and Random Processes
Random Process The concept of random variable was defined previously as mapping from the Sample Space S to the real line as shown below.
COSC 4214: Digital Communications Instructor: Dr. Amir Asif Department of Computer Science and Engineering York University Handout # 2: Random Signals.
TELECOMMUNICATIONS Dr. Hugh Blanton ENTC 4307/ENTC 5307.
TELECOMMUNICATIONS Dr. Hugh Blanton ENTC 4307/ENTC 5307.
Review for Exam I ECE460 Spring, 2012.
Week 2ELE Adaptive Signal Processing 1 STOCHASTIC PROCESSES AND MODELS.
Random Processes ECE460 Spring, Power Spectral Density Generalities : Example: 2.
1 Part 5 Response of Linear Systems 6.Linear Filtering of a Random Signals 7.Power Spectrum Analysis 8.Linear Estimation and Prediction Filters 9.Mean-Square.
2. Stationary Processes and Models
Elements of Stochastic Processes Lecture II
Generalized Semi- Markov Processes (GSMP). Summary Some Definitions The Poisson Process Properties of the Poisson Process  Interarrival times  Memoryless.
1 Continuous Probability Distributions Continuous Random Variables & Probability Distributions Dr. Jerrell T. Stracener, SAE Fellow Leadership in Engineering.
CHAPTER 5 SIGNAL SPACE ANALYSIS
ارتباطات داده (883-40) فرآیندهای تصادفی نیمسال دوّم افشین همّت یار دانشکده مهندسی کامپیوتر 1.
Chapter 3 Foundation of Mathematical Analysis § 3.1 Statistics and Probability § 3.2 Random Variables and Magnitude Distribution § 3.3 Probability Density.
COSC 4214: Digital Communications Instructor: Dr. Amir Asif Department of Computer Science and Engineering York University Handout # 3: Baseband Modulation.
Random Processes and Spectral Analysis
Chapter 6 Bandpass Random Processes
Robotics Research Laboratory 1 Chapter 7 Multivariable and Optimal Control.
EEE Probability and Random Variables Huseyin Bilgekul EEE 461 Communication Systems II Department of Electrical and Electronic Engineering Eastern.
1 EE571 PART 4 Classification of Random Processes Huseyin Bilgekul Eeng571 Probability and astochastic Processes Department of Electrical and Electronic.
Chapter 1 Random Process
Geology 6600/7600 Signal Analysis 21 Sep 2015 © A.R. Lowry 2015 Last time: The Cross-Power Spectrum relating two random processes x and y is given by:
Discrete-time Random Signals
1 EE571 PART 3 Random Processes Huseyin Bilgekul Eeng571 Probability and astochastic Processes Department of Electrical and Electronic Engineering Eastern.
EE354 : Communications System I
Geology 6600/7600 Signal Analysis 09 Sep 2015 © A.R. Lowry 2015 Last time: Signal Analysis is a set of tools used to extract information from sequences.
EEE APPENDIX B Transformation of RV Huseyin Bilgekul EEE 461 Communication Systems II Department of Electrical and Electronic Engineering Eastern.
Random Processes Gaussian and Gauss-Markov processes Power spectrum of random processes and white processes.
Geology 5600/6600 Signal Analysis 14 Sep 2015 © A.R. Lowry 2015 Last time: A stationary process has statistical properties that are time-invariant; a wide-sense.
ELEC 303 – Random Signals Lecture 19 – Random processes Dr. Farinaz Koushanfar ECE Dept., Rice University Nov 12, 2009.
Lecture 5,6,7: Random variables and signals Aliazam Abbasfar.
ECE4270 Fundamentals of DSP Lecture 8 Discrete-Time Random Signals I School of Electrical and Computer Engineering Center for Signal and Image Processing.
1 Review of Probability and Random Processes. 2 Importance of Random Processes Random variables and processes talk about quantities and signals which.
EEE Chapter 6 Random Processes and LTI Huseyin Bilgekul EEE 461 Communication Systems II Department of Electrical and Electronic Engineering Eastern.
Geology 5600/6600 Signal Analysis 11 Sep 2015 © A.R. Lowry 2015 Last time: The Central Limit theorem : The sum of a sequence of random variables tends.
Random process UNIT III Prepared by: D.MENAKA, Assistant Professor, Dept. of ECE, Sri Venkateswara College of Engineering, Sriperumbudur, Tamilnadu.
디지털통신 Random Process 임 민 중 동국대학교 정보통신공학과 1.
Random Variables By: 1.
SIGNALS PROCESSING AND ANALYSIS
Outline Introduction Signal, random variable, random process and spectra Analog modulation Analog to digital conversion Digital transmission through baseband.
Chapter 6 Bandpass Random Processes
Random Process The concept of random variable was defined previously as mapping from the Sample Space S to the real line as shown below.
STOCHASTIC HYDROLOGY Random Processes
Chapter 6 Random Processes
Presentation transcript:

Chapter 6 Random Processes Description of Random Processes Stationarity and ergodicty Autocorrelation of Random Processes Properties of autocorrelation Huseyin Bilgekul EEE 461 Communication Systems II Department of Electrical and Electronic Engineering Eastern Mediterranean University

Homework Assignments Return date: November 8, 2005. Assignments: Problem 6-2 Problem 6-3 Problem 6-6 Problem 6-10 Problem 6-11

Random Processes A RANDOM VARIABLE X, is a rule for assigning to every outcome, w, of an experiment a number X(w). Note: X denotes a random variable and X(w) denotes a particular value. A RANDOM PROCESS X(t) is a rule for assigning to every w, a function X(t,w). Note: for notational simplicity we often omit the dependence on w. Another way to look at it is: RVs maps events into constants and RPs map events into functions of the parameter t -RPs can be described as an indexed set of RVs - The set of all possible waveforms or outputs is called an ensemble. We will be interested in the behavior of the system across all waveforms and a wide range of time.

Ensemble of Sample Functions The set of all possible functions is called the ENSEMBLE.

Random Processes A general Random or Stochastic Process can be described as: Collection of time functions (signals) corresponding to various outcomes of random experiments. Collection of random variables observed at different times. Examples of random processes in communications: Channel noise, Information generated by a source, Interference. t1 t2

Collection of Time Functions Consider the time-varying function representing a random process where wi represents an outcome of a random event. Example: a box has infinitely many resistors (i=1,2, . . .) of same resistance R. Let wi be event that the ith resistor has been picked up from the box Let v(t, wi) represent the voltage of the thermal noise measured on this resistor.

Collection of Random Variables For a particular time t=to the value x(to,wi) is a random variable. To describe a random process we can use collection of random variables {x(to,w1) , x(to,w2) , x(to,w3) , . . . }. Type: a random processes can be either discrete-time or continuous-time. Probability of obtaining a sample function of a RP that passes through the following set of windows. Probability of a joint event.

Description of Random Processes Analytical description: X(t) =f(t,w) where w is an outcome of a random event. Statistical description: For any integer N and any choice of (t1, t2, . . ., tN) the joint pdf of {X(t1), X( t2), . . ., X( tN) } is known. To describe the random process completely the PDF f(x) is required.

Example: Analytical Description Let where q is a random variable uniformly distributed on [0,2p]. Complete statistical description of X(to) is: Introduce Then, we need to transform from y to x: We need both y1 and y2 because for a given x the equation x=A cos (y) has two solutions in [0,2p].

Analytical (continued) Note x and y are actual values of the random variables X and Y. Since and pY is uniform in [2pf0t0, 2pf0t0 + 2p], we get Using the analytical description of X(t), we obtained its statistical description at any time t.

Example: Statistical Description Suppose a random process x(t) has the property that for any N and (t0,t1, . . .,tN) the joint density function of {x(ti)} is a jointly distributed Gaussian vector with zero mean and covariance This gives complete statistical description of the random process x(t).

Activity: Ensembles Consider the random process: x(t)=At+B Draw ensembles of the waveforms: B is constant, A is uniformly distributed between [-1,1] A is constant, B is uniformly distributed between [0,2] Does having an “Ensemble” of waveforms give you a better picture of how the system performs? 2 x(t) t B intersect is Random B x(t) t Slope Random

Stationarity Definition: A random process is STATIONARY to the order N if for any t1,t2, . . . , tN, fx{x(t1), x(t2),...x(tN)}=fx{x(t1+t0), x(t2+t0),...,x(tN +t0)} This means that the process behaves similarly (follows the same PDF) regardless of when you measure it. A random process is said to be STRICTLY STATIONARY if it is stationary to the order of N→∞. Is the random process from the coin tossing experiment stationary?

Illustration of Stationarity Time functions pass through the corresponding windows at different times with the same probability.

Example of First-Order Stationarity Assume that A and w0 are constants; q0 is a uniformly distributed RV from [-p,p); t is time. From last lecture, recall that the PDF of x(t): Note: there is NO dependence on time, the PDF is not a function of t. The RP is STATIONARY. This results applies to problems in which theta_0 is the random start up phase of an unsynchronized oscillator.

Non-Stationary Example Now assume that A, q0 and w0 are constants; t is time. Value of x(t) is always known for any time with a probability of 1. Thus the first order PDF of x(t) is Note: The PDF depends on time, so it is NONSTATIONARY. This results applies to problems in which theta_0 is the random start up phase of an unsynchronized oscillator.

Ergodic Processes Definition: A random process is ERGODIC if all time averages of any sample function are equal to the corresponding ensemble averages (expectations) Example, for ergodic processes, can use ensemble statistics to compute DC values and RMS values Ergodic processes are always stationary; Stationary processes are not necessarily ergodic

Example: Ergodic Process A and w0 are constants; q0 is a uniformly distributed RV from [-p,p); t is time. Mean (Ensemble statistics) Variance This results applies to problems in which theta_0 is the random start up phase of an unsynchronized oscillator.

Example: Ergodic Process Mean (Time Average) T is large Variance The ensemble and time averages are the same, so the process is ERGODIC This results applies to problems in which theta_0 is the random start up phase of an unsynchronized oscillator.

EXERCISE Write down the definition of : Wide sense stationary Ergodic processes How do these concepts relate to each other? Consider: x(t) = K; K is uniformly distributed between [-1, 1] WSS? Ergodic?

Autocorrelation of Random Process The Autocorrelation function of a real random process x(t) at two times is:

Wide-sense Stationary A random process that is stationary to order 2 or greater is Wide-Sense Stationary: A random process is Wide-Sense Stationary if: Usually, t1=t and t2=t+t so that t2- t1 =t. Wide-sense stationary process does not DRIFT with time. Autocorrelation depends only on the time gap but not where the time difference is. Autocorrelation gives idea about the frequency response of the RP.

Autocorrelation Function of RP Properties of the autocorrelation function of wide-sense stationary processes Autocorrelation of slowly and rapidly fluctuating random processes.

Cross Correlations of RP Cross Correlation of two RP x(t) and y(t) is defined similarly as: If x(t) and y(t) are Jointly Stationary processes, If the RP’s are jointly ERGODIC,

Cross Correlation Properties of Jointly Stationary RP’s Some properties of cross-correlation functions are Uncorrelated: Orthogonal: Independent: if x(t1) and y(t2) are independent (joint distribution is product of individual distributions)