2. Stationary Processes and Models

Slides:



Advertisements
Similar presentations
Dates for term tests Friday, February 07 Friday, March 07
Advertisements

16. Mean Square Estimation
ELG5377 Adaptive Signal Processing
Lecture 7 Linear time invariant systems
AGC DSP AGC DSP Professor A G Constantinides©1 Modern Spectral Estimation Modern Spectral Estimation is based on a priori assumptions on the manner, the.
OPTIMUM FILTERING.
ELEC 303 – Random Signals Lecture 20 – Random processes
STAT 497 APPLIED TIME SERIES ANALYSIS
Stochastic processes Lecture 8 Ergodicty.
EE322 Digital Communications
SYSTEMS Identification
SYSTEMS Identification
Review of Probability and Random Processes
1 10. Joint Moments and Joint Characteristic Functions Following section 6, in this section we shall introduce various parameters to compactly represent.
ELEC 303 – Random Signals Lecture 21 – Random processes
Review of Probability.
Prof. SankarReview of Random Process1 Probability Sample Space (S) –Collection of all possible outcomes of a random experiment Sample Point –Each outcome.
1 Chapter 8 The Discrete Fourier Transform 2 Introduction  In Chapters 2 and 3 we discussed the representation of sequences and LTI systems in terms.
ARMA models Gloria González-Rivera University of California, Riverside
Probability Theory and Random Processes
Random Process The concept of random variable was defined previously as mapping from the Sample Space S to the real line as shown below.
Time Series Analysis.
STAT 497 LECTURE NOTES 2.
COSC 4214: Digital Communications Instructor: Dr. Amir Asif Department of Computer Science and Engineering York University Handout # 2: Random Signals.
TELECOMMUNICATIONS Dr. Hugh Blanton ENTC 4307/ENTC 5307.
Review for Exam I ECE460 Spring, 2012.
Week 2ELE Adaptive Signal Processing 1 STOCHASTIC PROCESSES AND MODELS.
EE484: Probability and Introduction to Random Processes Autocorrelation and the Power Spectrum By: Jason Cho
Random Processes ECE460 Spring, Power Spectral Density Generalities : Example: 2.
1 Part 5 Response of Linear Systems 6.Linear Filtering of a Random Signals 7.Power Spectrum Analysis 8.Linear Estimation and Prediction Filters 9.Mean-Square.
Linear Stationary Processes. ARMA models. This lecture introduces the basic linear models for stationary processes. Considering only stationary processes.
Mathematical Preliminaries. 37 Matrix Theory Vectors nth element of vector u : u(n) Matrix mth row and nth column of A : a(m,n) column vector.
1 Chapter 2 1. Parametric Models. 2 Parametric Models The first step in the design of online parameter identification (PI) algorithms is to lump the unknown.
Intro. ANN & Fuzzy Systems Lecture 26 Modeling (1): Time Series Prediction.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Definitions Random Signal Analysis (Review) Discrete Random Signals Random.
Elements of Stochastic Processes Lecture II
1 Lecture 1: February 20, 2007 Topic: 1. Discrete-Time Signals and Systems.
Week 21 Stochastic Process - Introduction Stochastic processes are processes that proceed randomly in time. Rather than consider fixed random variables.
CHAPTER 5 SIGNAL SPACE ANALYSIS
ارتباطات داده (883-40) فرآیندهای تصادفی نیمسال دوّم افشین همّت یار دانشکده مهندسی کامپیوتر 1.
Robotics Research Laboratory 1 Chapter 7 Multivariable and Optimal Control.
1 EE571 PART 4 Classification of Random Processes Huseyin Bilgekul Eeng571 Probability and astochastic Processes Department of Electrical and Electronic.
Lecture#10 Spectrum Estimation
Chapter 1 Random Process
ENEE631 Digital Image Processing (Spring'04) Basics on 2-D Random Signal Spring ’04 Instructor: Min Wu ECE Department, Univ. of Maryland, College Park.
Discrete-time Random Signals
1 EE571 PART 3 Random Processes Huseyin Bilgekul Eeng571 Probability and astochastic Processes Department of Electrical and Electronic Engineering Eastern.
Lecture 5 – 6 Z - Transform By Dileep Kumar.
Lecture 12: Parametric Signal Modeling XILIANG LUO 2014/11 1.
Joint Moments and Joint Characteristic Functions.
Random Processes Gaussian and Gauss-Markov processes Power spectrum of random processes and white processes.
Geology 5600/6600 Signal Analysis 14 Sep 2015 © A.R. Lowry 2015 Last time: A stationary process has statistical properties that are time-invariant; a wide-sense.
Topics 1 Specific topics to be covered are: Discrete-time signals Z-transforms Sampling and reconstruction Aliasing and anti-aliasing filters Sampled-data.
1 Review of Probability and Random Processes. 2 Importance of Random Processes Random variables and processes talk about quantities and signals which.
Geology 6600/7600 Signal Analysis 05 Oct 2015 © A.R. Lowry 2015 Last time: Assignment for Oct 23: GPS time series correlation Given a discrete function.
EEE Chapter 6 Random Processes and LTI Huseyin Bilgekul EEE 461 Communication Systems II Department of Electrical and Electronic Engineering Eastern.
Computacion Inteligente Least-Square Methods for System Identification.
Random Signals Basic concepts Bibliography Oppenheim’s book, Appendix A. Except A.5. We study a few things that are not in the book.
Adv DSP Spring-2015 Lecture#11 Spectrum Estimation Parametric Methods.
Chapter 6 Random Processes
Stochastic Process - Introduction
SIGNALS PROCESSING AND ANALYSIS
Outline Introduction Signal, random variable, random process and spectra Analog modulation Analog to digital conversion Digital transmission through baseband.
Random Process The concept of random variable was defined previously as mapping from the Sample Space S to the real line as shown below.
STOCHASTIC HYDROLOGY Random Processes
The Spectral Representation of Stationary Time Series
Chapter 6 Random Processes
16. Mean Square Estimation
CH2 Time series.
Presentation transcript:

2. Stationary Processes and Models A random variable x is a rule for assigning to every outcome s of an experiment a number x(s) For example: the outcome of die tossing. For example: the outcome of a survey. 非常滿意 滿意 尚可 不滿意 非常不滿意 5 4 3 2 1 The outcomes of an experiments are unknown before they are conducted. This is why the variable is called random. In many cases, we can collect many outcomes and characterize the variable using its distribution function. For example: the previous survey example.

The distribution function: Although the variable is random, we can somehow understand its characteristics and predict its outcomes if we know the distribution function. 80 50 55 45 30 1 2 3 4 5

If t is fixed, x(t,s) is a random variable. A random (stochastic) process x(t) is a rule for assigning to every outcome s of an experiment a time function x(t,s). If t is fixed, x(t,s) is a random variable. If s is fixed, x(t,s) is a outcome (time function) of a particular experiment. If both s and t are fixed, x(t,s) is a number. s t (ensemble)

How to characterize a random process mathematically? -- use statistics Examples: Record of a speech. Record of temperature. Received signal in mobile communication. How to characterize a random process mathematically? -- use statistics First order statistics : the density function of x(t) t f[x(t0)] f[x(t0+1)] f[x(t0+2)]

Second order statistics: f[x(t1),x(t2)] N-th order statistics: f[x(t1),x(t2),..,x(tN)] In any cases, we need infinite distribution functions. It is almost impossible to specify. A random process x(t) is called strict-sense stationary if its statistic properties are invariant to a shift of origin. Let the time be discrete, we have Except for the first order statistics, we still have infinite distributions to specify.

A random process is called wide-sense stationary (WSS) if The function R() is called the autocorrelation function. Note that R(0)=E{|x(n)|2} is the average power of x(n). The autocovariance is defined as For real signals, we have For complex signals, we have

Thus,  and r() completely characterize a WSS process. Note that the characterization of a random process by the mean and the autocorrelation function is not unique. Conclusion: In general, we can only partially characterize a small portion of random processes. In real applications, how can we know  and r() of a random process? It is usually impractical to obtain  and r() from ensemble average and we may want to use time average. Note that the estimate is a random variable itself and

The ensemble and time average: We say that the process x(n) is mean ergodic in the mean square sense if s t Time average Ensemble average

For N , | l |/N0. Thus, the condition implies It can be shown that For N , | l |/N0. Thus, the condition implies Thus if the process is asymptotic uncorrelated, it is mean ergodic in the MSE sense. * check this

By the same way, a process is said to be correlation ergodic in the MSE sense if the autocorrelation can be estimated using the time average. Let u(n)=[u(n),u(n-1),…,u(n-M+1)]T. The correlation matrix of u(n) is defined as R=E{u(n)uH(n)}.

The correlation matrix plays an important role in adaptive signal processing. Property 1: the correlation matrix (of a stationary process) is Hermitian, i.e., RH=R. Property 2: the correlation matrix is Toeplitz. Property 3: the correlation matrix is always nonnegative definite. Let y=xHu(n), then E{|y|2}=E{yy*}=E{xH u(n)uH(n)x}= xH E{u(n)uH(n)}x=xHRx0.

Property 4: Let uB(n)=[u(n-M+1),u(n-M+2),…,u(n)]T. Then In other words, E{uB(n) uBH(n)}=RT. Property 5: the correlation matrices RM and RM+1 are related by rH=[r(1),r(2),…r(M)] rBT=[r(-M),r(-M+1),…r(-1)]

This can be easily shown as follows: In this case, the input vector is decomposed as

Similarly, The input vector is decomposed as

Stochastic model: Any hypothesis that may be applied to explain or describe the hidden laws that are supposed to govern or constrain the generation of some physical data of interest. This is equivalent to ask how to synthesize a stochastic process? Or, how to characterize a stochastic process? Typically, using a model, we only need a set of parameters to do the job. Linear Filter White Gaussian Noise v(n) Stochastic processes u(n)

Autoregressive (AR) model : z-transform representation: The AR model is used most since the parameters identification is easier. Note that poles must be inside the unit circle. Strictly speaking, the AR process is not stationary. If n is large, we say that it is asymptotic stationary.

For example: u(n)=a u(n-1)+v(n) Thus, we have the variance of u(n) as * One can show that the autocorrelation also has the same characteristics. Thus, AR processes are asymptotic stationary.

The generation of an AR process:

Moving Average (MA) Model: z-transform representation: ARMA model: Computations of MA and ARMA model coefficients required solving systems of nonlinear equations.

Generation of ARMA signal: u(n) v(n) + + White Gaussian noise ARMA process D + + D + + D + +

Multiplying the AR difference equation by u Multiplying the AR difference equation by u*(n-l) and taking expectation, we have For l=0, we have

Let l=1,2,…,M. We then obtain a set of linear equations to solve ai’s. Once ai’s are found, we can then find v2.

Example AR2: u(n)+a1u(n-1)+a2u(n-2)=v(n) The 2-nd order AR process has the characteristic equation Thus, we have two poles. They must be located inside of the unit circle. It turns out the following conditions must be satisfied. Three examples are considered; (1) a1=-0.1, a2=-0.8, (poles: 0.9458, -0.8458) (2) a1=0.1, a2=-0.8, (poles: -0.9458, 0.8458) (3) a1=-0.975, a2=0.95 (poles: 0.4875 + j 0.8440, 0.4875 - j 0.8440).

Permissible region:

Samples: Input white noise

The autocorrelation for (1)

The autocorrelation for (2)

The autocorrelation for (3)

Is an AR process stationary? No. It is non-stationary. It is asymptotically stationary. This can be clearly seen from its recursive equation. For example, for a first-order AR process, we have We then have As we can see, even the variance is not a constant in the AR process. However, as n approaches infinity, it becomes stationary.

An information-theoretic (AIC) criterion: Order selection: Given a random process, how can we select an order for a modeling AR process? An information-theoretic (AIC) criterion: m : the order ui=u(i), I=1,2,…,N, observations : estimated parameters Minimum description length (MDL) criterion: N: the number of data

Let u(n) denote a complex Gaussian process consisting of N samples. A mean of zero An autocorrelation function denoted by and the set of the autocorrelation function defines the correlation matrix R of the Gaussian process u(n). The density function: Note that fU(u) is 2N-dimensinal. We use N(0,R) to denote a Gaussian process with correlation matrix R.

Properties: The process u(n) is stationary in the strict sense. The process is circularly complex. It is also referred to as a circularly complex Gaussian. Let un=u(n), for n=1,2,…,N denotes samples of a Gaussian process. If kl and k=l where denotes a permutation of {1,2,…,l}.

Linear transformation of a random process: The last one is called Gaussian factorization theorem. A special case is Linear transformation of a random process: u(n) x(n) Discrete-time linear filter

For an AR process: Thus, as long as we know poles’ positions, we can figure out the PSD of the AR process. u(n) v(n) white Gaussian Discrete-time all-pole filter

Power spectrum analyzer: Ideal bandpass filter

Spectrum analysis of stochastic processes: Let u(n) be a random process and uN(n)=u(n) for n=0,1,…,N-1, uN(n)=0 for n>N-1 [windowing of u(n)]. Then, The Fourier transform (FT) of uN(n) and its conjugate are Then,

Let l=n-k. We may rewrite the above formula as Thus, Thus, the FT of the autocorrelation function is called the power spectrum density (PSD) of a process. Let s() is a PSD. Then s()d corresponds to the average power of the contribution to the total power from components of a process with frequencies located between  and  +d.

Property : The PSD of a stationary process is real and nonnegative. Property: The frequency support of the PSD is the Nyquist interval (-, ]. Property: The PSD of a real stationary process is even (if it is complex, this is not necessarily true). Property: