Presentation is loading. Please wait.

Presentation is loading. Please wait.

2. Stationary Processes and Models

Similar presentations


Presentation on theme: "2. Stationary Processes and Models"— Presentation transcript:

1 2. Stationary Processes and Models
A random variable x is a rule for assigning to every outcome s of an experiment a number x(s) For example: the outcome of die tossing. For example: the outcome of a survey. 非常滿意 滿意 尚可 不滿意 非常不滿意 The outcomes of an experiments are unknown before they are conducted. This is why the variable is called random. In many cases, we can collect many outcomes and characterize the variable using its distribution function. For example: the previous survey example.

2 The distribution function:
Although the variable is random, we can somehow understand its characteristics and predict its outcomes if we know the distribution function. 80 50 55 45 30

3 If t is fixed, x(t,s) is a random variable.
A random (stochastic) process x(t) is a rule for assigning to every outcome s of an experiment a time function x(t,s). If t is fixed, x(t,s) is a random variable. If s is fixed, x(t,s) is a outcome (time function) of a particular experiment. If both s and t are fixed, x(t,s) is a number. s t (ensemble)

4 How to characterize a random process mathematically? -- use statistics
Examples: Record of a speech. Record of temperature. Received signal in mobile communication. How to characterize a random process mathematically? -- use statistics First order statistics : the density function of x(t) t f[x(t0)] f[x(t0+1)] f[x(t0+2)]

5 Second order statistics: f[x(t1),x(t2)]
N-th order statistics: f[x(t1),x(t2),..,x(tN)] In any cases, we need infinite distribution functions. It is almost impossible to specify. A random process x(t) is called strict-sense stationary if its statistic properties are invariant to a shift of origin. Let the time be discrete, we have Except for the first order statistics, we still have infinite distributions to specify.

6 A random process is called wide-sense stationary (WSS) if
The function R() is called the autocorrelation function. Note that R(0)=E{|x(n)|2} is the average power of x(n). The autocovariance is defined as For real signals, we have For complex signals, we have

7 Thus,  and r() completely characterize a WSS process.
Note that the characterization of a random process by the mean and the autocorrelation function is not unique. Conclusion: In general, we can only partially characterize a small portion of random processes. In real applications, how can we know  and r() of a random process? It is usually impractical to obtain  and r() from ensemble average and we may want to use time average. Note that the estimate is a random variable itself and

8 The ensemble and time average:
We say that the process x(n) is mean ergodic in the mean square sense if s t Time average Ensemble average

9 For N , | l |/N0. Thus, the condition implies
It can be shown that For N , | l |/N0. Thus, the condition implies Thus if the process is asymptotic uncorrelated, it is mean ergodic in the MSE sense. * check this

10 By the same way, a process is said to be correlation ergodic in the MSE sense if the autocorrelation can be estimated using the time average. Let u(n)=[u(n),u(n-1),…,u(n-M+1)]T. The correlation matrix of u(n) is defined as R=E{u(n)uH(n)}.

11 The correlation matrix plays an important role in adaptive signal processing.
Property 1: the correlation matrix (of a stationary process) is Hermitian, i.e., RH=R. Property 2: the correlation matrix is Toeplitz. Property 3: the correlation matrix is always nonnegative definite. Let y=xHu(n), then E{|y|2}=E{yy*}=E{xH u(n)uH(n)x}= xH E{u(n)uH(n)}x=xHRx0.

12 Property 4: Let uB(n)=[u(n-M+1),u(n-M+2),…,u(n)]T. Then
In other words, E{uB(n) uBH(n)}=RT. Property 5: the correlation matrices RM and RM+1 are related by rH=[r(1),r(2),…r(M)] rBT=[r(-M),r(-M+1),…r(-1)]

13 This can be easily shown as follows:
In this case, the input vector is decomposed as

14 Similarly, The input vector is decomposed as

15 Stochastic model: Any hypothesis that may be applied to explain or describe the hidden laws that are supposed to govern or constrain the generation of some physical data of interest. This is equivalent to ask how to synthesize a stochastic process? Or, how to characterize a stochastic process? Typically, using a model, we only need a set of parameters to do the job. Linear Filter White Gaussian Noise v(n) Stochastic processes u(n)

16 Autoregressive (AR) model :
z-transform representation: The AR model is used most since the parameters identification is easier. Note that poles must be inside the unit circle. Strictly speaking, the AR process is not stationary. If n is large, we say that it is asymptotic stationary.

17 For example: u(n)=a u(n-1)+v(n)
Thus, we have the variance of u(n) as * One can show that the autocorrelation also has the same characteristics. Thus, AR processes are asymptotic stationary.

18 The generation of an AR process:

19 Moving Average (MA) Model:
z-transform representation: ARMA model: Computations of MA and ARMA model coefficients required solving systems of nonlinear equations.

20 Generation of ARMA signal:
u(n) v(n) + + White Gaussian noise ARMA process D + + D + + D + +

21 Multiplying the AR difference equation by u
Multiplying the AR difference equation by u*(n-l) and taking expectation, we have For l=0, we have

22 Let l=1,2,…,M. We then obtain a set of linear equations to solve ai’s.
Once ai’s are found, we can then find v2.

23 Example AR2: u(n)+a1u(n-1)+a2u(n-2)=v(n)
The 2-nd order AR process has the characteristic equation Thus, we have two poles. They must be located inside of the unit circle. It turns out the following conditions must be satisfied. Three examples are considered; (1) a1=-0.1, a2=-0.8, (poles: , ) (2) a1=0.1, a2=-0.8, (poles: , ) (3) a1=-0.975, a2=0.95 (poles: j , j ).

24 Permissible region:

25 Samples: Input white noise

26 The autocorrelation for (1)

27 The autocorrelation for (2)

28 The autocorrelation for (3)

29 Is an AR process stationary?
No. It is non-stationary. It is asymptotically stationary. This can be clearly seen from its recursive equation. For example, for a first-order AR process, we have We then have As we can see, even the variance is not a constant in the AR process. However, as n approaches infinity, it becomes stationary.

30 An information-theoretic (AIC) criterion:
Order selection: Given a random process, how can we select an order for a modeling AR process? An information-theoretic (AIC) criterion: m : the order ui=u(i), I=1,2,…,N, observations : estimated parameters Minimum description length (MDL) criterion: N: the number of data

31 Let u(n) denote a complex Gaussian process consisting of N samples.
A mean of zero An autocorrelation function denoted by and the set of the autocorrelation function defines the correlation matrix R of the Gaussian process u(n). The density function: Note that fU(u) is 2N-dimensinal. We use N(0,R) to denote a Gaussian process with correlation matrix R.

32 Properties: The process u(n) is stationary in the strict sense.
The process is circularly complex. It is also referred to as a circularly complex Gaussian. Let un=u(n), for n=1,2,…,N denotes samples of a Gaussian process. If kl and k=l where denotes a permutation of {1,2,…,l}.

33 Linear transformation of a random process:
The last one is called Gaussian factorization theorem. A special case is Linear transformation of a random process: u(n) x(n) Discrete-time linear filter

34 For an AR process: Thus, as long as we know poles’ positions, we can figure out the PSD of the AR process. u(n) v(n) white Gaussian Discrete-time all-pole filter

35 Power spectrum analyzer:
Ideal bandpass filter

36 Spectrum analysis of stochastic processes: Let u(n) be a random process and uN(n)=u(n) for n=0,1,…,N-1, uN(n)=0 for n>N-1 [windowing of u(n)]. Then, The Fourier transform (FT) of uN(n) and its conjugate are Then,

37 Let l=n-k. We may rewrite the above formula as
Thus, Thus, the FT of the autocorrelation function is called the power spectrum density (PSD) of a process. Let s() is a PSD. Then s()d corresponds to the average power of the contribution to the total power from components of a process with frequencies located between  and  +d.

38 Property : The PSD of a stationary process is real and nonnegative.
Property: The frequency support of the PSD is the Nyquist interval (-, ]. Property: The PSD of a real stationary process is even (if it is complex, this is not necessarily true). Property:


Download ppt "2. Stationary Processes and Models"

Similar presentations


Ads by Google