Presentation is loading. Please wait.

Presentation is loading. Please wait.

ELG5377 Adaptive Signal Processing Lecture 7: Stochastic Models: Moving Average (MA), Autoregressive (AR) and ARMA.

Similar presentations


Presentation on theme: "ELG5377 Adaptive Signal Processing Lecture 7: Stochastic Models: Moving Average (MA), Autoregressive (AR) and ARMA."— Presentation transcript:

1 ELG5377 Adaptive Signal Processing Lecture 7: Stochastic Models: Moving Average (MA), Autoregressive (AR) and ARMA

2 Stochastic Models The term model is used for any hypothesis that may be applied to describe the hidden laws that govern the generation of physical data. A time series u(k) consisting of highly correlated observations may be generated by applying a series of statistically independent “shocks” to a linear filter. –The shocks are drawn from a fixed distribution (usually Gaussian) with zero mean and constant variance. –This time series of shocks is v(k). –E[v(k)v * (k-i)]=  v 2 for i = 0 and 0 for i ≠ 0. Discrete-time Linear Filter v(k)v(k)u(k)u(k)

3 Types of Models u(k) =  b i * v(k-i) –Moving Average Model FIR filter implementation  a i * u(k-i) = v(k) (usually a 0 = 1) u(k) = v(k) –  a i * u(k-i) (i ≠ 0) –Autoregressive Model IIR filter implementation ARMA –  a i * u(k-i) =  b i * v(k-i) –Cascade of FIR and IIR filters

4 Autoregressive Models The time series u(k), u(k-1), …, u(k-M) represents the realization of an autoregressive model of order M if it satisfies the difference equation –u(k) + a 1 * u(k-1)+…+a M * u(k-M) = v(k). Or –u(k) = w 1 * u(k-1)+…+w M * u(k-M) = v(k). Where w i = -a i. It is called an autoregressive model since u(k) is regressed on previous values of itself.

5 Correlation function of an asymptotically stationary AR process Starting with  a i * u(k-i) = v(k) –Multiply both sides by u*(k-l) and take the expectation. E[  a i * u(k-i)u * (k-l)] = E[v(k)u * (k-l)] The right side of the equation is 0 for l > 0. The left side is  a i *r(l-i) Therefore for l > 0,  a i *r(l-i) = 0 (for i = 0, 1, … M) (a 0 = 1) Or we can write this as –r(l) =  w i *r(l-i) = 0 (for i = 1,2, … M) –Want to find w 1, w 2, … w M.

6 Coefficients of AR model Recall that r(-x) = r*(x). If we take the complex conjugate of both sides, we get or r = Rw

7 Coefficients of AR model 2 r = [r * (1) r * (2) … r * (M)] T and w=[w 1 w 2 … w M ]T. R is the M×M correlation matrix of u(k). Therefore w = R -1 r. a 0 = 1 and a i = -w i. Next, let l = 0. E[v(k)u * (k)] = E[v(k)(  w i * u(k-i)+v(k))*] = E[v(k)v * (k)] =  v 2. The right side of the equation becomes  a i r(i). Therefore:

8 AR model example Find a third order AR model that produces a process with the following correlation function –r(i) = sinc(i/2) Solution –M = 3. –r(0) = 1, r(1) = 0.637, r(2) = 0, r(3) =

9 AR model example continued w= R -1 r. w = [1.552, , 0.703] T. a 0 = 1, a 1 = , a 2 = and a 3 =  v 2 =  a i r(i) = w1w1 w2w2 w3w3 + v(k)v(k)u(k)u(k)

10 Applying autoregressive models to nonstationary systems Let us consider a first order autoregressive model. –w(k) = b 1 w(k-1)+v(k) –We need r(0) and r(1). –b 1 = r(1)/r(0). –  v 2 = r(0)-r 2 (1)/r(0). Next let us consider the relationship between x(k) and d(k) in a stationary system. –d(k) = y o (k)+e o (k) = w o H x(k)+e o (k). Suppose that w o is time varying. Then the cross correlation between d(k) and x(k) would also be time-varying. –Nonstationary system. –Can represent as d(k) = w o H (k)x(k)+e o (k). where w o (k)=aw o (k-1)+  (k)


Download ppt "ELG5377 Adaptive Signal Processing Lecture 7: Stochastic Models: Moving Average (MA), Autoregressive (AR) and ARMA."

Similar presentations


Ads by Google