Presentation is loading. Please wait.

Presentation is loading. Please wait.

SYSTEMS Identification Ali Karimpour Assistant Professor Ferdowsi University of Mashhad Reference: “System Identification Theory For The User” Lennart.

Similar presentations


Presentation on theme: "SYSTEMS Identification Ali Karimpour Assistant Professor Ferdowsi University of Mashhad Reference: “System Identification Theory For The User” Lennart."— Presentation transcript:

1 SYSTEMS Identification Ali Karimpour Assistant Professor Ferdowsi University of Mashhad Reference: “System Identification Theory For The User” Lennart Ljung

2 lecture 9 Ali Karimpour Nov 2010 Lecture 9 Topics to be covered include: v Central Limit Theorem v The Prediction-Error approach: Basic theorem v Expression for the Asymptotic Variance v Frequency-Domain Expressions for the Asymptotic Variance v Distribution of Estimation for the correlation Approach v Distribution of Estimation for the Instrumental Variable Methods 2 Asymptotic Distribution of Parameter Estimators

3 lecture 9 Ali Karimpour Nov 2010 Overview 3 If convergence is guaranteed, then But, how fast does the estimate approach the limit? What is the probability distribution of ? The variance analysis of this chapter will reveal: a) The estimate converges to at a rate proportional to b) Distribution converges to a Gaussian distribution: N(0,Q) c) Covariance matrix Q, depends on - The number of samples/data set size: N, - The parameter sensitivity of the predictor: - The noise variance

4 lecture 9 Ali Karimpour Nov 2010 Overview 4 If convergence is guaranteed, then

5 lecture 9 Ali Karimpour Nov 2010 Central Limit Theorem Topics to be covered include: v Central Limit Theorem v The Prediction-Error approach: Basic Theorem v Expression for the Asymptotic Variance v Frequency-Domain Expressions for the Asymptotic Variance v Distribution of Estimation for the correlation Approach v Distribution of Estimation for the Instrumental Variable Methods 5

6 lecture 9 Ali Karimpour Nov 2010 6 Central Limit Theorem The mathematical tool needed for asymptotic variance analysis is “Central Limit” theorems. Example: Consider two independent random variable, X and Y, with the same uniform distribution, shown in Figure below. Define another random variable Z as the sum of X and Y: Z=X+Y. we can obtain the distribution of Z, as :

7 lecture 9 Ali Karimpour Nov 2010 7 In general, the PDF of a random variable approaches a Gaussian distribution, regardless of the PDF of each, as N gets larger. Central Limit Theorem Further, consider W=X+Y+Z. The resultant PDF is getting close to a Gaussian distribution The resultant PDF is getting close to a Gaussian distribution.

8 lecture 9 Ali Karimpour Nov 2010 8 Central Limit Theorem Let be a d-dimensional random variable with : Mean Cov Consider the sum of given by: Then, as N tends to infinity, the distribution of converges to the Gaussian distribution given by PDF:

9 lecture 9 Ali Karimpour Nov 2010 The Prediction-Error approach: Basic Theorem Topics to be covered include: v Central Limit Theorem v The Prediction-Error approach: Basic Theorem v Expression for the Asymptotic Variance v Frequency-Domain Expressions for the Asymptotic Variance v Distribution of Estimation for the correlation Approach v Distribution of Estimation for the Instrumental Variable Methods 9

10 lecture 9 Ali Karimpour Nov 2010 The Prediction-Error Approach 10 Then, with prime denoting differentiation with respect to, Expanding around gives: is a vector “between” Applying the Central Limit Theorem, we can obtain the distribution of estimate as N tends to infinity. Let be an estimate based on the prediction error method

11 lecture 9 Ali Karimpour Nov 2010 The Prediction-Error Approach 11 Assume that is nonsingular, then: Where as usual: To obtain the distribution of, and must be computed as N tends to infinity.

12 lecture 9 Ali Karimpour Nov 2010 The Prediction-Error Approach 12 For simplicity, we first assume that the predictor is given by a linear regression: The actual data is generated by ( is the parameter vector of the true system) So: Therefore:

13 lecture 9 Ali Karimpour Nov 2010 13 Let us treat as a random variable. Its mean is zero, since: The covariance is Consider: Appling the central limit Theorem: The Prediction-Error Approach

14 lecture 9 Ali Karimpour Nov 2010 The Prediction-Error Approach 14 Next, compute : And: Exercise1: Proof (I)

15 lecture 9 Ali Karimpour Nov 2010 The Prediction-Error Approach 15 We obtain: So:

16 lecture 9 Ali Karimpour Nov 2010 The Prediction-Error Approach 16 Theorem Consider the estimate determined by: and also we have: and also: Then Assume that the model structure is linear and uniformly stable and that the data set is subject to D1. Assume also that for a unique value interior to D M we have: where

17 lecture 9 Ali Karimpour Nov 2010 The Prediction-Error Approach 17 As stated formally in pervious Theorem, the distribution of converges to a Gaussian distribution for the broad class of system identification problems. This is called the asymptotic covariance matrix and it depends on (c) Noise variance This implies that the covariance of asymptotically converges to: (a) the number of samples/data set size: N (b) the parameter sensitivity of the predictor:

18 lecture 9 Ali Karimpour Nov 2010 Expression for the Asymptotic Variance Topics to be covered include: v Central Limit Theorem v The Prediction-Error approach: Basic Theorem v Expression for the Asymptotic Variance v Frequency-Domain Expressions for the Asymptotic Variance v Distribution of Estimation for the correlation Approach v Distribution of Estimation for the Instrumental Variable Methods 18

19 lecture 9 Ali Karimpour Nov 2010 19 Let us compute the covariance once again for the general case: Unlike the linear regression, the sensitivity is a function of θ, Expression for the Asymptotic Variance

20 lecture 9 Ali Karimpour Nov 2010 20 So Expression for the Asymptotic Variance Similarly Hence : Therefore The asymptotic variance is therefore a) inversely proportional to the number of samples, b) proportional to the noise variance, and c) Inversely related to the parameter sensitivity.

21 lecture 9 Ali Karimpour Nov 2010 21 A very important and useful aspect of expressions for the asymptotic covariance matrix is that it can be estimated from data. Having N data points and determined we may use: Since is not known, the asymptotic variance cannot be determined. sufficient data samples needed for assuming the model accuracy may be obtained. Expression for the Asymptotic Variance

22 lecture 9 Ali Karimpour Nov 2010 22 Consider the system Suppose that the coefficient for u(t-1) is known and the system is identified in the model structure and are two independent white noise with variances and respectively We have: Example : Covariance of LS Estimates Or Hence

23 lecture 9 Ali Karimpour Nov 2010 23 Hence To compute the covariance, square the first equation and take the expectation: Multiplying the first equation by y(t-1) and taking expectation gives: The last equality follows, since u(t) does not affect y(t) (due to the time delay ) Hence: Example : Covariance of LS Estimates

24 lecture 9 Ali Karimpour Nov 2010 24 Cov (a) = 0.047 The actual value of a is considered as 0.1 for the previous example. Both e(t) and u(t) are considered white noise with variance 1 and number of data is 10. Estimated values for parameter a, for 100 independent experiment using LSE, is shown in the bellow Figure. Example : Covariance of LS Estimates

25 lecture 9 Ali Karimpour Nov 2010 25 Cov (a) = 0.014 Example : Covariance of LS Estimates Now we increase the variance of u(t) from 1 to 10. But other parameters are the same. It can be seen that Cov(a) decreases by increasing input variance, as we expect.

26 lecture 9 Ali Karimpour Nov 2010 26 Now we increase the variance of e(t) from 1 to 10 and set the variance of u(t) in 1 and repeat the first experiment while a=0.1. It can be seen that Cov(a) increases by increasing noise variance, as we expect. Cov (a) = 0.120 Example : Covariance of LS Estimates

27 lecture 9 Ali Karimpour Nov 2010 27 Example : Covariance of an MA(1) Parameter Consider the system is white noise with variance. The MA(1) model structure is used: Given the predictor 4.18: Differentiation w.r.t c gives At c=c 0 we have : If is the PEM estimate of c: Exercise2: Proof (I)

28 lecture 9 Ali Karimpour Nov 2010 28 For general Norm We have: Similarly: The expression for the asymptotic covariance matrix is rather complicated in general. Asymptotic Variance for general Norms.

29 lecture 9 Ali Karimpour Nov 2010 29 After straightforward calculations: The choice of in the criterion only acts as scaling of the covariance matrix Exercise3: Proof (I) where

30 lecture 9 Ali Karimpour Nov 2010 Frequency-Domain Expressions for the Asymptotic Variance Topics to be covered include: v Central Limit Theorem v The Prediction-Error approach: Basic Theorem v Expression for the Asymptotic Variance v Frequency-Domain Expressions for the Asymptotic Variance v Distribution of Estimation for the correlation Approach v Distribution of Estimation for the Instrumental Variable Methods 30

31 lecture 9 Ali Karimpour Nov 2010 31 Frequency-Domain Expressions for the Asymptotic Variance. The asymptotic variance has different expression in the frequency domain, which we will find useful for variance analysis and experiment design. Let transfer function and noise model be consolidated into a matrix The gradient of T, that is, the sensitivity of T to θ, is For a predictor, we have already defined W(q,θ,) and z(t), s.t.

32 lecture 9 Ali Karimpour Nov 2010 32 Therefore the predictor sensitivity is given by Where Substituting in the first equation: Frequency-Domain Expressions for the Asymptotic Variance.

33 lecture 9 Ali Karimpour Nov 2010 33 At (the true system), note and where Let be the spectrum matrix of : Using the familiar formula: Frequency-Domain Expressions for the Asymptotic Variance.

34 lecture 9 Ali Karimpour Nov 2010 34 For the noise spectrum, Using this in equation below: We have: The asymptotic variance in the frequency domain. Frequency-Domain Expressions for the Asymptotic Variance.

35 lecture 9 Ali Karimpour Nov 2010 Distribution of Estimation for the correlation Approach Topics to be covered include: v Central Limit Theorem v The Prediction-Error approach: Basic Theorem v Expression for the Asymptotic Variance v Frequency-Domain Expressions for the Asymptotic Variance v Distribution of Estimation for the correlation Approach v Distribution of Estimation for the Instrumental Variable Methods 35

36 lecture 9 Ali Karimpour Nov 2010 36 The Correlation Approach By Taylor expansion we have: We shall confine ourselves to the case study in Theorem 8.6, that is, and linearly generated instruments. We thus have:

37 lecture 9 Ali Karimpour Nov 2010 37 This is entirely analogous with the previous one obtained for PE approach, with he difference that in is replaced with in. The Correlation Approach

38 lecture 9 Ali Karimpour Nov 2010 38 The Correlation Approach Theorem : consider by Assume that is computed for a linear, uniformly stable model structure And that: is a uniformly stable family of filters. Assume also that that is nonsingular and that And the data set is subject to D1

39 lecture 9 Ali Karimpour Nov 2010 39 Under the assumption S M,there exists a value such that: For L(q)=1 Then The Correlation Approach

40 lecture 9 Ali Karimpour Nov 2010 40 The Correlation Approach

41 lecture 9 Ali Karimpour Nov 2010 41 Example 9.5 : Covarianc of Pseudolinear Regression Estimate for example 9.2 but suppose that the c estimate is determined by the PLR method that is: The Correlation Approach Here:

42 lecture 9 Ali Karimpour Nov 2010 42 The Correlation Approach

43 lecture 9 Ali Karimpour Nov 2010 43 The Correlation Approach

44 lecture 9 Ali Karimpour Nov 2010 Distribution of Estimation for the Instrumental Variable Methods Topics to be covered include: v Central Limit Theorem v The Prediction-Error approach: Basic Theorem v Expression for the Asymptotic Variance v Frequency-Domain Expressions for the Asymptotic Variance v Distribution of Estimation for the correlation Approach v Distribution of Estimation for the Instrumental Variable Methods 44

45 lecture 9 Ali Karimpour Nov 2010 45 Instrumental Variable Methods Suppose the true system is given as Where e(t) is white noise with variance independent of {u(t)}. then is independent of {u(t)} and hence of if the system operates in open loop. Thus is a solution to: We have:

46 lecture 9 Ali Karimpour Nov 2010 46 Instrumental Variable Methods To get an asymptotic distribution, we shall assume it is the only solution to. Introduce also the monic filter Intersecting into these Eqs.,

47 lecture 9 Ali Karimpour Nov 2010 47 Instrumental Variable Methods

48 lecture 9 Ali Karimpour Nov 2010 Example 9.6 Covariance of an IV estimate : Consider the system : 48 Model structure : And let a be estimated by the IV method using the instrument is: And L(q) = 1 Instrumental Variable Methods

49 lecture 9 Ali Karimpour Nov 2010 49 Instrumental Variable Methods


Download ppt "SYSTEMS Identification Ali Karimpour Assistant Professor Ferdowsi University of Mashhad Reference: “System Identification Theory For The User” Lennart."

Similar presentations


Ads by Google