1 / 44

SYSTEMS Identification

SYSTEMS Identification. Ali Karimpour Assistant Professor Ferdowsi University of Mashhad. Lecture 9. Asymptotic Distribution of Parameter Estimators. Topics to be covered include : Central Limit Theorem The Prediction-Error approach: Basic theorem Expression for the Asymptotic Variance

sol
Télécharger la présentation

SYSTEMS Identification

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. SYSTEMSIdentification Ali Karimpour Assistant Professor Ferdowsi University of Mashhad

  2. Lecture 9 Asymptotic Distribution of Parameter Estimators Topics to be covered include: • Central Limit Theorem • The Prediction-Error approach: Basic theorem • Expression for the Asymptotic Variance • Frequency-Domain Expressions for the Asymptotic Variance • Distribution of Estimation for the correlation Approach • Distribution of Estimation for the Instrumental Variable Methods

  3. Overview If convergence is guaranteed, then But, how fast does the estimate approach the limit? What is the probability distribution of ? • The variance analysis of this chapter will reveal: • a) The estimate converges to at a rate proportional to • b) Distribution converges to a Gaussian distribution: N(0,Q) • c) Covariance matrix Q, depends on • - The number of samples/data set size: N, • - The parameter sensitivity of the predictor: • - The noise variance

  4. Central Limit Theorem Topics to be covered include: • Central Limit Theorem • The Prediction-Error approach: Basic Theorem • Expression for the Asymptotic Variance • Frequency-Domain Expressions for the Asymptotic Variance • Distribution of Estimation for the correlation Approach • Distribution of Estimation for the Instrumental Variable Methods

  5. Central Limit Theorem The mathematical tool needed for asymptotic variance analysis is “Central Limit” theorems. Example: Consider two independent random variable, X and Y, with the same uniform distribution, shown in Figure below. Define another random variable Z as the sum of X and Y: Z=X+Y. we can obtain the distribution of Z, as :

  6. Central Limit Theorem Further, consider W=X+Y+Z. The resultant PDF is getting close to a Gaussian distribution The resultant PDF is getting close to a Gaussian distribution. In general, the PDF of a random variable approaches a Gaussian distribution, regardless of the PDF of each , as N gets larger.

  7. Central Limit Theorem Let be a d-dimensional random variable with : Mean Cov Consider the sum of given by: Then, as N tends to infinity, the distribution of converges to the Gaussian distribution given by PDF:

  8. The Prediction-Error approach: Basic Theorem Topics to be covered include: • Central Limit Theorem • The Prediction-Error approach: Basic Theorem • Expression for the Asymptotic Variance • Frequency-Domain Expressions for the Asymptotic Variance • Distribution of Estimation for the correlation Approach • Distribution of Estimation for the Instrumental Variable Methods

  9. The Prediction-Error Approach Applying the Central Limit Theorem, we can obtain the distribution of estimate as N tends to infinity. Let be an estimate based on the prediction error method Then, with prime denoting differentiation with respect to , Expanding around gives: is a vector “between”

  10. The Prediction-Error Approach Assume that is nonsingular, then: To obtain the distribution of , and must be computed as N tends to infinity. Where as usual:

  11. The Prediction-Error Approach For simplicity, we first assume that the predictor is given by a linear regression: The actual data is generated by (is the parameter vector of the true system) So: Therefore:

  12. The Prediction-Error Approach Let us treat as a random variable. Its mean is zero, since: The covariance is Consider: Appling the central limit Theorem:

  13. The Prediction-Error Approach Next, compute : And:

  14. The Prediction-Error Approach We obtain: So:

  15. The Prediction-Error Approach The extended result of estimate distribution is summarized in the following theorem, i.e. Ljun’g Textbook Theorem 9-1. Theorem 1 Consider the estimate determined by: Assume that the model structure is linear and uniformly stable and that the data setsatisfies the quasi stationary requirements. Assume also that converges with probability 1 to a unique parameter vector involved in : Also we have: And that: Converge to with probability 1

  16. The Prediction-Error Approach Where is the ensemble mean given by: Then, the distribution of converges to the Gaussian distribution given by

  17. The Prediction-Error Approach As stated formally in Theorem 1, the distribution of converges to a Gaussian distribution for the broad class of system identification problems. This implies that the covariance of asymptotically converges to: • This is called the asymptotic covariance matrix and it depends not only on • (a) the number of samples/data set size: N, but also on • (b) the parameter sensitivity of the predictor: (c) Noise variance

  18. Expression for the Asymptotic Variance Topics to be covered include: • Central Limit Theorem • The Prediction-Error approach: Basic Theorem • Expression for the Asymptotic Variance • Frequency-Domain Expressions for the Asymptotic Variance • Distribution of Estimation for the correlation Approach • Distribution of Estimation for the Instrumental Variable Methods

  19. Quadratic Criterion Let us compute the covariance once again for the general case: Unlike the linear regression, the sensitivity is a function of θ, Assume that is a white noise with zero mean and variances . We have:

  20. Quadratic Criterion Similarly Hence: The asymptotic variance is therefore a) inversely proportional to the number of samples, b) proportional to the noise variance, and c) inversely related to the parameter sensitivity. The more a parameter affects the prediction, the smaller the variance becomes.

  21. Quadratic Criterion Since is not known, the asymptotic variance cannot be determined. A very important and useful aspect of expressions for the asymptotic covariance matrix is that it can be estimated from data. Having N data points and determined we mat use: sufficient data samples needed for assuming the model accuracy may be obtained.

  22. Example : Covariance of LS Estimates Consider the system and are two independent white noise with variances and respectively Suppose that the coefficient for is known and the system is identified in the model structure Or We have:

  23. Example : Covariance of LS Estimates Hence To compute the covariance, square the first equation and take the expectation: Multiplying the first equation by and taking expectation gives: The last equality follows, since does not affect (due to the time delay ) Hence:

  24. Example : Covariance of LS Estimates Assume a=0.1, Estimated values for parameter a, for 100 independent experiment using LSE, is shown in the bellow Figure. Cov (a) = 0.0471

  25. Example : Covariance of LS Estimates Now, assume a=0.1, Cov (a) = 0.0014

  26. Example : Covariance of LS Estimates Now, assume a=0.1, Cov (a) = 0.1204

  27. Example : Covariance of an MA(1) Parameter Consider the system is white noise with variance . The MA(1) model structure is used: Given the predictor 4.18: Differentiation w.r.t c gives At we have : If is the PEM estimate of c:

  28. Asymptotic Variance for general Norms. For general Norm We have: Similarly: We can use the asymptotic normality result in this more general form whenever required. The expression for the asymptotic covariance matrix is rather complicated in general.

  29. Asymptotic Variance for general Norms. Assume that . Then under assumption after straightforward calculations: Clearly for for quadratic The choice of in the criterion only acts as scaling of the covariance matrix

  30. Frequency-Domain Expressions for the Asymptotic Variance Topics to be covered include: • Central Limit Theorem • The Prediction-Error approach: Basic Theorem • Expression for the Asymptotic Variance • Frequency-Domain Expressions for the Asymptotic Variance • Distribution of Estimation for the correlation Approach • Distribution of Estimation for the Instrumental Variable Methods

  31. Frequency-Domain Expressions for the Asymptotic Variance. The asymptotic variance has different expression in the frequency domain, which we will find useful for variance analysis and experiment design. Let transfer function and noise model be consolidated into a matrix The gradient of T, that is, the sensitivity of T to θ, is For a predictor, we have already defined W(q,θ,) and z(t), s.t.

  32. Frequency-Domain Expressions for the Asymptotic Variance. Therefore the predictor sensitivity is given by Where Substituting in the first equation:

  33. Frequency-Domain Expressions for the Asymptotic Variance. At (the true system), note and where Let be the spectrum matrix of : Using the familiar formula:

  34. Frequency-Domain Expressions for the Asymptotic Variance. For the noise spectrum, Using this in equation below: We have: The asymptotic variance in the frequency domain.

  35. Distribution of Estimation for the correlation Approach Topics to be covered include: • Central Limit Theorem • The Prediction-Error approach: Basic Theorem • Expression for the Asymptotic Variance • Frequency-Domain Expressions for the Asymptotic Variance • Distribution of Estimation for the correlation Approach • Distribution of Estimation for the Instrumental Variable Methods

  36. The Correlation Approach We shall confine ourselves to the case study in Theorem 8.6, that is, and linearly generated instruments. We thus have: By Taylor expansion we have: This is entirely analogous with the previous one obtained for PE approach, with he difference that in is replaced with in .

  37. The Correlation Approach Theorem : consider by Assume that is computed for a linear, uniformly stable model structure And that: is a uniformly stable family of filters. Assume also that that is nonsingular and that Then

  38. The Correlation Approach

  39. Example : Covariance of an MA(1) Parameter Consider the system is white noise with variance . Using PLR method: At we have :

  40. Distribution of Estimation for the Instrumental Variable Methods Topics to be covered include: • Central Limit Theorem • The Prediction-Error approach: Basic Theorem • Expression for the Asymptotic Variance • Frequency-Domain Expressions for the Asymptotic Variance • Distribution of Estimation for the correlation Approach • Distribution of Estimation for the Instrumental Variable Methods

  41. Instrumental Variable Methods We have: Suppose the true system is given as Where e(t) is white noise with variance independent of {u(t)}. then is independent of {u(t)} and hence of if the system operates in open loop. Thus is a solution to:

  42. Instrumental Variable Methods To get an asymptotic distribution, we shall assume it is the only solution to . Introduce also the monic filter Intersecting into these Eqs.,

  43. Example : Covariance of LS Estimates Consider the system and are two independent white noise with variances and respectively Suppose that the coefficient for is known and the system is identified in the model structure. Let a be estimated by the IV method using: By comparing the above system with We have

  44. Instrumental Variable Methods

More Related