1 / 18

General Multivariate Gaussian Detection Problems

General Multivariate Gaussian Detection Problems. ECE 7251: Spring 2004 Lecture 30 3/29/04. Prof. Aaron D. Lanterman School of Electrical & Computer Engineering Georgia Institute of Technology AL: 404-385-2548 <lanterma@ece.gatech.edu>. General Gaussian Problem. We have a data vector

hedya
Télécharger la présentation

General Multivariate Gaussian Detection Problems

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. General Multivariate Gaussian Detection Problems ECE 7251: Spring 2004 Lecture 30 3/29/04 Prof. Aaron D. Lanterman School of Electrical & Computer Engineering Georgia Institute of Technology AL: 404-385-2548 <lanterma@ece.gatech.edu>

  2. General Gaussian Problem • We have a data vector • distributed according to • Two hypotheses: • Likelihood ratio:

  3. Loglikelihood Ratio Test • Test looks like where

  4. Mahalanobis Distance Interpretation • Define a norm on : • Emphasizes components of z which are colinear to eigenvectors of R associated with small eigenvalues • Can rewrite test statistic as

  5. Quadratic Form Interpretation • Alternatively, express using a new statistic and a new threshold

  6. Four Kinds of Decision Regions (1) • If the covariances are equal, i.e. the test reduces to and the decision region is a hyperplane 2. If i.e., is positive definite, the H1 decision region is the interior of an ellipsoid

  7. Four Kinds of Decision Regions (2) 3.If i.e., is negative definite, the H1 decision region is the exterior of an ellipsoid 4. If none of the above apply, i.e. is neither singular, positive definite, or negative definite, then the decision region has hyperbolic boundaries

  8. Known Signal in White Noise • Consider the familiar special case: • Test can be expressed as • Analysis from univariate case now applies “Deflection ratio” or “detectability index” is

  9. Could transform to white noise case by linearly preprocessing the data to give an equivalent test in terms of Known Signal in Colored Noise • Now let R be general • Neyman-Pearson test has the form

  10. The Prewhitening Transformation • Use MATLAB (or whatever) to compute the eigendecomposition D has positive eigenvalues of R along diagonal columns of U are orthogonal eigenvectors of R • We can define • Our prewhitening operation is

  11. Performance for Colored Noise • Test statistic for the colored noise problem is still Gaussian, so we can again use formulas from the univariate case with • In white noise case, performance depended only on the total power of s, not its shape • Here, in the colored noise case, performance depends on shape as well!

  12. Signal Design for Colored Noise • Problem: maximize subject to the constraint • Rayleigh quotient theorem says: furthermore, we have equality if s is a minimizing eigenvector of R • So, to make d2 big, pick s to be that minimizing eigenvector

  13. Unequal Covariances • For equal means, test statistic purely quadratic • Analysis is simplified by prefiltering to diagonalize • Still a total pain; test statistic is a mixture of chi-square random variables • For unequal means, it’s even more of a pain; test statistic is mixture of noncentral chi-square random variables

  14. A Simple Zero-Mean Case note signal power is time-varying • In our generic notation, we have

  15. A Simple Zero-Mean Case • Test statistic is • Several different interpretations

  16. > < Filter-Squarer Interpretation

  17. > < Estimator-Correlator Interpretation

  18. > < Wide-Sense-Stationary Sequences • Suppose Sk and Wk are wide-sense stationary Gaussian time series with power spectral densities • Estimator-Correlator structure generalizes

More Related