1 / 80

Statistical modeling, classification, and sensor management

Statistical modeling, classification, and sensor management. DARPA-MURI Review 2003. Alfred Hero Univ. Michigan Ann Arbor. Target Search Scenario. HighRes Spot Scan. LowRes Spot Scan. Strip Scan. Sensor Deployment Architecture. Our Research themes:. Sequential Sensor Management.

xanthe
Télécharger la présentation

Statistical modeling, classification, and sensor management

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Statistical modeling, classification, and sensor management DARPA-MURI Review 2003 Alfred Hero Univ. Michigan Ann Arbor

  2. Target Search Scenario HighRes Spot Scan LowRes Spot Scan Strip Scan

  3. Sensor Deployment Architecture Our Research themes: Sequential Sensor Management Image Reconstruction Adaptive Detection

  4. Research Loci • Image modeling and reconstruction • Markov random field (MRF) polarimetric models (Hory&Blatt) • 3D Imaging with uncalibrated sensor nets (Rangarajan&Patwari) • Adaptive detection and classification • Pattern matching and modeling (Costa) • Distributed detection and classification (Blatt&Patwari) • Sequential sensor management • Myopic information-driven approaches (Kreucher) • Non-myopic approaches (Kruecher&Blatt) Common theme: adaptive robust non-parametric methods

  5. Detection: Target or Clutter Alone?

  6. Detection: Target or Clutter Alone?

  7. Target Returns Not Additive or Gaussian • 1cm x 1cm x 1mm plate at 1m from ground • Plate under forest canopy (10 deciduous trees) • 2GHz SAR illumination • Aggregate of three look angles (azimuth=35,45,55, elev=180) SNR=0dB SNR=6dB

  8. Polarimetric Field Modeling and Reconstruction h-pol. incidence • Field Distribution On FDTD Box (2 GHz) v-pol. incidence

  9. MRF empirical histogram Conditional Markov transition histogram …estimated from training data

  10. Causal kNN predictor: Non-Causal MRF model: Causal MRF Field Synthesis

  11. Example: K-NN MRF Extrapolation

  12. Non-parametric MRF density estimator • General penalized MRF transition density estimate • y is observed data • parameter b enforces smoothness • function g(f) captures data-fidelity • g(f)=|f|^2: standard L2 quadratic regularization • g(f)=|f|: L1 regularization for denoising • w(x): smoothing within and across neighborhoods

  13. Cartoon illustration of density estimator K-Nearest Neighbors Estimator Penalized MRF transition Density Estimator

  14. Visual Validation of MRF Model g(f)=|f|

  15. MRF Transition Density Comparisons

  16. Target Modeling and Classification • Pattern matching in high dimensions • Standard techniques (histogram, density estimation) fail due to curse of dimensionality • Entropic graphs recover inter-distribution distance directly • Robustification to outliers through graph pruning • Manifold learning and model reduction • Standard techniques (LLE, MDS, LE, HE) rely on local linear fits and provide no means of getting at sample density • Our geodesic entropic graph methods fit the manifold globally • Computational complexity is only n log n

  17. A Planar Sample and its Euclidean MST

  18. Convergence of Euclidean MST Beardwood, Halton, Hammersley Theorem:

  19. Pattern Matching

  20. MST Estimator of a-Jensen Affinity Two well separated Classes Two overlapping Classes

  21. MST Estimator of Friedman-Rafsky Affinity Two well separated Classes Two overlapping Classes

  22. Target model reduction • 128x128 images of three land vehicles over 360 deg azimuth at 0 deg elevation • The 3(360)=1080 images evolve on a lower dimensional imbedded manifold in R^(16384) Courtesy of Center for Imaging Science, JHU

  23. Target-Image Manifold

  24. 2D manifold Embedding Sampling distribution Sampling A statistical sample

  25. Geodesic Entropic Graph Manifold Learning and Pattern Matching Algorithm • Construct geodesic edge matrix (ISOMAP,C-ISOMAP) • Build entropic graph over geodesic edge matrix • MST: consistent estimator of manifold dimension and process alpha-entropy • MST-Jensen: consistent estimator of Jensen difference between labeled vectors • Use bootstrap resampling and LS fitting to extract rate of convergence (intrinsic dimension) and convergence factor (entropy)

  26. Illustration for 3 land Vehicles

  27. loglogLinear fit to asymptote LS-Soln: d=13 H=120(bits)_

  28. Distributed Multisensor Estimation and Detection • Distributed M-estimation (Blatt) • Ambiguity function is often multimodal: local and global M • Distributed measurements make local M more difficult • We develop method to discriminate between local/global M • Use unsupervised clustering and Fisher information matching • Distributed change detection (Patwari) • Bandwidth and computation constraints • Multilayer vs flat store-detect-forward architecture • We study perfromance loss due to bandwidth constraints • How much information should be sent to what layers?

  29. Sensor 1 Sensor 2 Sensor N Processing unit Final estimator Distributed Estimation and Detection Flat Sensor Aggregation Architecture

  30. Distributed M- Estimation Ambiguityfunction for Cauchy distributed points on a manifold

  31. Global maximum Local maxima A slice of ambiguity function

  32. Key Theoretical Result • The asymptotic distribution of M-estimate is (asymptotically) a Gaussian mixture • Parameters Ref: Blatt&Hero:2003

  33. Validation of Key Result – QQ-plots M-estimates are clustered into two groups. Each group is centered according to the analytical mean and normalized according to the analytical variance.

  34. Estimator 1 Estimator 2 Estimator N M-estimator Aggregation Algorithm Estimation of Gaussian Mixture Parameters (EM) Sample Covariance Analysis Aggregation To Final Estimate

  35. Illustration Model: • 200 Sensors • 100 snapshots per sensor • Snapshots are 1D Gaussian 2-mixture • Known covariance • Unknown means • Sensors generate i.i.d. M-estimates of means and forward to central processor Global maximum Local maximum Ambiguity function.

  36. Local/Global Maxima Discrimination Algorithm Bad estimates Bad estimates Inverse FIM Good estimates Empirical covariance

  37. Addition of other Discriminants Value-added due to local acquisition and transmission of likelihood values

  38. Sensor 1 Sensor 3 Sensor 2 Sensor 4 Processing unit Final estimator Distributed Estimation and Detection Hierarchical Sensor Aggregation Architecture Sensor 6 Sensor 5

  39. 1 Optimal 7-Sensor 2 3 4 5 6 7 r = 0.30 r= 0.10 1 Legend 2 3 Flat r = 0.03 4 5 6 7 Hier. w/o Feedback Hier. w/ Feedback Optimal 1-Sensor Detection: Flat vs Hierarchical Architecture Optimal 7-Sensor • ‘Flat’ [Rago, Willett, et al] • Hierarchical, w/ and w/o Feedback • Each sensor is limited with identical r • At low PF, Hierarchical outperforms Flat r = 0.30 r = 0.10 Legend Flat r = 0.03 Hier. w/o Feedback Hier. w/ Feedback Optimal 1-Sensor

  40. Sequential Adaptive Sensor Management • Sequential: only one sensor deployed at a time • Adaptive: next sensor selection based on present and past measurements • Multi-modality: sensor modes can be switched at each time • Detection/Classification/Tracking: task is to minimize decision error • Centralized decisionmaking: sensor has access to entire set of previous measurements Single-target state vector:

  41. Sequential Adaptive Sensor Management • Myopic information-based strategies (Kruecher) • Multi-target tracking capabilities • Fully Bayesian approach • Non-linear particle filtering with adaptive partitioning • Renyi-alpha divergence criterion • Non-Myopic strategies (Blatt&Kreucher) • MDP value function approximations and rollout methods • Bayesian path averaging • Reinforcement feedback and learning

  42. Sensor scheduling objective function • Prospective value of deploying sensor s at time t: Sensor agility Prediction Retrospective value of deploying sensor s Available measurements at time t-1

  43. Information-based Value Function • Incremental information gained from data collected from using sensor s. Can be measured by divergence • Requires posterior distributions of future target state X given future Z and given present Z, resp., • Main issues for evaluation of E[D(s,t)|Z] • Computation complexity • Robustness to model mismatch • Decisionmaking relevance

  44. Value Function : Alpha Divergence • Properties of Renyi divergence • Simpler and more stably implementable than KL (Kreucher&etal:TSP03) • Parameter alpha can be adapted to non-Gaussian posteriors • More robust to mis-specified models than KL (Kreucher&etal:TSP03) • Related directly to decision error probability via Sanov (Hero&etal:SPM02) • Information theoretic interpretation

  45. Relevance of alpha-D to Decision Error • Consider testing hypotheses • Sanov’s theorem: optimal decision rule has error • Implication: nearly-optimal decision rule for H1 is if can generate good estimate of alpha-D

  46. Multi-Target Bayesian Filtering • Joint multiple target posterior density (JMPD) jointly represents all target states (Kastela) • Update eqns must generally be approximated Model Update (Prediction using prior kinematic model) Measurement Update(Bayes Rule)

  47. Particle Filter (Metropolis) Approximation • Propose (draw) a set of particles based on some importance (proposal) density q chosen to be as close to the posterior as possible • Weight the particles using the principle of importance sampling • Resample particles using above density to avoid degeneracy time t time t-1

  48. Particle Filtering Illustration Initialize: simulate random samples (particles) from proposal density

  49. Particle Filtering Illustration • Model Update: Propose new particles from existing particles based on drawing samples from the importance density

  50. Particle Filtering Illustration Measurement Update: Reweight particles density according to • Resample the particles if necessary

More Related