1 / 96

Part V: Collaborative Signal Processing Akbar Sayeed

Part V: Collaborative Signal Processing Akbar Sayeed. CSP Outline. Introduction and Overview Illustrative example: single target tracking Different modes of CSP Event Detection CSP principles for distributed decision making Illustrated in the context of target classification

hildai
Télécharger la présentation

Part V: Collaborative Signal Processing Akbar Sayeed

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Part V: Collaborative Signal ProcessingAkbar Sayeed

  2. CSP Outline • Introduction and Overview • Illustrative example: single target tracking • Different modes of CSP • Event Detection • CSP principles for distributed decision making • Illustrated in the context of target classification • Data fusion versus decision fusion • Illustrative examples with real data • Other CSP applications • Target Localization • Target Tracking • Distributed compression • Information driven sensor querying • Challenges

  3. Sensor Networks from a SP Perspective • Provide a virtual map of the physical world: • Monitoring a region in a variety of sensing modalities (acoustic, seismic, thermal, …) • Two key components: • Networking and routing of information • Collaborative signal processing (CSP) for extracting and processing information from the physical world

  4. Space-Time Sampling • Sensors sample the spatial signal field in a particular modality (e.g., acoustic,seismic) • Sensor density commensurate with spatial signal variation • Sampling of time series from each sensor commensurate with signal bandwidth • Sensor field decomposed into space-time cells to enable distributed signal processing (multiple nodes per cell) Space Space Time Time Uniform space-time cells Non-uniform space-time cells

  5. An Object Moving Through the Network A moving object corresponds to a spatial peak moving with time Target tracking corresponds to determining the peak location over time

  6. Manager Nodes • In each region or cell, a particular node is designated manager node • The manager node is responsible for: • Coordinating the communication between different nodes in the region • Receiving data from different collaborating nodes in the region • Jointly processing the signals from different collaborating nodes for tasks that require CSP across nodes • Communicating requested information back to the querying node

  7. Illustrative Example: Single Target Tracking • Initialization: Cells A,B,C and D are put on detection alert for a specified period • Five-step procedure: • A track is initiated when a target is detected in a cell (Cell A – Active cell). Detector outputs of active nodes are sent to the manager node • Manager node estimates target location at N successive time instants using outputs of active nodes in Cell A. • Target locations are used to predict target location at M<N future time instants • Predicted positions are used to create new cells that are put on detection alert • Once a new cell detects the target it becomes the active cell

  8. Component Signal Processing Algorithms • Single target tracking • requires detection, localization, and location prediction • Can track multiple targets if they are sufficiently separated in space and/or time • separate track for each target • Can track spatio-temporally overlapping targets with appropriate classification algorithms

  9. Why CSP? • In principle, more information about a phenomenon can be gathered from multiple measurements • Multiple sensing modalities (acoustic, seismic, etc.) • Multiple nodes • Limited local information gathered by a single node necessitates CSP • Inconsistencies between measurements, such as due to malfunctioning nodes, can be resolved • Variability in signal characteristics and environmental conditions necessitates CSP • Complementary information from multiple measurements can improve performance

  10. Categorization of CSP Algorithms Based on Communication Burden Manager node • Intra-node collaboration • Multiple sensing modalities • E.g., combining acoustic and seismic measurements • No communication burden since collaboration is at a particular node • Higher computational burden at the node • Inter-node collaboration • Combining a measurements at different nodes • Higher communication burden since data is exchanged between nodes • Higher computational burden at manager node

  11. Categorization of CSP Algorithms Based on Computational Burden Manager node Manager node • Data fusion • Time series for different measurements are combined • Higher computational burden since higher dimensional data is jointly processed • Higher communication burden if different measurements from different nodes • Decision fusion • Decisions (hard or soft) based on different measurements are combined • Lower computational burden since lower dimensional data (decisions) is jointly processed • Higher communication burden if the component decisions are made at different nodes

  12. Various Forms of CSP Manager node Manager node • Single Node, Multiple Modality (SN, MM) • Simplest form of CSP: no communication burden • Decision fusion • Data fusion (higher computational burden) • Multiple Node, Single Modality (MN, SM) • Higher communication burden • Decision fusion • Data fusion (higher computational burden) • Multiple Node, Multiple Modality (MN, MM) • Highest communication and computational burden • Decision fusion across modalities and nodes • Data fusion across modalities, decision fusion across nodes • Data fusion across modalities and nodes

  13. Event Detection • Simple energy detector • Detect a target/event when the output exceeds an adaptive threshold (CFAR) • Detector output: • At any instant is the average energy in a certain window • Is sampled at a certain rate based on a priori estimate of target velocity and signal bandwidth • Output parameters for each event: • max value (CPA – closest point of approach) • time stamps for: onset, max, offset • time series for classification • Multi-node and multi-modality collaboration

  14. Detector Output Illustration M M M M

  15. Constant False Alarm Rate (CFAR) Detection • Energy detector is designed to maintain a CFAR • Detector threshold is adapted to the statistics of the decision variable under noise hypothesis • Let x[n] denote a sensor time series • Energy detector: W is the detector window length • Detector decision: Target present Target absent Target present Target absent

  16. CFAR Threshold Computation • Choose the threshold g so that the False Alarm Rate (FAR) is below a specified value • The desired threshold is given by • estimates are continually updated to adapt the threshold

  17. Single Target Classification: Overview • Single measurement classifiers • MAP/ML Gaussian classifiers • NN classifiers (benchmark) • Training and Performance Evaluation • Confusion matrices • Multiple measurement classifiers • Data fusion (dependent measurements) • Decision fusion (independent measurements) • Different possibilities for CSP-based classification • Single node, multiple sensing modalities (SN, MM) • Multiples nodes, single sensing modality (MN, SM) • Multiple nodes, multiple sensing modalities (MN, MM) The basic ideas illustrate general CSP principles in distributed decision making

  18. Single Measurement Classifier • M possible target classes: • x : N-dim. (complex-valued) event feature vector • x belongs to m-th class with probability • C: classifier assigns one of the classes to x MAP: if Bayes rule: Equal priors (ML):

  19. Single Measurement Classifier – Pictorially M=3 classes x C(x)=2 Event feature vector Decision (max) Class likelihoods

  20. Effective Event Feature Vectors • Each event typically yields multiple feature vectors • An effective feature vector for each event is extracted, such as the mean

  21. Gaussian Classifiers • Assume that for class j, x has a complex Gaussian distribution with mean vector and covariance matrix • denotes ensemble average over class j • Superscript H denotes complex conjugate transpose • Likelihood function for class j

  22. Training and Performance Assessment • training events available for each class • 3-way cross validation – partition data into 3 sets ( ) with equal number of events for each class • Three sets of experiments:

  23. Training and Testing • In each experiment we have: • Training phase: estimate mean and covariance for each class from the two training data sets • Testing phase: Using estimated from the two training data sets, test the performance of the classifier on the third testing set For j = 1, …, M

  24. Confusion Matrix = number of events from classified as

  25. Probability of Detection, Probability of False Alarm, Belief (m-th row) • Probability of detection for class m • Probability of false alarm for class m • Prior belief in the classifier decisions (via training) (m-th column) (j-th column)

  26. Benchmark: Nearest Neighbor (NN) Classifier • -- the set of all training event feature vectors (containing all classes) • -- test event feature vector to be classified That is, find the training feature vector that is closest to the test feature vector. Assign the label of the closest training feature vector to the test event

  27. Multiple Measurements • K measurements (from a detected event) • Different nodes or sensing modalities • -- event feature vector for k-th measurement • Classifier C assigns one of the M classes to the K event measurements Equal priors (ML):

  28. Data Fusion – Gaussian Classifier • Assume that different measurements ( ) are jointly Gaussian and correlated For the concatenated event feature vector (KN dim.) is Gaussian with mean and covariance: characterize the j-th class and can be estimated from training data  cross-validation, CM’s, PD, PFA, belief

  29. Multiple Measurement Classifier – Data Fusion M=3 classes C(x)=3 Event feature vectors from 2 measurements Concatenated event feature vector Class likelihoods Decision (max) Higher computational burden

  30. Data Fusion – NN Classifier • Let denote the set of all concatenated training event feature vectors (containing all classes) • Let denote the concatenated test event feature vector to be classified (NK dimensional)

  31. Forms of Data Fusion in CSP Manager node Manager node K modalities, P nodes • Data fusion of multiple modalities (e.g, acoustic and seismic) at each node (SN, MM) • Higher comp. burden (NK dim. data) • No additional comm. burden • Data fusion of a single modality at multiple nodes (MN, SM) • Higher computational burden at manager node (PN dim. data) • Higher communication burden due to transmission of N dim. data from different nodes to the manager node • Data fusion of multiple modalities at multiple nodes (MN, MM) • Highest computational burden at manager node (NKP dim. data) • Highest communication burden due to transmission of KN dim. multi-modality data from different nodes to the manager node

  32. Pros and Cons of Data Fusion • Pros • Maximal exploitation of available information in multiple times series • Potentially the best performing classification scheme • Cons • High computational burden • High communication burden if data fusion across nodes • Need larger amount of data for training • Inconsistencies between measurements could cause performance degradation (malfunctioning nodes, e.g.)

  33. Decision Fusion • Suppose different measurements are statistically independent For each class, • This suggests combining the decisions of component classifiers for different measurements

  34. Hard or Soft Decisions • Soft decisions for different measurements may be combined to form the final decision • Hard decisions for different measurements may be combined to form the final decision otherwise

  35. Multiple Measurement Classifier – Soft Decision Fusion Comb. Comb. Comb. C(x)=1 Final Decision (max) Component decision combiner Event feature vectors from 2 measurements Lower computational burden

  36. Different Ways of Soft Decision Fusion • All based on the following inequalities • Product rule Equal priors (ML): “On combining classifiers,” Kittler et. al., IEEE Trans. Pattern Anal. Machine Intelligence, March 1998.

  37. Soft Decision Fusion: Sum Rule • Product rule suffers from the problem that one small component likelihood would heavily influence the final decision • Sum Rule Equal priors (ML):

  38. Soft Decision Fusion: Median, Min and Max Rules • Assume equally likely priors • Median Rule • Min Rule • Max Rule

  39. Hard Decision Fusion • Majority Vote Rule otherwise

  40. Multiple Measurement Classifier – Hard Decision Fusion M=3 classes 1 C(x)=1 3 Majority vote 1 Final decision Event feature vectors from 3 measurements Component hard decisions Lower computational burden

  41. Hard Decision Fusion with Prior Beliefs • Product Rule Based on Beliefs Prior belief in class j given hard decisions Can be estimated using the modality confusion matrices (m-th column of ) (Can have similar sum, median, min, max rules based on beliefs)

  42. Multiple Measurement Classifier – Hard Decision Fusion With Prior Beliefs C(x)=1 Final decision Combined class beliefs Component hard decisions for 2 measurements Component class beliefs

  43. Forms of Decision Fusion in CSP Manager node Manager node • Decision fusion of multiple modalities (e.g, acoustic and seismic) at each node (SN, MM) • Relatively low comp. burden (K dim. data) • No additional communication burden • Decision fusion of a single modality at multiple nodes (MN, SM) • Relatively low comp. burden at manager node (P dim. data) • Relatively low communication burden since decisions are exchanged • Decision fusion of multiple modalities at multiple nodes (MN, MM) • Slightly higher comp. burden at manager node (KP dim. data) • Slightly higher comm. burden due to exchange of K dim. multi-modality decisions

  44. Pros and Cons of Decision Fusion • Pros • No loss of information if different measurements are independent • Potentially as good as data fusion • Significantly lower computational and communication burden relative to data fusion • Need less training data to train each component classifier for each modality (joint training needed in data fusion) • Cons • Loss of performance if measurements are correlated • Relatively higher communication burden when decision fusion across nodes (compared to intra-node collaboration) • Inconsistencies between different modalities could cause performance degradation (less severe than data fusion)

  45. Pictorial Summary: Single Node (SN) CSP Event feature vector for p-th node, k-th modality Component classifier for p-th node, k-th modality 1) Single node, single modality (SN, SM) No CSP 2) Single node, multiple modalities (SN, MM) a) Decision fusion: Data fusion: b) Higher computational burden

  46. Pictorial Summary: MN-SM CSP Event feature vector for p-th node, k-th modality Component classifier for p-th node, k-th modality 3) Multiple node, single modality (MN, SM) a) Decision fusion: Manager node

  47. Pictorial Summary: MN-SM CSP Event feature vector for p-th node, k-th modality Component classifier for p-th node, k-th modality 3) Multiple node, single modality (MN, SM) b) Data fusion: Higher computational burden Higher communication burden Manager node

  48. Pictorial Summary: MN-MM CSP Event feature vector for p-th node, k-th modality Component classifier for p-th node, k-th modality 4) Multiple nodes, multiple modalities (MN, MM) a) Decision fusion in modalities and over nodes: Minimal communication and computational burden Manager node

  49. Pictorial Summary: MN-MM CSP Event feature vector for p-th node, k-th modality Component classifier for p-th node, k-th modality 4) Multiple nodes, multiple modalities (MN, MM) b) Data fusion across modalities and decision fusion over nodes: Higher computational burden Manager node

  50. Pictorial Summary: MN-MM CSP Event feature vector for p-th node, k-th modality Component classifier for p-th node, k-th modality 4) Multiple nodes, multiple modalities (MN, MM) c) Data fusion across nodes in modality 1 and decision fusion in modality 2: Higher computational and communication burden Manager node

More Related