1 / 16

Enhancing Underwater Mine Detection Using Semi-Supervised Multi-Task Learning

This study by Lawrence Carin, Qiuhua Liu, and Xuejun Liao at Duke University, in collaboration with the Office of Naval Research, explores a novel approach for underwater mine detection. By leveraging semi-supervised multi-task learning, the researchers propose a unified sharing structure to simultaneously learn multiple classifiers from partially labeled data manifolds. The sharing structure considers a joint prior distribution to capture the inter-task relationships, enhancing the classification performance within and across different underwater scenes. By incorporating data manifold representation based on Markov random walks and exploiting both intra-scene and inter-scene contexts, this approach aims to improve the accuracy and efficiency of mine detection in challenging underwater environments.

virote
Télécharger la présentation

Enhancing Underwater Mine Detection Using Semi-Supervised Multi-Task Learning

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Multi-Task Semi-Supervised Underwater Mine Detection Lawrence Carin, Qiuhua Liu and Xuejun Liao Duke University Jason Stack Office of Naval Research

  2. Intra-Scene Context

  3. Individual Signatures Processed by Supervised Classifiers What Analyst Processes Message: Analyst Places Classification of Any Given Item Within Context of All Items in the Scene Supervised Classifier Classifies Each Item in Isolation

  4. Decision surface based on labeled data (supervised) Decision surface based on labeled & Unlabeled data (semi-supervised)

  5. Inter-Scene Context

  6. Message • Humans are very good at exploiting context, both within a • given scene and across multiple scenes • Intra-scene context: semi-supervised learning • Inter-scene context: multi-task and transfer learning • A major focus of machine learning these days

  7. Data Manifold Representation Based on Markov Random Walks Given X={x1, …,xN}, first construct a graph G=(X,W), with the affinity matrix W, where the (i, j)-th element of W is defined by a Gaussian kernel: we consider a Markov transition matrix A, which defines a Markov random walk, where the (i, j)-th element: gives the probability of walking from xito xj by a single step. The one-step Markov random work provides a local similarity measure between data points.

  8. Semi-Supervised Multitask Learning(1/2) • Semi-supervised MTL: Given M partially labeled data manifolds, each defining a classification task, we propose a unified sharing structure to learn the M classifiers simultaneously. • The Sharing Prior: We consider MPNBCclassifiers, parameterized by The M classifiers are not independent but coupled by a joint prior distribution:

  9. Baseline prior Balance parameter Prior transferred from previous tasks Semi-Supervised Multitask Learning(2/2) With • The normal distributions indicates the meta-knowledge indicating how the present task should be learned, based on the experience with a previous task. • When there are no previous tasks, only the baseline prior is used by setting m=1 =>PNBC. • Sharing tasks to have similar , not exactly the same(advantages over the Dirac delta function used in previous MTL work).

  10. Thanks

More Related