1 / 7

Pest Sound Classification

Pest Sound Classification. Yi Yu. Pest Sound Classification Contest. Reference www.cs.ucr.edu/~eamonn/CE/contest.htm PDF: UCR Insect Classification Contest Two stages Phase I : July to November 16th 2012

Télécharger la présentation

Pest Sound Classification

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Pest Sound Classification Yi Yu

  2. Pest Sound Classification Contest • Reference • www.cs.ucr.edu/~eamonn/CE/contest.htm • PDF: UCR Insect Classification Contest • Two stages • Phase I: July to November 16th 2012 • The task is to produce the best distance (similarity) measure for insect flight sounds. • Phase II: Spring 2013 to Fall 2013 (tentatively) • A more general insect flight sound contest (your classifier does not have to be distance based, you can use any classifier). • Clustering, or anomaly detection or… of insect sounds • Phase I contest • 5,000 exemplars in five classes. Call this D1. • D1 is split into two sets: D1public and D1evaluation • D1public is publically available, contains 500 objects. • D1evaluation is for evaluation, contains 4500 objects.

  3. Phase I • Publically available framework and metric • Metric: Leave-one-out-accuracy of the one-nearest-neighbor algorithm using the provided distance function. • The contest organizers provide a simple framework in matlab • Pick the ith pest sound from the dataset, measure the distance between the ith pest sound and each of the rest pest sound. • If the jth pest sound has the shortest distance with the ith pest sound and their labeled class IDs are also the same, it is regarded that the ith pest sound is correctly classified in the leave-one-out test. • A simple interface: take two pest sounds as input and returns a distance. • Contest participants • Each team provides one matlab file with the defined interface • The organizers evaluate and return both leave-one-out-accuracy and hold-out-accuracy, but provide no details • Main focus • How to select features & how to define distance metric.

  4. Feature for Pest Sound • The pest sound is recorded by a laser sensor • When a flying insect crosses the laser beam, its wings partially occlude the light, causing small light fluctuations captured by the phototransistor. • This signal is filtered and amplified by a custom designed board, and the output signal is recorded as audio data. • The wing-beat frequency is said to be the most critical feature. • In this sense, pitch of the pest sound is the most proper feature • Refer to: “SIGKDD Demo: Sensors and Software to Allow Computational Entomology, an Emerging Application of Data Mining”

  5. Continued’ • We have already evaluated several audio features, such as Pitch, STFT, MFCC and Chroma. • Potential Distance Metric • Our current choice: Euclidean distance d(a,b) • Tried but not useful: d(a,b) = 1 - correlation(a,b) • KL distance (not tried yet): Requires statistical info. But pest sound is very short, usually less than 100ms. • Current result • The leave-one-out accuracy of the one-nearest neighbor using STFT is about 0.924. • I find that pest sounds belonging to different classes might have very similar spectrum, which leads to the failure of classification.

  6. Result • Start comparing features ... • 37 out of 500 done, misclassified as 249, 4 --> 3 • 51 out of 500 done, misclassified as 476, 3 --> 4 • 62 out of 500 done, misclassified as 388, 3 --> 5 • 77 out of 500 done, misclassified as 497, 1 --> 5 • 122 out of 500 done, misclassified as 151, 1 --> 5 • 135 out of 500 done, misclassified as 249, 4 --> 3 • 151 out of 500 done, misclassified as 480, 5 --> 1 • 165 out of 500 done, misclassified as 401, 5 --> 1 • 205 out of 500 done, misclassified as 169, 5 --> 1 • 206 out of 500 done, misclassified as 165, 1 --> 5 • 210 out of 500 done, misclassified as 16, 2 --> 4 • 215 out of 500 done, misclassified as 279, 3 --> 4 • 218 out of 500 done, misclassified as 39, 4 --> 3 • 225 out of 500 done, misclassified as 249, 4 --> 3 • 249 out of 500 done, misclassified as 135, 3 --> 4 • 272 out of 500 done, misclassified as 309, 1 --> 5 • 273 out of 500 done, misclassified as 275, 3 --> 4 • 275 out of 500 done, misclassified as 273, 4 --> 3 • 301 out of 500 done, misclassified as 169, 5 --> 1 • 316 out of 500 done, misclassified as 398, 1 --> 5 • 327 out of 500 done, misclassified as 173, 3 --> 2 • 334 out of 500 done, misclassified as 446, 4 --> 3 • 362 out of 500 done, misclassified as 497, 1 --> 5 • 398 out of 500 done, misclassified as 65, 5 --> 1 • 401 out of 500 done, misclassified as 165, 1 --> 5 • 403 out of 500 done, misclassified as 257, 5 --> 1 • 424 out of 500 done, misclassified as 354, 1 --> 5 • 431 out of 500 done, misclassified as 361, 1 --> 5 • 458 out of 500 done, misclassified as 338, 3 --> 2 • 469 out of 500 done, misclassified as 356, 3 --> 2 • 471 out of 500 done, misclassified as 337, 3 --> 4 • 472 out of 500 done, misclassified as 470, 1 --> 3 • 473 out of 500 done, misclassified as 59, 1 --> 5 • 480 out of 500 done, misclassified as 151, 1 --> 5 • 497 out of 500 done, misclassified as 77, 5 --> 1 • Evaluation results • The dataset you tested has 5 classes • The data set is of size 500. • The error rate was 0.07

  7. 37 out of 500 done, misclassified as 249, 4 --> 3 51 out of 500 done, misclassified as 476, 3 --> 4

More Related