1 / 16

Annotating Music and Lyrics

Annotating Music and Lyrics. Kristine Monteith CS 652 - Research Project June 11, 2009. Project Goal. Find a suitable song for a given situation Applications Indexing songs by topic and mood Soundtracks for movie scenes Music therapy groups. What do we want to label?. Lyrics Themes

cira
Télécharger la présentation

Annotating Music and Lyrics

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Annotating Music and Lyrics Kristine Monteith CS 652 - Research Project June 11, 2009

  2. Project Goal • Find a suitable song for a given situation • Applications • Indexing songs by topic and mood • Soundtracks for movie scenes • Music therapy groups

  3. What do we want to label? • Lyrics • Themes • Moods • Music • Moods • Energy • Emotional evocativeness • Genre

  4. Song Ontology

  5. How to Extract Features:Lyrics • Current state: Keyword search • Find all occurrences of the search term in the song • Find all occurrences of search term synonyms (Using WordNetsynsets) • Future work: • Extend searches to phrases • Determining mood

  6. Supervised Learning Prediction of target label Labeled Training Data Classifier Features of a Musical Selection

  7. Input Features • Bag of Words • Words appearing on page and word counts  Looking for other methods to analyze documents and collect input features

  8. Output Labels • Derived from questionnaire • Hand-labeled by researchers

  9. How to Extract Features:Music • Target labels to predict • Moods (labeled by subject or expert) • Energy (determined by direction of change in biofeedback responses) • Emotional evocativeness (determined by extent of change in biofeedback responses) • Genre (labeled by subject, expert, or clustering)

  10. Input Features:Acoustic Properties • Spectral Centroid • Spectral Rolloff Point • Spectral Flux • Compactness • Spectral Variability • Root Mean Square • Fraction of Low Energy Windows • Zero Crossings • Strongest Beat • Beat Sum • Strength of Strongest Beat • Strongest Frequency Via Spectral Centroid • Strongest Frequency Via FFT Maximum

  11. Input Features:Acoustic Properties • MFCC • LPC • Method of Moments • Partial Based Spectral Centroid • Partial Based Spectral Flux • Peak Based Spectral Smoothness • Relative Difference Functions • Area Method of Moments

  12. Input Features:Symbolic Features • Tempo • Key • Mode • Musical form • Rhythmic structure • Vocalization • Instrumentation • Melodic contour • Harmonic patterns

  13. Output Labels:Questionnaire-based Responses

  14. Output Labels:Physiological Responses • Heart rate • Breathing rate • Perspiration • Skin temperature • Each subject will listen to one minute segments separated by one minute of silence

  15. Conclusion • Demo: Music Therapist Assistant • Any Questions/Suggestions?

More Related