1 / 34

Music Processing

Music Processing. Roger B. Dannenberg. Overview. Music Representation MIDI and Synthesizers Synthesis Techniques Music Understanding. Music Representation. Acoustic Level: sound, samples, spectra Performance Information: timing, parameters

lesley
Télécharger la présentation

Music Processing

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Music Processing Roger B. Dannenberg

  2. Overview • Music Representation • MIDI and Synthesizers • Synthesis Techniques • Music Understanding

  3. Music Representation • Acoustic Level: sound, samples, spectra • Performance Information: timing, parameters • Notation Information: parts, clefs, stem direction • Compositional Structure: notes, chords, symbolic structure

  4. Performance Information • MIDI bandwidth is 3KB/s, or 180KB/min • More typical: 3KB/minute, 180KB/hour • Complete Scott Joplin: 1MB • Output of 50 Composers (400 days of music): 500MB (1 CD-ROM) • Synthesis of acoustic instruments is a problem

  5. Music Notation • Compact, symbolic representation • Does not capture performance information • Expressive “performance” not fully automated

  6. Compositional Structure • Example: Nyquist (free software!) (defun melody1 ()(seq (stretch q (note a4) (note b4) (note cs5) (note d5)))) (defun counterpoint () …) (defun composition () (sim (melody1) (counterpoint))) (play (transpose 4 (composition)))

  7. Overview • Music Representation • MIDI and Synthesizers • Synthesis Techniques • Music Understanding

  8. MIDI: Musical Instrument Digital Interface • Musical Performance Information: • Piano Keyboard key presses and releases • “instrument” selection (by number) • sustain pedal, switches • continuous controls: volume pedal, pitch bend, aftertouch • very compact (human gesture < 100Hz bandwidth)

  9. MIDI (cont’d) • Point-to-point connections: • MIDI IN, OUT, THRU • Channels • No time stamps • (almost) everything happens in real time • Asynchronous serial, 8-bit bytes+start+stop bits, 31.25K baud = 1MHz/32

  10. 8 ch key# vel 9 ch key# vel B ch ctrl# value C ch index# MIDI Message Formats Key Up Key Down Polyphonic Aftertouch A ch key# press Control Change Program Change Channel Aftertouch D ch press Pitch Bend E ch lo 7 hi 7 … DATA … System Exclusive F 0 F E

  11. Standard MIDI Files Key point: Must encode timing information Interleave time differences with MIDI data... <track data> =1 or more <track event>, <track event> = <delta time><event>, <event> = midi data or <meta event>, <meta event> = FF<type><length><data> Delta times use variable length encoding, omit for zero.

  12. Standard MIDI Files (cont’d) • MThd <length> <header data> • MTrk <length> <track data> • MTrk <length> <track data> header info track data: each with 16 channels

  13. Overview • Music Representation • MIDI and Synthesizers • Synthesis Techniques • Music Understanding

  14. Music Synthesis Introduction • Primary issue is control • No control  Digital Audio (start, stop, ...) • Complete control  Digital Audio (S[0], S[1], S[2], ... ) • Parametric control  Synthesis

  15. Music Synthesis Introduction (cont’d) • What parameters? • pitch • loudness • timbre (e.g. which instrument) • articulation, expression, vibrato, etc. • spatial effects (e.g. reverberation) • Why synthesize? • high-level representation provides precision of specification and supports interactivity

  16. Additive Synthesis • amplitude A[i] and frequency [i] specified for each partial (sinusoidal component) • potentially 2n more control samples than signal samples!

  17. Additive Synthesis (cont’d) • often use piece-wise linear control envelopes to save space • still difficult to control because of so many parameters • and parameters do not match perceptual attributes

  18. Table-Lookup Oscillators • If signal is periodic, store one period • Control parameters: pitch, amplitude, waveform Frequency • Efficient, but ... • Spectrum is static + Phase (Note that phase and frequency are fixed point or floating point numbers) Amplitude x

  19. FM Synthesis • Usually use sinusoids • “carrier” and “modulator” are both at audio frequencies • If frequencies are simple ratio (R), output spectrum is periodic • Output varies from sinusoid to complex signal as MOD increases MOD FREQ A F AMPL + A F out =AMPL·sin(2·FREQ·t+MOD sin(2R·FREQ·t))

  20. FM Synthesis (cont’d) • Interesting sounds, • Time-varying spectra, and ... • Low computation requirements • Often uses more than 2 oscillators … but … • Hard to recreate a specific waveform • No successful analysis procedure

  21. Sample-based Synthesis • Samplers store waveforms for playback • Sounds are “looped” to extend duration • Spectrum is static (as in table-lookup), so: • different samples are used for different pitches • simple effects are added: filter, vibrato, amplitude envelope • attack portion, where spectrum changes fastest, added to front ... Attack Loop Loop again

  22. Physical Models • Additive, FM, and sampling:more-or-less perception-based. • Physical Modeling is source-based: compute the wave equation, simulate attached reeds, bows, etc. • Example: Reed Bore Bell

  23. Physical Models (cont’d) • Difficult to control, and ... • Can be very computationally intensive … but ... • Produce “characteristic” acoustic sounds.

  24. Overview • Music Representation • MIDI and Synthesizers • Synthesis Techniques • Music Understanding

  25. Music Understanding • Introduction • Score Following, Computer Accompaniment • Interactive Performance • Style Recognition • Conclusions

  26. What Does Music Mean? • Emotion • Formal structures • Abstract • Physical

  27. What is Music Understanding? • Translation? • Recognition? (Of what?) • Parsing? • Pattern forming? • Recognition of themes? • Music Understanding is the recognition of pattern and structure in music.

  28. Computer Accompaniment Performance Score for Performer Input Processing Score for Accompaniment Matching Accompaniment Performance Music Synthesis Accompaniment (see Dannenberg ‘84)

  29. Interactive Performance • Traditional Western Composition is carefully composed, but the result is static. Composer is central figure. • Jazz and other improvisations are not carefully composed (typically small structures), but the result is dynamic and spontaneous. Performer is central figure. • Can we integrate these two?

  30. A New Approach to Music Making • Computers let us put compositional theories into programs. • Music understanding helps us tie programs to live performance. • Result can be carefully composed and structured: Composer-oriented. • At the same time, result can be spontaneous: Performer-oriented.

  31. Style Recognition • Everyone recognizes musical style:“I don’t know anything about music, but I know what I like” • What makes something Lyrical? Syncopated?

  32. Experimental Setup Pointilistic ? Lyrical Frantic Syncopated

  33. Music Understanding Conclusions • Music Understanding: the recognition of pattern or structure in music. • Music Understanding is necessary for high-level interfaces between musicians and computers.

  34. Music Summary • Rich in representations • Different representations support different tasks • Active research in: • Synthesis • Understanding • Hardware to Software

More Related