1 / 14

Motion Simulation in the Environment for Auditory Research

Motion Simulation in the Environment for Auditory Research. Braxton B. Boren, Mark Ericson Nov. 1, 2011. Introduction. ARL’s Environment for Auditory Research (EAR) contains state-of-the-art facilities for auditory simulations

etana
Télécharger la présentation

Motion Simulation in the Environment for Auditory Research

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Motion Simulation in the Environment for Auditory Research Braxton B. Boren, Mark Ericson Nov. 1, 2011

  2. Introduction • ARL’s Environment for Auditory Research (EAR) contains state-of-the-art facilities for auditory simulations • Realistic auditory environments should contain both static and moving sources • Moving sources are much more difficult to simulate in a 57-channel audio system • Multichannel audio editors • Max/MSP • Matlab • Using streaming audio buffers, the EAR’s Sphere Room has been equipped to simulate moving sources by automatically generating source paths and processing each source’s motion in real time.

  3. ENVIRONMENT FOR AUDITORY RESEARCH Sphere Room • The Sphere Room is a 140 m3 (5.3m × 5.4m × 4.9m) auditory virtual reality space designed to facilitate investigations of: • -Integrity of auditory virtual spaces • -Realism of complex auditory simulations • -Effects of changes in Head-Related TransferFunctions on auditory perception • -Effect of helmets and other headgear on spatial orientation The room contains 57 loudspeakers separated vertically by 30°, constituting a sphere surrounding the listener. This configuration of loudspeakers enables virtual sound source movement and sound projection in an almost 360° sphere. Unlimited stationary or moving sound sources may be presented to any combination of the 57 loudspeakers permitting generation of realistic and dynamically changing acoustic environments

  4. Streaming Audio in Matlab • PortAudio API: allows low-level control of multichannel audio devices through the Matlab programming environment • - Low system latency • - High audio fidelity • Streaming Audio: short buffers update every 11.5 milliseconds with new audio data • - Loudspeaker gains are pre-calculated • - Signal processing can be enacted in real time • Static sources can be placed in background at specific positions • Additional moving sources can be added in a single virtual environment

  5. Source Motion Paths • Virtual source motion paths are defined parametrically, over time • - Circular • - Elliptical • - ‘Dogbone’

  6. Panning Algorithms • Distance-Based Amplitude Panning (DBAP), Lossius et al., 2009 • loudspeaker gains are determined by each speaker’s distance from virtual audio source • independent of listener position • provides smooth motion panning for virtual sources located on loudspeaker array • cannot simulate sources outside array

  7. Panning Algorithms • Vector Base Amplitude Panning (VBAP), Pulkki, 1997 • Defines each loudspeaker as a position vector • Given a set of three linearly independent speaker vectors, VBAP can simulate a source within the speaker triangle as a linear combination of the three vectors • The coefficient of each vector is the gain of the corresponding speaker

  8. Panning Algorithms • Vector Base Amplitude Panning (VBAP), Pulkki, 1997 • VBAP is more robust than DBAP given a fixed listener position • Allows efficient simulation of virtual sources outside the loudspeaker array • Requires an algorithm for detecting vector/triangle intersections

  9. Assigning triangles to sources • Parametrically define the triangle’s plane: • If a given ray intersects the plane, find its parametric coordinates s and t • If (s + t) is between 0 and 1, the ray intersects the triangle Ray-Triangle Intersection, Sunday, 2003

  10. Assigning triangles to sources Ray-Triangle Intersection, Sunday, 2003 • Requires 5 distinct dot product operations • Not as efficient as other algorithms for dynamic environments • But it’s more efficient for static sets of triangles because the planes’ normal vectors can be pre-computed

  11. Signal Processing • High-quality vehicular recordings are available with included x-y-z coordinates • These already contain attenuation, air absorption, and Doppler shift • Position data can be read and interpolated to determine pan positions • To allow arbitrary movement of any signal, signal processing is added • Attenuation and air absorption coefficients are pre-computed • Signal gain and one-pole filter are updated in real time before loading the streaming audio buffer • Doppler shift is more computationally expensive • Matlab isn’t fast enough to do it in real time • May later be implemented from C++ • Need better constant recordings of vehicular motion • Current recordings of vehicles idling are unconvincing • Not as important for slower sources that don’t change with velocity

  12. Discussion • With a full signal processing load, this system can process up to four independent moving sources at once • Pre-calculations can take longer if different sources’ velocities and path lengths have very high least-common multiples • Static sources can be added in specific channels to add background ambience

  13. Conclusions • Real time streaming audio allows simulated motion of any parametric path • Two different panning algorithms have been implemented • DBAP is simpler and better for listener-independent reproduction • VBAP is more robust and better for a fixed listener position • Attenuation and air absorption filtering can be applied in real time to give more realistic distance cues • This system will be used in a series of auditory simulations and experiments ongoing at EAR

  14. References • Henry, P., Amrein, B., & Ericson, M., “The Environment for Auditory Research”, Acoustics Today, 5(3), 2009. • Kleiner M., Brainard D., & Pelli D., “What's new in Psychtoolbox-3?”, Perception 36 ECVP Abstract Supplement, 2007. • Lossius, T. Baltazar, P, & de la Hogue, T., “DBAP - Distance-Based Amplitude Panning”, Proceedings of the2009 International Computer Music Conference,2009. • Pulkki, V., “Virtual Sound Source Positioning Using Vector Base Amplitude Panning”, J. Audio Eng. Soc., 45, 1997. • Sunday, D., “Intersections of Rays, Segments, Planes and Triangles in 3D”, 2003.

More Related