1 / 1

Expressive Tangible Acoustic Interfaces

Antonio Camurri, Corrado Canepa, and Gualtiero Volpe InfoMus Lab, DIST-University of Genova, Viale Causa 13, Genova, 16145, Italy Antonio.Camurri@unige.it corrado@infomus.dist.unige.it Gualtiero.Volpe@unige.it www.infomus.dist.unige.it - www.eyesweb.org. Tangible Acoustic Interfaces (TAIs)

peers
Télécharger la présentation

Expressive Tangible Acoustic Interfaces

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Antonio Camurri, Corrado Canepa, and Gualtiero Volpe InfoMus Lab, DIST-University of Genova, Viale Causa 13, Genova, 16145, Italy Antonio.Camurri@unige.itcorrado@infomus.dist.unige.itGualtiero.Volpe@unige.it www.infomus.dist.unige.it - www.eyesweb.org • Tangible Acoustic Interfaces (TAIs) • Objective: transforming physical objects (including everyday objects, e.g. tables or chairs), augmented surfaces, and spaces into tangible-acoustic embodiments of natural seamless unrestricted interfaces. • Physical objects and space as media to bridge the gap between the virtual and physical worlds and to make information accessible through large size touchable objects and ambient media. • TAIs exploit propagation of sound into physical objects in order to obtain information about where, when, and how an object is touched. • TAIs are integrated in a multimodal perspective with other sensors (e.g., videocameras). Expressive Tangible Acoustic Interfaces Two examples of TAIs: a table running the Google Earth application and a sensorized chair • Expressive Gesture Processing • Analysis and processing of user’s movement and gestures for obtaining information related to the user’s affective\emotional state. • Such information may include the way in which the user approaches a TAI: e.g., he/she can touch it in a soft and light way or in a hard and heavy way. • Analysis is carried out with a multi-layered architecture: • A first layer extracts information from video and audio signals and from sensor data. • Then, algorithms are used for extracting expressive features characterizing gesture (e.g., energy, fluency, impulsiveness). • Finally, the extracted values are fed to machine learning modules for associating an expressive characterization to gestures. The EyesWeb framework: an application for analysis of expressive gesture The music theatre piece “Un Avatar del Diavolo” (composer R. Doati), La Biennale, Venezia, Sept 2005. The piece exploits TAI technologies (a sensorized chair). • The EyesWeb Framework • For multimodal analysis and processing. • Visual environment for design of applications. • Input support for frame grabbers, wireless on-body sensors, audio and MIDI input, serial, network, standard input devices. • Math and filters (e.g., operations with scalars and matrices). • Imaging libraries (analysis, processing. and conversion of images and video). • Sound and MIDI libraries. • Support to industrial standards: DirectX, ASIO, VST, FEAPI, OSC, FreeFrame. • Connection with Pure Data, Max/MSP, Kyma. • SDK for custom extensions. • Available for free at www.eyesweb.org. • Expressive TAIs in EyesWeb • Libraries for gathering and processing data from TAIs sensors. • Libraries for localization of touching gestures. • Expressive Gesture Processing Libraries for analysis of expressive qualities in gestures. • Applications • InfoMus Lab is exploring possible exploitation of expressive TAIs in: interactive music, theatre, arts, therapy and rehabilitation, interactive edutainment, applications for museums, exhibits, and science centers (authoring), interactive entertainment, ambient intelligence, tools for teaching by playing and experiencing in simulated environments, tools for enhancing communication about new products or ideas in conventions and “information ateliers”. • Example: the music theatre opera “Un avatar del diavolo” (composer Roberto Doati), performed at La Biennale, Venezia, Sept. 2005. The opera includes a sensorized chair. Expressive gestures (e.g., caresses or tapping-like hand movement) of an actor on the chair control in real-time sound generation and processing. With the partial support of the EU-IST Project TAI-CHI (Tangible Acoustic Interfaces for Computer-Human Interaction)

More Related