1 / 22

Expressive gesture in interaction: the role of movement and gesture in emotion

Expressive gesture in interaction: the role of movement and gesture in emotion. Ginevra Castellano, Antonio Camurri, Gualtiero Volpe. Department of computer science, systems and telematic, University of Genoa. Infomus Lab http://infomus.dist.unige.it.

kezia
Télécharger la présentation

Expressive gesture in interaction: the role of movement and gesture in emotion

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Expressive gesture in interaction: the role of movement and gesture in emotion Ginevra Castellano, Antonio Camurri, Gualtiero Volpe Department of computer science, systems and telematic, University of Genoa Infomus Lab http://infomus.dist.unige.it WP6 HUMAINE Workshop, Paris, March, 10-11, 2005

  2. What is expressive gesture • Gesture: support to verbal communication, movement of the body containing information • Expressive gesture: high-level non-verbal expressive and emotional communication (Camurri et al., 2004) • Artistic context: gesture as conveyor of information related to the emotional domain (dance or music performances)

  3. Expressive gesture in HCI • Aims:  to communicate emotions to users  to recognize users’ emotional engagement • Expressive gesture: movement as  conveyor of emotional information  component of an emotional process

  4. Expressive gesture analysis: a layered approach (1) (Camurri et al., 2004, 2005) Modelling techniques: prediction of emotions (e.g., multiple regression, neural networks, decision trees) ↑ Techniques for gesture segmentation, representation of gestures as trajectories in virtual, expressive spaces ↑ Video and audio processing techniques (computer vision techniques on the incoming images, signal processing on audio signals) ↑ Physical signals, video and audio pre-processing techniques (e.g. motion detection, audio filtering, etc.)

  5. Expressive gesture analysis: a layered approach (2) • Not only analysis but also synthesis of expressive gesture • Experiments on expressive gesture are carried out with Eyesweb open software platform (Camurri et al., 2000) • Perspective: mapping info about users’ behaviour onto real time generation of expressive behaviour of virtual agents such ECAs

  6. Our activity in HUMAINE • Expressive gesture as a component of an emotional process • Component-process model of emotion provided by Klaus Scherer (GERG) has been investigated (Scherer, 1984, 2000; Scherer and Zentner, 2001) • We used motor activation component to evaluate the emotional engagement of users exposed to emotional stimuli

  7. Music, emotion and movement • Research in collaboration with Professor Klaus Scherer’s group (GERG, Geneva Emotion Research Group) • Aim: to investigate the relationship between emotions induced by musical stimuli and movement • Pilot experiment: are there correlations between the emotionalcharacterizations of music excerpts and human movement ?

  8. Continuous measures of emotions • Music as induction technique • Music and emotion: time-varying relationship • Several indicators  Problem: conscious Vs unconscious measurements  Verbal report Physiological measures  Coding of non-verbal behavior, subject interfaces: from sliders to multimodal interfaces • Idea: laser pointer as semi-conscious interface through which movement to communicate anemotional experience generated by music

  9. A pilot experiment (1) • 20 subjects equipped with a laser pointer and asked to move it on a white wall in front of them while listening to music excerpts • Stimuli: a set of classical music excerpts provided by GERG Grouped in four characterizations defined on the basis of valence and energy: slow positive, slow negative, fast positive and fast negative

  10. A pilot experiment (2) • Method  Each subject listened to four music excerpts, one for each emotional characterization  Trajectories performed by the subjects moving the laser pointer on the wall have been recorded  Questionnaire to indicate emotions felt by subjects duringlistening to music

  11. Preliminary analysis • Aim: to look for correlations among features of the trajectories performed by subjects with the laser pointer and emotional characterization of the music excerpt a subject was listening to • Global and static analysis: integration of the laser trajectories over time To obtain for each video file with the movement of the laser a bitmap summarizing the trajectory followed during the whole listening  Each bitmap represents a graphical subject response (GSR) from the listening of a single music excerpt  Is it possible to separate the GSRs in classes and to verify if these classes can be correlated with the characterization of the music excerpts? CLUSTERING ANALYSIS

  12. Extraction of global trajectories • Patch summarizing the path followed by the laser pointer (Eyesweb platform) ..\Presentazione\Presentazione.eyw

  13. Identification and measure of trajectories features • To identify a collection of descriptors Related to specific features of the trajectory patterns  Angularity, rarefaction, spatial occupation, vertical symmetry, horizontal symmetry, central symmetry, compactness, lateral location,vertical location, angular tendency, spatial extension • Providing measures for relevant trajectory features  Manual annotation  Unambiguous criteria  Patterns are evaluated with a value from 0 to 4 with respect to each specific feature  Five evaluators

  14. An example: angularity • The trajectories drawn by the laser can be smooth (0) or angular (4)  Smooth trajectory: wavy, soft lines Angular trajectory: direct, sharp, nervous lines Smooth trajectory Angular trajectory

  15. An example: rarefaction • The pattern can be thick and intense (0) or rarefied (4) White pixels / total pixels in the boundary rectangle Thick trajectory: high degree of filling of the occupied space Rarefied trajectory: low degree of filling of the occupied space Thick trajectory Rarefied trajectory

  16. Statistical analysis: mean of all the ratings of all the features for the four emotions Useful for deciding how to realize a clustering analysis

  17. Hypotheses to be verified during the clustering analysis • Angularity, rarefaction and compactness seem to explain the motor activation analyzed with this static and global analysis: critical features • Slow patterns: low angularity, high rarefaction, low compactness • Fast patterns: high angularity, low rarefaction, high compactness

  18. Clustering global trajectories • Eyesweb patch with a block implementing the K-Means algorithm • Aim: to verify if the grouping create clusters that are consistent with the emotional characterizations of the music excerpts used to induce the emotions in the subjects • Choice of the best type of clustering: two clusters, three features • Two different classifications: fast/slow and positive/negative

  19. Results • Fast/slow patterns explained by angularity only • Positive and the negative patterns don’t distinguish from each others • Subjects, moving the laser pointer, synchronize with the rhythm of the excerpts  If the velocity of the music increases, consequently the velocity of the arm movement increases as well as the direction changes frequency • There could be a sort of correlation between characteristics of the music listened to and movement performed • Resonance between music and motor activation

  20. Future developments • Dynamic analysis of laser pointer trajectories: how can be correlated with the musical structure at different time scales • Aim: to discover how rules can be established to recognize emotions of users • Possible perspective: to contribute to define the role of attention in emotion-oriented systems such ECAs

  21. Applications • Motor rehabilitation • Multimedia content analysis through novel affective interfaces (e.g. mobiles, embedded systems, new media) • Music industry: music information retrieval from huge databases based on emotional responses • Artistic and musical applications • Cultural applications, museums, and science centers

  22. References • Camurri A., Hashimoto, S., Ricchetti, M., Trocca, R., Suzuki, K., and Volpe, G., (2000), “Eyesweb – Toward Gesture and Affect Recognition in Interactive Dance and Music Systems”, Computer Music Journal, 24:1, pp. 57-69, MIT Press, Spring 2000. • Camurri, A., Mazzarino, B., Ricchetti, M., Timmers, R., and Volpe, G., (2004), “Multimodal Analysis of Expressive Gesture in Music and Dance Performances”, in A.Camurri, G. Volpe, (Eds.), “Gesture-based Communication in Human-Computer Interaction”, LNAI 2915, Springer Verlag, 2004. • Camurri, A., De Poli, G., Leman, M., and Volpe, G., (2005), “Communicating Expressiveness and Affect in Multimodal Interactive Systems”, IEEE MultiMedia, January-March 2005, pp.43-53. • Scherer, K.R., (1984), “On the nature and function of emotion: a component process approach”, in K.R. Scherer & P. Ekman (Eds.), Approaches to emotion (pp.293-317). Hillsdale, NJ: Erlbaum. • Scherer, K.R., (2000), “Emotions as episodes of subsystem synchronization driven by nonlinear appraisal processes”, in Lewis, M. & Granic, I. (Eds.) Emotion, Development, and Self-Organization (pp. 70-99). New York/Cambridge: Cambridge University Press. • Scherer K.R., Zentner M.R., (2001), “Emotional effects of music: production rules”, In P.N. Juslin & J.A.Sloboda (Eds). Music and emotion: Theory and research (pp. 361-392). Oxford: Oxford University Press.

More Related