html5-img
1 / 11

Expressive Gestures for NAO

Expressive Gestures for NAO. Le Quoc Anh - Catherine Pelachaud CNRS, LTCI, Telecom- ParisTech , France. NAO TechDay, 13/06/2012, Paris. Objectives. Generate communicative gestures for Nao robot Integrated within an existing platform for virtual agent

lok
Télécharger la présentation

Expressive Gestures for NAO

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Expressive Gestures for NAO Le QuocAnh - Catherine Pelachaud CNRS, LTCI, Telecom-ParisTech, France NAO TechDay, 13/06/2012, Paris

  2. Objectives • Generate communicative gestures for Nao robot • Integrated within an existing platform for virtual agent • Nonverbal behaviors described symbolically • Synchronization (gestures and speech) • Expressivity of gestures • GVLEX project (Gesture & Voice for Expressive Reading) • Robot tells a story expressively. • Partners : LIMSI (linguistic aspects), Aldebaran (robotics), Acapela (speech synthesis), Telecom ParisTech (expressive gestures) NAO TechDay 2012

  3. State of the art NAO TechDay 2012 • Several initiatives recently: • Salem et Kopp (2012): robot ASIMO, the virtual framework MAX, gesture description with MURML. • Aaron Holroyd et Charles Rich (2011): robot Melvin, motion scripts with BML, simple gestures, feedback to synchronize gestures and speech • Ng-Thow-Hing et al. (2010): robot ASIMO, gestures selection, synchronization between gestures and speech. • Nozawa et al. (2006): motion scripts with MPML-HP, robot HOAP-1 • Our system: Focus on expressivity and synchronization of gestures with speech using a common platform for Greta and for Nao

  4. Steps GRETA System Behavior Realizer BML Intent Planning Behavior Planning Text FML Behavior Realizer BML • Build a library of gestures from a corpus of storytelling video: the gesture shapes should not be identical (between the human, virtual agent, robot) but they have to convey the same meaning. • Use the GRETA system to generate gestures for Nao • Following the SAIBA framework • Tworepresentationlanguages: FML (FunctionMarkupLanguage) and BML (BehaviorMarkupLanguage) • Threeseparated modules: plan communicative intents, select and plan gestures, and realizegestures NAO TechDay 2012

  5. Global diagram LEXICON Gesture Selection Synchronisation with AI speech Planification of gesture duration Modification of gesture expressivity KEYFRAMES FML BML NAO TechDay 2012

  6. Gesture Animation Planning • Synchronizationwith speech • The stroke phase coincides or precedes emphasized words of the speech (McNeill, 1992) • Gesture stroke phase timing specified by synch points • Expressivity of gestures • The same prototype but different animations • Parameters: • Spatial Extent (SPC): Amplitude of movement • Temporal Extent (TMP): Speed of movement • Power (PWR): Acceleration of movement • Repetition (REP): Number of Stroke times • Fluidity (FLD): Smoothness and Continuity • Stiffness (STF): Tension/Flexibility NAO TechDay 2012

  7. Example <bml>    <speech id="s1" start="0.0“ \vce=speaker=Antoine\ \spd=180\ Et le troisièmedittristement: \vce=speaker=AntoineSad\ \spd=90\ \pau=200\ <tm id="tm1"/>J'aitrèsfaim! </speech> <gesture id="beat_hungry" start="s1:tm1" end=“start+1.5" stroke="0.5"> <FLD.value>0</FLD.value> <OAC.value>0</OAC.value> <PWR.value>-1.0</PWR.value> <REP.value>0</REP.value> <SPC.value>-0.3</SPC.value> <TMP.value>-0.2</TMP.value> </gesture> </bml> <gesture id=“beat_hungry” min_time="1.0"> <phase type="STROKE-START“> <hand side=“BOTH"> <verticalLocation>YCC</verticalLocation> <horizontalLocation>XCenter</horizontalLocation> <distanceLocation>Zmiddle</distanceLocation> <handShape>OPENHAND</handShape> <palmOrientation>INWARD</palmOrientation> </hand> </phase> <phase type="STROKE-END“ > <hand side=“BOTH"> <verticalLocation>YLowerEP</verticalLocation> <horizontalLocation>XCenter</horizontalLocation> <distanceLocation>ZNear</distanceLocation> <handShape>OPEN</handShape> <palmOrientation>INWARD</palmOrientation> </hand> </phase> </gesture> keyframe1]<phase="preparation", start-time=“Start", end-time="Ready", description of stroke-start's position> keyframe[2] <phase="stroke", start-time="Stroke-start", end-time="Stroke-end", description of stroke-end's position> keyframe[3]<phase="retraction", start-time="Relax", end-time="End", description of rest position> NAO TechDay 2012

  8. Compilation • Send timed key-positions to the robot using available APIs • Animation is obtained by interpolating between joint values with robot built-in proprietary procedures. API.AngleInterpolation (joints, values,times) BML Realizer BML Realizer NAO TechDay 2012

  9. Demo « Trois petits morceaux de nuit » NAO TechDay 2012

  10. Conclusion • Conclusion • A gesture model is designed, implemented for Nao while taking into account physical constraints of the robot. • Common platform for both virtual agent and robot • Expressivity model • Future work • Create gestures with different emotional colour and personal style • Validate the model through perceptive evaluations NAO TechDay 2012

  11. Acknowledgment • This work has been funded by the ANR GVLEX project • It is supported from members of the laboratory TSI, Telecom-ParisTech NAO TechDay 2012

More Related