1 / 8

Sensate Skins for Robots Cynthia Breazeal Dan Stiehl (S.M. MAS student)

Sensate Skins for Robots Cynthia Breazeal Dan Stiehl (S.M. MAS student). Goal & Motivation. Develop full-body skin for social robot Functions of skin and touch for human-robot interaction (HRI) Express Sense Protect Communicate Nurture. Leonardo. “Touch is our most social sense”

Télécharger la présentation

Sensate Skins for Robots Cynthia Breazeal Dan Stiehl (S.M. MAS student)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Sensate Skins for RobotsCynthia BreazealDan Stiehl (S.M. MAS student)

  2. Goal & Motivation • Develop full-body skin for social robot • Functions of skin and touch for human-robot interaction (HRI) • Express • Sense • Protect • Communicate • Nurture Leonardo “Touch is our most social sense” Jones & Yarborough “Tactile needs do not seem to change with aging---if anything they seem to increase” Ashley Montagu

  3. Goal & Motivation • Investigate biologically inspired algorithms for large distributed sensor networks • 1.1x10 axons distributed across surface • Efficient and quick processing of multi-modal information • Elegant principles for multi-modal integration 6

  4. Biologically Inspired Approach WORLD (people and things) • Somatic Alphabet & Hierarchical Organization • Model multiple types of “receptors” • Touch, pain, temperature, proprioception • Multi-modal integration based on primitives (Meredith & Stein) • Somatosensory (Hyvarinen et. al.) • Motor (Bizzi & Mussa-Ivaldi) • Visual (Jones & Palmer, Kruger et. al.) “Letters” Joint Angle (posture) FSR (indentation) Capacitive (proximity) Camera (vision) “Words” Motor Cortex Neuron (hard, soft, curvature) Visual Processing (Shape, Color, Depth) Somatosensory Cortical Neuron (orientation, direction) “Sentences” “The Soft Red Ball is Rolling Down My Arm”

  5. Milestones to Date • Mechanical hands with integrated sensing & computation • Articulated fingers for active touch • 40 FSR, capacitive sensing • Silicone skin glove

  6. Milestones to Date • Embedded Modular Scalable sensing network • PIC16F876A 20MHz Microprocessor • 57600 baud serial communication • Poll 64 sensors in 6ms. • Hierarchical organization • Distribute tactile sensors over body regions • Based on somatopic map

  7. Motion Orientation Direction Milestones to Date • “Fleshing” out letters and words of somatosensory alphabet • Letters: model slowly and rapidly adapting mechanoreceptors (FSR) • Words: model population based cortical processing • Centroid • Orientation • Direction of motion

  8. Next Steps • Apply alphabet to “sentences” • Active touch for exploration and manipulation • Communicative touch for interaction with people • Multi-sensor integration based on primitives • Add in vision, limb proprioception, movement primitives • Spread to other parts of body according to cortical distribution • Arms & Face

More Related