1 / 24

Vision-Based Reach-To-Grasp Movements

Vision-Based Reach-To-Grasp Movements. From the Human Example to an Autonomous Robotic System. Alexa Hauck. Context. MODEL of Hand-Eye Coordination. ANALYSIS of human reaching movements. SYNTHESIS of a robotic system. Special Research Program “Sensorimotor”.

Télécharger la présentation

Vision-Based Reach-To-Grasp Movements

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Vision-Based Reach-To-Grasp Movements From the Human Example to an Autonomous Robotic System Alexa Hauck

  2. Context MODEL of Hand-Eye Coordination ANALYSIS of human reaching movements SYNTHESIS of a robotic system Special Research Program “Sensorimotor” • C1: Human and Robotic Hand-Eye Coordination • Neurological Clinic (Großhadern), LMU München • Institute for Real-Time Computer Systems, TU München

  3. The Question is ... control strategy representation catching reaching How to use whichvisualinformation for motioncontrol?

  4. State-of-the-art Robotics Look-then-move: (visual feedforward control) Visual Servoing: (visual feedback control) • easy integration with path planning • only little visual information needed • sensitive against model errors • model errors can be compensated • convergence not assured • high-rate vision needed Impressive results ... but nowhere near human performance!

  5. The Human Example Separately controlled hand transport: • almost straight path • bell-shaped velocity profile Experiments with target jump: • smooth on-line correction of the trajectory Experiments with prism glasses: • on-line correction using visual feedback • off-line recalibration of internal models • Use of visual information in spatial representation • Combination of visual feedforward and feedback ... but how ?

  6. New Control Strategy

  7. Example: Point-to-point

  8. Example: Target Jump

  9. Example: Target Jump

  10. Example: Target Jump

  11. Example: Multiple Jumps

  12. Example: Multiple Jumps

  13. Example: Double Jump

  14. Hand-Eye System position target & hand Image Interpretation Motion Planning object model object model Models Hand-Eye System & Objects features trajectory Image Processing Robot Control sensor model arm model images commands Robot

  15. The Robot: MinERVA CCD cameras pan-tilt head manipulator with 6 joints

  16. Robot Vision Target corresponding points 3D Bin. Stereo Hand corresponding points

  17. Example: Reaching

  18. Example: Reaching

  19. Example: Reaching

  20. Model Parameters HALCON HALCON Calibration • Arm: • geometry, kinematics • 3 parameters • Arm-Head Relation: • coordinate transformation • 3 parameters • Head-Camera Relations: • coordinate transformations • 4 parameters • Cameras: • pinhole camera model • 4 parameters (+ rad. distortion) manufacturer measuring tape

  21. Use of Visual Feedback corr mean max 0 8.9cm 20cm 1 Hz 0.4cm 1cm

  22. Example: Vergence Error

  23. Example: Compensation

  24. Summary • New control strategy for hand-eye coordination • Extension of a biological model • Unification of look-then-move & visual servoing • Flexible,economic use of visual information • Validation in simulation • Implementation on a real hand-eye system

More Related