1 / 18

Simplifying Wheelchair Mounted Robotic Arm Control with a Visual Interface

Simplifying Wheelchair Mounted Robotic Arm Control with a Visual Interface. Katherine M. Tsui and Holly A. Yanco University of Massachusetts Lowell. http://www.cs.uml.edu/robots. Outline. Collaborators Research Question Hardware Experiment Current/Future work. Collaborators.

ula
Télécharger la présentation

Simplifying Wheelchair Mounted Robotic Arm Control with a Visual Interface

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Simplifying Wheelchair Mounted Robotic Arm Control with a Visual Interface Katherine M. Tsuiand Holly A. Yanco University of Massachusetts Lowell http://www.cs.uml.edu/robots

  2. Outline • Collaborators • Research Question • Hardware • Experiment • Current/Future work

  3. Collaborators • University of Central Florida: Aman Behal • Crotched Mountain Rehabilitation Center: David Kontack • Exact Dynamics: GertWilem Romer • NSF IIS-0534364

  4. Research Question • What is the most effective user interface to manipulate a robot arm? • Our target audience is power wheelchair users, specifically: • Physically disabled, cognitively aware people. • Cognitively impaired people who do not have fine motor control.

  5. Manus ARM by Exact Dynamics 6 DoF Joint encoders, slip couplings Cameras Manual and computer control modes Both are capable of individual joint movement and Cartesian movement of the wrist. Hardware

  6. Interface Design • Interface is compatible with single switch scanning. • Left: • Original image is quartered. • Quadrant containing the desired object is selected. • Middle: • Selection is repeated a second time. • Right: • Desired object is in 1/16th close-up view.

  7. Demo

  8. User Testing: Hypotheses • H1: Users will prefer a visual interface to a menu based system. • H2: With greater levels of autonomy, less user input is necessary for control. • H3: It should be faster to move to the target in computer control than in manual control.

  9. User Testing: Experiment • Participants • 12 able-bodied participants (10 male, 2 female) • Age: [18, 52] • 67% technologically capable • Computer usage per week (including job related): • 67% 20+ hours; 25% 10 to 20 hours; 8% 3 to 10 hours • 1/3 had prior robot experience: • 1 industry; 2 university course; 1 “toy” robots

  10. User Testing: Experiment Methodology • Two tested conditions: manual and computer control. • Input device was single switch for both controls. • Each user performed 6 runs (3 manual, 3 computer). • Start control was randomized and alternated. • 6 targets were randomly chosen.

  11. User Testing:Experiment Methodology • Neither fine control nor depth existed in implementation of computer control during user testing. • In manual control, users were instructed to move the opened gripper “sufficiently close” to the target.

  12. User Testing:Experiment Methodology • Manual control procedure, using single switch and single switch menu: • Unfold ARM. • Using Cartesian movement, maneuver opened gripper “sufficiently close” to target.

  13. User Testing:Experiment Methodology • Computer control procedure: • Turn on ARM. • Select image using single switch. • Select major quadrant using single switch. • Select minor quadrant using single switch. • Color calibrate using single switch.

  14. User Testing: Results H1: Users will prefer a visual interface to a menu based system. • 83% stated preference for manual control in exit interviews. • Likert scale rate of manual and computer control (1 to 5) showed no significant difference in user experience preference. • H1 was not proven. • Why? Color calibration

  15. User Testing: Results H2: With greater levels of autonomy, less user input is necessary for control. • In manual control, counted the number of clicks executed by users during runs, divide by run time. This yields average clicks per second. • In computer control, the number of clicks is fixed. • H2 was confirmed.

  16. User Testing: Results H3: It should be faster to move to the target in computer control than in manual control. • Distance to time ratio: moving distance X takes Y time. • Under computer control, ARM moved farther in less time. • H3 was confirmed.

  17. Current/Future Work • Identify specific volunteers • User interface • User testing: • H1 • Baseline evaluation • Initial testing at Crotched Mountain • Integration with power wheelchair • Depth extraction • Occlusion

  18. http://www.cs.uml.edu/robots

More Related