improving human robot interaction jill drury the mitre corporation n.
Skip this Video
Loading SlideShow in 5 Seconds..
Improving Human-Robot Interaction Jill Drury, The MITRE Corporation PowerPoint Presentation
Download Presentation
Improving Human-Robot Interaction Jill Drury, The MITRE Corporation

Improving Human-Robot Interaction Jill Drury, The MITRE Corporation

330 Vues Download Presentation
Télécharger la présentation

Improving Human-Robot Interaction Jill Drury, The MITRE Corporation

- - - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

  1. Improving Human-Robot InteractionJill Drury, The MITRE Corporation Collaborators: Holly Yanco, UMass Lowell Jean Scholtz, NIST Mike Baker, Bob Casey, Dan Hestand, Brenden Keyes, Phil Thoren (UML)

  2. Methodology for Evaluating HRI • Two approaches: field and laboratory • Field work: so far, robotics competitions • See many different user interfaces but have no control over what operator does • Difficult to collect data • Can see what they did – but there isn’t time to determine why • Best used to get an idea of the difficulties in the real world • Can identify “critical events” but don’t know for certain whether operator was aware of them

  3. Methodology for Evaluating HRI • Laboratory studies • Take what we learned in the real world and isolate factors to determine effects • Repeatability is still difficult to achieve due to fragile nature of robots

  4. Analysis Frameworks • Taxonomy to define human/robot system • Detailed definition of human-robot interaction awareness • Coding scheme/metrics for analyzing data • Scholtz’ evaluation guidelines

  5. Taxonomy • Autonomy • Amount of intervention • Human-robot ratio • Level of shared interaction • Composition of robot teams • Available sensors • Sensor Fusion • Criticality • Time • Space

  6. Awareness as a concept from CSCW • CSCW software: • “...makes the user aware that he is part of a group, while most other software seeks to hide and protect users from each other” [Lynch et al. 1990] • HRI software: • Makes humans aware of robots’ status and activities via the interface

  7. What is “awareness”? • No standard definition in the CSCW field • We’ve seen at least 16 different definitions! • Are many different types of awareness, e.g. • Concept awareness • Conversational awareness • Group-structural awareness • Informal awareness • Peripheral awareness • Common thread: understanding that participants have of each other in a shared environment • Situation awareness • Social awareness • Task awareness • Workspace awareness

  8. But CSCW systems are different from robotic systems… • CSCW: • Multiple humans interacting via a CSCW system • Robotics: • Single or multiple humans interacting with a single or multiple robots • Non-symmetrical relationships between humans and robots; e.g., differences in • Free will • Cognition

  9. Tailoring an awareness definition for HRI: a base case • Given one human and one robot... • ... HRI awareness is the understanding that the human has of the • location, • activities, • status, and • surroundings of the robot; and • the knowledge that the robot has of • the human’s commands necessary to direct its activities and • the constraints under which it must operate

  10. An awareness framework: General case • Given n humans and m robots working together on a synchronous task, HRI awareness consists of five components: • Human-robot awareness • Human-human awareness • Robot-human awareness • Robot-robot awareness • Humans’ overall mission awareness

  11. General case: a detailed look • Given n humans and m robots working together on a synchronous task, HRI awareness consists of five components: • Human-robot: the understanding that the humans have of the locations, identities, activities, status and surroundings of the robots. Further, the understanding of the certainty with which humans know this information. • Human-human: the understanding that the humans have of the locations, identities and activities of their fellow human collaborators

  12. General case, concluded • Robot-human: the robots’ knowledge of the humans’ commands needed to direct activities and any human-delineated constraints that may require command noncompliance or a modified course of action • Robot-robot: the knowledge that the robots have of the commands given to them, if any, by other robots, the tactical plans of the other robots, and the robot-to-robot coordination necessary to dynamically reallocate tasks among robots if necessary. • Humans’ overall mission awareness: the humans’ understanding of the overall goals of the joint human-robot activities and the measurement of the moment-by-momentprogress obtained against the goals.

  13. Coding Scheme: Problems Relating to Critical Incidents • Critical incident: Robot has, or could, cause harm or damage • Types of problems: • Local navigation • Global navigation • Obstacle encounter • Vehicle state • Victim identification (specific to search and rescue)

  14. Some Metrics for HRI • Time spent navigating, on UI overhead and avoiding obstacles • Amount of space covered • Number of victims found • Critical incidents • Positive outcomes • Negative outcomes • Operator interventions • Amount of time robot needs help • Time to acquire situation awareness • Reason for intervention

  15. Scholtz’ Guidelines (tailored) • Is sufficient status and robot location information available so that the operator knows the robot is operating correctly and avoiding obstacles? • Is the information coming from the robots presented in a manner that minimizes operator memory load, including the amount of information fusion that needs to be performed in the operators’ heads? • Are the means of interaction provided by the interface efficient and effective for the human and the robot (e.g., are shortcuts provided for the human)? • Does the interface support the operator directing the actions of more than one robot simultaneously? • Will the interface design allow for adding more sensors and more autonomy?

  16. Design Guidelines • Enhance awareness • Provide a map of where the robot has been • Provide more spatial information about the robot in the environment to make the operators more award of their robot’s immediate surroundings • Lower cognitive load • Provide fused sensor information to avoid making the user fuse data mentally • Display important information near or fused with the video image

  17. Design Guidelines, concluded • Increase efficiency • Provide user interfaces that support multiple robots in a single window, if possible • In general, minimize the use of multiple windows and maximize use of the primary viewing area • Provide help in choosing robot modality • Give the operator assistance in determining the most appropriate level of robot autonomy at any given time

  18. Fusing Information • Victims can be missed in video images

  19. Fusing Infrared and Color Video

  20. Fusing Infrared and Color Video

  21. Fusing Infrared and Color Video

  22. Other Sensor Modalities for USAR • CO2 detection • Audio

  23. Overlay of four sensor modalities

  24. Overlay of four sensor modalities