1 / 41

Developing Performance Predictive Tools for Supervisors of Teams and Complex Systems

Developing Performance Predictive Tools for Supervisors of Teams and Complex Systems. February 22 nd 2007 Yves Boussemart (yves@mit.edu) Humans & Automation Lab MIT Aeronautics and Astronautics http://halab.mit.edu. Outline. Lab Overview Human Supervisory Control

lan
Télécharger la présentation

Developing Performance Predictive Tools for Supervisors of Teams and Complex Systems

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Developing Performance Predictive Tools for Supervisors of Teams and Complex Systems February 22nd 2007 Yves Boussemart (yves@mit.edu) Humans & Automation Lab MIT Aeronautics and Astronautics http://halab.mit.edu

  2. Outline • Lab Overview • Human Supervisory Control • Single vs. Multiple UVs • Supervising teams of UV operators • Tools for Supervisor • TRACS • Performance Prediction • Team Environments

  3. MIT Humans & Automation Lab • Created in 2004 • Director: Dr. Mary (Missy) Cummings • Visiting professors: Dr. Gilles Coppin • Visiting scientist: Dr. Jill Drury • Post Doctorate Associates: Dr. Stacey Scott, Dr. Jake Crandall, Dr. Mark Ashdown • Grad Students: Yves Boussemart, Sylvain Bruni, Amy Brzezinski, Hudson Graham, Jessica Marquez, Jim McGrew, Carl Nehme, Jordan Wan

  4. Research and Current Projects • Research in the Humans and Automation Lab (HAL) focuses on the multifaceted interactions of human and computer decision-making in complex socio-technical systems. • Time-Sensitive Operations for Distributed Teams • Human Supervisory Control Issues of Multiple Unmanned Vehicles (Reduced manning) • Measurement Technologies for Human-UV Teams • Collaborative Human Computer Decision Making • Integrated Sensor Decision Support • Sponsors: Office of Naval Research, Boeing, Lincoln Labs, AFOSR, Thales

  5. HAL Testing Equipment Single Operator Testing: ONR’s Multi-modal Watch Station (MMWS) Team Testing: HAL Complex Operation Center ONR Mobile Testing Lab

  6. Outline • Lab Overview • Human Supervisory Control • Single vs. Multiple UVs • Supervising teams of UV operators • Tools for Supervisor • TRACS • Performance Prediction • Team Environments

  7. Computer Task Human Supervisory Control (HSC) Actuators Controls Human Operator (Supervisor) Displays Sensors • Human Supervisory Control • Humans on the loop vs. in the loop • Supporting knowledge-based versus skill-based tasks • Network-centric operations & cognitive saturation

  8. Human-Supervisory Control of Automated Systems Manned Aviation Process Control Satellite Operations (Shadow UAV) (Mars rover) Unmanned Vehicle Operations

  9. Major Research Area: HSC of Unmanned Vehicles Unmanned Aerial Vehicles (UAVs) Unmanned Undersea Vehicles (UUVs) Predator UAV VideoRay UUV Shadow UAV Odyssey UUV Unmanned Ground Vehicles UGVs (i.e., Robots) Spotter UGV Packbot UGV

  10. Major Research Area: HSC of Unmanned Vehicles Unmanned Aerial Vehicles (UAVs) Unmanned Undersea Vehicles (UUVs) Predator UAV VideoRay UUV Shadow UAV Odyssey UUV Unmanned Ground Vehicles UGVs (i.e., Robots) Spotter UGV Packbot UGV

  11. Predator Ground Control Station Predator UAV Motivation: Increasing Reliance on UAVs in Military Operations • UAVs are becoming an essential part of modern military operations • Typical UAV missions include: • Force protection • Intelligence, surveillance, and reconnaissance (ISR) • Combat search and rescue • Strike coordination and reconnaissance (SCAR)

  12. Inverting the Operator/Vehicle Ratio Semi-Autonomous UAV Operations2-5 UAVs : 1 Operator Current UAV Operations1 UAV : 2-5 Operators Future UAV Teams

  13. Current Supervisory-Level Decision Support for Teams • Developed large-screen supervisor displays that provide current and expected mission and task progress information of team assets andoperator activity • Displays integrate related informationand provides emergent features fortime-critical data

  14. Supervisory Information? • Individual and Team performances • Stress & time pressure • Rapidly evolving situation • Actions: • Adaptive automation • Operator replacement / shifts Excessive workload

  15. Towards Performance Prediction Tools 4 step process: • Tracking of individual actions • Pattern recognition on strategies and performance prediction • Aggregation of individual data and collaboration factors • Team level performance predictions Is the team doing well? Is the Operator using “good” strategies? Operator Supervisor

  16. Outline • Lab Overview • Human Supervisory Control • Single vs. Multiple UVs • Supervising teams of UV operators • Tools for Supervisor • TRACS • Performance Prediction • Team Environments

  17. Tracking Resource Allocation Cognitive Strategies (TRACS) • 2-dimensional space: • Level of Information Detail (LOID) • Mode (action steps) • 4 quadrants: • LOID: higher vs. lower automation/information • Mode: evaluation vs. generation of solutions • Technology disclosure for patent and licensing

  18. Example of TRACS Application • Application: Decision-Support for Tomahawk Land Attack Missile (TLAM) Strike Planning • Resource allocation task: • Match resources (missiles) with objectives (missions) • Respect Rules of Engagement • Satisfy multivariate constraints • Current system: PC-MDS, no decision-support • 3 interfaces at various levels of collaboration

  19. Example of TRACS Representation • TRACS applied to TLAM • LOID: • Higher automation: • Group of criteria • Individual criterion • Lower automation: • Group of matches • Individual match • Data cluster • Data item • Mode: • Evaluation: • Evaluate, • Backtrack • Generation: • Browse • Search • Select • Filter • Automatch

  20. Example of TRACS Results Mostly Manual (Interface 1) Mostly Automation (Interface 3) Combination (Interface 2) Cognitive strategies are emerging as patterns

  21. Outline • Lab Overview • Human Supervisory Control • Single vs. Multiple UVs • Supervising teams of UV operators • Tools for Supervisor • TRACS • Performance Prediction • Team Environments

  22. Performance Prediction with TRACS • TRACS as a observable data of Hidden Markov Model for individual users • Compute the decision transition matrices from empirical data • Bayesian Prediction based on Markov Chains

  23. Performance Prediction with TRACS • TRACS + Neural Networks: • Detect pattern with neural network: cognitive strategies • Alert supervisor when behavior degrades • Are bad performances robustly predictable in advance? Manual to automatch Manual Browsing Automatch loop

  24. Outline • Lab Overview • Human Supervisory Control • Single vs. Multiple UVs • Supervising teams of UV operators • Tools for Supervisor • TRACS • Performance Prediction • Team Environments

  25. UAV Team Operations

  26. Individual Tracking cognitive strategies Performance predictions Relatively simple metrics Team Team dynamics Intra-Team communication Verbal & Non-Verbal Performance metrics Group Awareness Situation and activity awareness Distributed cognition Group Interaction Theories Collaboration Factors Open Research Questions

  27. Critical Questions to Consider • What metrics can we use to gauge team performance? • Which factors drive the metric? • How does time pressure affect the decision process? • How much information does a supervisor need? • Direct observation of operators’ behavior • Synthetic data only (TRACS)? • Both?

  28. Summary • Focus went from individual UAV operator to supervisor of teams of UAV operators • Proposing a performance predictive tool • Extend the predictions to team environments

  29. Questions?

  30. Research supervised by Prof. M. L. Cummings • Research effort sponsored by Boeing/Boeing Phantom Works • Contacts: yves@mit.edumissyc@mit.edu • Web: http://halab.mit.edu • TRACS demo: • http://web.mit.edu/aeroastro/www/labs/halab/media.html • http://tinyurl.com/ybafp2

  31. Backup Slides

  32. Interface 1 - manual LOA 2 - manual matching Basic support: filtering, sorting, warning and summarizing

  33. Interface 2 - collaborative manual matching “automatch” = customizable heuristic search algorithm graphical summaries of constraint satisfaction option to save for comparison purposes LOA 3 - collaborative matching Advanced features for interactive search of a solution

  34. Interface 3 - configural possibility to tweak the solution or to force assignment no access to raw data aggregated info. only LOA 4 - automated matching with configural display High level, constrained solution search

  35. Tomahawk Mission Planning • Performance on incomplete scenario • performance decreased when LOA increased on single interface setup • best: interface 1 and interfaces 2&3 - worst: interfaces 1&3 • no deviation on interface 3 • Interface 1: P = 69.7 • Interface 3: P = 68.5

  36. TRACS 3D • Problems with a 3D visualization • Loss of granularity and clutter • Occlusion effect (loss of 2D information) • Parallax effect (detrimental perspective) • Difficult to manipulate (high cognitive load) • Difficult to orient oneself (loss of SA) • Lack of emergent temporal analysis feature

  37. From TRACS 3D to TRACS 2.5D • Temporal Data • TRACS 3D: orthogonal axis • TRACS 2.5D: interactive timeline • Advantages • Not 3D (occlusion, parallax, orientation problems addressed) • Familiar manipulation • Clear grouping of temporal features (granularity, clutter, emergent properties)

  38. TRACS 2.5D

  39. Mobile Advanced Command and Control Station Humans and Automation Laboratory

  40. Mobile Advanced Command and Control Station

More Related