1 / 18

Trust, Trustworthiness, and Trustabilty

Trust, Trustworthiness, and Trustabilty. John D. Lee Department of Industrial and Systems Engineering University of Wisconsin-Madison. Why consider trust? It seems to make a difference.

flavio
Télécharger la présentation

Trust, Trustworthiness, and Trustabilty

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Trust, Trustworthiness, and Trustabilty John D. Lee Department of Industrial and Systems Engineering University of Wisconsin-Madison

  2. Why consider trust? It seems to make a difference http://www.dailymail.co.uk/news/article-1164705/BMW-left-teetering-100ft-cliff-edge-sat-nav-directs-driver-steep-footpath.html

  3. 20+ Million lines of code In the navigation system http://spectrum.ieee.org/green-tech/advanced-cars/this-car-runs-on-code

  4. Managing multiple UVs

  5. Human-automation interaction architectures Cooperation Reliance Shared resources Automation Human Human Automation Automation Controlled Process Controlled Process Competing goals Automation Human Automation Controlled Process

  6. Lee, J. D., & See, K. A. (2004). Trust in technology: Designing for appropriate reliance. Human Factors, 46(1), 50-80.

  7. Calibration of trust and appropriate reliance

  8. Gladwell’s problem with rankings • Stanford’s undergraduate Aeronautics program ranked top 10 in country • More than just a single, context-independent certification of trustworthiness • Trustworthiness of automation • to do what • in what context • in what way http://www.newyorker.com/reporting/2011/02/14/110214fa_fact_gladwell

  9. Lee, J. D., & See, K. A. (2004). Trust in technology: Designing for appropriate reliance. Human Factors, 46(1), 50-80.

  10. Basis of trust Ability Benevolence Trust Integrity Lee & Moray (1992)Mayer, Davis & Schoorman(1995) Purpose Benevolence Process Integrity Performance (M, SD …) Ability Mayer, Davis & Schoorman, 1995, Academy of Management Review

  11. Basis of trust:Surface and depth cues • Depth cues: Algorithm characteristics • Purpose of algorithms relative to what role • Process relative to human: same, optimal, different • Performance and capability and verification for what context • Surface cues: Interface features that should not matter, but often do • Social response to technology, particularly with voice • Color pallet and layout??? • Physiological and social context manipulations(e.g., Liquid trust)

  12. Oxytocin and the physiological basis of trust • Trust governs willingness to take risks in social exchange relationships • People often respond to technology in a social fashion • Factors influencing oxytocin levels may affect trust in decision aids Kosfeld, M., Heinrichs, M., Zak, P. J., Fishbacher, U., & Fehr, E. (2005). Oxytocin increases trust in humans. Nature, 435, 673-676.

  13. Trust in humans Trust in technology • Beyond Purpose, Process and Performance (Benevolence, Integrity, Ability) • Reciprocity: Trusting behavior contingent on trusting behavior of partner • Joint risk: • Trustor at risk for trustee performance • Trustee reputation at risk • Trusting outcomes: Cooperation and reliance • Trust evolves through a dynamic interaction

  14. Lee, J. D., & See, K. A. (2004). Trust in technology: Designing for appropriate reliance. Human Factors, 46(1), 50-80.

  15. PerceptionActionActionPerception Purpose Ability Process Benevolence Trust Trust Performance Integrity Trust influences the information people seek Trust develops not just in response to provided information, but a result of active exploration and experimentation Reliance action

  16. Trustable technology • Beyond trusthworthy to trustable • Coordinate surface and depth features (algorithms and interface) • Display transparency • Making the purpose, process, performance visible • Making the history of purpose, process, performance visible (Role and reputation) • Interactive transparency

  17. Interactive transparency Show a family of suggested routes, not just optimal Show most dissimilar routes routes with similar objective function Interactive objective function (e.g., sliders to adjust weighting of constraints)

  18. Research issues Trust relationships: Reliance, compliance, and cooperation Measures of trust calibration (calibration, resolution, specificity, timeliness…) Trust and trustworthiness in context Creating trustable, not just trustworthy technology Depth features: Match automation algorithms to human process, Maximize performance, or Minimize similarity with human Surface features: Understand when technology engages a social response Enhance display and interactive transparency Support performatory and exploratory automation interactions

More Related