1 / 36

Communication via the Skin: The Challenge of Tactile Displays

Communication via the Skin: The Challenge of Tactile Displays. Lynette Jones Department of Mechanical Engineering, Massachusetts Institute of Technology Cambridge, MA. Spectrum of Tactile Displays. Sensory substitution . Human-computer interactions. Navigation/orientation.

shaun
Télécharger la présentation

Communication via the Skin: The Challenge of Tactile Displays

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Communication via the Skin: The Challenge of Tactile Displays Lynette Jones Department of Mechanical Engineering, Massachusetts Institute of Technology Cambridge, MA

  2. Spectrum of Tactile Displays Sensory substitution Human-computer interactions Navigation/orientation Visual impairments Tactile mouse TSAS (Rupert et al.) Hearing impairments CyberTouch MIT tactile display DataGlove Data Glove EST Vestibular (balance) impairments Tactile belt

  3. Tactile Displays - CTA ADA FocusUtilize a relatively underused sensory channel to convey information that is private and discreet • Assist in navigation or threat location in the battlefield • Increase SA in virtual environments used for training • Enhance the representation of information in displays

  4. Forearm Abdomen Finger Torso-based Tactile Displays Vibrotactile Sensitivity Function as an alert Orientation and direction information Sequential activation of array – vector conveys “movement” in environment Effective in environments with reduced visibility – enhances situation awareness

  5. Development of Tactile Display • Actuator (tactor) selection and characterization • Development of body-based system (configuration of display, power, wireless communication) • Perceptual studies – optimize design of the display in terms of human perceptual performance • Develop a framework for creating a tactile vocabulary – tactons • Field studies – measure the efficacy of display for navigation, identifying location of environmental events, and examine robustness of system (e.g. impact of body armor)

  6. Characteristics of the Actuators Evaluated Cylindrical motor Rototactor C2 tactor Pancake motor Cylindrical motor (Jones, Lockyer & Piateski, 2006) Tactaid

  7. Prototypes 2003-2007

  8. Tactile Display - Final Elements Core components - Pancake motors, Wireless Tactile Control Unit Contact area - ~ 300 mm2 (encased in plastic) Input signal – 130 Hz at 3.3 V, sinusoidal waveform Power – 9 V battery or 7.2 V Li-ion rechargeable, 2200 mAh Display – vest, waist band, sleeve Visual Basic GUI

  9. Actuator Evaluation – Frequencies and Forces Mechanical properties not affected by encasing motors (Jones & Held, 2008)

  10. Actuator Evaluation – Tactor spacing and Intensity Mechanical testing of skin Skinsim with accelerometers (Jones & Held, 2008)

  11. Transitions– MIT Tactile DisplayARL • Investigated the efficacy of tactile and multimodal alerts on • decision making by Army Platoon Leaders (Krausman et al., 2005, 2007) • Analyzed the effectiveness of tactile cues in target search and • localization tasks and when controlling robotic swarms (Hass, 2009) • Evaluated Soldiers’ abilities to interpret and respond to tactile cues • while they navigated an Individual Movement Techniques (IMT) course (Redden et al., 2006) • Measured the effects of tactile cues on target acquisition and • workload of Commanders and Gunners and determined the • detectability ofvibrotactile cues while combat assault maneuvers were • being performed (Krausman & White, 2006; White et al., in press). The MIT tactile displays have also been incorporated into multi-modal platforms developed by the University of Michigan, ArtisTech in the CTA test bed, and Alion MA&D for a robotics control environment.

  12. Questions addressed – MIT Research • Can tactile signals be used to provide spatial cues about the environment that are accurately localized? • How does the location and configuration of the tactile display influence the ability of the user to identify tactile patterns? • What is the maximum size of a tactile vocabulary that could be used for communication? • Which characteristics of vibrotactile signals are optimal for generating a tactile vocabulary? • Can a set of Army Hand and Arm Signals be translated into tactile signals that are accurately identified when the user is involved in concurrent tasks?

  13. Localization of Tactile Cues for Navigation and Orientation Navigation • Way-finding • Location of events – real and simulated environments • Control of robots • Experiments • 10 subjects in each experiment • Each tactor activated 5 times (randomly) • Subject indicate location of tactor vibrated Waist Back

  14. Navigation – Tactile Belt – One-dimensional Display Navel 1 8 2 Left 7 3 Right Inter-tactor distance 80-100 mm 6 4 5 Identification of tactor location Eight locations – 98% correct (Inter-tactor spacing: 80-100 mm) Twelve locations – 74% correct (Spacing: 55-66 mm) Spine (Jones & Ray, 2008)

  15. Localization – Two-dimensional Display Identification of tactor location 16 locations – 59% correct (40-82%) Within 1 tactor location: 95% Inter-tactor spacing: 40 mm vertical 60 mm horizontal Darker the shading, the more accurate the localization (Jones & Ray, 2008)

  16. Results • Spatial localization becomes more difficult as the number of tactors increases and the inter-tactor distance decreases • Two-dimensional 16-tactor array on the back is unable to support precise spatial mapping, for example between tactile location and visual target –driving or to highlight on-screen information • One-dimensional array is very effective for conveying directions

  17. Questions addressed • Can tactile signals be used to provide spatial cues about the environment that are accurately localized? • How does the location and configuration of the tactile display influence the ability of the user to identify tactile patterns? • What is the maximum size of a tactile vocabulary that could be used for communication? • Which characteristics of vibrotactile signals are optimal for generating a tactile vocabulary? • Can a set of Army Hand and Arm Signals be translated into tactile signals that are accurately identified when the user is involved in concurrent tasks?

  18. Location and Configuration of Tactile Display • Tested vibrotactile pattern recognition on forearm and back • Fabricated 3x3 (arm) and 4x4 (torso) arrays both controlled by Wireless Tactile Control Unit (WTCU) • Tactile patterns varied with respect to spatial cues (location), amplitude (number of tactors simultaneously active) and spatio-temporal sequence.

  19. A 3 3 3 B 1 1 1 2 2 2 2 2 2 1 1 1 3 3 3 Up Down C 1 2 3 D 3 2 1 1 2 3 3 2 1 1 2 3 3 2 1 Right Left E F 2 1,3 1,3 1,3 1,3 1,3 2 1,3 2 2 2 2 Left, right, left Top, bottom, top G 1, 2, 3 H 1, 2, 3 Blink X-shape 3 times Blink center 3 times Group mean percentage of correct responses – averaged across tactors – 89% Results Tactile Patterns (Piateski & Jones, 2005)

  20. Back -Tactile Pattern Recognition A B C D 4 4 4 4 1 1 1 1 1 2 3 4 4 3 2 1 3 3 3 3 2 2 2 2 1 2 3 4 4 3 2 1 2 2 2 2 3 3 3 3 1 2 3 4 4 3 2 1 1 1 1 1 4 4 4 4 1 2 3 4 4 3 2 1 Right Up Down Left 99% 97% 100% 100% E F G H 1,2, 3,4 1, 3 1, 3 2, 4 1, 3 1, 3 1, 3 1,2, 3,4 1, 3 2, 4 1, 3 2, 4 2, 4 2, 4 2, 4 2, 4 1, 3 2, 4 Blink corners 4 times Left, right, left, right Top, bottom, top, bottom Blink single motor 4 times 100% 100% 99% 100% (Piateski & Jones, 2005)

  21. Tactile Vocabulary – 15-20 Tactons? C A B D E F G 1,2 4 3 2 1 1 2 3 4 7,8 3,4 5,6 1 1 2 3 4 Each corner vibrates twice Right, left, right, left 4 3 2 1 Down Up Corners vibrate together four times I Right Left N O L M K H I J 1,3 1,3 2,4 2,4 1,3 4,8 3,7 2,6 1,5 2,4 1,3 2,4 Top, bottom, top, bottom Single tactor vibrates four times Bottom, top, bottom, top Left, right, left, right Outer corners then inner twice Two corners vibrate in turn twice Middle two rows Diagonal vibrates four times Mean: 96% (Jones, Kunkel, & Torres, 2007)

  22. Tactile Pattern Recognition – Effect of Stimulus Set Experiment 1A Mean correct response rate: 62% in Expt 1A 85% in Expt 1B IT: 1.48 bits IT: 2.15 bits Confusion matrix (Expt 1A): A misidentified as F, whereas F mis-identified as D – errors not symmetrical. Tactile patterns that “moved” across the arm more accurately perceived than those that “moved” along the arm A 3 3 3 B 1 1 1 C 1 2 3 D 3 2 1 3 2 1 2 2 2 2 2 2 1 2 3 3 2 1 1 1 1 3 3 3 1 2 3 Up Down Right Left 1, 3 1,3 1,3 E F 1, 2, 3 H 2 G 2 1,3 2 1,3 2 2 2 2 1,3 Blink center 3 times Top, bottom, top Blink X - shape 3 times Left, right, left Experiment 1B (Jones, Kunkel, & Piateski, 2009)

  23. Summary of Findings • Arm vsback – both provide effective substrates for communication • Array dimensions – marked effect on spatial localization • Asymmetries in spatial processing on the skin • Need to evaluate patterns in the context of the “vocabulary” used • Tactile vocabulary size – absolute identification vs communication • Interceptor Body Armor - no effect on performance Tap on shoulder Saltation Direction and orientation

  24. Navigation path Field Experiments Five subjects participated Eight patterns with five repetitions Familiarization with visual analog initially Brief training period outdoors Navigation using only tactile cues, without feedback 100% accuracy for 7/8 patterns presented Single error on 8th pattern Demonstrated that navigation is accurate using only tactile cues as directions (Jones, Lockyer & Piateski, 2006)

  25. Questions addressed • Can tactile signals be used to provide spatial cues about the environment that are accurately localized? • How does the location and configuration of the tactile display influence the ability of the user to identify tactile patterns? • What is the maximum size of a tactile vocabulary that could be used for communication? • Which characteristics of vibrotactile signals are optimal for generating a tactile vocabulary? • Can a set of Army Hand and Arm Signals be translated into tactile signals that are accurately identified when the user is involved in concurrent tasks?

  26. Tactons (tactile icons) Structured tactile messagesthat can be used to communicate information. These tactons must be intuitive and salient. 3,4 1,2 5,6 7,8 Assemble/rally Communication tactons Navigation tactons

  27. Tactons for hand-based communication Frequency Duration, repetition rate Waveform complexity (Jones & Sarter, 2008)

  28. Tacton building blocks: Relevant properties of each variable (Jones, Kunkel, & Piateski, 2009)

  29. Arm and Hand Signals for Ground Forces Identify a set of structured tactile messages (tactons) that can be used to communicate information. 4 4 4 4 1 1 1 1 3,5 3,5 3,5 3,5 2 2 2 2 2 3 4 1 2,6 2,6 2,6 2,6 3 3 3 3 2 4 1 3 1,7 1,7 1,7 1,7 4 4 4 4 Danger area Take cover Increase speed

  30. Attention Halt Advance or Move Out 2, 4 2 4 4 1, 3 4 3 4 1 4 3 1 3 4 1, 3 2, 4 2 3 3 3 1 2 3 4 1, 3 2, 4 2 2 2 2 2 1 3 4 1 1, 3 2, 4 1 1 1 Each tactile hand signal was designed to keep some of the iconic information of the matching visual handsignal

  31. Hand and Arm Signals – Tactile-visual mapping Mean (N=10) percentage of correct responses (35 trials per subject) when identifying the hand signal with both the illustration and schematic available (black - 98% correct) and with only the illustration available (red – 75% correct). (Jones, Kunkel, & Piateski, 2009)

  32. Questions addressed • Can tactile signals be used to provide spatial cues about the environment that are accurately localized? • How does the location and configuration of the tactile display influence the ability of the user to identify tactile patterns? • What is the maximum size of a tactile vocabulary that could be used for communication? • Which characteristics of vibrotactile signals are optimal for generating a tactile vocabulary? • Can a set of Army Hand and Arm Signals be translated into tactile signals that are accurately identified when the user is involved in concurrent tasks?

  33. Field Experiments – Concurrent activities 91% 91% 93% Nuclear, biological and chemical attack Increase speed Take cover Advance to left Attention Assemble Danger area Halt (Jones, Kunkel, & Piateski, 2009)

  34. Conclusions • Vibrotactile patterns easily perceived on torso with little training and single stimulus exposure • Demonstrated feasibility of using sites that are non-intrusive and body movements are not impeded • Shown that the ability to perceive tactile patterns is not affected by concurrent physical and cognitive activities • Directional patterns are intuitive and can readily be used as navigational and instructional cues • Two-dimensional arrays provide greater capabilities for communication, but one-dimensional arrays are effective for simple commands

  35. Acknowledgements Brett Lockyer Mealani Nakamura Erin Piateski Jacquelyn Kunkel Edgar Torres Amy Lam David Held Christa Margossian Katherine Ray Research was supported through the Advanced Decision Architectures Collaborative Technology Alliance sponsored by the U.S. Army Research Laboratory under Cooperative Agreement DAAD19-01-2-0009.

  36. References Jones, L.A., Kunkel, J. & Piateski, E. (2009). Vibrotactile pattern recognition on the arm and back. Perception, 38, 52-68. Jones, L.A. & Held, D.A. (2008). Characterization of tactors used in vibrotactile displays. Journal of Computing and Information Sciences in Engineering,044501-1-044501-5. Jones, L.A. & Ray, K. (2008). Localization and pattern recognition with tactile displays. Proceedings of the Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems,33-39. Jones, L.A. & Sarter, N. (2008). Tactile displays: Guidance for their design and application. Human Factors, 50, 90-111. Jones, L.A., Kunkel, J., & Torres, E. (2007). Tactile vocabulary for tactile displays. Proceedings of the Second Joint Eurohaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, 574-575. Jones, L.A., Lockyer, B., & Piateski, E. (2006). Tactile display and vibrotactile pattern recognition on the torso. Advanced Robotics, 20, 1359-1374. Piateski, E. & Jones, L.A. (2005). Vibrotactile pattern recognition on the arm and torso. Proceedings of the First Joint Eurohaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, 90-95.

More Related