1 / 59

Eye Tracking and Eye Movement-Based Interaction

Eye Tracking and Eye Movement-Based Interaction. Tampere University Computer Human Interaction Group. Aulikki Hyrskykari New Interaction Techniques 19.1.2001. Contents. Motivation - Why eye? (2 slides) Structure of the e ye (4) E ye movements (4) Technique and technology (8 )

erno
Télécharger la présentation

Eye Tracking and Eye Movement-Based Interaction

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Eye Tracking and Eye Movement-Based Interaction TampereUniversityComputerHumanInteractionGroup Aulikki Hyrskykari New Interaction Techniques 19.1.2001

  2. Contents • Motivation - Why eye? (2 slides) • Structure of the eye (4) • Eye movements (4) • Technique and technology (8 ) • Eyes as an input modality (3) • Eye movement data (7) • Eye movement based interaction (28) • Project suggestions (1) • References (3) 1 Reference:

  3. Constrained interface between two powerful information processors . Why eye? 1 . • Goal is to increase the bandwidth across the channel • Eyes are extremely rapid 2 MOTIVATION

  4. Why eye? 2 • Target acquisition usually requires the user to look at the target first before actuating the cursor control • Direction of gaze implicitly indicates the focus of attention • Facilitates hands free operation • Eye movements are natural, little conscious effort - RSI problems (repetitive stress injury) 3 MOTIVATION

  5. Pupil allows varying amounts of light to enter the eye. Cornea is a transparent structure thatcovers the iris and pupil • Retina includes • rods (94%)(light sensitive) • cones (6%) (color sensitive) Lens helps to focus light on theretina. Cones are concentrated in the centerof the retina - the fovea Structure of the eye 4 STRUCTURE OF THE EYE [Duchowsky00]

  6. Visual angle • thumbnail – arms length – 1.5 – 2.0 degrees • sun or moon – 0.5 degrees 5 STRUCTURE OF THE EYE [Duchowsky00]

  7. Visual angle and receptor distribution 6 STRUCTURE OF THE EYE [Duchowsky00]

  8. Spatial vision • Vision field is divided in to three regions • Fovea(foveola) provides the sharpest vision (<1.3o) • Parafovea previews foveal information (<5o) • Peripheral visionreacts to flashing objects and sudden movements • Peripheral vision has 15-50% of the acuity of the fovea and it is also less colour-sensitive 7 • [Antti Aaltonen] STRUCTURE OF THE EYE

  9. Eye movements • Are mainly used to reposition the fovea • The three most important types of eye movementsneeded to gain insight into overt localization of visual attention: • saccades • fixations • smooth pursuits (to a lesser extent) • Additionally there are eye movements that relate to the sense of balance, twisting and micro movements (like jitter) 8 EYE MOVEMENTS • [Antti Aaltonen]

  10. Saccades • Rapid eye movements used to reposition fovea • Range in duration from 10ms - 100ms • Effectively blind during transition • Deemed ballistic (pre-programmed) and stereotyped (reproducible) 9 EYE MOVEMENTS [Duchowsky00]

  11. Fixations • Standstills between saccades – information transfers during fixations • Possibly the most important type of eye movement for eye aware applications • 90% viewing time is devoted to fixations • duration: 150ms - 600ms • Not technically eye movements in their own right, rather characterized by miniature eye movements(tremor, drift, microsaccades) 10 EYE MOVEMENTS [Duchowsky00]

  12. Smooth pursuits • Involved when visually tracking a moving target • Depending on range of target motion, eyes are capable of matching target velocity 11 EYE MOVEMENTS [Duchowsky00]

  13. Scelar coil Eye tracking methods • Electronic methods • Mechanical methods 12 TECHNIQUE AND TECHNOLOGY

  14. Eye tracking methods • Video-based methods • Tracking some visible feature(s) of eyeball, e.g. • limbus (boundary of sclera and iris) • pupil • A video camera observes the user's eye • Image- processing software analyzes the video image and traces the tracked feature 13 TECHNIQUE AND TECHNOLOGY • [Antti Aaltonen]

  15. Bright pupil Corneal reflection Two point video-based method • The two features of eye are tracked -- typically • corneal reflection • pupil • Uses IR-light (invisible to human eye) • produce corneal reflection • causes bright or dark pupil, which helps the system to recognize pupil from video image 14 TECHNIQUE AND TECHNOLOGY • [Antti Aaltonen]

  16. Set-up of video-based systems • The optics of the system can be mounted on floor or head • If optics are floor mounted, the system is not in contact with the user • Generally head movements are not restricted and they can be separated from eye movements 15 TECHNIQUE AND TECHNOLOGY

  17. [Virtual Laboratory in Clemson University] Floor (table) mounted optics [Applied Science Laboratories, TAUCHI] 16 TECHNIQUE AND TECHNOLOGY

  18. Head mounted optics [SensoMotoric Instruments GmbH, TAUCHI] [Iota AB, EyeTrace Systems] [Mooij Holding] 17 TECHNIQUE AND TECHNOLOGY

  19. Terms related to ET Hardware • Spatial Resolution • The smallest mesurable change in eye position. • Accuracy • The expected difference in degrees of visual angle between true eye position and mean computed eye position during a fixation. • Because of the vision system and physiology of eye the accuracy is usually 0.5-1O. • Temporal Resolution (sampling rate) • Number of recorded eye positions per second. 18 TECHNIQUE AND TECHNOLOGY • [Antti Aaltonen]

  20. Eyes as an input modality • Problems and research issues • technological issues • HCI related issues 19 EYES AS AN INPUT MODALITY

  21. Technological issues • Usability of the hardware • head mounted systems more reliable but somewhat awkward • floor mounted more comfortable but more sensible to the head movements • Accuracy - need of calibration • for every user at the beginning of a task • also during the task • Costs of the hardware (coming down, though) 20 EYES AS AN INPUT MODALITY

  22. HCI related issues • Need to design and study new interaction techniques suitable for exploiting eye input • eye isa perceptual device, not evolved into a control organ • people are not used to operate things by simply looking at them - if poorly done it could be very annoying • Noisy data - need to refine in order to get useful dialogue information (fixations, eye events, intentions) • accuracy restricted by biological characteristics of the eye 21 EYES AS AN INPUT MODALITY

  23. Eye movement data • Scanpaths (3 min of raw eye data) when a subject was asked to answer to three different questions concerning the painting • examine at will • estimate wealth • estimate ages 22 EYE MOVEMENT DATA • [Yarbus67]

  24. Processing the Eye Movement Data • The data contains jittery, errorsoriginating from the limited accuracy), failures of tracking … • At the lowest level the raw eye position data must be filtered and the fixations identified • When analyzing the eye movement data off-line the noisy data can be refined using different filtering algorithms before counting the fixations, in real time the analysis must be more simple 23 EYE MOVEMENT DATA

  25. Filtering the noisy data x Raw data measured with an eye tracker:X-coordinates of eye gaze position during ~3 seconds time 24 EYE MOVEMENT DATA • [Jacob]

  26. Fixation identification (real-time) A simple algorithm for identifying the fixations in real-time [Siebert00]: 1) Fixation starts when the eye position stays within 0.5o > 100 ms (spatial and temporal thresholds filter the jitter) 2) Fixation continues as long as the position stays within 1o 3) Shorter than 200 ms failure totrack the eye does not terminate the fixation 25 EYE MOVEMENT DATA

  27. Visualization of scanpaths • circles are the fixations (center is the point of gaze during the fixation) • radius depicts the length of the fixation • lines are the saccades between fixations 26 EYE MOVEMENT DATA

  28. From fixations to eye events (real-time) • The fixations are thenturned into input tokens, ”eye events”, such as • start of fixation • continuation of fixation (repeatedly every X ms) • end of fixation • failure to locate eye position • entering monitored regions • The eye eventsare multiplexed into the event queue stream with other input events • The eye events can also carry information of the fixated screen object (using nearest neighbor approach) 27 EYE MOVEMENT DATA • [Siebert00]

  29. From eye events intentions (real-time) • Objective • recognizing the user’s intentions by modelling eye gaze patterns in different situations • to implement a higher level programming interface for gaze aware applications • Eye Interpretation Engine, objective to identify such behaviors as [Edwards98] • the user is reading • just “looking around” • starts and stops searching for an object (e.g. a button) • wants to select an object 28 EYE MOVEMENT DATA

  30. Eye Movement Based Interaction Jacob’s taxonomy:approaches for using gaze input in the user interface: response unnatural natural (learned)unnatural A. Commandbased interfaces eye movement C. Virtual environments B. Noncommandinterfaces natural 29 EYE MOVEMENT BASED INTERACTION

  31. Command Based Interaction • Gaze behavior very different from other devices used for controlling computer (hand, voice, feet) • intentional control of eyes is difficult and stressful, the gaze is easily driven by external events • precise control of eyes difficult • “Midas touch” problem • Most of the time the eyes are used for obtaining information with no intent to initiate commands • Users are easily afraid of looking at the “eye active” objects or areas of the window 30 EYE MOVEMENT BASED INTERACTION – A. COMMAND BASED

  32. Command Based Interaction • Eye movements don’t seem to apply as sole (nor main) input method • Supplementary input method – used besides some other input method(s) • Some exeptions: • applications for disabled • other exeptional circumstances when hands are engaged with some other activity 31 EYE MOVEMENT BASED INTERACTION – A. COMMAND BASED

  33. Applications for disabled © TechnoWorks, Co. Ltd (http:/www./t-works.co.jp/). 32 EYE MOVEMENT BASED INTERACTION – A. COMMAND BASED

  34. Eye Typing © Erica, Inc. http://www.ericainc.com 33 EYE MOVEMENT BASED INTERACTION – A. COMMAND BASED

  35. Eye Typing ©LC Technologies, Inc. http.//www.lctinc.com/doc/ecs.html 34 EYE MOVEMENT BASED INTERACTION – A. COMMAND BASED

  36. Environment control ©LC Technologies, Inc. http.//www.lctinc.com/doc/ecs.html 35 EYE MOVEMENT BASED INTERACTION – A. COMMAND BASED

  37. Eye controlled games ©LC Technologies, Inc. http.//www.lctinc.com/doc/ecs.html 36 EYE MOVEMENT BASED INTERACTION – A. COMMAND BASED

  38. Selection • Most obvious task is selection of an object • Experiments have proven that gaze selection is faster than mouse selection [Ware87, Jacob94, Jacob99] • Accuracy problem - target objects must be quite large 37 EYE MOVEMENT BASED INTERACTION – A. COMMAND BASED - SELECTION

  39. Selection – trigger? • Midas touch problem must be resolved, different solutions • dwell time • screen buttons • eye movement (e.g. wink) for selection • hardware buttons (e.g. space bar, or mouse) as the trigger for performing the selection in the position of gaze • In several experiments [Ware87, Jacob98], dwell time was found to be the most convenient trigger 38 EYE MOVEMENT BASED INTERACTION – A. COMMAND BASED - SELECTION

  40. Selection – dwell time • Dwell time, how long should it be? • if too long, sticky feeling (especially with expert users) • If too short, wrong selections • something more than 150 ms • Also winks have been used for implementing selection for disabled users, who can not use additional control devices 39 EYE MOVEMENT BASED INTERACTION – A. COMMAND BASED - SELECTION

  41. Screen buttons - EyeCon • Midas touch and screen buttons • EyeCon - visual feedback of theimminent selection 40 [Glenstrup 95] EYE MOVEMENT BASED INTERACTION – A. COMMAND BASED – SCREEN BUTTONS

  42. selectionmark Command * command name selection area Screen buttons – Quick glance • Quick glance menu selectionmethod • faster than mouse • more errors than with mouse • lack of good canceling method 41 [Ohno98] EYE MOVEMENT BASED INTERACTION – A. COMMAND BASED – SCREEN BUTTONS

  43. Screen buttons – Magic Pointing • Combined use of gaze and mouse • gaze is used to warp the cursor to the vicinity of the target object • threshold circle, in the circle the gaze does not affect the cursor • the fine adjustment is done by the mouse Gaze positionreported by eyetracker Target will be within the circle The cursor is warped to eyetracking position Eye tracking boundarywith 99% confidence 42 [Zhai99] EYE MOVEMENT BASED INTERACTION – A. COMMAND BASED – MAGIC POINTING

  44. Screen buttons – Magic Pointing • Two different approaches were experimented with • the cursor warps to every new object the user looks at (“liberal”) • the cursor does not warp until the user actuates the cursor (“conventional”) • Conventional way was slightly slower than plain mouse selection, but the liberal way was faster than mouse • Test persons’ reactions positive 43 [Zhai99] EYE MOVEMENT BASED INTERACTION – A. COMMAND BASED – MAGIC POINTING

  45. Menus, dragging • Gaze controlled pull down menus • using dwell time did not work out very well, the time was either too long or too prone to errors • gaze+hardware button worked better • Dragging of objects (with gaze only, with gaze + hardware button) • performed better than most of the other experiments • using the gaze + hardware button felt natural 44 [Jaco98] EYE MOVEMENT BASED INTERACTION – A. COMMAND BASED

  46. Here we have a text window. Usually we have to grab the mouseand click in the scroll- bar when we want to read the text on the next page, now just look at the arrows Scrolling, window control • Scrolling text in a window • Listener window control • controlling the selection of the activated widows by eye 45 [Jaco98] EYE MOVEMENT BASED INTERACTION – A. COMMAND BASED

  47. Non-command Interaction • Multimodal interfaces head towards task-oriented (and user oriented) interfaces instead of command oriented • In non-command interfaces the computer monitors the user’s actions instead of waiting user’s commands [Nielsen93] • In many cases the natural eye movement information could be valuable information for thatkind of an application 46 EYE MOVEMENT BASED INTERACTION – B. NON-COMMAND

  48. Little Prince, Ship Database • Little Prince -application [Staker90] • an example of “IES-media” (interest and emotion sensitive) • Ship databaseexample[Jacob98, Siebert00] 47 [Staker90, Jaco98, SIebert00] EYE MOVEMENT BASED INTERACTION – B. NON-COMMAND BASED

  49. iDict • Objective of iDict: To design and implement an gaze-assisted environment that gives the user aid with translating the English text - or in fact understanding the text 48 [Hyrskykari00] EYE MOVEMENT BASED INTERACTION – B. NON-COMMAND BASED

  50. iDict • Objective of iDict: To design and implement an gaze-assisted environment that gives the user aid with translating the English text - or in fact understanding the text 49 [Hyrskykari00] EYE MOVEMENT BASED INTERACTION – B. NON-COMMAND BASED

More Related