1 / 63

The Schepens Eye Research Institute An Affiliate of Harvard Medical School

A. A. The Schepens Eye Research Institute An Affiliate of Harvard Medical School. The Effect of Edge Filtering on Vision Multiplexing Henry L. Apfelbaum, Doris H. Apfelbaum, Russell L. Woods, Eli Peli. SID 2005 May 23, 2005 41-2 Boston, MA. Motivation.

casey
Télécharger la présentation

The Schepens Eye Research Institute An Affiliate of Harvard Medical School

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. A A The Schepens Eye Research InstituteAn Affiliate of Harvard Medical School The Effect of Edge Filteringon Vision Multiplexing Henry L. Apfelbaum,Doris H. Apfelbaum, Russell L. Woods, Eli Peli SID 2005 May 23, 2005 41-2 Boston, MA

  2. Motivation • Our lab is developing devices to help people with low vision

  3. Motivation • Our lab is developing devices to help people with low vision • Central field loss (e.g., macular degeneration)

  4. Motivation • Our lab is developing devices to help people with low vision • Central field loss (e.g., macular degeneration) • Peripheral vision loss (“tunnel vision”)

  5. Tunnel vision

  6. Motivation • Our lab is developing devices to help people with low vision • Central field loss (e.g., macular degeneration) • Peripheral vision loss (“tunnel vision”) • Our devices employ vision multiplexing

  7. Motivation • Our lab is developing devices to help people with low vision • Central field loss (e.g., macular degeneration) • Peripheral vision loss (“tunnel vision”) • Our devices employ vision multiplexing • Two different views presented to one or both eyes simultaneously

  8. Vision multiplexing: HUD

  9. Motivation • Our lab is developing devices to help people with low vision • Central field loss (e.g., macular degeneration) • Peripheral vision loss (“tunnel vision”) • Our devices employ vision multiplexing • Two different views presented to one or both eyes simultaneously • For tunnel vision, we have spectacles with a see-through minifying display

  10. See-through minifying HMD a a

  11. See-through minifying HMD a Camera a

  12. See-through minifying HMD a Display Camera a

  13. See-through minifying HMD a Display Camera a Beam-splitter

  14. Motivation • Our lab is developing devices to help people with low vision • Central field loss (e.g., macular degeneration) • Peripheral vision loss (“tunnel vision”) • Our devices employ vision multiplexing • Two different views presented to one or both eyes simultaneously • For tunnel vision, we have spectacles with a see-through minifying display • We edge-filter the display to emphasize detail needed for orientation and navigation

  15. See-through HMD

  16. Motivation • Can the brain handle it?

  17. Neisser & Becklen experiment (1975)

  18. Count the slap attempts

  19. Did you see her?

  20. Motivation • Can the brain handle it?

  21. Motivation • Can the brain handle it? • Inattentional blindness

  22. Motivation • Can the brain handle it? • Inattentional blindness: • Failure to notice significant events in one scene while attention is focused on another scene

  23. Motivation • Can the brain handle it? • Inattentional blindness: • Failure to notice significant events in one scene while attention is focused on another scene • Hypothesis: Edge filtering can mitigate inattentional blindness

  24. Our experiment • We reproduced the Neisser and Becklen experiment, introducing edge filtering to see if unexpected events would be noticed more readily

  25. Our experiment • We reproduced the Neisser and Becklen experiment, introducing edge filtering to see if unexpected events would be noticed more readily • 4 attended/unattended scene filtering combinations:

  26. Full video over full video

  27. Filtered ballgame over full handgame: Bipolar edges

  28. Filtered ballgame over full handgame: White edges

  29. DigiVision edge filter output

  30. Filtered handgame over full ballgame

  31. Both games edge-filtered

  32. Our experiment • We reproduced the Neisser and Becklen experiment, introducing edge filtering to see if unexpected events would be noticed more readily • 4 attended/unattended scene filtering combinations

  33. Our experiment • We reproduced the Neisser and Becklen experiment, introducing edge filtering to see if unexpected events would be noticed more readily • 4 attended/unattended scene filtering combinations • 6 unexpected event scenes:

  34. Unexpected events Umbrella woman Juggler Lost ball Handshake Ball toss Choose-up

  35. Trials • 36 subjects • 4 practice trials • 8 scored trials • Each game attended in half of the trials • 6 showed the 6 unexpected events • 2 had no unexpected event • All 4 filtering treatments used with each game • Edge/edge combination used for the trials without unexpected events • Treatment/unexpected event pairings and presentation order were balanced across subjects

  36. Trials (cont’d) • Subject clicked a mouse at each ball toss or hand-slap attempt in the attended game

  37. Trials (cont’d) • Subject clicked a mouse at each ball toss or hand-slap attempt in the attended game • Questions asked after each trial:

  38. Trials (cont’d) • Subject clicked a mouse at each ball toss or hand-slap attempt in the attended game • Questions asked after each trial: • How difficult was that? • Any particularly hard parts?

  39. Trials (cont’d) • Subject clicked a mouse at each ball toss or hand-slap attempt in the attended game • Questions asked after each trial: • How difficult was that? • Any particularly hard parts? • Anything in the background that distracted you or interfered with the task?

  40. Trials (cont’d) • Subject clicked a mouse at each ball toss or hand-slap attempt in the attended game • Questions asked after each trial: • How difficult was that? • Any particularly hard parts? • Anything in the background that distracted you or interfered with the task? • We scored • Number of unexpected events detected

  41. Trials (cont’d) • Subject clicked a mouse at each ball toss or hand-slap attempt in the attended game • Questions asked after each trial: • How difficult was that? • Any particularly hard parts? • Anything in the background that distracted you or interfered with the task? • We scored • Number of unexpected events detected • Hits rate (mouse click close to attended event) • Average response time to attended event “hits”

  42. Results: Unexpected event detections • 57% of the 216 unexpected events presented were detected

  43. Results: Unexpected event detections • 57% of the 216 unexpected events presented were detected • Only 2 subjects detected all 6 events shown • One subject detected none

  44. Results: Unexpected event detections Edge filtering was not significant (p = 0.67)

  45. Results: Attended task accuracy • Hit rates were high • 95.2% ballgame hit accuracy • 98.2% handgame hit accuracy

  46. Results: Attended task accuracy • Hit rates were high • 95.2% ballgame hit accuracy • 98.2% handgame hit accuracy • No significant effect of cartooning or unexpected events

  47. Results: Attended task accuracy • Hit rates were high • 95.2% ballgame hit accuracy • 98.2% handgame hit accuracy • No significant effect of cartooning or unexpected events • Hit response times

  48. Results: Attended task accuracy • Hit rates were high • 95.2% ballgame hit accuracy • 98.2% handgame hit accuracy • No significant effect of cartooning or unexpected events • Hit response times • Event scene had no significant effect (p > 0.65)

  49. Results: Attended task accuracy • Hit rates were high • 95.2% ballgame hit accuracy • 98.2% handgame hit accuracy • No significant effect of cartooning or unexpected events • Hit response times • Event scene had no significant effect (p > 0.65) • Filtering the unattended task had no significant effect (p = 0.37)

  50. Results: Attended task accuracy • Hit rates were high • 95.2% ballgame hit accuracy • 98.2% handgame hit accuracy • No significant effect of cartooning or unexpected events • Hit response times • Event scene had no significant effect (p > 0.65) • Filtering the unattended task had no significant effect (p = 0.37) • Filtering the attended task had a significant but small impact (527 vs 498 ms, p < 0.001)

More Related