1 / 34

Virtual Worlds: Immersion, Visual and Haptic Output

Virtual Worlds: Immersion, Visual and Haptic Output. Visual displays: Human Eye. Range of visible light

fay-gamble
Télécharger la présentation

Virtual Worlds: Immersion, Visual and Haptic Output

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Virtual Worlds: Immersion, Visual and Haptic Output

  2. Visual displays: Human Eye • Range of visible light • Muscles in the eye focus the image on the retina: accommodation- measured in diopters: inverse of the distance from the eye to the object: try focusing on different objects – what is the max and min? How long to refocus? Related to vergence: inward or outward rotation of the eyes to focus- up to 10 deg per sec, smooth pursuit of objects – also have saccades eye movement: response to peripheral movement- and vestibulo-ocular: fix on an object as the head rotates • Pupil controls brightness entering the eye – adapts to bright and dim light (slower) • IPD (interpupillary distance) varies in individuals and affects head-mounted displays and stereoscopic viewing

  3. Visual Display Depth Cues: Monocular • Linear perspective: spacing due to distance- first identified and used during the 15th c. by Italian painters • Interposition: uses occlusion to determine depth: one object in front of another • Size: we think smaller is farther away – need to know information about the object • Texture gradients: if objects have a regular texture then clearer and larger means closer

  4. Perspective (from Slater et al)

  5. Perspective (from Slater et al)

  6. Texture Gradient (from Slater et al)

  7. Visual Display Depth Cues: Monocular (con’t) • Brightness: brighter objects are perceived to be closer • Shadows: pattern and size of shadows give user clues – need to estimate the location of the light source • Experience: know what size (relative) objects should be- so gives us depth cues • Height in the visual field: higher=farther • Atmospheric effects: farther is hazier or cloudier

  8. Visual Display Depth Cues: Stereoposis • Stereoscopic cues: we see with two images with our two eyes – most relevant for close objects- the two different images tell us about relative depth • Try viewing close objects with just one eye

  9. Visual Display Depth Cues: Motion Depth Cues • Viewer or object moves: more quickly = closer – called motion parallax- eg. of looking out a car window • Viewer movement helps gauge depth because viewer knows how far she moved • Relative motion of objects

  10. Visual Display Depth Cues: Physiological • Accommodation: focusing of the eye; muscles change the shape of the lens; for object 2-3 m • Vergence or convergence: movement of eyes to focus; provides info for the brain

  11. Visual Displays: Visual Properties • Color, contrast, brightness and resolution: monitors or head mounted displays; issues particularly for head-mounted displays • Number of display channels: often two separate channels in order to display stereo (HMDs), with shutter glasses can have temporal multiplexing (2 different images presented in quick succession with shutters closing for the other eye), spectral multiplexing (different colors for each eye with special glasses – gen. one green and one red) – can be more channels if there are several viewers; issues about health

  12. Visual Properties (con’t) • Focal distance and accommodation: images are at a fixed focal plane (distance from the eye to the screen, for example) but must appear as if they aren’t; only need to worry about <3 m (we don’t use accomo. much beyond that) – objects can “jump” off the screen to appear closer than the distance to the screen – need to be careful of conflicting depth cues • Opacity: is the real world seen or not or partially (AR)- diff. between HMDs and screen display

  13. Visual Properties (con’t) • Masking: user’s actual hand can mask virtual objects • Field of view: actually about 200 degress with 120 degrees of binocular- can change this in the virtual world, CAVEs are more, HMDs much less (tunnel vision) • Field of regard: space in front that has the virtual world: very small for screens, 100% for HMDs – even CAVEs have one side open- objects may be cut off at the edge of the screen (breaking the frame)

  14. Visual Properties (con’t) • Head position info: orientation is important to convey to display; question of movement: how does the screen change (diff. for objects in front of, on, and behind the screen) • Latency and lag: issues for updating • Frame rate: movies are about 24 FPS, aim for better, below 10 is very difficult

  15. Visual Displays: Logistic Properties • User mobility: cables, type of visual display • Tracking methods in conjunction with visual display – where placed and whether there is interference • Environment: size, lighting • Interaction with other senses (haptic, audio)

  16. Logistic Properties (con’t) • Portability • Throughput: number of users • Encumbrance: weight, cables of HMDs, fit for users (HMDs, glasses, BOOM) • Safety: tripping, weight, headaches, eye fatigue, nausea (lag, accommodation, conflicting cues) • Cost

  17. Visual Displays: Kinds • Fishtank (monitor) • Projection • Occlusive head-based • Non-occlusive head-based • Handheld

  18. Fishtank Visual Displays • If stereo can have on screen or with the help of glasses: Crystal Eyes – synched with monitor and opens and shuts lenses to each eye • Tracking important: from video cam or trackers • Cheap • Disadvantages: field of range limited, can’t turn head, less immersion, can’t move much • Interfaces: mouse, trackers, keyboard, joystick, haptic

  19. Projection Visual Displays • Size of display varies: full walls (CAVE), large monitors, also workbenches and table-top • Can have some movement, possibly glasses, multiple viewers, more natural cues because focal length is longer, better field of view, more immersion • Need floor space, powerful computers • Interfaces: gen. no keyboard, number of screens, widgets, handheld devices

  20. Examples of Projection Displays Duke’s DiVE System Reachin

  21. Occlusion Head-based Displays • Fit on the head: block out the world • World changes as the head moves • Eg. of BOOM, HMDs, cockpits • Include a tracking system • FOV generally more limited but field of range 100% • Resolution can be less • Lag causes problems, can be heavy, fatigue • More immersive • Interface more limited: can’t see hands

  22. Non-occlusion Head-based Displays • See-through: augmented reality • Tracking • Synchronization • Accurate occlusion difficult • Lag a problem

  23. Handheld Visual Displays • Screen held by user • Need tracking • Resolution an issue • Mobility: outdoor, GPS, phones • Not very immersive

  24. Haptic Displays: Terms • Tactile: cutaneous perception, temperature, force; ability of the skin to detect mechanical, thermal and electrocutaneous stimuli (touching); different kinds of receptors with differing degrees of sensitivity • Kinesthesia: capability to sense our limbs (both movement and position); notice that movement and position are different • Kinesthetic: perception of movement, position and torque of body (conducting an orchestra) • Force: muscular opposition that resists mechanical forces (pushing a door)

  25. Haptics: Terms (con’t) • Haptic: Ability to actively experience the environment through exploration, typically with hands • Somesthesis or touch or haptics (used interchangeably) : skin sensations and also the capability to sense our limbs (both movement and position) • Experiment: hold still- what haptic sensations are there? • What would loss of touch mean?

  26. Haptic Presentation Properties • Kinesthetic Cues: angles of joints, muscle length, tension • Tactile: heat, current • Grounding: world (absolute – tied to external) or self (only tied to body) • Number of display channels • Degrees of freedom (position and force) • Form: prop, pin display or glove (changing), pen

  27. Haptic Presentation Properties (con’t) • Fidelity: range of forces, position • Spatial resolution: we have JND (just noticeable difference – varies on different regions of the body) • Temporal resolution: frame rate, update rate • Latency or time lag (needs high update rate) • Size

  28. Logistic Properties of Haptics • User mobility: generally user must be static • Interference with tracking or other displays • Environment • Portability • Throughput: how much data can be transmitted • Encumbrance • Safety: bigger issue in haptics, both for machine and user • Cost

  29. Tactile Haptic Displays • Vibrator actuators: gloves, props • Speakers: vibration • Pin actuators: surface displays • Generally for fingertips • Sense of touch but not necessarily full surface

  30. End-effector Displays • Mounted at end of robot arm: force displays • Need to sense movement and supply resistance: motors, hydraulics • Often both input (position and orientation – often mechanical tracking) and output • Generally operate at a single point • Eg of Phantom

  31. Phantom from Sensable

  32. Other Haptic Devices • Robotically operated shape displays: robots put objects in front of the user; authentic; safety, environment, interaction with other senses

  33. Applications for Haptic Devices • NASA • Modeling and design • Entertainment, games, larger environments • Training: surgery, flying • Remote medicine, medicine • Rehabilitation, impaired or handicapped individuals

  34. Sources Understanding Virtual Reality, Sherman & Craig, Morgan Kaufman, 2003 Computer Graphics and Virtual Environments, Slater et al The Importance of the Sense of Touch in Virtual Environments, Robles-de-la-Torre, IEEE Multimedia, 2006 Individualize Interactive Home-Based Haptic Telerehabilitation by Jadhav et al, IEEE Multimedia, 2006 Testing Usability of Multimodal Applications with Visually Impaired Children by Raisano et al, IEEE Multimedia, 2006 Haptic Media Synchronization for Remote Surgery through Simulation by Wongwirat and Ohara, IEEE Multimedia, 2006

More Related