1 / 41

3D USER INTERFACE OUTPUT HARDWARE

3D USER INTERFACE OUTPUT HARDWARE. Andrew Flangas. OVERVIEW. Visual Displays Auditory Displays Haptic Displays Levels of Fidelity Choosing Devices Case Studies. VISUAL DISPLAYS. Characteristics of Visual Displays Field of regard and field of view Spatial resolution Screen geometry

friedman
Télécharger la présentation

3D USER INTERFACE OUTPUT HARDWARE

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. 3D USER INTERFACE OUTPUT HARDWARE Andrew Flangas

  2. OVERVIEW • Visual Displays • Auditory Displays • Haptic Displays • Levels of Fidelity • Choosing Devices • Case Studies

  3. VISUAL DISPLAYS • Characteristics of Visual Displays • Field of regard and field of view • Spatial resolution • Screen geometry • Light transfer mechanism • Refresh rate • Ergonomics • Effect on depth cues

  4. VISUAL DISPLAY DEVICE TYPES • Single-screen displays • Surround-screen and multiscreen displays • Workbenches and tabletop displays • Head-worn displays • Arbitrary surface displays • Autostereoscopic displays

  5. SINGLE-SCREEN DISPLAYS • Commonly used in many different kinds of 3D applications, such as video games, modeling, and scientific and information visualization. • Examples of these displays are conventional monitors, high-definition and higher resolution televisions, and front- or rear-projection displays using a wall or screen material as the projection surface. • Stereoscopic glasses are needed for stereoscopic viewing. • Temporal multiplexing, polarization multiplexing, and spectral multiplexing. • Active stereo is considered to achieve the highest stereo quality. • Not very immersive.

  6. Active Stereo glasses for viewing stereoscopic images A monitor equipped with stereo glasses.

  7. SURROUND-SCREEN DISPLAYS • Visual output device that increases the FOR for a user or group of users. • Incorporates either a set of display screens, a large curved display screen, or some combination of curved and planar screens that makes use of one or more light projection devices. • Provide high spatial resolution and a large FOR. • Have a large FOV, which allows the user to utilize his peripheral vision. • They are expensive and often require a large amount of space. • Can have difficulty seeing objects in stereo under certain conditions.

  8. An example of a surround-screen VR system. The image shows a four-sided device. Outerra Anteworld 180 degree projection with Immersive Display

  9. A surround-screen display using a collection of display panels. An example of a curved, rear-projected surround-screen display system.

  10. WORKBENCHES AND TABLETOP DISPLAYS • Used to simulate and augment interaction that takes place on desks, tables, and workbenches. • These displays provide relatively high spatial resolution and make for an intuitive display for certain types of applications. • The device can accommodate multiple users, but with the same viewpoint constraints as with a large surround-screen or single-screen display. • Users have limited mobility when interacting with a workbench because the display is stationary.

  11. Workbench style displays. A personalized workbench display that supports tracked 3D stereo and interaction

  12. HEAD-WORN DISPLAYS • These devices are attached to the user’s head and are typically called HMD or HWDs. • Today’s HWDSs achieve high-definition quality or better resolution and can weigh less than one pound. • Tracked HWDs allow for all the monoscopic and motion parallax depth cues. • Stereoscopy is produced differently with HWDs than with projection-based display and monitors. • The user can have complete physical visual immersion (i.e., a 360-degree FOR). • Are more portable and often less expensive as compared to surround-screen and single screen displays. • Vergence cue conflicts.

  13. A head-worn display that uses a cell phone as the display engine. A head-worn display for virtual reality.

  14. An arm-mounted display called the Binocular Omni-Orientation Monitor A head-mounted display that supports both VR and AR applications by using an attachment that makes it project onto an integral opaque or see-through surface, instead of onto surfaces in the environment.

  15. ARBITRARY SURFACE DISPLAYS • Projects imagery directly on arbitrary surfaces of any shape or size. • Known as projection mapping or spatial augmented reality. • Projecting onto 3D objects supports appropriate depth, since the images are directly placed onto the 3D surface. • If images need to appear in front of or behind the display surface, view-dependent stereoscopic projection is required. • Support for multiple head-tracked viewers is also possible.

  16. The illumroom, an example of projection mapping to create an extended display in a living room. The projector-camera pair correctly projects onto the various surfaces in the room. The Virtual Showcase, an augmented surface display that projects virtual information onto a physical 3D artifact, presenting a seamless integration of the physical object and virtual content.

  17. AUTOSTEREOSCOPIC DISPLAYS • Generate 3D imagery without the need for special shutters or polarized glasses. • Use lenticular, volumetric, or holographic display technology. • Other techniques such as compressive light fields, diffractive-optical elements, integral imaging, parallax illumination, and barrier grids exist as well. • There has been little 3D UI research with lenticular and holographic displays. • The same interfaces that work for rear-projected displays should also work for lenticular displays.

  18. A lenticular display A volumetric display system that uses the swept-volume technique. On the left is the display device and on the right a volumetric image.

  19. AUDITORY DISPLAYS • 3D Sound Generation • Sound System Configurations • Audio in 3D interfaces

  20. 3D SOUND GENERATION • 3D Sound Sampling and Synthesis • Records sound that the listener will hear in the 3D application by taking samples from a real environment. • Binaural audio recordings • Auralization • The process of rendering the sound field of a source in space in such a way as to simulate the binaural listening experience through the use of physical and mathematical models. • Recreates a listening environment by determining the reflection patterns of sound waves coming from a sound source as they move through the environment.

  21. SOUND SYSTEM CONFIGURATIONS • Headphones • Stereophonic headphones present different information to each ear. • Problems with inside-the-head localization (IHL) which is the lack of externalization of a sound source, which results in the false impression that a sound is emanating from inside the user’s head. • External Speakers • Placed at strategic locations in the environment. • The major challenge with using these for displaying 3D sound is how to avoid crosstalk and make sure the listener’s left and right ears receive the appropriate signals. • Transaural audio allows for the presentation of the left and right binaural audio signals to the corresponding left and right ears using external speakers.

  22. AUDIO IN 3D INTERFACES • Localization • The generation of 3D spatial sound creates an important audio depth cue, providing the user with the ability to use their localization skills and giving him/her an aural sense of the 3D environment. • Sonification • The process of turning information into sounds. • Ambient Effects • Provides a sense of realism in a 3D application (i. e. hearing birds chirping and the wind whistling through the trees). • Sensory Substitution • The process of substituting sound for another sensory modality, such as touch. • Annotation and Help • Recorded or synthesized speech can play a role as an annotation tool in collaborative applications and as a means to provide help to users.

  23. HAPTIC DISPLAYS • Characteristics of Haptic Displays • Perceptual dimensions • Resolution • Ergonomics

  24. HAPTIC DISPLAY TYPES • Ground-referenced • Body-referenced • Tactile • In-air • Combination • Passive

  25. GROUND-REFERENCED HAPTIC DEVICES • Creates a physical link between the user and a ground point in the environment. • This can be in the form of a desktop, wall, ceiling, or floor. • Different types of these displays include force-reflecting joysticks, pen-based force-feedback devices, stringed devices, motion platforms, and large articulated robotic arms. • These displays typically use electric, pneumatic, or hydraulic actuator technology. • They can provide a fairly large range of motion for the user.

  26. A ground-referenced force-feedback device.

  27. BODY-REFERENCED HAPTIC DEVICES • Places the haptic device on some part of the user’s body, so it is ”grounded” to the user. • They provide the user with much more freedom of motion in the surrounding environment than do ground-referenced displays. • User has to bear the entire weight of the device. • Can be further classified by the body locations that are actuated by the devices. • They require setup time to be put on the user and calibrated for specific body size.

  28. Body-referenced force-feedback devices.

  29. TACTILE DISPLAYS • Present haptic information by stimulating the user’s tactile sense. • Skin is highly sensitive, so significantly less energy is required to produce strong and recognizable tactile sensations. • Are generally smaller and more lightweight than the force displays. • All tactile displays are based on producing tactile sensations by applying physical stimuli on human skin. • Can be categorized by the physical principles of the stimulation. • They include mechanical displacement-based displays, vibrotactile displays, electrocutaneous displays, electrovibration displays, surface friction displays, and thermoelectric displays.

  30. InForm is a mechanical displacement display that is able to produce tactile haptic shapes. A tactile device that puts vibrating actuators on the fingertips and the palm of the hand.

  31. IN-AIR HAPTICS • Most of these devices are based on using air to stimulate human skin. • Are not currently able to provide the highly detailed and responsive tactile sensations that are possible with elctromechanical devices. • They offer new interactive modalities when designing 3D Uis in both desktop and room-scale settings.

  32. Vortex-based in-air tactile display.

  33. COMBINATION DEVICES AND PASSIVE HAPTICS • Combination Devices • Combining different types of feedback can create more believable and recognizable haptic sensations. • Passive Haptics • Another class of haptic interfaces are those that are based on using passive physical representations of virtual objects to communicate their physical qualities. • They convey a constant force or tactile sensation based on the geometry and texture of the particular object. • Are very specific in that they are solid physical objects that directly mimic the virtual objects that they are used to represent.

  34. A haptic device that combines ground-referenced and body-referenced force-feedback.

  35. HAPTIC DISPLAYS IN 3D INTERFACES • Presenting haptic feedback to the user in a 3D UI is a powerful tool in developing more efficient, effective and immersive experiences. • Can help improve the realism of a 3D UI, which is important in applications such as entertainment and gaming. • Is currently difficult to do • One use of haptic feedback is to provide feedback when grabbing and manipulating virtual objects using direct manipulation. • Tactile feedback can be used to simulate the texture of physical surfaces. • Can also be used in 3D interfaces as props that provide weight and texture and represent an inexpensive way to display haptic cures to the user.

  36. CHARACTERIZING DISPLAYS BY LEVEL OF FIDELITY • A display’s level of fidelity is the degree to which the sensory stimuli produced by a display correspond to those that would be present in the real world. • Much of the research on displays is aimed at ever-increasing realism. • Talking about a display’s fidelity allows us to benchmark it with respect to the real world and to compare it to other displays in a standard way. • A display’s fidelity can have significant effects on the user’s experience with the display. • Visual display fidelity components include: FOR, FOV, stereoscopy quality, refresh rate, spatial resolution, color reproduction quality, display latency.

  37. CHOOSING OUTPUT DEVICES FOR 3D UI • There is no single rule of thumb telling developers which output devices to use. • Analyze the application’s goals and its underlying operations and tasks to obtain direction in choosing an appropriate output device. • Visual display devices with a 360-degree FOR should be used in applications in which users perform frequent turns and require spatial orientation. • For visual search and pattern identification tasks in 3D environments, choose a display with high spatial resolution.

  38. VR GAMING CASE STUDY • Choose a visual display that will be both effective and practical for your end users. • Carefully consider human factors issues related to VR displays. • Don’t forget to account for social aspects such as non-users viewing the VR experience. Mixed reality compositing in Fantastic Contraption, allowing viewers to see both the player and the Virtual world she inhabits.

  39. MOBILE AR CASE STUDY • Support a comfortable power grip to hold the system firmly, especially when the user is required to hold the device at eye height. • Allow users to vary their poses, and look into the potential of resting the arms against the body to extend usage duration. • Closely analyze the relationship between display angle and pose, as changing pose to see content clearly may result in nonorganic usage. • Look closely at the balance of the setup-if the device tips in the wrong direction, even a good pose or grip may not be sufficient to hold the system for long. Balance may be more important than overall weight. • Try to limit the need for additional batteries for operation and compress additional cables, as they tend to take up a lot of space.

  40. Vesp’r handheld AR device setup

  41. CONCLUSION • Examined a variety of different output devices. • visual displays • auditory displays • haptic and tactile feedback devices • Went over strategies for choosing appropriate output device and discussed effects of display fidelity. • Covered two case studies.

More Related