1 / 19

MA Final: 2D Magic Lens Implementation using a Handheld device in a 3D Virtual Environment

MA Final: 2D Magic Lens Implementation using a Handheld device in a 3D Virtual Environment. Student: Alba Huelves Director: Prof. Gudrun Klinker (Ph.D.) Supervisor: Amal Benzina and Marcus Tönnis. Contents. Introduction Related Works System Architecture

laksha
Télécharger la présentation

MA Final: 2D Magic Lens Implementation using a Handheld device in a 3D Virtual Environment

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. MA Final: 2D Magic Lens Implementation using a Handheld device in a 3D Virtual Environment Student: Alba Huelves Director: Prof. Gudrun Klinker (Ph.D.) Supervisor: AmalBenzina and Marcus Tönnis

  2. Contents • Introduction • Related Works • System Architecture • Exploring the VE with the Hand Held device: approaches • Motion control • Viewpoint • Lens frustum computation • Rendering the Magic Lens • GUI • Averaging and thresholds • Conclusions • Video demo

  3. Introduction • Human interaction techniques with 3D VE are important for: • Selection • Manipulation of 3D graphical info. • Purpose: • Different interaction techniques using Magic Lens metaphor to explore the VE • Obtain an alternative focus view of the scene. • Intersection with terrain surface for below surface exploration in the future.

  4. Related Works ‘3D Magic Lenses’ Viega et al.1996 ‘Toolglass and Magic Lenses: The See-Through Interface ’ Bier et al.1993 ‘Magic Lenses for Augmented Virtual Environments’ Leonard D. Brown 2006 ‘The Through-The-Lens Metaphor: Taxonomy and Application’ S. Stoev et al. 2002

  5. System Architecture • System elements: • Android Client running on the tablet • Glasses target • FRAVE • Fraveui0 • ART System

  6. Communication Procedure 1. User starts the Android application and data is sent to the servers (UDP wireless connection). • User can navigate and travel through terrain by touching the screen which sends a message to servers to enable the tracking. 3. User selects Magic Lens mode and another message is sent to servers so it is rendered. The FRAVE server sends the Fraveui0 the Magic Lens data. 4. User explores VE by touching the screen and moving the tablet. Message is sent to servers to enable tracking. FRAVE updates Fraveui0. 5. User takes snapshot of selected region. Message is sent to Fraveui0 to obtain image. 6. Fraveui0 sends image (TCP) to client.

  7. Exploring the VE with the Hand Held device • Tracked Android tablet controls the Magic Lens virtual avatar. • Motion control options: • Rate control • Direct avatar - position control • Viewpoint options: • Fixed viewpoint • Tracked viewpoint • Translation and orientation of the tablet are mapped to the VE depending on the motion control option used.

  8. Rate Control • Mapping from the tablet’s translation and rotation to the Magic Lens’ translation and rotation. • Initial pose of the tablet is obtained as user touches screen. • Delta translation and delta orientation relative to initial pose mapped to the Magic Lens. • The delta translation is scaled by the sensitivity factor • When the delta translation and the delta orientation exceed a threshold a rate factor is increased and added to the current translation or orientation.

  9. Translation • Magic Lens centre position is fixed at a starting position 300m in front of the camera: • centreStartPos = camPosition + 300 * directionCam • Translation X  Right direction • Translation Y  Up direction • Translation Z  - View direction • Centre position of the Magic Lens is updated: • centrePos = centreStartPos + translationx * rightCam • + translationy * upCam • - translationz * directionCam

  10. Orientation • The Magic Lens has an initial 90 degrees pitch, and zero roll and heading. Pitch  Pitch Roll  Heading Heading  Roll  0 • pitch  rotation around X • roll  rotation around Y • heading  rotation around Z • (not considered) • pitch  rotation around Right vector • roll  rotation around View direction vector • heading  rotation around Up vector • The delta rotation from the initial pose is obtained for each frame and multiplied by the elapsed time since the last frame. This is added to the accumulated rotation. • If the pitch and the heading of the Magic Lens exceed a certain threshold the Magic Lens will rotate faster.

  11. Direct avatar – Position Control • Approximation of direct avatar by tracking the left and right walls of the FRAVE. • Initial position of the Magic Lens depends of the proximity to each of the walls of the FRAVE. • Delta translation scaled by sensitivity factor obtained by testing to give the impression the lens follows. • Delta rotation is obtained relative to the orientation with zero values for the pitch, heading and roll. • The virtual camera’s pitch and heading are added to that of the Magic Lens.

  12. Fixed Viewpoint Tracked Viewpoint • The Magic Lens viewpoint is fixed 40m from its centre position in the negative direction of the view vector: • viewpoint= centrePos - 40 * viewVector • Symmetric lens frustum where the Magic Lens is the near plane. • The Magic Lens viewpoint is settotheuser’seyetrackedbytheglasses. • Relative position of the viewpoint to the centre of the tabletismapped to the VE. The virtualviewpoint has the same relative position to the centre of the Magic Lens.

  13. Lens Frustum Computation • Relative position of the virtual viewpoint (Evirtual) • to that of the lower left corner (Lleft) of the Magic Lens: • R vp = E virtual-L left dNear =Rvp ●Zaxis dLeft= -Rvp ●Xaxis dRight = width-dLeft dBottom = Rvp ●Yaxis dTop = height-dBottom • Set projection matrix with the distances • to the near, far, left, right, top and bottom • planes.

  14. Rendering the Magic Lens • Rectangle centre position compute upper left, upper right, lower left and lower right corners. • Viewpoint is represented by a small sphere. • The surface of the pyramid formed by the viewpoint, lens corners and terrain intersection is shaded:

  15. Graphical User Interface • Android app running on the tablet Navigation View Magic Lens View • User touches the screen to navigate or to explore the VE with the Magic Lens enables tracking. • Snapshot  requests the server in Fraveui0 to send the image of the lens frustum.

  16. Sending the Snapshot • Upon the request of the Android Client in Fraveui0 the image displayed is captured, compressed to JPEG and sent through TCP connection. • TCP receiver in the client, reads the stream of bytes, decodes the bytes and displays the image in the tablet.

  17. Averaging and Thresholds • To filter unintended hand or head movements for translation and rotation. • If the difference between the previous value and the current one doesn’t exceed a threshold, it will be added to a variable. • When the difference exceeds a value the average of the previously added values will be assigned to the current value  smoothness

  18. Conclusions • Three metaphors: • Rate control with fixed viewpoint • Avoids hand fatigues, and the user moving. • A larger 3D space can be explored • Position control (direct avatar) with fixed viewpoint • Needs to move to explore the VE • More immersive experience the avatar behaves as in the real world • Position control (direct avatar) with tracked viewpoint • Tracking the viewpoint allows user to explore the terrain also with his head  more immersion • Not very intuitive because the focus view is on Fraveui0 and not in the tablet. • Future work: • User evaluation • Be able to display in the tablet in real time the focus view • Below surface exploration

  19. Video Demo

More Related