1 / 1

Full panoramic representation Depth information from the real world

Bird’s-eye view of texture-mapped 3-D static scene model. Jun Shimamura†, Haruo Takemura‡, Naokazu Yokoya‡, Kazumasa Yamazawa‡ †NTT Cyber Space Laboratory, ‡Nara Institute of Science and Technology JAPAN.

ferris-ross
Télécharger la présentation

Full panoramic representation Depth information from the real world

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Bird’s-eye view of texture-mapped 3-D static scene model Jun Shimamura†, Haruo Takemura‡, Naokazu Yokoya‡, Kazumasa Yamazawa‡ †NTT Cyber Space Laboratory, ‡Nara Institute of Science and TechnologyJAPAN Construction and Presentation of a Virtual Environment Using Panoramic Stereo Images of a Real Scene and Computer Graphics Models Background Approach Construction of a large-scale virtualized environment ex. urban area, natural scenes, … • Uses a video-rate omni-directional stereo imaging sensor • Constructs a full panoramic 3-D model using a cylindrical • panoramic stereo image sequence • Constructs a prototype system of presenting • a mixed environment ( virtualized, CG ) Requirements in constructing an immersive virtualized environment • Full panoramic representation • Depth information from the real world • Digitizing and representing dynamic events Omni-directional stereo imaging sensor virtual center of lens Composed of twelve CCD cameras and two hexagonal pyramidal mirrors lens center upper component • Single viewpoint constraint • High-resolution image acquisition • Omni-directional stereoscopic imaging at video-rate pyramidal mirror lower component camera top view front view Geometry Appearance Generation of panoramic stereo images A pair of panoramic images • Elimination of geometric distortion in images, • Color adjustment of camera images, • Concatenation of six images for completing • upper and lower omni-directional images of a stereo pair 3006 x 330 pixels Virtualizing a dynamic real scene • Layered representation of dynamic real scene Static scene image generation Extraction of moving objects • A panoramic image of a static scene is generated • by applying a temporal mode filter to a panoramic • image sequence in a time interval • Moving objects are extracted by subtracting consecutive image • frames in a sequence panoramic image sequence including dynamic event subtracting static image majority filtering Extracted moving object regions • Depth estimation from static panoramic stereo images and moving objects • Stereo matching used to estimate depths of static scene and moving objects upper panoramic image • Generation of 3-D model epipolar line Apply Delaunay’s triangulation lower panoramic image Panoramic depth map of static scene world cylindrical coordinate Prototype system User’s appearance in mixed environment using CYLINDRA system Immersive mixed reality system • Has a 330-degree cylindrical screen • Merges virtual objects with a virtualized real world scene • Superimposes dynamic event layers onto static scene layer Frame rate: • Hardware configuration of immersive mixed reality system 13 frames/sec. ii. Mixed environment observed from different viewpoints iii. Superimposing dynamic event layers onto a static scene layer Background: Original viewpoint 13,400 polygons 6144 x 768 pixels resolution CG tree: 41,340 polygons New higher viewpoint Total: 54,740 polygons

More Related