1 / 23

Texture-Mapping Real Scenes from Photographs

SIGGRAPH 2000 Course on Image-Based Surface Details. Texture-Mapping Real Scenes from Photographs. Yizhou Yu Computer Science Division University of California at Berkeley. Basic Steps. Acquire Photographs Recover Geometry Align Photographs with Geometry Map Photographs onto Geometry.

duer
Télécharger la présentation

Texture-Mapping Real Scenes from Photographs

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. SIGGRAPH 2000 Course on Image-Based Surface Details Texture-Mapping Real Scenes from Photographs Yizhou Yu Computer Science Division University of California at Berkeley

  2. Basic Steps • Acquire Photographs • Recover Geometry • Align Photographs with Geometry • Map Photographs onto Geometry

  3. Camera Pose Estimation • Input • Known geometry recovered from photographs or laser range scanners • A set of photographs taken with a camera • Output • For each photograph, 3 rotation and 3 translation parameters of the camera with respect to the geometry • Requirement • 4 correspondences between each photograph and the geometry

  4. Recover Camera Pose with Known Correspondences • Least-squares solution • Needs good initial estimation from human interaction Image Camera

  5. Recover Rotation Parameters only from Known Correspondences • Constraints • Least-squares solution Image Camera

  6. Obtaining Correspondences • Feature Detection in 3D geometry and 2D images • Human interaction • Interactively pick corresponding points in photographs and 3D geometry • Automatic Search • Combinatorial search

  7. Automatic Search for Correspondences • Pose estimation using calibration targets • Combinatorial search for the best match • 4 correspondences each image 3D Targets

  8. Camera Pose Results • Accuracy: consistently within 2 pixels Texture-mapping a single image

  9. Texture Mapping • Conventional texture-mapping with texture coordinates • Projective texture-mapping

  10. Texture Map Synthesis I • Conventional Texture-Mapping with Texture Coordinates • Create a triangular texture patch for each triangle • The texture patch is a weighted average of the image patches from multiple photographs • Pixels that are close to image boundaries or viewed from a grazing angle obtain smaller weights Photograph 3D Triangle Texture Map

  11. Texture Map Synthesis II • Allocate space for texture patches from texture maps • Generalization of memory allocation to 2D • Quantize edge length to a power of 2 • Sort texture patches into decreasing order and use First-Fit strategy to allocate space First-Fit

  12. A Texture Map Packed with Triangular Texture Patches

  13. Texture-Mapping and Object Manipulation Original Configuration Novel Configuration

  14. Texture Map Compression I • The size of each texture patch is determined by the amount of color variations on its corresponding triangles in photographs. • An edge detector (the derivative of the Gaussian) is used as a metric for variations.

  15. Texture Map Compression II • Reuse texture patches • Map the same patch to multiple 3D triangles with similar color variations • K-means clustering to generate texture patch representatives • Larger penalty along triange edges to reduce Mach Band effect 3D Triangles Texture Map

  16. Synthetic Images with Compressed and Uncompressed Texture Maps Compressed 5 texture maps Uncompressed 20 texture maps 20 texture maps 5 texture maps

  17. Projective Texture-Mapping • Can directly use the original photographs in texture-mapping • Visibility processing is more complicated • Projective texture-mapping has been implemented in hardware, therefore, real-time rendering becomes possible • View-dependent effects can be added by effectively using hardware accumulation buffer

  18. Motivation for Visibility Processing: Artifacts Caused by Hardware Camera Image Geometry Texture gets projected onto occluded and backfacing polygons

  19. Visibility Algorithms • Image-space algorithms • Shadow buffer • Ray casting • Object-space algorithms • Weiler-Atherton

  20. A Hybrid Visibility Algorithm • Occlusion testing in image-space using Z-buffer hardware • Render polygons with their identifiers as colors • Retrieve occluding polygons’ ids from color buffer • Object-space shallow clipping to generate fewer polygons

  21. Input Photographs and Recovered Geometry from Facade

  22. Visibility Processing Results The tower The rest of the campus

  23. Synthetic Renderings

More Related