1 / 35

Unstructured Lumigraph Rendering

Unstructured Lumigraph Rendering. Chris Buehler Michael Bosse Leonard McMillan MIT-LCS Steven J. Gortler Harvard University Michael F. Cohen Microsoft Research. The Image-Based Rendering Problem. Synthesize novel views from reference images

lobo
Télécharger la présentation

Unstructured Lumigraph Rendering

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Unstructured Lumigraph Rendering Chris Buehler Michael Bosse Leonard McMillan MIT-LCS Steven J. Gortler Harvard University Michael F. Cohen Microsoft Research

  2. The Image-Based Rendering Problem • Synthesize novel views from reference images • Static scenes, fixed lighting • Flexible geometry and camera configurations

  3. LF ULR VDTM The ULR Algorithm • Designed to work over a range of image and geometry configurations • Designed to satisfy desirable properties # of Images Geometric Fidelity

  4. u u0 s0 s Desired Camera “Light Field Rendering,” SIGGRAPH ‘96 Desired color interpolated from “nearest cameras”

  5. u Desired Property #1: Epipole consistency s Desired Camera “Light Field Rendering,” SIGGRAPH ‘96

  6. “The Scene” u Potential Artifact Desired Camera “The Lumigraph,” SIGGRAPH ‘96

  7. “The Scene” Desired Property #2: Use of geometric proxy Desired Camera “The Lumigraph,” SIGGRAPH ‘96

  8. “The Scene” Desired Camera “The Lumigraph,” SIGGRAPH ‘96

  9. “The Scene” Desired Property #3: Unstructured input images Desired Camera “The Lumigraph,” SIGGRAPH ‘96 Rebinning Note: all images are resampled.

  10. “The Scene” Desired Property #4: Real-time implementation Desired Camera “The Lumigraph,” SIGGRAPH ‘96

  11. “The Scene” Occluded Out of view Desired Camera View-Dependent Texture Mapping, SIGGRAPH ’96, EGRW ‘98

  12. “The Scene” Desired Property #5: Continuous reconstruction Desired Camera View-Dependent Texture Mapping, SIGGRAPH ’96, EGRW ‘98

  13. “The Scene” Desired Camera View-Dependent Texture Mapping, SIGGRAPH ’96, EGRW ‘98 θ1 θ3 θ2

  14. “The Scene” Desired Property #6: Angles measured w.r.t. proxy Desired Camera View-Dependent Texture Mapping, SIGGRAPH ’96, EGRW ‘98 θ1 θ3 θ2

  15. “The Scene” Desired Camera

  16. “The Scene” Desired Property #7: Resolution sensitivity Desired Camera

  17. Previous Work • Light fields and Lumigraphs • Levoy and Hanrahan, Gortler et al., Isaksen et al. • View-dependent Texture Mapping • Debevec et al., Wood et al. • Plenoptic Modeling w/Hand-held Cameras • Heigl et al. • Many others…

  18. Unstructured Lumigraph Rendering • Epipole consistency • Use of geometric proxy • Unstructured input • Real-time implementation • Continuous reconstruction • Angles measured w.r.t. proxy • Resolution sensitivity

  19. Desired Camera colordesired = Σwi colori i Blending Fields

  20. Desired Camera colordesired = Σw(ci)colori i Blending Fields

  21. Unstructured Lumigraph Rendering • Explicitly construct blending field • Computed using penalties • Sample and interpolate over desired image • Render with hardware • Projective texture mapping and alpha blending

  22. θ6 θ5 θ1 θ4 θ2 θ3 Angle Penalty Geometric Proxy C6 C1 C5 C2 C4 Cdesired C3 penaltyang(Ci)=θi

  23. Cdesired Resolution Penalty Geometric Proxy penaltyres Ci distdesired penaltyres(Ci)=max(0,dist(Ci)– dist(Cdesired ))

  24. Field-Of-View Penalty penaltyFOV angle

  25. penalty(Ci) = αpenaltyang(i) + βpenaltyres(i) + γpenaltyfov(i) Total Penalty

  26. K-Nearest Continuous Blending • Only use cameras with K smallest penalties • C0 Continuity: contribution drops to zero as camera leaves K-nearest set • w(Ci) = 1- penalty(Ci)/penalty(Ck+1st closest ) • Partition of Unity: normalize • w(Ci) =w(Ci)/Σw(Cj) ~ j

  27. Blending Field Visualization

  28. Sampling Blending Fields Epipole and grid sampling Just epipole sampling

  29. Hardware Assisted Algorithm Sample Blending Field • Clear frame buffer • for each camera ido • Set current texture and projection matrix • Copy blending weights to vertices’ alpha channel • Draw triangles with non-zero alphas • end for Select blending field sample locations for each sample location jdo for each camera ido Compute penalty(i) for sample location j end for Find K smallest penalties Compute blending weights for sample location j end for Triangulate sample locations Render with Graphics Hardware

  30. Blending over one triangle Epipole and grid sampling Just epipole sampling

  31. Hardware Assisted Algorithm Sample Blending Field • Clear frame buffer • for each camera ido • Set current texture and projection matrix • Copy blending weights to vertices’ alpha channel • Draw triangles with non-zero alphas • end for Select blending field sample locations for each sample location jdo for each camera ido Compute penalty(i) for sample location j end for Find K smallest penalties Compute blending weights for sample location j end for Triangulate sample locations Render with Graphics Hardware

  32. Demo

  33. Future Work • Optimal sampling of the camera blending field • More complete treatment of resolution effects in IBR • View-dependent geometry proxies • Investigation of geometry vs. images tradeoff

  34. Conclusions • Unstructured Lumigraph Rendering • unifies view-dependent texture mapping and lumigraph rendering methods • allows rendering from unorganized images • sampled camera blending field

  35. Acknowledgements • Thanks to the members of the • MIT Computer Graphics GroupandMicrosoft ResearchGraphics and Computer Vision Groups • DARPA ITO Grant F30602-971-0283 • NSF CAREER Awards 9875859 & 9703399 • Microsoft Research Graduate Fellowship Program • Donations from Intel Corporation, Nvidia, andMicrosoft Corporation

More Related