1 / 25

Introduction to Computational Photography

Introduction to Computational Photography. What is Computational Photography?. Second breakthrough by IT First : electronic image sensor (digital camera) Digital representation of “image formed by lens” Second : Re-definition of whole camera (optics, usage)

ghalib
Télécharger la présentation

Introduction to Computational Photography

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Introduction toComputational Photography

  2. What isComputational Photography? • Second breakthrough by IT • First : electronic image sensor (digital camera) • Digital representation of “image formed by lens” • Second : Re-definition of whole camera (optics, usage) • Image is reconstructed by computation Image sensor Film camera Image processing Image Optics Digital Camera Computational Photography Whole part of camera is affected by computational photography Digital Camera

  3. What is camera? Light source • Camera is a machine to record the distribution of the light in a scene • How to represent the distribution of the light in the scene? • 3-D coordinate of the point where the light passing through : X, Y, Z • Direction of the light : θ,Φ • Wavelength of the light (color of the light) : λ • Time : t • The 7 parameters function P which represent the distribution of the light is called “Plenoptic function” Light field (Light space) Object

  4. Integration of camera • Camera integrates the light for all 7 parameters • Position(range of X, Y, Z : aperture size should not be zero) • Direction(range of θ,Φ : pixel size is not zero) • Wavelength( range of λ : No single wavelength filter) • Exposure time ( range of t : shutter speed should not be too fast) • Multiple samples - θ,Φ:number of pixel, λ:RGB,t : burst shot • So, what is multiple sampling for X, Y, Z? Light field (light ray) Optics (lens) Image sensor (pixel)

  5. Camera array The StanfordMulti-Camera Array (Marc Levoy @Stanford University) • Measuring the distribution of the light at multiple position ProFUSION25 (ViewPlus, Inc.)

  6. Use of camera array • Free-viewpoint image • Defocus generation by synthetic aperture 3-D video (Matsuyama lab, Kyoto Univ.) Synthetic aperture(Vaish@Stanford)

  7. Defocus control by Uncalibrated Synthetic Aperture Natsumi Kusumoto, Shinsaku Hiura and Kosuke Sato, Uncalibrated Synthetic Aperture for Defocus Control, CVPR2009 (Jun. 2009)

  8. Reviewing “integration” • Some part of information is lost by integration • Sine wave which period is just as same as the integration duration is lost • Blur of object within an exposure time • Defocus by misfocus × = 0

  9. Coded Exposure • Coded exposure : exposure is coded in time axis Flutter Shutter Camera (Raskar@MERL)

  10. This Coded Exposure Traditional Deblurred Image Deblurred Image Image of Static Object Slide by R. Raskar

  11. Coded Aperture Spatial 2-D broadband mask: Focus Deblurring Coded Exposure Temporal 1-D broadband code: Motion Deblurring Slide by R. Raskar

  12. Captured Blurred Photo Slide by R. Raskar

  13. Refocused on Person Slide by R. Raskar

  14. Coded Aperture • Levin@MIT(2007) • Depth estimation by single image (manual operation is necessary)

  15. Coded Aperture • Levin@MIT(2007)

  16. Coded Aperture • Levin@MIT(2007)

  17. Multi-focus camera withCoded Aperture • Stabilizing the depth estimation and deblur by coded aperture • Simultaneous capture of 3 images with different focused distance Hiura et al,CVPR(1998), SSII(1999)

  18. Multi-Focus Range Sensorusing Coded Aperture

  19. Invariant integration • Defocus : changed according to the distance • Blur : changed according to the speed of the object • Reconstruction is not easy because the estimation of the speed or distance is necessaryIs it possible to make defocus or blur invariant to the distance or speed?

  20. Invariant integration • Defocus • Special optics : Wavefront Coding • Motion of the image sensor while exposure • Blur • Reciprocal motion of the camera CDM Optics, Inc.

  21. Motion of the image sensorfor invariant defocus • H. Nagahara, S. Kuthirummal, C. Zhou, and S.K. Nayar, Flexible Depth of Field Photography, ECCV2008

  22. Motion of the image sensorfor invariant defocus • H. Nagahara, S. Kuthirummal, C. Zhou, and S.K. Nayar, Flexible Depth of Field Photography, ECCV2008

  23. Deblur by reciprocal motion of the camera • A. Levin, P. Sand, T. S. Cho, F. Durand, W. T. Freeman. Motion-Invariant Photography. SIGGRAPH2008. Input image Deblurred image

  24. Deblur by reciprocal motion of the camera • A. Levin, P. Sand, T. S. Cho, F. Durand, W. T. Freeman. Motion-Invariant Photography. SIGGRAPH2008. Equipment Conceptual figure for Light sources with different speed

  25. More.. • Resources on www • Wikipedia : computational photography • http://computationalphotography.org/ • http://www1.cs.columbia.edu/CAVE/projects/what_is/ • http://projects.csail.mit.edu/photo/ • Conferences • International Conference on Computational Photography • SIGGRAPH, CVPR, .. Session about computational photography

More Related