1 / 48

Synthetic Aperture Focusing Using Dense Camera Arrays

Synthetic Aperture Focusing Using Dense Camera Arrays. Vaibhav Vaish. Computer Graphics Laboratory Stanford University. Cameras are Getting Cheaper …. (Photo: AP). Eyevision, SuperBowl 2001. Matrix: Bullet scene (Manex). Why Camera Arrays ?. High performance imaging Virtual reality.

sonel
Télécharger la présentation

Synthetic Aperture Focusing Using Dense Camera Arrays

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Synthetic Aperture Focusing Using Dense Camera Arrays Vaibhav Vaish Computer Graphics Laboratory Stanford University

  2. Cameras are Getting Cheaper … (Photo: AP)

  3. Eyevision, SuperBowl 2001 Matrix: Bullet scene (Manex) Why Camera Arrays ? • High performance imaging • Virtual reality

  4. Per-camera processing Stanford Multi-camera Array 100 cameras X 640x480 pixels X 30 frames/sec = 1 GB/sec Scalable Flexible

  5. Stanford Multi-camera Array

  6. Demo #1: High Speed Video • N cameras, each running at 30Hz • Stagger the frames of cameras by 1/Nth of a frame • Align images to single perspective Video: 52 cameras, 1560 Hz

  7. Demo #2: High Resolution Video • 12 × 8 array of VGA cameras • total field of view = 29° wide • seamless stitching • cameras individually metered Tiled Video: 7 Megapixels

  8. Camera Array: Portable Version • 48 cameras in 16 x 3 layout • 2m wide baseline

  9. Synthetic Aperture Focusing: Scene • distance to occluder 33m • distance to targets 40m • field of view at target 3m

  10. Synthetic Aperture Focusing: Results Synthetic aperture sequence

  11. Synthetic Aperture Focusing: Results Synthetic aperture sequence

  12. Outline • Synthetic aperture focusing: basics • Technical challenges • Determining the image transformations • How to refocus efficiently • Real-time system • Future Work

  13. Synthetic Aperture Focusing

  14. Synthetic Aperture Focusing

  15. Synthetic Aperture Focusing

  16. Synthetic Aperture Focusing

  17. Synthetic Aperture Focusing

  18. Synthetic Aperture Focusing

  19. Synthetic Aperture Focusing: Properties • Focusing is a computational process (as opposed to optical) • can vary focal plane after capturing images • Can use arbitrary apertures • Averaging multiple images improves signal-to-noise ratio

  20. Focusing on one Plane Backproject each camera image on to focal plane This is a 2D image warp called a homography (3x3 matrix) Focal Plane

  21. Focusing on one Plane The final image is the average of all the backprojected camera images Focal Plane

  22. Example: Focusing on one plane Add camera images so that points on one plane are in good focus +

  23. Technical Challenges • How do we determine the projections for focusing digitally ? • what camera parameters do we need to calibrate ? • what are the image warps required ? • Are there efficient algorithms for varying the focal plane ? • a homography requires 3 adds + 1 divide/pixel • 100 video cameras = 90 million pixels/sec • computationally intensive!

  24. Plane 2 Plane 1 Camera Image Camera Image Plane 1 Plane 2 Plane 2 Varying the Focal Plane Plane 1

  25. Focal Plane Varying the Focal Plane Reference Plane Reference Plane Camera Image Focal Plane Camera Image Reference Plane Focal Plane

  26. Planar Homologies • Refocusing requires projecting image from reference plane on to new focal plane • This reprojection is called a planar homology • A homology is described by a matrix of the form Hi = I + i ei l T

  27. Focusing Algorithm • Calibration (Pre-process) • Homographies for projection on to the reference plane • Epipoles • [Vaish 2004, Hartley 2000] • Project images on reference plane • Vary the focal plane by applying the homologies given by Hi = I + i ei l T

  28. Example: varying focal planes Rotating focal plane

  29. Case 1: Parallel Reference Plane When the reference plane is parallel to the camera plane, the epipoles ei= [ xi yi 0 ]T are points at infinity Homologies Hi = I + i ei lT reduce to affine transforms Focal Planes Reference Plane Camera Plane

  30. Case 2: Parallel Focal Planes Parallel Focal Planes Reference Plane When focal planes are parallel to reference plane, the line l = [ 0 0 1 ]T is at infinity Homologies Hi = I + i ei lT reduce to a scale and shift

  31. Case 3: Frontoparallel Planes When the reference plane, camera plane and focal planes are parallel, the epipoles eiand line l are both at infinity Homologies Hi = I + i ei lT reduce to shifts [Vaish 2004] Focal Planes Reference Plane Camera Plane

  32. Case 4: Scheimpflug Configuration Focal Planes When the camera plane, reference plane and focal planes intersect in the same line, l and ei can both be mapped to infinity Homologies Hi = I + i ei lTagain reduce to shifts Reference Plane Camera Plane

  33. Taxonomy of Homologies 2. Scale + shift 1.Affine 4. Shift + post warp of final image 3. Shift

  34. Real-time Implementation • Projection on to fixed reference plane (Look-up table) • Shift image for desired focal plane (FPGA) • Send MPEG stream to client PC (Firewire) • Decompress and add streams on client PC • Server adds streams from clients and displays live synthetic aperture video

  35. Real-time system: Demo

  36. Real-time System: Discussion • Reference plane is fixed • Initial projection (homographies) can be implemented via look-up table which is computed beforehand • Varying focal plane requires shifting images • Easy to realize in FPGAs (2 adds/camera) • Keeps per-camera cost low • Extensions • Implement affine warps (2 adds/pixel) • Computer assisted focusing • Study other architectures for digital focusing (GPU)

  37. Outline • Synthetic aperture focusing: basics • Technical challenges • Determining the image transforms • How to refocus efficiently • Real-time system • Future Work • Matted synthetic apertures • General focal surfaces

  38. Matted Synthetic Apertures

  39. Crowd scene

  40. Crowd scene

  41. unoccluded planar focal surface cylindrical focal surface Curved focal surfaces

  42. Shape from focus Single camera view “Unoccluded” view General Focal Surfaces Can we reconstruct the correct focal depth for every pixel ?

  43. Single camera view “Unoccluded” view General Focal Surfaces Can we reconstruct the correct focal depth for every pixel ? Shape from stereo

  44. Summary • Using a camera array for synthetic aperture focusing • large synthetic aperture allows seeing through partial occluders • Geometry of digital focusing • Real-time system • Future work • explore general apertures • reconstruct correct focal depths for each pixel • study design space of synthetic aperture camera arrays

  45. Acknowledgements • Sponsors • Bosch Research • NSF IIS-0219856-001 • DARPA NBCH 1030009 • Acquisition assistance • Augusto Roman, Billy Chen, Abhishek Bapna, Mike Cammarano • Listeners • Gaurav Garg, Ren Ng, Jeff Klingner, Doantam Phan, Niloy Mitra, Sriram Sankaranarayanan

  46. staff Mark Horowitz Marc Levoy Bennett Wilburn students Vaibhav Vaish Gaurav Garg Eino-Ville Talvala Emilio Antunez Andrew Adams Neel Joshi Georg Petschnigg Guillaume Poncin Monica Goyal collaborators Mark Bolas Ian McDowall Microsoft Research funding Bosch Research Intel Sony Interval Research NSF DARPA The Camera Array Team http://graphics.stanford.edu/projects/array

  47. Effect of feature size s = 2” • see-around ability increases with aperture width (a) and separation (Δz) relative to distance (d) from the cameras and to feature size (s) • can see around 2” leaves at 125’ using a 16” aperture (ours was 6’) • independent of number of cameras a = 6’ d = 125’ Δz =15’ see-around ability ~ a Δz / d s

  48. Effect of occluder density • see-through ability increases with number of cameras (n) relative to occluder opacity (α) • independent of aperture size • our bushes averaged 97% opaque (needs better measurement) • qualitative figure of merit depends on human perception

More Related