1 / 22

Practical Scene Illuminant Estimation via Flash/No-Flash Pairs

Practical Scene Illuminant Estimation via Flash/No-Flash Pairs. Cheng Lu and Mark S. Drew Simon Fraser University {clu, mark}@cs.sfu.ca. Flash/No-flash Imagery – a Brief History. diCarlo, Xiao, & Wandell, CIC 2001. Combine flash/no-flash images to produce a pure-flash image.

ansel
Télécharger la présentation

Practical Scene Illuminant Estimation via Flash/No-Flash Pairs

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Practical Scene Illuminant Estimation via Flash/No-Flash Pairs Cheng Lu and Mark S. Drew Simon Fraser University {clu, mark}@cs.sfu.ca

  2. Flash/No-flash Imagery – a Brief History diCarlo, Xiao, & Wandell, CIC 2001 • Combine flash/no-flash images to • produce a pure-flash image. • Use dim=3 FDM + knowledge of flash SPD and sensor curves to estimate surface reflectance •  most likely ambient illuminant Szeliski et al., Siggraph 2004 Transfer lower-noise information from flash image to higher-noise ambient-light image. Raskar et al., Non-Realistic Rendering 2004 Filling in night-time imagery with daytime image info. Blake et al., Poisson Image Editing, Siggraph 2004 Drew,Lu,Finlayson, Removing Shadows using Flash/Noflash Image Edges , ICME 2006 Copy edges from cloned image region into edge-map of target background; re-integrate. Find shadow-mask, copy edges inside shadow from flash image into ambient image, re-integrate.

  3. Like diCarlo&Wandell approach, but replace knowledge of camera sensor curves with a camera RGB-based calibration using difference of with-flash and no-flash images. How? • Spectral sharpening • Subtract “both” – “no-flash”  pure-flash image • Log’s • Project difference of flash minus ambient into • geometric-mean chromaticity color space  Calibrate such to get illuminant chromaticity. This paper: Estimate Ambient Illuminant, using Flash/No-flash Pairs

  4. What’s the point?: Can estimate scene (ambient) illuminant without knowing: • Flash SPD • Camera sensors • Surface reflectance

  5. Why estimate the illuminant? White balance, plus many computer vision applications == intrinsic images without illumination. What’s good about this method? • Simple • Fast

  6. + The set-up: 2 images , one under ambient lighting, & another under flash. Under Ambient: Image “A”. Under Both: Image “B”.

  7. - ( + ) = The Key: Pure-Flash Image • The ambient light from “A” is also in “B”. • Therefore if we subtract the two, we have “F”: the pure-flash image. Under Flash: Image “F”:

  8. Image “F”: the scene as imaged under Flash light only. Incidentally, note that there are now extra shadows, from the flash (since it’s offset from the lens).

  9. Simple Image Formation Model  will guide us. Assumptions: 1., 2., 3. 1. Lambertian surface: RGB = Shading = normal  effective light-direction Illuminant Surface Sensors

  10. 2. Narrow-band sensors: is exactly a single-spike sensor: so then

  11. 3. Planckian light: (in Wien’s approximation) But, can violate 1., 2., 3. and still succeed. Gives

  12. Now take Log’s, to pull apart multiplications: Camera-dep’t vector Camera-dep’t vector Intensity and shading Surface Color-temperature of light where

  13. We’d like to remove intensity/shading term: So form geometric-mean chromaticity: In logs: Surface Camera-dep’t vector where Color-temperature of light

  14. The point: As temp (light color) changes, move along straight line. • But, we have “A” and “F” images: • Subract them, and use same chromaticity trick •  Only illumination is left!

  15. Log-difference Geometric-Mean Chromaticity  { • So log-log delivers inverse-temperature difference: • Calibrate for 1/TA-1/TF, • then in new scene obtain TA!

  16. What does this look like? Moved to 2D; color-matching functions in geo-mean chromaticity. (9 Planckians, Macbeth ColorChecker, spike sensors, xenon flash SPD)

  17. Sony DXC930 sensors, Daylights+F2, actual xenon flash SPD: • How to proceed: • Sharpen • Find closest cluster “Reference locus”

  18. Effect of sharpening: Kodak DCS420: Poor clusters  Better clusters #’ing

  19. Test: can we determine the illuminant? 102 illuminants, Sony camera, Macbeth patches 102 illuminants, Sony camera, Munsell patches Estimate illum. from Munsell to Macbeth  Nearly 100% correctly identified.

  20. Application: White Balance Image under CWF; CWF+Xenon No flash  4 calibration illuminants, HP camera, Macbeth chart (each cluster has 24 dots) With flash  • Sharpen • Sample image at 24 locations • evenly over image • Same (“daylight”) color balance • for training and for testing

  21. Overlaps best with CWF, so use white patch of Macbeth under CWF for white balance: Our color-balance– Much closer. “Auto” balance – Wrong. “Fluor” balance – Correct.

  22. Thanks! To Natural Sciences and Engineering Research Council of Canada

More Related