1 / 21

Speaker Min-Koo Kang

Depth Enhancement Technique by Sensor Fusion: MRF-based approach. Speaker Min-Koo Kang. March 26, 2013. Outline. Review of filter-based method Summary and limitation Related work MRF-based depth up-sampling framework Introduction of state-of-the-art method

rasia
Télécharger la présentation

Speaker Min-Koo Kang

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Depth Enhancement Technique by Sensor Fusion: MRF-based approach Speaker Min-Koo Kang March 26, 2013

  2. Outline • Review of filter-based method • Summary and limitation • Related work • MRF-based depth up-sampling framework • Introduction of state-of-the-art method • High Quality Depth Map Upsampling for 3D-TOF Cameras / ICCV 2011 • Future work • Remaining problems • Strategy

  3. Depth upsampling • Definition • Conversion of depth map with low resolution into one with high resolution • Approach • Most state-of-the-art methods are based on sensor fusion technique; i.e., use image sensor and range sensor together Depth map up-sampling by using bi-cubic interpolation Depth map up-sampling by using image and range sensor

  4. Joint bilateral upsampling (JBU) • Representative formulation: • N(P): targetingpixel P(i, j)’sneighborhood. fS(.): spatial weighting term, applied for pixel position P. fI(.): range weighting term, applied for pixel value I(q). fS(.), fI(.) areGaussian functionswith standard deviations, σSand σI, respectively. Upsampled depth map Rendered 3D view *Kopf et al., “Joint Bilateral Upsampling”, SIGGRAPH2007

  5. Is JBUideal enough? • Limitations of JBU: • It starts from the fundamental heuristic assumptions about the relationship between depth and intensity data • Sometimes depth has no corresponding edges in the 2-D image • Remaining problems: • Erroneous copying of 2-D texture into actually smooth geometries within the depth map • Unwanted artifact known as edge blurring High-resolution guidance image (red=non-visible depth discontinuities) Low-resolution depth map (red=zooming area) JBU enhanced depth map (zoomed)

  6. Summary of JBU-based approach • Joint bilateral upsampling approach • Propagates properties from one to an other modality • Credibility map decides system performance • Defining blending function can be another critical factor • Many empirical parameters make the practical automated usage of such fusion filter challenging • Another question is a clear rule on when a smoothing by filtering is to be avoided and when a simple binary decision is to be undertaken

  7. MRF-Based Depth Up-sampling • Diebelet al., NIPS 2005 • Use a multi-resolution MRF which ties together image and range data • Exploit the fact that discontinuities in range and coloring tend to co-align • Pros and cons • Robust to changes in up-sampling scale through global optimization • High computation complexity Data term Smoothness term MRF framework

  8. A novel method based on MRF approach • High Quality Depth Map Upsampling for 3D-TOF Cameras / ICCV 2011

  9. Problem Definition

  10. System Setup and Preprocessing

  11. Evaluation on Weighting Terms

  12. The plot of PSNR accuracy • The combined weighting term consistently produce the best results under different upsampling scale.

  13. NLM regularization term • Thin structure protection • By allowing the pixels on the same nonlocal structure to reinforce each other within a larger neighborhood.

  14. User Adjustments Additional weighting term for counting the additional depth discontinuity information is defined as: After adding the additional depth samples, our algorithm generates the new depth map using the new depth samples as a hard constraint in Equation (4)

  15. Experimental Results (Synthetic)

  16. Experimental Results (Real world)

  17. Is this method ideal enough? • Noise distribution in depth map: • Practical depth map contains more complicated noise distribution than the Gaussian noise • Neighborhood extension to higher dimension: • Practical depth data is a sequence of successive depth maps • Spatial domain  spatial-temporal domain

  18. Spatial-Temporal MRF-Based Depth Map Refinement • Zhu et al., CVPR 2008 • Combine range sensor with stereo sensor • Extend the MRF to temporal domain to take the temporal coherence into account • Pros and cons • Improve accuracy by using temporal coherence • Do not consider changes of depth on time-varying Data term Smoothness term Spatial-temporal MRF structure

  19. Summary of MRF-based approach • MRF-based approach • Maintaining sharp depth boundaries • Easy adoption of several weighting factors • Easy cooperation with user adjustment • Possible improvements in the future • Noise distribution consideration in practical depth data • Temporal smoothness consideration by neighborhood extension

More Related