1 / 39

LINE-BASED SINGLE VIEW METHODS FOR ESTIMATING 3D CAMERA ORIENTATION IN URBAN SCENES

LINE-BASED SINGLE VIEW METHODS FOR ESTIMATING 3D CAMERA ORIENTATION IN URBAN SCENES. Ron Tal York University. Outline. Introduction Previous works Approach Results Conclusion. Motivation. Vanishing point. Manhattan Assumption.

december
Télécharger la présentation

LINE-BASED SINGLE VIEW METHODS FOR ESTIMATING 3D CAMERA ORIENTATION IN URBAN SCENES

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. LINE-BASED SINGLE VIEW METHODS FOR ESTIMATING 3D CAMERA ORIENTATION IN URBAN SCENES Ron Tal York University

  2. Outline • Introduction • Previous works • Approach • Results • Conclusion

  3. Motivation Vanishing point

  4. Manhattan Assumption • Recovering this ‘Manhattan frame’ is an important first stage of any single-view reconstruction system • Each direction points towards a vanishing point • In urban imagery, linear features belong to one of 3 mutually orthogonal 3D directions

  5. Outline • Introduction • Previous works • Approach • Results • Conclusion

  6. Early Works • Coexter(1955): Mathematical formulation • Haralick (1980): Application to scene analysis • Barnard (1983): Gauss sphere representation • Collins & Weiss (1991): Least-square formulation

  7. Early Works: Limitations • Assumes strong linear cues are found • Assumes data association is known • No unified framework for estimating the Manhattan frame

  8. Unified Framework: Manhattan World’s • Coughlan & Yuille (1999, 2003) • For gradient observation and frame parameters • Define a probabilistic mixture model • Maximum-likelihood estimate of parameters

  9. Edge-Based Methods • Denis et al. (2008) • Argues less is more • Sparse edge-based probabilistic framework • Accuracy improves by a factor of 2.5 • Begs the question: can we do even better with even sparser lines?

  10. Contributions • More accurate mapping of edges to lines • Artifact free line selection • Line-based framework for recovering the Manhattan frame • Evaluation of line-based methods with the state of the art

  11. YorkUrbanDB • Contains 102 images taken using a calibrated camera • Introduced by Denis et al. (2008) • Hand labeled ground-truth lines and Manhattan frames

  12. Outline • From edges to lines • Line selection • Probabilistic framework • Maximum-likelihood optimization • Introduction • Previous works • Approach • Results • Conclusion

  13. Hough Transform • Lines are detected using the Hough transform, Duda and Hart (1972): • Parametric representation of a line y x

  14. Hough Transform • Discretization of Hough map is a trade-off between accuracy and tolerance for false positives • Observation uncertainty needs to be explicitly considered

  15. Probabilistic Hough Transforms • Kiryati & Bruckstein (2000) • Li and Xie (2003) • Fernandes & Oliveira (2008) • Barinoval et al. (2010) • My solution: a Kernel voting scheme that accurately propagates edge-observation uncertainty onto the Hough domain

  16. Observation Uncertainty • Edge observation uncertainty can be modeled

  17. Propagation of Uncertainty • Linear propagation of uncertainty is used to map observation uncertainty: where

  18. Propagation of Uncertainty • Thus • Recall: and therefore

  19. Kernel Voting • Hough map is the accumulation of BVN kernels that correspond to edge observations

  20. Kernel Voting • Each kernel is computed according to the distribution defined by

  21. Outline • From edges to lines • Line selection • Probabilistic framework • Maximum-likelihood optimization • Introduction • Previous works • Approach • Results • Conclusion

  22. Line Selection • Classical approaches to peak selection: • Sorted list of local maxima • Hough map smoothing • Greedy global maxima selection, followed by NMS • Problem: • Sampling error is unavoidable

  23. Line Selection • Solution: a greedy iterative selection technique that subtracts contributions of edges that belong to detected lines

  24. Outline • From edges to lines • Line selection • Probabilistic framework • Maximum-likelihood optimization • Introduction • Previous works • Approach • Results • Conclusion

  25. Probabilistic Framework • Given a Manhattan frame, the probability of a line is defined as: Determined Via error model Cause prior Learned via training

  26. Error Model • Gauss sphere representation for extended lines: • On the image plane for point features:

  27. Probability of a Line • More specifically: • Where:

  28. Outline • From edges to lines • Line selection • Probabilistic framework • Maximum-likelihood optimization • Introduction • Previous works • Approach • Results • Conclusion

  29. Maximum-Likelihood Optimization • Given a set of line observations we can find the frame that maximizes the likelihood • The rotation parameter is described by the Euler angles that map the camera frame onto the Manhattan frame

  30. A set of three rotations about the axes of the frame: Maximum-Likelihood Optimization 29

  31. Outline • Introduction • Previous works • Approach • Results • Conclusion

  32. Comparison with traditional Hough Quantitative Evaluation 31

  33. Comparison with traditional Hough Quantitative Evaluation 32

  34. Quantitative Evaluation 33 • Previous edge- and gradient-based methods

  35. Run-Time 34

  36. Qualitative Comparison 35 Edge-based Line-based

  37. Outline • Introduction • Previous works • Approach • Results • Conclusion

  38. Conclusion Accurate method for line extraction Improved framework for Manhattan line extraction Method presented outperforms the state of the art by over a factor of 2 37

  39. Thank You!

More Related