1 / 48

A Non-local Cost Aggregation Method for Stereo Matching

A Non-local Cost Aggregation Method for Stereo Matching. Qingxiong Yang City University of Hong Kong 2012 IEEE Conference on Computer Vision and Pattern Recognition. Outilne. Introduction Related Works Method Experimental Results Conclusion. Introduction _________________________.

sheri
Télécharger la présentation

A Non-local Cost Aggregation Method for Stereo Matching

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. A Non-local Cost Aggregation Method for Stereo Matching Qingxiong Yang City University of Hong Kong 2012 IEEE Conference on Computer Vision and Pattern Recognition

  2. Outilne • Introduction • Related Works • Method • Experimental Results • Conclusion

  3. Introduction_________________________

  4. Introduction • Goal : Get fast and accurate disparity map. • Solution : Non-local cost aggregation + MST • Advantage : Better in low textures region Low complexity

  5. Related Works_________________________

  6. Related Works [21] D. Scharstein and R. Szeliski. A taxonomy and evaluation of dense two-frame stereo correspondence algorithms. International Journal of Computer Vision (IJCV), 47:7–42, 2002.

  7. Related Works Local methods Global methods 1(=>2)=>3 Energy minimization process (GC,BP,DP,Cooperative) Per-processing Explicit smoothness Accuratebutslow • 1=>2=>3 • A local support region with winner take all • Implicit smoothness • Fastbutinaccurate.

  8. Comparison (Rank in Middleburry)

  9. Method_________________________

  10. Method

  11. Bilateral Filter • Every sample is replaced by a weighted average of its neighbors. • These weights reflect two forces • How close are the neighbor and the center sample • How similar are the neighbor and the center sample • Edge-preservingand noise reducing smoothing filter

  12. Bilateral Filter

  13. Bilateral Filter Center Sample : p Neighborhood : q

  14. Bilateral Filter Total Distance

  15. Bilateral Filter Bilateral wieght Original image Gaussian wieght

  16. Minimum Spanning Tree • Kruskal's Algorithm • Scan all edges increasing weight order, if an edge is safe, add it to F. 4 B C 4 2 1 A 4 E F 1 2 D 3 10 5 G Orginal Graph 5 6 3 4 I H 3 2 J PPT By Jonathan Davis

  17. 4 1 A B A D 4 4 B C B D 4 10 2 B C B J C E 4 2 1 1 5 C F D H A 4 E F 1 6 2 2 D J E G D 3 10 5 G 3 5 F G F I 5 6 3 4 3 4 I G I G J H 3 2 J 2 3 H J I J

  18. Sort Edges (in reality they are placed in a priority queue - not sorted - but sorting them makes the algorithm easier to visualize) 1 1 A D C F 2 2 C E E G 4 2 3 B C H J F G 4 2 1 3 3 G I I J A 4 E F 1 4 4 2 A B B D D 3 10 5 G 4 4 B C G J 5 6 3 4 5 5 I F I D H H 3 2 J 6 10 D J B J

  19. Add Edge 1 1 A D C F 2 2 C E E G 4 2 3 B C H J F G 4 2 1 3 3 G I I J A 4 E F 1 4 4 2 A B B D D 3 10 5 G 4 4 B C G J 5 6 3 4 5 5 I F I D H H 3 2 J 6 10 D J B J

  20. Add Edge 1 1 A D C F 2 2 C E E G 4 2 3 B C H J F G 4 2 1 3 3 G I I J A 4 E F 1 4 4 2 A B B D D 3 10 5 G 4 4 B C G J 5 6 3 4 5 5 I F I D H H 3 2 J 6 10 D J B J

  21. Add Edge 1 1 A D C F 2 2 C E E G 4 2 3 B C H J F G 4 2 1 3 3 G I I J A 4 E F 1 4 4 2 A B B D D 3 10 5 G 4 4 B C G J 5 6 3 4 5 5 I F I D H H 3 2 J 6 10 D J B J

  22. Add Edge 1 1 A D C F 2 2 C E E G 4 2 3 B C H J F G 4 2 1 3 3 G I I J A 4 E F 1 4 4 2 A B B D D 3 10 5 G 4 4 B C G J 5 6 3 4 5 5 I F I D H H 3 2 J 6 10 D J B J

  23. Add Edge 1 1 A D C F 2 2 C E E G 4 2 3 B C H J F G 4 2 1 3 3 G I I J A 4 E F 1 4 4 2 A B B D D 3 10 5 G 4 4 B C G J 5 6 3 4 5 5 I F I D H H 3 2 J 6 10 D J B J

  24. Cycle Don’t Add Edge 1 1 A D C F 2 2 C E E G 4 2 3 B C H J F G 4 2 1 3 3 G I I J A 4 E F 1 4 4 2 A B B D D 3 10 5 G 4 4 B C G J 5 6 3 4 5 5 I F I D H H 3 2 J 6 10 D J B J

  25. Add Edge 1 1 A D C F 2 2 C E E G 4 2 3 B C H J F G 4 2 1 3 3 G I I J A 4 E F 1 4 4 2 A B B D D 3 10 5 G 4 4 B C G J 5 6 3 4 5 5 I F I D H H 3 2 J 6 10 D J B J

  26. Add Edge 1 1 A D C F 2 2 C E E G 4 2 3 B C H J F G 4 2 1 3 3 G I I J A 4 E F 1 4 4 2 A B B D D 3 10 5 G 4 4 B C G J 5 6 3 4 5 5 I F I D H H 3 2 J 6 10 D J B J

  27. Add Edge 1 1 A D C F 2 2 C E E G 4 2 3 B C H J F G 4 2 1 3 3 G I I J A 4 E F 1 4 4 2 A B B D D 3 10 5 G 4 4 B C G J 5 6 3 4 5 5 I F I D H H 3 2 J 6 10 D J B J

  28. Cycle Don’t Add Edge 1 1 A D C F 2 2 C E E G 4 2 3 B C H J F G 4 2 1 3 3 G I I J A 4 E F 1 4 4 2 A B B D D 3 10 5 G 4 4 B C G J 5 6 3 4 5 5 I F I D H H 3 2 J 6 10 D J B J

  29. Add Edge 1 1 A D C F 2 2 C E E G 4 2 3 B C H J F G 4 2 1 3 3 G I I J A 4 E F 1 4 4 2 A B B D D 3 10 5 G 4 4 B C G J 5 6 3 4 5 5 I F I D H H 3 2 J 6 10 D J B J

  30. Minimum Spanning Tree Orginal Graph 4 B C 4 B C 4 4 2 1 2 1 A E A 4 F E 1 F 1 2 D 2 D 3 10 G 5 G 3 5 6 3 4 I I H H 3 2 3 J 2 J

  31. Cost Computation • Cd(p) : matching cost for pixel p at disparity level d • : aggregated cost -- σSand σR: constants used to adjust the similarity.

  32. Cost Aggregation on a Tree Structure • Weight between pand q • w(p, q) = | I(p)-I(q)| = image gradient • Distance between p and q • D(p, q) = sum of weights of the connected edges • Similarity between p and q • Aggregated cost =>

  33. Bilateral Filter vsTree Structure

  34. Cost Aggregation on a MST • Claim 1. Let Tr denote a subtree of a node s and r denote the root node of Tr, then the supports node s received from this subtree is the summation of the supports node s received from r and S(s, r) times the supports node r received from its subtrees. • Supports r = • Supports s = s r Tr

  35. Cost Aggregation on a MST • Aggregated cost => • , if node v is a leaf node • P(vc) denote parent of nodevc

  36. Cost Aggregation on a MST

  37. Cost Aggregation on a MST • Aggregated cost =>

  38. Cost Aggregation on a MST • Cost aggregation process • Aggregate the original matching cost Cd from leaf nodes towards root node using Eqn. (6) • Aggregate from root node towards leaf nodes using Eqn. (7) • Complexity • Each level:2 addition/subtraction + 3 multiplication

  39. Disparity Refinement • D:the left disparity map • Unstable:occlusion, lack of texture, specularity • Median filter overlap

  40. Experimental Results_________________________

  41. Experimental Results • Device:a MacBook Air laptop computerwith a 1.8 GHz Intel Core i7 CPU and 4 GB memory • Parameter: • σ = 0.1 (non-local cost aggregation) • Source : Middlebury http://vision.middlebury.edu/stereo/ HHI database(book arrival) Microsofy i2i database(Ilkay)

  42. Experimental Results • Time: • Proposed average runtime : 90 milliseconds (1.25× slower) • Unnormalizedbox filter average runtime : 72 milliseconds. • Local guided image filter average runtime : 960 milliseconds [7] C.Rhemann, A. Hosni, M. Bleyer, C. Rother, and M. Gelautz. Fast cost-volume filtering for visual correspondence and beyond. In CVPR,2011. [24] P. Viola and M. Jones. Robust real-time face detection. International Journal of Computer Vision, volume 57, pages 137–154, 2003.

  43. [7] C.Rhemann, A. Hosni, M. Bleyer, C. Rother, and M. Gelautz. Fast cost-volume filtering for visual correspondence and beyond. In CVPR,2011.

  44. Experimental Results

  45. Experimental Results

  46. Experimental Results

  47. Conclusion_________________________

  48. Conclusion • Contributions • Outperform all local cost aggregation methodsboth in speed and accuracy. • Present a near real-time stereo system with accurate disparity results. • Futureworks • Apply to parallel algorithms • Refine matching cost estimation

More Related