1 / 48

Contextual Classification with Functional Max-Margin Markov Networks

Contextual Classification with Functional Max-Margin Markov Networks. Problem. Geometry Estimation ( Hoiem et al. ). 3-D Point Cloud Classification. Wall. Sky. Vegetation. Vertical. Support. Ground. Tree trunk. Our classifications. Room For Improvement.

loan
Télécharger la présentation

Contextual Classification with Functional Max-Margin Markov Networks

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Contextual Classification withFunctional Max-Margin Markov Networks

  2. Problem Geometry Estimation (Hoiemet al.) 3-D Point Cloud Classification Wall Sky Vegetation Vertical Support Ground Tree trunk Our classifications

  3. Room For Improvement

  4. Approach: Improving CRF Learning “Boosting” (h) Gradient descent (w) + Better learn models with high-order interactions + Efficiently handle large data & feature sets + Enable non-linear clique potentials • Friedman et al. 2001, Ratliff et al. 2007

  5. Conditional Random Fields Lafferty et al. 2001 yi Labels x Pairwise model MAP Inference

  6. Parametric Linear Model Weights Local features that describe label

  7. Associative/Potts Potentials Labels Disagree

  8. Overall Score Overall Score for a labeling y to all nodes

  9. Learning Intuition φ( ) increase score φ( ) decrease score • Iterate • Classify with current CRF model • If (misclassified) • (Same update with edges)

  10. Max-Margin Structured Prediction Taskaret al. 2003 Score withground truth labeling Best score fromalllabelings (+M) min w Ground truth labels Convex

  11. Descending† Direction (Objective) Labels from MAP inference Ground truth labels

  12. Learned Model

  13. Update Rule , and λ = 0 - - + wt+1+= Ground truth Inferred Unit step-size

  14. Verify Learning Intuition - + - wt+1 += φ( ) increase score φ( ) decrease score • Iterate • If (misclassified)

  15. Alternative Update , +1 , +1 D= , -1 , -1 • Create training set: D • From the misclassified nodes & edges

  16. Alternative Update ht(∙) D Create training set: D Train regressor: ht

  17. Alternative Update (Before) Create training set: D Train regressor: h Augment model:

  18. Functional M3N Summary D • Given features and labels • for T iterations • Classification with current model • Create training set from misclassified cliques • Train regressor/classifier ht • Augment model

  19. Illustration D +1 +1 -1 -1 Create training set

  20. Illustration h1(∙) ϕ(∙) = α1 h1(∙) Train regressorht

  21. Illustration ϕ(∙) = α1 h1(∙) Classification with current CRF model

  22. Illustration D +1 -1 ϕ(∙) = α1 h1(∙) Create training set

  23. Illustration h2(∙) ϕ(∙) = α1 h1(∙)+α2 h2(∙) Train regressorht

  24. Illustration ϕ(∙) = α1 h1(∙)+α2 h2(∙) Stop

  25. Boosted CRF Related Work • Gradient Tree Boosting for CRFs • Dietterichet al. 2004 • Boosted Random Fields • Torralbaet al. 2004 • Virtual Evidence Boosting for CRFs • Liao et al. 2007 • Benefits of Max-Margin objective • Do not need marginal probabilities • (Robust) High-order interactions • Kohliet al. 2007, 2008

  26. Using Higher Order Information Colored by elevation

  27. Region Based Model

  28. Region Based Model

  29. Region Based Model • Inference: graph-cut procedure • Pn Potts model (Kohliet al. 2007)

  30. How To Train The Model 1 Learning

  31. How To Train The Model Learning (ignores features from clique c)

  32. How To Train The Model Robust Pn Potts Kohliet al. 2008 β Learning 1 β

  33. Experimental Analysis 3-D Point Cloud Classification Geometry Surface Estimation

  34. Random Field Description [0,0,1] θ normal Local shape Orientation Elevation Nodes: 3-D points Edges: 5-Nearest Neighbors Cliques: Two K-means segmentations Features

  35. Qualitative Comparisons Parametric Functional (this work)

  36. Qualitative Comparisons Parametric Functional (this work)

  37. Qualitative Comparisons Parametric Functional (this work)

  38. Quantitative Results (1.2 M pts) Parametric 64.3% Functional 71.5% +3% +24% +4% +5% -1% Macro* AP:

  39. Experimental Analysis 3-D Point Cloud Classification Geometry Surface Estimation

  40. Random Field Description More Robust • Nodes: Superpixels (Hoiemet al. 2007) • Edges: (none) • Cliques: 15 segmentations (Hoiemet al. 2007) • Features (Hoiemet al. 2007) • Perspective, color, texture, etc. • 1,000 dimensional space

  41. Quantitative Comparisons Hoiemet al. 2007 Parametric (Potts) Functional (Potts) Functional (Robust Potts)

  42. Qualitative Comparisons Parametric (Potts) Functional (Potts)

  43. Qualitative Comparisons Parametric (Potts) Functional (Robust Potts) Functional (Potts)

  44. Qualitative Comparisons Hoiemet al. 2007 Parametric (Potts) Functional (Robust Potts) Functional (Potts)

  45. Qualitative Comparisons Hoiemet al. 2007 Parametric (Potts) Functional (Robust Potts) Functional (Potts)

  46. Qualitative Comparisons Hoiemet al. 2007 Parametric (Potts) Functional (Robust Potts) Functional (Potts)

  47. Conclusion • Effective max-margin learning of high-order CRFs • Especially for large dimensional spaces • Robust Potts interactions • Easy to implement • Future work • Non-linear potentials (decision tree/random forest) • New inference procedures: • KomodakisandParagios 2009 • Ishikawa 2009 • Gould et al. 2009 • Rotheret al. 2009

  48. Thank you • Acknowledgements • U. S. Army Research Laboratory • Siebel Scholars Foundation • S. K. Divalla, N. Ratliff, B. Becker • Questions?

More Related