1 / 20

Gustavo Carneiro

The Automatic Design of Feature Spaces for Local Image Descriptors using an Ensemble of Non-linear Feature Extractors. Gustavo Carneiro. Set of Matching Problems. Wide baseline Matching. Visual Class Recognition. Visual Object Recognition. Set of Matching Problems.

taipa
Télécharger la présentation

Gustavo Carneiro

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The Automatic Design of Feature Spaces for Local Image Descriptors using anEnsemble of Non-linear Feature Extractors Gustavo Carneiro

  2. Set of Matching Problems Wide baseline Matching Visual Class Recognition Visual Object Recognition

  3. Set of Matching Problems 1- Design a feature space that facilitates certain matching problems SIFT [Lowe,ICCV09] Shape Context [Belongie et al. PAMI02] HOG [Dalal & Triggs,CVPR05]

  4. Set of Matching Problems 2- Given a matching problem, and a set of feature spaces, combine them in order to minimize probability of error (mismatches) [Varma & Ray, CVPR07] SIFT Target Matching Problem Shape Context HOG

  5. Set of Matching Problems 3- Given a matching problem, find the feature space and respective parameters θ that minimizes probability of error (mismatches) [Hua et al.ICCV07] Target Matching Problem Feature Transform 2 (θ*) Feature Transform 1 (θ) Feature Transform 2 (θ) Feature Transform 1 (θ*)

  6. Set of Matching Problems 4- Given future unknown matching problems, find the feature space that minimizes probability of error (mismatches) Feature Transform Matching Problem 3 Feature Transform Matching Problem 1 Target Matching Problem 1 Target Matching Problem 2 Feature Transform Matching Problem 4 Feature Transform Matching Problem 2 Feature Transform Matching Problem 5

  7. The Universal Feature Transform • Solve random and simple matching problems • The more matching problems solved, the easier it will be to solve new problems • Restriction: problems should be in similar feature ranges and similar class statistics

  8. (Linear) Distance Metric Learning[Chopra et al.CVPR05,Goldberger et al.NIPS04, Weinberger & Saul JMLR09] • Image patches: • Linear transform: • Distance in T space:

  9. (Non-Linear) Distance Metric Learning [Sugiyama JMLR07] • Rewrite S(b) and S(w): • By taking the following transformation: • Generalized Eigenvalue Problem: Dot product replaced by non-linear kernel function • Feature Transform

  10. Linear vs Non-linear DML LINEAR Points not belonging to any class collapse at the origin NON-LINEAR Points from the same class collapse and are far from each other

  11. Intuition • Train several feature transforms • Random matching problems • Aggregate distances[Breiman 01]: • Threshold-based classifier

  12. ROC Aggregated distances Intuition Unkown target problem T Small dist. Large dist. Random training problem 1

  13. ROC Aggregated distances Intuition Unkown target problem T Small dist. Largedist. Random training problem 2

  14. Toy Example • Combing 100 feature spaces... NLMSL trained UFT Original Error decreases with number of feature spaces No matter the error for each space

  15. Experiments • Dataset of for training [Winder & Brown,CVPR07]: • Backprojecting 3D points to 2D images from scene reconstructions • Variations in scene location, brightness and partial occlusion • Similar pre-processing of [Winder & Brown,CVPR07] • Train: all patch classes from Trevi & Yosemite dataset • Test: 50K matching and 50K non-matching pairs from Notre Dame dataset

  16. Experiments • Using cross validation • 50 training classes for training each feature space • 50 training feature spaces Error decreases with number of feature spaces UFT (2.28%) SIFT (6.3%) @95% TP No matter the error in each space

  17. Experiments • Matching database [Mikolajczyk & Schmid,PAMI’05]

  18. Conclusion • Competitive performance • Simple ensemble classifier (can be efficiently implemented) • Adapt to new classification problems (no re-training)

  19. Linear vs Non-linear DML 10 runs, 100 points per class Classifier: threshold matching LINEAR NON-LINEAR Non-linear: low bias, high variance Linear: High bias, low variance

  20. Combining Feature Spaces • Breiman’s idea about ensemble classifiers [Breiman 01]: • combine low-bias, high-variance (unstable) classifiers to produce low-bias, low-variance classifiers. • Distance

More Related