1 / 15

Presenter : Fen-Rou, Ciou Authors : Toh Koon Charlie Neo, Dan Ventura 2012, PRL

A direct boosting algorithm for the k-nearest neighbor classifier via local warping of the distance metric. Presenter : Fen-Rou, Ciou Authors : Toh Koon Charlie Neo, Dan Ventura 2012, PRL. Outlines. Motivation Objectives Methodology Experiments Conclusions Comments. Motivation.

jadon
Télécharger la présentation

Presenter : Fen-Rou, Ciou Authors : Toh Koon Charlie Neo, Dan Ventura 2012, PRL

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. A direct boosting algorithm for the k-nearest neighbor classifier via local warping of the distance metric Presenter : Fen-Rou, CiouAuthors : Toh Koon Charlie Neo, Dan Ventura2012, PRL

  2. Outlines • Motivation • Objectives • Methodology • Experiments • Conclusions • Comments

  3. Motivation • The k-nearest neighbor pattern classifier is an effective learning algorithm, it can result in large model sizes.

  4. Objectives • The paper present a direct boosting algorithm for the k-NN classifier that creates an ensemble of models with locally modified distance weighting to increase the accuracy and condense the model size.

  5. Methodology - Framework x Dz xi v = {+, } AdaBoost

  6. Methodology Sensitivity data order - Randomize - Batch update

  7. Methodology Sensitivity data order - Randomize - Batch update

  8. Methodology Voting mechanism - simple voting - error-weigh voting

  9. Methodology Condensing model size - optimal weight - average the weight

  10. Experiments

  11. Experiments

  12. Experiments Fig 8. Boosted k-NN with randomized data order. Fig 9. Boosted k-NN with batch update. Fig 11. Boosted k-NN with optimal weights. Fig 10. Boosted k-NN with error-weighted voting. Fig 12. Boosted k-NN with average weights.

  13. Experiments

  14. Conclusions • The Boosted k-NN can boost the generalization accuracy of the k-nearest neighbor algorithm. • The Boosted k-NN algorithm modifier the decision surface, producing a better solution.

  15. Comments • Advantages • The paper describes rich experiment. • Applications • classification

More Related