1 / 27

A fast nearest neighbor classifier based on self-organizing incremental neural network (SOINN)

A fast nearest neighbor classifier based on self-organizing incremental neural network (SOINN). Presenter : Lin, Shu -Han Authors : Shen Furao ,, Osamu Hasegawa. Neuron Networks (NN, 2008). Outline. Introduction Motivation Objective Methodology Experiments Conclusion Comments.

sahkyo
Télécharger la présentation

A fast nearest neighbor classifier based on self-organizing incremental neural network (SOINN)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. A fast nearest neighbor classifier based onself-organizing incremental neural network(SOINN) Presenter : Lin, Shu-Han • Authors : ShenFurao,, Osamu Hasegawa Neuron Networks (NN, 2008)

  2. Outline • Introduction • Motivation • Objective • Methodology • Experiments • Conclusion • Comments

  3. Introduction-self-organizing incremental neural network(SOINN) Distance: Toofar Node =prototype

  4. Introduction-self-organizing incremental neural network(SOINN) Link age

  5. Introduction-self-organizing incremental neural network(SOINN) Age: Tooold

  6. Introduction-self-organizing incremental neural network(SOINN) Insertnodeiferrorislarge CancelInsertionifinsertisnouse Runtwotimes

  7. Introduction-self-organizing incremental neural network(SOINN) Deleteoutlier: Nodeswithoutneighbor(low-densityassumption) Runtwotimes

  8. Motivation • SOINNclassifier(theirfirstresearchin2005) • Use6userdeterminedparameters • Donotmentionedaboutnoise • Toomanyprototypes • Unsupervisedlearning • Theirsecondresearch(in2007)talkabouttheseweakness

  9. Objectives • Propose a ImprovedversionofSOINN,ASC(AdjustSOINNClassifier) • FASTER: delete/less prototype • Training phase • Classification phase • CLASSIFIER: 1-NN (prototype) rule • INCREMENTAL LEARNING • ONE LAYER: easy to understand the setting, less parameters~ • MORE STABLE: help of k-means

  10. Methodology– Adjusted SOINN Distance: Toofar Anodeisacluster 10

  11. Methodology– Adjusted SOINN Link age 11

  12. Methodology– Adjusted SOINN Winner Neighbor 12

  13. Methodology– Adjusted SOINN Age: Too old > ad 13

  14. Methodology– Adjusted SOINN Deleteoutlier: Nodeswithoutneighbor(low-densityassumption) 14

  15. Methodology– Adjusted SOINN Lambda = iterations 15

  16. Methodology– k-means • Help of k-means clustering, k = # of neurons • Adjust the result prototypes: assume that each node nearby the centroid of class 16

  17. Methodology– noise-reduction • Help of k-Edit Neighbors Classifier (ENC), k=? • Delete the node which label are differs from the majority voting of its k-neighbors: assume that are generated by noise 17

  18. Methodology– center-cleaning Delete neurons: if it has never been the nearest neuron to other class: assume that are lies in the central part of class 18

  19. Experiments: Artificial dataset dataset Adjusted SOINN Error: same Speed: faster 19 ASC

  20. Experiments: Artificial dataset dataset Adjusted SOINN Error: same Speed: faster 20 ASC

  21. Experiments: Artificial dataset dataset Adjusted SOINN Error: better Speed: faster 21 ASC

  22. Experiments: Artificial dataset dataset Adjusted SOINN Error: better Speed: faster 22 ASC

  23. Experiments: Real dataset Compression ratio (%) Speed up ratio (%) 23

  24. Experiments: Compare with other prototype-based classification method Nearest Subclass Classifier (NSC) k-Means Classifier (KMC) k-NN Classifier (NNC) Learning Vector Quantization (LVQ) 24

  25. Experiments: Compare with other prototype-based classification method 25

  26. Conclusions • ASC • Learns the number of nodes needed to determine the decision boundary • Incremental neural network • Robust to noisy training data • Fast classification • Fewer parameters: 3 parameters

  27. Comments • Advantage • Improve many things • A previous paper to demonstrate the thing they want to modify • Drawback • NO Suggestion of parameters • Application • A work from unsupervised learning to supervised learning

More Related