1 / 14

The Evolving Tree — Analysis and Applications

The Evolving Tree — Analysis and Applications. Advisor : Dr. Hsu Presenter : Zih-Hui Lin Author :Jussi Pakkanen, Jukka Iivarinen, and Erkki Oja,. IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL. 17, NO. 3, MAY 2006. Outline. Motivation Objective ETree ETree- Analysis Experiments

brac
Télécharger la présentation

The Evolving Tree — Analysis and Applications

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The Evolving Tree—Analysis and Applications Advisor : Dr. Hsu Presenter : Zih-Hui Lin Author :Jussi Pakkanen, Jukka Iivarinen, and Erkki Oja, IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL. 17, NO. 3, MAY 2006

  2. Outline • Motivation • Objective • ETree • ETree-Analysis • Experiments • Conclusions

  3. Motivation • Some of its intrinsic features make it unsuitable for analyzing very large scale problems. • This converts complexity control from a global problem to a local one which is simpler.

  4. Objective • We have analyzed and compared the ETree against many different systems.

  5. hop BMU ETree Xi Training data 1)Find the BMU using the search tree 2) Update the leaf node locations using the SOM training formulas substituting tree distance for grid distance. 3) Increment the BMU’s hit counter. 4) If the counter reaches the splitting threshold, split the node.

  6. ETree- How to controlling the growth

  7. ETree-Removing Layers • One beneficial feature of most neural networks is graceful degradation.

  8. ETree- better Search for the BMU • At every layer we keep the n best subbranches instead of only one. Regular BMU 51 52 55

  9. ETree- Child Node Initialization • The first perturbs the child nodes randomly. • The second one is based on principal component analysis (PCA).

  10. ETree- Optimizing the Leaf Node Locations 1)First, we map all training vectors to leaf nodes using the established BMU search. 2)Then we move the leaf nodes to the center of mass of their respective data vectors. Large dataset

  11. Visualization experiments data vector ETree Leaf nodes SOM K-means

  12. Quality of clustering without the search tree

  13. Conclusions • ETree’s performance is quite close to classical, nonhierarchical algorithms but it is noticeably faster, • ETree makes implementation and application easier.

  14. My opinion • Advantage: • … • Disadvantage: • … • Apply • clustering, classification, large dataset

More Related