1 / 61

Unsupervised Learning Networks

Unsupervised Learning Networks. 主講人 : 虞台文. Content. Introduction Important Unsupervised Learning NNs Hamming Networks Kohonen’s Self-Organizing Feature Maps Grossberg’s ART Networks Counterpropagation Networks Adaptive BAN Neocognitron Conclusion. Unsupervised Learning Networks.

Télécharger la présentation

Unsupervised Learning Networks

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Unsupervised Learning Networks 主講人: 虞台文

  2. Content • Introduction • Important Unsupervised Learning NNs • Hamming Networks • Kohonen’s Self-Organizing Feature Maps • Grossberg’s ART Networks • Counterpropagation Networks • Adaptive BAN • Neocognitron • Conclusion

  3. Unsupervised Learning Networks Introduction

  4. What is Unsupervised Learning? • Learning without a teacher. • No feedback to indicate the desired outputs. • The network must by itself discover the relationship of interest from the input data. • E.g., patterns, features, regularities, correlations, or categories. • Translate the discovered relationship into output.

  5. A Strange World

  6. A B Height C IQ Supervised Learning

  7. A B Height C IQ Try Classification Supervised Learning

  8. A B Height C IQ The Probabilities of Populations

  9. A Height B C IQ The Centroids of Clusters

  10. A Height B C IQ Try Classification The Centroids of Clusters

  11. Height IQ Unsupervised Learning

  12. Height IQ Unsupervised Learning

  13. Height IQ Categorize the input patterns into several classes based on the similarity among patterns. Clustering Analysis

  14. Height IQ Categorize the input patterns into several classes based on the similarity among patterns. Clustering Analysis How many classes we may have?

  15. Height IQ Categorize the input patterns into several classes based on the similarity among patterns. Clustering Analysis 2 clusters

  16. Height IQ Categorize the input patterns into several classes based on the similarity among patterns. Clustering Analysis 3 clusters

  17. Height IQ Categorize the input patterns into several classes based on the similarity among patterns. Clustering Analysis 4 clusters

  18. Unsupervised Learning Networks The Hamming Networks

  19. The Nearest Neighbor Classifier • Suppose that we have p prototypes centered at x(1), x(2), …, x(p). • Given pattern x, it is assigned to the class label of the ith prototype if • Examples of distance measures include the Hamming distance and Euclidean distance.

  20. 1 2 3 4 The Stored Prototypes The Nearest Neighbor Classifier x(1) x(2) x(3) x(4)

  21. 1 2 3 4 The Nearest Neighbor Classifier x(1) x(2)  ?Class x(3) x(4)

  22. The Hamming Networks • Stored a set of classes represented by a set of binary prototypes. • Given an incomplete binary input, find the class to which it belongs. • Use Hamming distance as the distance measurement. • Distance vs. Similarity.

  23. x1 x2 xn The Hamming Net MAXNET Winner-Take-All Similarity Measurement

  24. The Hamming Distance y= 1 1 1 1 1 1 1 x = 1 1 1 1 1 1 1 Hamming Distance = ?

  25. The Hamming Distance y= 1 1 1 1 1 1 1 x = 1 1 1 1 1 1 1 Hamming Distance = 3

  26. The Hamming Distance y= 1 1 1 1 1 1 1 Sum=1 x = 1 1 1 1 1 1 1 1 1 1 1 1 1 1

  27. The Hamming Distance

  28. The Hamming Distance

  29. y1 y2 yn1 yn 1 2 n1 n 1 2 n1 n x1 x2 xm1 xm The Hamming Net MAXNET Winner-Take-All Similarity Measurement

  30. y1 y2 yn1 yn 1 2 n1 n 1 2 n1 n x1 x2 xm1 xm The Hamming Net MAXNET Winner-Take-All WM=? Similarity Measurement WS=?

  31. y1 y2 yn1 yn 1 2 n1 n 1 2 n1 n x1 x2 xm1 xm The Stored Patterns MAXNET Winner-Take-All WM=? Similarity Measurement WS=?

  32. m/2 k . . . x1 x2 xm The Stored Patterns Similarity Measurement

  33. 1 2 n1 n Similarity Measurement x1 x2 xm1 xm Weights for Stored Patterns WS=?

  34. m/2 m/2 m/2 m/2 1 2 n1 n x1 x2 xm1 xm Weights for Stored Patterns Similarity Measurement WS=?

  35. y1 y2 yn1 yn 1 2 n1 n 1 2 n1 n x1 x2 xm1 xm The MAXNET MAXNET Winner-Take-All Similarity Measurement

  36. Weights of MAXNET y1 y2 yn1 yn MAXNET Winner-Take-All 1 1 2 n1 n

  37. Weights of MAXNET y1 y2 yn1 yn 0<  < 1/n  MAXNET Winner-Take-All 1 1 2 n1 n

  38. Updating Rule 0<  < 1/n  MAXNET Winner-Take-All 1 1 2 n1 n s1 s2 s3 sn

  39. Updating Rule 0<  < 1/n  MAXNET Winner-Take-All 1 1 2 n1 n s1 s2 s3 sn

  40. Analysis  Updating Rule Let If now

  41. Analysis  Updating Rule Let If now

  42. Example

  43. Unsupervised Learning Networks The Self-Organizing Feature Map

  44. Feature Mapping • Map high-dimensional input signals onto a lower-dimensional (usually 1 or 2D) structure. • Similarity relations present in the original data are still present after the mapping. Dimensionality Reduction Topology-Preserving Map

  45. Somatotopic Map Illustration:The “Homunculus” The relationship between body surfaces and the regions of the brain that control them.

  46. Another Depiction of the Homunculus

  47. Phonotopic maps

  48. Phonotopic maps humppila

  49. Self-Organizing Feature Map • Developed by professor Kohonen. • One of the most popular neural network models. • Unsupervised learning. • Competitive learning networks.

  50. The Structure of SOM

More Related