1 / 38

Adaptive Resonance Theory

Adaptive Resonance Theory. Unsupervised Learning. Learning Objectives. Introduction ART architecture ART implementation ART2 Conclusion. Introduction. Developed by Carpenter and Grossberg ART1-accept binary data ART2-accept continuous value data ART3-improved ART

fathia
Télécharger la présentation

Adaptive Resonance Theory

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Adaptive Resonance Theory Unsupervised Learning 38

  2. Learning Objectives • Introduction • ART architecture • ART implementation • ART2 • Conclusion

  3. Introduction • Developed by Carpenter and Grossberg • ART1-accept binary data • ART2-accept continuous value data • ART3-improved ART • Fuzzy ART- accept fuzzy data

  4. ART Architecture (1/20) • Brief description: • a. Accept an input vector and classify it into one of categories. • b. The category it belongs to depend on which one it most resembles. • c. If the input vector does not match any of the stored pattern, a new category is created.

  5. ART Architecture (2/20) • d. If the input vector matched (within vigilance level) any of stored pattern, the pattern is adjusted (trained) to make it more like the input vector. • 2. Simplified ART Architecture • a. two layers: comparison(F1), recognition(F2) • b. 3 control function: Gain 1, Gain 2, and Reset.

  6. ART Architecture (3/20)

  7. ART Architecture (4/20) • 3. Functions of each models: • a. Comparison layer: • -accept binary input X • -initially pass X to C . So C = X • -binary vector R is produced from recognition layer to modify C

  8. ART Architecture (5/20)

  9. ART Architecture (6/20) • -each neuron in the comparison layer has 3 inputs: • X: input vector • Pj: weighted sum of recognition layer output • Gain 1: same signal to all neurons • -use “two-third” rule • => at least two of a neuron’s three inputs must be one, otherwise, the output is zero. • -initially, Gain 1 is set to one and R are set to 0.

  10. ART Architecture (7/20) • b. Recognition layer: • -compute dot product of B and C • -the neuron with largest output wins • -the winning neuron is set to one others are set to zero. • Gain 2 • -OR of components ofX

  11. ART Architecture (8/20)

  12. ART Architecture (9/20) • Gain 1 • OR of X • components G2 OR of R G1 • ---------------------------------------------------- • 0 0 0 0 • 1 1 0 1 • 1 1 1 0 • 0 0 1 0

  13. ART Architecture (10/20) • Reset • -measures the similarity between X and C

  14. ART Architecture (11/20) • 4. Operations • a. recognition phase • i) when no input vector X , then • G2 = 0 • ii) G2 = 0 disables all neurons in recognition layer • => R=0 • iii) This makes sure all neurons in this layer start out at same state.

  15. ART Architecture (13/20) • iv) Then, vector X is applied, this X must have at least one component is “1”. (∵OR of X is “1” => G2=1; ∵OR of R is “0” => G1=1). • v) So G2, G1 =1, and C = X. • vi) dot product Bj.C. find neuron j that has the largest result, fire “one”, others fire “zero”. • vii) This jth neuron has output rjof R equal to one, and all others equal to zero.

  16. ART Architecture (14/20)

  17. ART Architecture (15/20) • b. Comparison phase • i) rj going through weight tji (binary) to each neuron in comparison layer providing an input signal Pi. • ii) Now R is not zero anymore, G1 = 0 and by 2/3 rule, the components of C will be one only when X and P are ones.

  18. ART Architecture (16/20) • iii) If there is a substantial mismatch between X and P, C contains many zeros, while X contains ones. This will trigger Reset function to inhabit the output of firing neuron in the recognition layer to zero and disable it for the duration of the current classification.

  19. ART Architecture (17/20)

  20. ART Architecture (18/20) • c. Search phase • i) if no Reset signal generated, the match is in tolerate level, and the classification is finished. Otherwise, search other nodes. ii) to search other node, let rj = 0 => R = 0 => G1 =1 => C = X again. • iii) different neuron wins => different pattern P is fed back to comparison layer.

  21. ART Architecture (19/20) • A stored pattern is found then the network enters training cycle. • All stored pattern are tried and no match , assign new neuron set Bj and Tj.

  22. ART Architecture (20/20) • d. Performance issues: • i) sequential search • ii) stabilization

  23. ART Implementation(1/3) • Initialization • Tj, Bj, Vigilance level  • Bj: • 0< bij < L/(L-1+m) for all i, j • m: # of components in the input vector • L: as constant > 1 (L=2, typically) • all bij are the same value • Tj: tij=1 for all i, j • ρ: 0<ρ<1 coarse distinction at first, fine distinction at last.

  24. ART Implementation(2/3) • 2. Recognition • NETj = (BjC) • OUTj=1 NETj>T • 0 otherwise • 3. Comparison 4. Search Searching process is going until a pattern is matched or no pattern is matched.

  25. ART Implementation(3/3) • 5. Training • If the input X is matched, for newly stored Tj :

  26. ART Algorithm(1/4) • Initialization • L > 1, 0<ρ<1 • 0< bij < L/(L-1+m) for all i, j • tij=1 for all i, j • 2. While stopping condition is false, do 3-12: • 3. For each training pattern(X), • do 4-11: • 4. Set R = 0,C = X • 5. For each node in recognition layer that is not inhibited (r-1):

  27. ART Algorithm(2/4) • 6.If rj -1, rj =(BjC) • 7. While Reset is true • 8. Find winning neuron j, If rj -1 then this pattern cannot be clustered. • 9. Let P=TJ, • compute C

  28. ART Algorithm(3/4) • 10. 11. Update BJ, TJ=C 12. Test for stopping conditions a. no weight change b. maximum number of epochs reached.

  29. ART Algorithm(4/4) ** Stopping conditions: a. no weight change b. maximum number of epochs reached.

  30. ART2 (1/9) • Several sublayers replace F1 layer • Update functions: • Parameters: • m: dimension of input vector • n: number of cluster units

  31. ART2 (2/9)

  32. ART2 (3/9) • Parameters(continue): • a, b: fixed weight in F1, cannot be zero. • c: fixed weight used in testing for Reset, 0<c<1. • d: output activation of F2, • e: a small parameter to prevent division by zero when norm of a vector is zero. •  : noise suppression parameter • : learning rate • : vigilance level

  33. ART2 (4/9) • Algorithm: • a. Initialization: • a, b, c, d, e, , ,  • b. perform # of epochs of training • c. For each input pattern, S, • do d. to m. • d. update F1

  34. ART2 (5/9) • d. update F1 again e. compute signal to F2

  35. ART2 (6/9) • f. While Reset is true, do g. to h. • g. Find winning neuron J in F2 • h. Check for Reset:

  36. ART2 (7/9) • i) If ||R|| <  - e, • then yJ = -1 (inhibit J) Reset is true; repeat f. • ii) If ||R|| >=  - e, • then

  37. ART2 (8/9) • i. Do j. to l. for # of learning iterations. • j. update weight for neuron J k. Update F1

  38. ART2 (9/9) • l. Test stopping condition for weight updates. • m. Test stopping condition for # of epochs.

More Related