1 / 14

Supervised Hebbian Learning

Supervised Hebbian Learning. Hebb’s Postulate.

latoya
Télécharger la présentation

Supervised Hebbian Learning

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Supervised Hebbian Learning

  2. Hebb’s Postulate “When an axon of cell A is near enough to excite a cell B and repeatedly or persistently takes part in firing it, some growth process or metabolic change takes place in one or both cells such that A’s efficiency, as one of the cells firing B, is increased.” D. O. Hebb, 1949 B A

  3. Linear Associator Training Set:

  4. Hebb Rule Presynaptic Signal Postsynaptic Signal Simplified Form: Supervised Form: Matrix Form:

  5. T p p p p ¼ P = 1 1 2 Q T p T W T P = = t t t 2 ¼ 1 2 Q ¼ T t t t ¼ T = p 1 2 Q Q Batch Operation (Zero Initial Weights) Matrix Form:

  6. = 0 q ¹ k Performance Analysis Case I, input patterns are orthogonal. Therefore the network output equals the target: Case II, input patterns are normalized, but not orthogonal. Error

  7. Example Banana Apple Normalized Prototype Patterns Weight Matrix (Hebb Rule): Tests: Banana Apple

  8. t t t p p p ¼ ¼ T P = = 1 2 Q 1 2 Q 2 2 å å E || || = e i j i j Pseudoinverse Rule - (1) Performance Index: Matrix Form:

  9. Pseudoinverse Rule - (2) Minimize: If an inverse exists for P, F(W) can be made zero: When an inverse does not exist F(W) can be minimized using the pseudoinverse:

  10. T W T P = Relationship to the Hebb Rule Hebb Rule Pseudoinverse Rule If the prototype patterns are orthonormal:

  11. Example

  12. Autoassociative Memory

  13. Tests 50% Occluded 67% Occluded Noisy Patterns (7 pixels)

  14. Variations of Hebbian Learning Basic Rule: Learning Rate: Smoothing: Delta Rule: Unsupervised:

More Related