1 / 28

PSY105 Neural Networks 3/5

PSY105 Neural Networks 3/5. 3. “Machines that change themselves”. Lecture 1 recap. We can describe patterns at one level of description that emerge due to rules followed at a lower level of description.

duyen
Télécharger la présentation

PSY105 Neural Networks 3/5

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. PSY105 Neural Networks 3/5 3. “Machines that change themselves”

  2. Lecture 1 recap • We can describe patterns at one level of description that emerge due to rules followed at a lower level of description. • Neural network modellers hope that we can understand behaviour by creating models of networks of artificial neurons.

  3. Lecture 2 recap • Simple model neurons • Transmit a signal of (or between) 0 and 1 • Receive information from other neurons • Weight this information • Can be used to perform any computation

  4. Networks of such neurons are Turing complete 1912 - 1954

  5. Question: How could you use these simple neurons (TLUs) to compute the NOR (‘NOR OR’) function?

  6. Computing with neurons: NORa clue inputs output weights Input 1 (varies) Act. ? Input 2 (varies) Tonically active Input (always = 1)

  7. Computing with neurons: NORone way inputs output weights Input 1 (varies) Act. ? Input 2 (varies) Tonically active Input (always = 1) Threshold = 1, Weight 1 = -1, Weight 2 = -1 Weight 3 = +1

  8. Mechanism advantages of neural networks • Parallel rather than serial processing • Speed advantage • Robust • graceful degradation • Learning • Do not require full designer specification

  9. Successes of neural networks http://www.youtube.com/watch?v=-KxjVlaLBmk#t=2m39s

  10. Theoretical advantages of neural networks • Biological verisimilitude • But see earlier discussion of levels • Forces scientist to specify • Both problem and solution • Learning (again) • ‘Strong representational change’ • They don’t have the answers programmed in

  11. 11 SC 198612. Dudelange, Luxembourg. Painted white to blend with snow-covered terrain, an M-36 tank destroyer crosses a field. (3 Jan 1945). Signal Corps Photo #ETO-HQ-45-5944 (Hustead). http://www.history.army.mil/reference/bulge/images.htm

  12. Learning • Usually happens gradually • Hence ‘learning curve’ • Must be related to some physical change in the brain • Can we describe rules that explain learning?

  13. Braitenberg learning?

  14. Classical conditioning • http://www.youtube.com/watch?v=Eo7jcI8fAuI

  15. Classical conditioning • Unconditioned stimulus (UCS) • Unconditioned Response (UCR) • Conditioned stimulus (CS) • Taste of food • Salivation • Ringing of bell

  16. Classical conditioning • Unconditioned stimulus (UCS) • Unconditioned Response (UCR) • Conditioned stimulus (CS) • Taste of food • Salivation • Ringing of bell

  17. Classical conditioning • Unconditioned stimulus (UCS) • Unconditioned Response (UCR) • Conditioned stimulus (CS) • Taste of food • Salivation • Ringing of bell

  18. Modelling classical conditioning Stimuli Responses

  19. Modelling classical conditioning Stimuli Responses

  20. Modelling classical conditioning CS2 CS1 UCS Stimuli Responses

  21. How many ways are there to implement classical conditioning?

  22. Modelling classical conditioning CS2 CS1 UCS Stimuli S-R link Responses

  23. Modelling classical conditioning S-S link CS2 CS1 UCS Stimuli Responses

  24. Experiments S-S link CS2 CS1 UCS Stimuli ? ? Responses

  25. Stimulus presentation Stimulus Off Stimulus On

  26. S-S or S-R link • But experimental results can be found which support the existence of both links • Ultimately it seems that conditioning relies on internal representations, not just links

  27. Learning Rules “When an axon of cell A is near enough to excite a cell B and repeatedly or persistently takes part in firing it, some growth process or metabolic change takes place in one or both cells such that A's efficiency, as one of the cells firing B, is increased.” Hebb, D.O. (1949), The organization of behavior, New York: Wiley

  28. Operationalising the Hebb Rule • Turn ….“When an axon of cell A is near enough to excite a cell B and repeatedly or persistently takes part in firing it, some growth process or metabolic change takes place in one or both cells such that A's efficiency, as one of the cells firing B, is increased.” • ….Into a simple equation which is a rule for changing weights according to inputs and outputs

More Related