1 / 44

Introduction to Artificial Intelligence (G51IAI)

Introduction to Artificial Intelligence (G51IAI). Dr Matthew Hyde Neural Networks. More precisely: “ Artificial Neural Networks” Simulating, on a computer, what we understand about neural networks in the brain. Lecture Outline. Biology History Perceptrons Multi-Layer Networks

Télécharger la présentation

Introduction to Artificial Intelligence (G51IAI)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Introduction to Artificial Intelligence (G51IAI) Dr Matthew Hyde Neural Networks • More precisely: • “Artificial Neural Networks” • Simulating, on a computer, what we understand about neural networks in the brain

  2. Lecture Outline • Biology • History • Perceptrons • Multi-Layer Networks • The Neuron’s Activation Function • Linear Separability • Learning / Training • Time Steps Lecture 1 Lecture 2

  3. Biology

  4. Neural Networks

  5. Neural Networks • We have between 15-33 billion neurons • A neuron may connect to as many as 10,000 other neurons • Many neurons die as we progress through life • We continue to learn

  6. From a computational point of view, the fundamental processing unit of a brain is a neuron

  7. A neuron consists of a cell body (soma with a nucleus) • Each neuron has a number of dendrites which receive signals from other neurons

  8. Each neuron also has an axon, which goes out and splits into a number of strands to make a connection to other neurons • The point at which neurons join other neurons is called a synapse

  9. Signals move between neurons via electrochemical reactions • The synapses release a chemical transmitter which enters the dendrite. This raises or lowers the electrical potential of the cell body

  10. The soma sums the inputs it receives and once a threshold level is reached an electrical impulse is sent down the axon (often known as firing) • This increases or decreases the electrical potential of another cell (excitatory or inhibitory)

  11. Long-term firing patterns formed – basic learning • Plasticity of the network • Long term changes as patterns are repeated

  12. Videos http://www.youtube.com/watch?v=sQKma9uMCFk http://www.youtube.com/watch?v=-SHBnExxub8 Short neuron animation http://www.youtube.com/watch?v=vvxXnQuvTD8&NR=1 Various visualisations

  13. History

  14. Neural Networks • McCulloch & Pitts (1943) are generally recognised as the designers of the first neural network • One Neuron • The idea of a threshold • Many of their ideas still used today

  15. Neural Networks • Hebb (1949) developed the first learning rule • McCulloch & Pitts network has fixed weights • If two neurons were active at the same time the strength between them should be increased • Rosenblatt (1958) • The ‘perceptron’ • Same architecture as McCulloch & Pitts, but with variable weights

  16. Neural Networks • During the 50’s and 60’s • Many researchers worked on the perceptron amidst great excitement • This model can be proved to converge to the correct weights • More powerful learning algorithm than Hebb

  17. Neural Networks • 1969 saw the death of neural network research • Minsky & Papert • Perceptron can’t learn certain type of important functions • Research of ANN went to decline for about 15 years

  18. Neural Networks • Only in the mid 80’s was interest revived • Parker (1985) and LeCun (1986) independently discovered multi-layer networks to solve problem of non-linear separable • Bryson & Ho publish an effective learning algorithm in 1969: ‘Backpropagation’ • In fact Werbos and others make the link to neural networks in the late 70s and early 80s

  19. Perceptrons (single layer, feed forward networks)

  20. The First Neural Networks It consisted of: A set of inputs - (dendrites) A set of weights – (synapses) A processing element - (neuron) A single output - (axon)

  21. X1 X3 Y 2 X2 2 -1 McCulloch and Pitts Networks The activation of a neuron is binary. That is, the neuron either fires (activation of one) or does not fire (activation of zero).

  22. X1 X3 Y 2 X2 2 -1 McCulloch and Pitts Networks θ = threshold Output function: If (input sum < Threshold) output 0 Else output 1

  23. X1 X3 Y 2 X2 2 -1 McCulloch and Pitts Networks Each neuron has a fixed threshold. If the net input into the neuron is greater than or equal to the threshold, the neuron fires

  24. X1 X3 Y 2 X2 2 -1 McCulloch and Pitts Networks Neurons in a McCulloch-Pitts network are connected by directed, weighted paths

  25. X1 X3 Y 2 X2 2 -1 McCulloch and Pitts Networks • If the weight on a path is positive the path is excitatory, otherwise it is inhibitory • x1 and x2 encourage the neuron to fire • x3 prevents the neuron from firing

  26. X1 X3 Y 2 X2 2 -1 McCulloch and Pitts Networks The threshold is set such that any non-zero inhibitory input will prevent the neuron from firing (This is only a rule for McCulloch-Pitts Networks!!)

  27. X1 X3 Y 2 X2 2 -1 McCulloch and Pitts Networks It takes one time step for a signal to pass over one connection.

  28. 1 1 ? 2 0 0.5 1.5 Worked Examples on Handout 1 Threshold Function: If input sum < Threshold return 0 Else return 1 Does this neuron fire? Does it output a 0 or a 1? Inputs • Multiply the inputs to the neuron by the weights on their paths • Add the inputs • Apply the threshold function 2 3.5 0 Threshold(θ) = 4 1.5 3.5 < 4 So neuron outputs 0

  29. Answers • Using McCulloch-Pitts model we can model some logic functions • In the exercise, you have been working on logic functions • AND • OR • NOT AND

  30. Answers AND Function X Threshold(θ) = 2 1 Z 1 Y Threshold Function: If input sum < Threshold return 0 Else return 1

  31. Answers OR Function X Threshold(θ) = 2 2 Z 2 Y Threshold Function: If input sum < Threshold return 0 Else return 1

  32. Answers (This one is not a McCulloch-Pitts Network) NOT AND (NAND) Function X Threshold(θ) = -1 -1 Z -1 Y Threshold Function: If input sum < Threshold return 0 Else return 1

  33. One additional example AND NOT Function X Threshold(θ) = 2 2 Z -1 Y Threshold Function: If input sum < Threshold return 0 Else return 1

  34. Multi-Layer Neural Networks

  35. XOR Function Y1 Y2 Z X1 X2 2 2 -1 -1 2 2 Modelling Logic Functions XOR X1 XOR X2 = (X1 AND NOT X2) OR (X2 AND NOT X1)

  36. Y2 Z X1 X2 Modelling Logic Functions -1 AND NOT 2 2 X1 XOR X2 = (X1 AND NOT X2) OR (X2 AND NOT X1)

  37. Y1 Z X1 X2 Modelling Logic Functions 2 2 -1 AND NOT X1 XOR X2 = (X1 AND NOT X2) OR (X2 AND NOT X1)

  38. Y1 Y2 Z Modelling Logic Functions XOR 2 OR 2 X1 XOR X2 = (X1 AND NOT X2) OR (X2 AND NOT X1)

  39. Modelling Logic Functions X1 XOR X2 = (X1 AND NOT X2) OR (X2 AND NOT X1)

  40. Key Idea! • Perceptrons cannot learn (cannot even represent) the XOR function • We will see why next lecture • Multi-Layer Networks can, as we have just shown

  41. 1 1 ? 2 0 0.5 1.5 Worked Examples Threshold Function: If input sum < Threshold return 0 Else return 1 Does this neuron fire? Does it output a 0 or a 1? Inputs • Multiply the inputs to the neuron by the weights on their paths • Add the inputs • Apply the threshold function 2 3.5 0 Threshold (θ) = 4 1.5 3.5 < 4 So neuron outputs 0

  42. Example of Learning with a multi-layer neural network http://www.youtube.com/watch?v=0Str0Rdkxxo http://matthewrobbins.net/Projects/NeuralNetwork.html Provided courtesy of: Matthew Robbins Games Programmer Melbourne Australia

  43. Example of Learning with a multi-layer neural network Left wheel speed Right wheel speed ... ... “Fully connected” All neurons are connected to all the neurons in the next level

  44. Lecture Summary • Biology • History (remember the names and what they did) • Perceptrons • Multi-Layer Networks

More Related