1 / 13

The Artificial Neuron

The Artificial Neuron. Ranga Rodrigo February 8, 2014. Introduction. The basic building block of a artificial neural network is the artificial neuron. The neuron sums the weighted inputs.

bree-simon
Télécharger la présentation

The Artificial Neuron

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The Artificial Neuron Ranga Rodrigo February 8, 2014

  2. Introduction • The basic building block of a artificial neural network is the artificial neuron. • The neuron sums the weighted inputs. • If this sum exceeds a threshold value, the neuron fires and a signal is transmitted via the axon to other neurons. • In this lecture, we learn about the artificial neuron.

  3. Artificial Neuron x0 = 1 w1 w0 x1 w2 x2 w3 a y x3 f  wD xD Activation function

  4. Activation Functions f

  5. Perceptron • Perception is a single-layer NN with a step activation function. • The perceptron, due to its activation function, takes only two different output values, so it may classify signals applied at its input in the form of vectors to one of two classes.

  6. Question • Sketch the perceptron for a two-dimensional (2-D) data of the form . • How many weight parameters are there to be learned in this case?

  7. Learning • Learning means adjusting the weights. • We adjust the weights by presenting a set of input vectors with known desired (target) values. • If the desired value and the out output of the NN are different, there is an error. • We present these vectors one at a time. • We may adjust the weights, if the output of the NN differs from the desired. • We repeat the process until the sum of errors becomes smaller than a threshold.

  8. Objective Function (Error) • Here we consider sum of squared errors as the objective function and an identity activation func. • Given a training set comprising a set of input vectors where together with a corresponding set of target vectors , we minimize the error function True value for the nth input vector Output of the NN for nth input vector

  9. dth component of the nth input vector identity activation func

  10. Output of the NN for nth input vector

  11. Gradient Descent Rule • Given a single training pattern, weights are updated using Widrow-Hoff Learning Rule

  12. Homework • Plot the activation functions as shown in slide 4. • Slide 12 shows the perceptronalgorithm. What are the expression that fill the blanks in this flow chart? • Write the perceptron algorithm as shown in slide 12.

More Related