1 / 13

Learning Algorithm of Multi-Layer Perceptron (MLP) Neural Networks

This document delves into the learning algorithm of Multi-Layer Perceptron (MLP) neural networks, focusing on forward and backward propagation processes. It includes computational details at each neuron and explains the role of error signals and gradients in optimizing performance. The backpropagation learning algorithm is emphasized, detailing weight adjustment rules and the significance of activation functions. Homework assignments challenge students to implement the algorithm in MATLAB, ensuring they understand practical applications and concepts such as minimizing error and achieving convergence.

sydney
Télécharger la présentation

Learning Algorithm of Multi-Layer Perceptron (MLP) Neural Networks

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Learning Algorithm of MLP Neural Networks Multi Layer Perceptrons f(.) f(.) f(.) Function signal Error signal • Computations at each neuron j: • Neuron output, yj • Vector of error gradient, ¶E/¶wji Forward propagation “Backpropagation Learning Algorithm” Backward propagation

  2. Neural Networks Multi Layer Perceptrons x1(i) yk(i) x2(i) . . . . . . . . . xm(i) Learning Algorithm of MLP Goal: Cost function / performance index: Minimize Weight Modification Rule

  3. Neural Networks Multi Layer Perceptrons x1(i) yk(i) x2(i) . . . . . . . . . xm(i) Learning Algorithm of MLP • Backpropagation Learning Algorithm: • Learning on output neuron • Learning on hidden neurons

  4. Neural Networks Multi Layer Perceptrons . . . . . . Learning Algorithm of MLP Notations: the output of the k-the neuron of the l-th layer,at the i-th time instant the output of the j-the neuron of the l–1-th layer,at the i-th time instant

  5. Neural Networks Multi Layer Perceptrons . . . . . . . . . Back Propagation Learning Algorithm Learning on output neuron Depends on the activation function

  6. Neural Networks Multi Layer Perceptrons . . . . . . . . . Back Propagation Learning Algorithm Learning on hidden neuron

  7. Neural Networks Multi Layer Perceptrons Back Propagation Learning Algorithm Depends on the activation function Depends on the activation function

  8. Neural Networks Multi Layer Perceptrons f(.) f(.) f(.) Back Propagation Learning Algorithm Forwardpropagation • Set the weights • Calculate output Backwardpropagation • Calculate error • Calculate gradient vector • Update the weights

  9. Neural Networks Multi Layer Perceptrons Influential Factors in Learning • Initial weights and bias • Cost function / performance index • Training data and generalization • Network structure • Number of layers • Number of neurons • Interconnections • Learning Methods • Weight modification rule • Variable or fixed learning rate ()

  10. Neural Networks Multi Layer Perceptrons Homework 4 • Write an m-file that will perform the backpropagation learning algorithm for the following neural network with 2inputs, 1 hidden layer of 2 neurons, and 1 output layer of 1 neuron, no bias at all (all a = 1). • Be sure to obtain decreasing errors. • Note: Submit the hardcopy and softcopy of the m-file. • Hint: The number of parameters to be trained is six.

  11. Neural Networks Multi Layer Perceptrons Homework 4A (Odd Student-ID) • Write an m-file that will perform the backpropagation learning algorithm for the following neural network with 2 inputs, 1 hidden layer of3neurons, and 1 output layer of 1 neuron, withbias (all a = 1). • Be sure to obtain decreasing errors (convergence). • Note: Submit the hardcopy and softcopy of the m-file. • Hint: The number of parameters to be trained is eleven.

  12. Neural Networks Multi Layer Perceptrons Homework 4A (Even Student-ID) • Write an m-file that will perform the backpropagation learning algorithm for the following neural network with 2 inputs, 1 hidden layer of3neurons, and 1 output layer of 1 neuron, withbias (all a = 0.8). • Be sure to obtain decreasing errors (convergence). • Note: Submit the hardcopy and softcopy of the m-file. • Hint: The number of parameters to be trained is twelve.

More Related