1 / 25

Back Propagation Neural Networks BPNN

Back Propagation Neural Networks BPNN. A tutorial KH Wong. Introduction. Very Popular A high performance Classifier (multi-class) Successful in handwritten optical character OCR recognition, speech recognition, image noise removal etc. Easy to implementation Slow in learning

moanna
Télécharger la présentation

Back Propagation Neural Networks BPNN

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Back Propagation Neural NetworksBPNN A tutorial KH Wong Neural Networks NN ver. 4c

  2. Introduction • Very Popular • A high performance Classifier (multi-class) • Successful in handwritten optical character OCR recognition, speech recognition, image noise removal etc. • Easy to implementation • Slow in learning • Fast in classification http://www.ninds.nih.gov/disorders/brain_basics/ninds_neuron.htm http://yann.lecun.com/exdb/mnist/ Neural Networks NN ver. 4c

  3. Overview • Back Propagation Neural Networks (BPNN) • Part 1: Feed forward processing (classification in action / Recognition process) • Part 2: Feed backward processing (Training the network) • Appendix: • Convolution Neural networks CNN • http://cogprints.org/5869/1/cnn_tutorial.pdf Neural Networks NN ver. 4c

  4. Theory of Back Propagation Neural Net (BPNN) • Use many samples to train the weights (W) & Biases (b), so it can be used to classify an unknown input into different classes • Will explain • How to use it after training: forward pass (classify / recognize input ) • How to train it: how to train the weights and biases (using forward and backward passes) Neural Networks NN ver. 4c

  5. Motivation • Biological findings inspire the development of Neural Net • Input weight Logic function output • Biological relation • Input • Densities • Output Neural Networks NN ver. 4c

  6. Optical character recognitionOCR example • Training: Train the system first by presenting a lot of samples to the network • Recognition: When an image is input to the system, it will tell what character it is Training up the network: weights (W) and bias (b) Neural Net Output=‘3’ Neural Net Neural Networks NN ver. 4c

  7. Part 1 (classification in action /Recognition process) Forward pass of Back Propagation Neural Net (BPNN) Assume weights (W) and bias (b) are found by training already (to be discussed in part2) Neural Networks NN ver. 4c

  8. Recognition: assume weight (W) bias (b) are found earlier Output Output0=0 Output1=0 Output2=0 Output3=1 : Output1=0 Each pixel is X(I,j) Neural Networks NN ver. 4c

  9. Define a neuron of aFeed forward Back Propagation Neural Net (BPNN) • In side each neuron : Neural Networks NN ver. 4c Inputs Output neurons

  10. Multi-layer structure of a BP neural network Input layer output layer Neural Networks NN ver. 4c

  11. Neurons in the Multi-layer structure • In between any neighboring 2 layers, a set of neurons can be found Each Neuron Neural Networks NN ver. 4c

  12. BPNN Forward pass • Forward pass is to find output when an input is given. For example: • Assume we have used N=60,000 images to train a network to recognize c=10 numerals. • When an unknown image is given to the input, the output neuron corresponds to the correct answer will give the highest output level. Input image 10 output neurons for 0,1,2,..,9 Neural Networks NN ver. 4c

  13. Architecture (exercise: write formulas for A1(i=4) and A2(k=3) P(j=1) A1 W2(i=1,k=1) W1(j=1,i=1) A1(i=1) Neuron k=1 Bias=b2(k=1) Neuron i=1 Bias=b1(i=1) W2(i=2,k=1) W1(j=2,i=1) P(j=2) A2 W2(i=5,k=1) W1(j=9,i=1) A2(k=2) P(j=9) A5 W1(j=1,i=1) A1(i=1) P(j=1) P(j=2) P(j=3) : : P(j=9) W2(i=1,k=1) W2(i=2,k=1) W1(j=2,i=1) A1(i=2) W2(i=2,k=2) W1(j=3,i=4) W2(i=5,k=3) Output neurons=3 neurons, indexed by k W2=5x3 b2=3x1 A1(i=5) Hidden layer =5 neurons, indexed by i W1=9x5 b1=5x1 W1(j=9,i=5) Input: P=9x1 Indexed by j Neural Networks NN ver. 4c

  14. Answer (exercise: write values for A1(i=4) and A2(k=3) • P=[ 0.7656 0.7344 0.9609 0.9961 0.9141 0.9063 0.0977 0.0938 0.0859] • W1=[ 0.2112 0.1540 -0.0687 -0.0289 0.0720 -0.1666 0.2938 -0.0169 -0.1127] • -b1= 0.1441 • %Find A1(i=4) • A1_i_is_4=1/(1+exp[-(W1*P+b1))] • =0.49 Neural Networks NN ver. 4c

  15. Numerical Example : Architecture of the example Input Layer 9x1 pixels output Layer 3x1 Neural Networks NN ver. 4c

  16. Part 2: feed backward processing (Training the network) Backward pass of Back Propagation Neural Net (BPNN) (Training) Neural Networks NN ver. 4c

  17. Training • How to train it: how to train the weights (W) and biases (b) (use forward, backward passes) • Initialize W and b randomly • Iter=1: all_epochs(each ‘Iter’ is called an epcoh) • Forward pass (same as the recognition process in part1) for each output neuron: • Use training samples: Xclass_t : feed forward to find y. • Err=error_function(y-t) • Backward pass: • Find W and b to reduce Err. • Wnew=Wold+W; bnew=bold+b Neural Networks NN ver. 4c

  18. The criteria to train a network • Is based on the overall error function, there are ‘N’ samples and ‘c’ classes to be learned Neural Networks NN ver. 4c

  19. http://mathworld.wolfram.com/SigmoidFunction.html Sigmod function f(u) and its derivative f’(u) Neural Networks NN ver. 4c http://link.springer.com/chapter/10.1007%2F3-540-59497-3_175#page-1

  20. Feed backward stage Part1:Feed Forward (studied before) Part2: Feed backward Neural Networks NN ver. 4c

  21. derivation Neural Networks NN ver. 4c

  22. derivation Neural Networks NN ver. 4c

  23. Procedure for training • From the last layer (output), find (t-y), then find • Then find weights (W) and biases (b) of the whole network • Find iterative (forward- back forward pass) to generate a new set of W, until W is small enough, so W and b are found • Takes a long time Neural Networks NN ver. 4c

  24. Example: a simple BPNN • Number of classes (no. of output neurons)=3 • Input 9 pixels: each input is a 3x3 image • Training samples =3 for each class • Number of hidden layers =1 • Number of neurons in the hidden layer =5 Neural Networks NN ver. 4c

  25. Summary • Learn what is Back Propagation Neural Networks (BPNN) • Learn the forward pass • Learn the backward pass and the training of the BPNN network Neural Networks NN ver. 4c

More Related