1 / 26

Notes on Backpropagation

Notes on Backpropagation. Alex Churchill. Feed Forward. Node C = sigmoid(A * weight ca + B * weight cb ). C. D. Feed Forward. Node C = sigmoid(0.1 * 0.1+ 0.7*0.5). C. D. Feed Forward. Node C = sigmoid(0.01+0.35) = 0.59. C. 0.59. D. Feed Forward.

kinsey
Télécharger la présentation

Notes on Backpropagation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Notes on Backpropagation Alex Churchill

  2. Feed Forward • Node C = sigmoid(A * weightca + B * weightcb) C D

  3. Feed Forward • Node C = sigmoid(0.1 * 0.1+ 0.7*0.5) C D

  4. Feed Forward • Node C = sigmoid(0.01+0.35) = 0.59 C 0.59 D

  5. Feed Forward • Node D = sigmoid(A * weightda + B * weightdb) C 0.59 D

  6. Feed Forward • Node D = sigmoid(0.1 * 0.3+ 0.7*0.2) C 0.59 D

  7. Feed Forward • Node D = sigmoid(0.03+0.14) = 0.54 C 0.59 E D 0.54

  8. Feed Forward • Node E = sigmoid(C * weightec + D * weighted) C 0.59 E D 0.54

  9. Feed Forward • Node E = sigmoid(0.59*0.2 + 0.54*0.1)=0.542 C 0.59 E D 0.54

  10. Feed Forward • Node E = sigmoid(0.59*0.2 + 0.54*0.1)=0.542 C 0.59 E 0.542 0.542 D 0.54

  11. Backpropagation • Calculate error for each output neuron at the output layer (L) • For each hidden layer (L-1 to L – n) pass the error backwards from the layer above • Update the weights connecting the last hidden layer (L-1) to the output layer (L) • Update the weights connecting each lower layer

  12. Backpropagation • Calculate error (δk)for each output neuron (k) at the output layer (L) This is calculated using: δk=(yk-tk)*g'(xk) Where g’ is the first derivative of the sigmoid and xk is the pre-sigmoided output

  13. Backpropagation δk=(yk-tk)*g'(xk) δk=(0.542-1)*g'(xk)=(0.542-1)*(0.542)*(1-0.542) = -0.114 Target = 1 C Learning rate = 1 0.59 E 0.542 0.542 D 0.54

  14. Backpropagation δk=(yk-tk)*g'(xk) δk=(0.542-1)*g'(xk)=(0.542-1)*(0.542)*(1-0.542) = -0.114 Target = 1 C Learning rate = 1 δk=-0.114 0.59 E 0.542 0.542 D 0.54

  15. Backpropagation 2. For each hidden layer (L-1 to L – n) pass the error backwards from the layer above This is calculated using: Where j is the hidden neuron and k is the output neuron

  16. Backpropagation δc=(wecδk)*g'(xc) = 0.2 * -0.114 * 0.59*(1-0.59)=-0.0055 δd=(wedδk)*g'(xd) = 0.1 * -0.114 * 0.54*(1-0.54)=-0.0028 Target = 1 C Learning rate = 1 δk=-0.114 0.59 E 0.542 0.542 D 0.54

  17. Backpropagation δc=-0.0055 δc=(wecδk)*g'(xc) = 0.2 * -0.114 * 0.59*(1-0.59)=-0.0055 δd=(wedδk)*g'(xd) = 0.1 * -0.114 * 0.54*(1-0.54)=-0.0028 Target = 1 C Learning rate = 1 δk=-0.114 0.59 E δd=-0.0028 0.542 0.542 D 0.54

  18. Backpropagation 3. Update the weights connecting the last hidden layer (L-1) to the output layer (L) This is calculated using: Where j is the hidden neuron and k is the output neuron. aj is the sigmoided output of the hidden neuron

  19. Backpropagation δc=-0.0055 Wec=wec-ηδeC = 0.2 - 1* -0.114*0.59=0.267 Wed=wed -ηδed = 0.1 - 1* -0.114*0.54=0.162 Target = 1 C Learning rate = 1 δe=-0.114 0.59 E δd=-0.0028 0.542 0.542 D 0.54

  20. Backpropagation δc=-0.0055 Wec=wec-ηδeC = 0.2 - 1* -0.114*0.59=0.267 Wed=wed -ηδed = 0.1 - 1* -0.114*0.54=0.162 Target = 1 C Learning rate = 1 0.267 0.59 E δd=-0.0028 0.542 0.542 D 0.162 0.54

  21. Backpropagation 4. Update the weights connecting each lower layer. This is calculated using: Where j is the hidden neuron (or input neuron) in the layer below and k is the hidden neuron in the layer above.

  22. Backpropagation δc=-0.0055 Wca=wca-ηδcA = 0.1 - 1* -0.0055*0.1=0.1005 Wcd=wcd-ηδdA = 0.3 - 1* -0.0028*0.1=0.3003 Target = 1 C Learning rate = 1 0.267 0.59 E δd=-0.0028 0.542 0.542 D 0.162 0.54

  23. Backpropagation δc=-0.0055 Wca=wca-ηδcA = 0.1 - 1* -0.0055*0.1=0.1005 Wcd=wcd-ηδdA = 0.3 - 1* -0.0028*0.1=0.3003 Target = 1 C 0.1005 Learning rate = 1 0.267 0.59 0.3003 E δd=-0.0028 0.542 0.542 D 0.162 0.54

  24. Feed forward

  25. Iris

  26. Iris

More Related