1 / 22

MULTILAYER PERCEPTRON

MULTILAYER PERCEPTRON. Nurochman , Teknik Informatika UIN Sunan Kalijaga Yogyakarta. Review SLP. X1. w 1. Σ. f(y ). w 2. X2. output. activation f unc. Σ x i .w i. w i. X3. weight. Fungsi Aktivasi. Fungsi undak biner (hard limit) Fungsi undak biner (threshold). .

dash
Télécharger la présentation

MULTILAYER PERCEPTRON

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. MULTILAYER PERCEPTRON Nurochman, TeknikInformatika UIN SunanKalijaga Yogyakarta

  2. Review SLP X1 w1 Σ f(y) w2 X2 output . . . activation func Σxi.wi wi X3 weight

  3. Fungsi Aktivasi • Fungsi undak biner (hard limit) • Fungsi undak biner (threshold) 

  4. Fungsi Aktivasi • Fungsi bipolar • Fungsi bipolar dengan threshold

  5. Fungsi Aktivasi • Fungsi Linier (identitas) • Fungsi Sigmoid biner

  6. Learning Algorithm Inisialisasilajupembelajaran (α), nilaiambang (𝛉), bobotserta bias Menghitung Menghitung

  7. Learning Algorithm • Jika y ≠ target, lakukan update bobotdan bias Wibaru = Wlama + α.t.Xi b baru = b lama + α.t • Ulangdarilangkah 2 sampaitidakada update bobotlagi

  8. X1 1 Y 1 X2 1 Problem “OR” X1 X2 net Y, 1 jika net >=1, 0 jika net < 1 1 1 1.1+1.1=2 1 1 0 1.1+0.1=1 1 0 1 0.1+1.1=1 1 0 0 0.1+0.1=0 0 Ternyata BERHASIL mengenali pola

  9. X1 1 Y 2 X2 1 Problem “AND” X1 X2 net Y, 1 jika net >=2, 0 jika net < 2 1 1 1.1+1.1=2 1 1 0 1.1+0.1=1 0 0 1 0.1+1.1=1 0 0 0 0.1+0.1=0 0 Ternyata BERHASIL mengenalipola

  10. X1 2 Y 2 X2 -1 Problem “X1 and not(X2)” X1 X2 net Y, 1 jika net >=2, 0 jika net < 2 1 1 1.2+1.-1=1 0 1 0 1.2+0.-1=2 1 0 1 0.2+1.-1=-1 0 0 0 0.2+0.-1=0 0 Ternyata BERHASIL mengenali pola

  11. How about XOR?

  12. F(0,1) = 1 F(1,1) = 0 F(0,0) = 0 F(1,0) = 1 Problem “XOR” X1 X2 Y 1 1 0 1 0 1 0 1 1 0 0 0 GAGAL!

  13. 2 2 X1 Z1 1 Y 1 -1 -1 1 X2 Z2 2 2 Solusi • XOR = (x1 ^ ~x2) V (~x1 ^ x2) • Ternyata dibutuhkan sebuah layer tersembunyi

  14. Tabel

  15. Multi-Layer Perceptron • MLP is a feedforward neural network with at least one hidden layer (Li Min Fu) • Limitations of Single-Layer Perceptron • Neural Network for Nonlinier Pattern Recognition • XOR Problem

  16. x1 1 -1 1 x2 -1 Solution for XOR Problem

  17. Solution from XOR Problem -1 0,1 +1 1 if v > 0 (v) = -1 if v  0  is the sign function. x1 +1 -1 -1 x2 +1 +1 -1

  18. Input to Hidden layer

  19. Hidden to Output layer

  20. Learning Algorithm • Backpropagation Algorithm • It adjusts the weights of the NN in order to minimize the average squared error Function signals Forward Step Error signals Backward Step

  21. BP has two phases • Forward pass phase: computes ‘functional signal’, feedforward propagation of input pattern signals through network • Backward pass phase: computes ‘error signal’, propagates the error backwards through network starting at output units (where the error is the difference between actual and desired output values)

  22. 1 Increasing a -10 -8 -6 -4 -2 2 4 6 8 10 Activation Function • Sigmoidal Function

More Related