1 / 26

-Artificial Neural Network- Chapter 3 Perceptron

-Artificial Neural Network- Chapter 3 Perceptron. 朝陽科技大學 資訊管理系 李麗華教授. Outline. • History • Structure • Learning Process • Recall Process • Solving OR Problem • Solving AND Problem • Solving XOR Problem. History of Perceptron Model.

wagnert
Télécharger la présentation

-Artificial Neural Network- Chapter 3 Perceptron

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. -Artificial Neural Network-Chapter 3 Perceptron 朝陽科技大學 資訊管理系 李麗華教授

  2. Outline •History •Structure •Learning Process •Recall Process •Solving OR Problem •Solving AND Problem •Solving XOR Problem

  3. History of Perceptron Model In 1957, Rosenblatt and several other researchers developed perceptron, which used the similar network as proposed by McCulloch, and the learning rule for training network to solve pattern recognition problem. (*) But, this model was later criticized by Minsky who proved that it cannot solve the XOR problem.

  4. Structure The network structure includes: Input layer:input variables with binary type information. The number of node depends on the problem dimension. Processing node:uses linear activation function, i.e., , and the Bias is used. Output layer:the computed results is generated through transfer function. Transfer Function:discrete type, i.e., step function.

  5. W11 f1 W13 X1 W12 f3 Y1 W21 W23 f2 X2 W22 Perceptron Network Structure

  6. 1 net j > 0 Yj= if 0 net j 0 The training process The training steps:(One layer at a time) 1. Choose the network layer, nodes, and connections. 2. Randomly assign weights: Wij & bias: 3. Input training sets Xi (preparing Tj for verification ) 4. Training computation:

  7. If than: Update weights and bias : The training process 5. Training computation: 6. repeat steps 3 ~step 5 until every input pattern is satisfied as:

  8. The recall process After the network has trained as mentioned above, any input vector X can be send into the Perceptron network to derive the computed output. The ratio of total number of corrected output is treated as the prediction performance of the network. The trained weights, Wij, and the bias, θj , is used to derive netj and, therefore, the output Yj can be obtained for pattern recognition(or for prediction).

  9. X2 f1 X1 Example: Solving the AND problem •This is a problem for recognizing the AND pattern •Let the training patterns are used as follow

More Related