180 likes | 292 Vues
Explore the history, characteristics, and learning algorithms of Artificial Neural Networks (ANN). Learn how ANN mimics the biological brain to perform tasks efficiently and handle complex data processing. Discover the supervised and unsupervised learning methods and popular applications of ANN.
E N D
What is an ANN? • It is a computational System Inspired by Structure, Processing method, Learning ability of Biological Brain • The term “neural network” is also applied to models of the brain
Why ANN? • Some tasks can be done easily (effortlessly) by humans but are hard by conventional paradigms on Von Neumann machine with algorithmic approach • Computers can perform many operations considerably faster than a human being
Why ANN? • Massive Parallelism • Distributed representation • Learning ability • Generalization ablity • Fault tolerance
Characteristics of ANN • A large number of very simple processing neuron-like processing elements • A large number of weighted connections between the elements • Distributed representation of knowledge over the connections • Knowledge is acquired by network through a learning process
Biological Neuron • Dendrit , bertugasmenerimainformasi • Soma, tempatpengolahaninformasi • Axon, mengiriminpuls-inpulskeselsyaraflainya • Synapse, penghubungantara 2 neuron
Artificial Neuron w1 SUM Activation Function Σ p2 F(y) n=Σpi.wi w2 a=f(n) . . . wi Weight
Learning • Learn the connection weights from a set of training examples • Different network architectures required different learning algorithms
Supervised Learing • The network is provided with a correct answer (output) for every input pattern • Weights are determined to allow the network to produce answers as close as possible to the known correct answers • The back-propagation algorithm belongs into this category
Unsupervised Learning • Does not require a correct answer associated with each input pattern in the training set • Explores the underlying structure in the data, or correlations between patterns in the data, and organizes patterns into categories from these correlations • The Kohonen algorithm belongs into this category
Applications • Pattern Classification • Clustering/Categorization • Function approximation • Prediction/Forecasting • Optimization • Content-addressable Memory • Control
Sejarah • Model JST formal pertamadiperkenalkanoleh McCulloch dan Pitts (1943) • 1949, HebbmengusulkanjaringanHebb • 1958, Rosenblatt mengembangkanperceptronuntukklasifikasipola • 1960, Widrowdan Hoff mengembangkan ADALINE denganaturanpembelajaran Least Mean Square (LMS) • 1974, Werbosmemperkenalkanalgoritmabackpropagationuntukperceptronbanyaklapisan
Sejarah • 1975, Kunihiko Fukushima mengembangkan JST khususpengenalankarakter, disebutcognitron, namungagalmengenaliposisiataurotasikarakter yang terdistorsi • 1982, Kohonenmengembangkan learning unsupervised untukpemetaan • 1982, Grossbergdan Carpenter mengembangkan Adaptive Resonance Theory (ART, ART2, ART3) • 1982, Hopfield mengembangkanjaringan Hopfield untukoptimasi
Sejarah • 1983, perbaikancognitron (1975) denganneocognitron • 1985, Algoritma Boltzmann untukjaringansyarafprobabilistik • 1987, dikembangkan BAM (Bidirectional Associative Memory) • 1988, dikembangkan Radial Basis Function
Thank’s Any Questions ?