1 / 42

Neural Networks

Neural Networks. Course 2L490 Lecturer: Rudolf Mak E-mail: r.h.mak@tue.nl Course notes: H.M.M. ten Eikelder Neural Networks Webpage: Neurale Netwerken (2L490). Today’s topics. BNNs versus ANNs computing power and future development BNNs quick overview ANNs

neumannc
Télécharger la présentation

Neural Networks

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Neural Networks • Course 2L490 • Lecturer: Rudolf Mak • E-mail: r.h.mak@tue.nl • Course notes: H.M.M. ten Eikelder • Neural Networks • Webpage: Neurale Netwerken (2L490) Rudolf Mak TU/e Computer Science

  2. Today’s topics • BNNs versus ANNs • computing power and future development • BNNs • quick overview • ANNs • correspondence (with BNNs) • neuron model • learning paradigms • models • applications Rudolf Mak TU/e Computer Science

  3. Neural Computing • Neuroscience • The objective is to understand the human brain • Biologically realistic models of neurons • Biologically realistic connection topologies • Neural networks • The objective is to develop computation methods • Highly simplified artificial neurons • Connection topologies that are aimed at computational effectiveness Rudolf Mak TU/e Computer Science

  4. Man versus Machine (hardware) Rudolf Mak TU/e Computer Science

  5. Man versus Machine (information processing) Rudolf Mak TU/e Computer Science

  6. Brain versus computer • The following two slides have been taken from a paper by • Hans Moravec • When will computer hardware match the human brain Rudolf Mak TU/e Computer Science

  7. Types of neurons (Kandel et al) • Sensory neurons • Carry information for the purpose of perception and motor coordination • Motor neurons • Carry commands to control muscles and glands • Interneurons • Relay or projection • Long distance signaling • Local • Information processing Rudolf Mak TU/e Computer Science

  8. Biological Neuron • A neuron has four • main regions: • Cell body (soma) • Dendrites • Axon • Presynaptic terminal • excitatory • inhibitory Rudolf Mak TU/e Computer Science

  9. Signaling • All nerve cells signal in the same way through a • combination of electrical and chemical processes: • Input component produces graded local signals • Trigger component initiates action potential • Conductile component propagates action potential • Output component releases neurotransmitter • All signaling is unidirectional Rudolf Mak TU/e Computer Science

  10. Spike (width 0.2 – 5ms) Rudolf Mak TU/e Computer Science

  11. Pulse Trains Rudolf Mak TU/e Computer Science

  12. Some animations For this topic we visit the web-site: Neurobiology Home Page, Blackwell Science Subtopics: Channel gating during action potential Propagation of the Action Potential Neurotransmitter action Rudolf Mak TU/e Computer Science

  13. Summary of Neuron Firing Behavior • The behavior is binary, a neuron either fires or it does not • A neuron doesn’t fire if the accumulated activity stays below threshold • If the activity is above threshold, a neuron fires (produces a spike) • The firing frequency increases with accumulated activity until max. firing frequency reached • The firing frequency is limited by the refractory period of about 1-10 ms Rudolf Mak TU/e Computer Science

  14. Organization of the Brain Central nervous system Interregionalcircuits Taken from the Computational Brain by Churchland and Sejnowski Local circuits Neurons Dendritic trees Neural microcircuits Synapses Molecules Rudolf Mak TU/e Computer Science

  15. Neural Network Rudolf Mak TU/e Computer Science

  16. Sequential Recursive functions Church Turing machine Turing Random Access Machine Von Neumann Parallel P(arallel)RAM Cellular automata Von Neumann Artificial Neural Nets McCulloch/Pitts Wiener ANNs as a Computational Model We can distinguish between sequential and parallel models Rudolf Mak TU/e Computer Science

  17. Advantages of ANNs • Efficient • Inherent massively parallel • Robust • Can deal with incomplete and/or noisy data • Fault-tolerant • Still works when part of the net fails • User-friendly • Learning instead of programming Rudolf Mak TU/e Computer Science

  18. Disadvantages of ANNs • Difficult to design • The are no clear design rules for arbitrary applications • Hard or impossible to train • Difficult to assess internal operation • It is difficult to find out whether, and if so what tasks are performed by different parts of the net • Unpredictable • It is difficult to estimate future network performance based on current (or training) behavior Rudolf Mak TU/e Computer Science

  19. BNN-ANN Correspondence • Nodes stand for the neuron body • Linear combiners model accumulation of synaptic stimuli • Nonlinear activation function models firing behavior • Connections stand for the dendrites and axons • Synapses are modeled by attaching weights to the connections: • Positive weights for excitatory synapses • Negative weights for inhibitory synapses Rudolf Mak TU/e Computer Science

  20. Artificial Neuron linear combiner transfer function Rudolf Mak TU/e Computer Science

  21. Discrete asymmetric transfer Heaviside step-function: f(c, x) = ( x > c ? 1 : 0 ) Transfer functions are also called activation or squashing functions Rudolf Mak TU/e Computer Science

  22. Discrete symmetric transfer Sign function: f(x) = ( x > 0 ? 1 : -1 ) Used with bipolar state encoding Rudolf Mak TU/e Computer Science

  23. Continuous asymmetric transfer f(z) = 1 / (1 + e-c×z ) sigmoid function logistic function Rudolf Mak TU/e Computer Science

  24. Continuous symmetric transfer f(z) = (ec×z - e-c×z ) / (ec×z + e-c×z ) tangens hyperbolicus Rudolf Mak TU/e Computer Science

  25. Piecewise-Linear Transfer f(c, z) = ( z < –c ? –1 : ( z > c ? 1 : z / c ) ) Rudolf Mak TU/e Computer Science

  26. Local transfer function • f(z) = N (0, 1) = (1 / sqrt (2)) exp (-x2/2) Rudolf Mak TU/e Computer Science

  27. Probabilistic Neurons • Neurons are in one of two states • x = 0 or x = 1 • Thetransfer function P(z) only determines the probability of finding the output node in a certain • state • y = 1 with probability P(z) • y = 0 with probability 1 - P(z) • Common choice for P(z) is • P(z) = 1 / 1 + exp (-z / T) Rudolf Mak TU/e Computer Science

  28. Specific neuron models • McCulloch-Pitts neuron • Discrete (0 - 1) inputs • Heaviside activation function • Only weights 1 (excitatory) and -1 (inhibitory) • Adaline (Widrow & Hoff) • Continuous inputs • Identity (no) activation function • Continuous weights • x0 =1, w0 is bias Rudolf Mak TU/e Computer Science

  29. Artificial Neural Networks • Layered net with • n input nodes • m output nodes • zero or more • hidden layers • (one shown) Rudolf Mak TU/e Computer Science

  30. ANN Models • Feedforward networks (FANN) • Single-layer perceptrons (SLP, SLFF) (Rosenblatt) • Multi-layer perceptrons (MLP, MLFF) (Rumelhart, …) • Radial basis function networks (RBFN) • Functional link nets (FLN) • (Neo-)Cognitron (Fukushima) Rudolf Mak TU/e Computer Science

  31. ANN Models (continued) • Recurrent networks (RNN) • Hopfield networks (Hopfield, Amari) • Boltzmann machines (Hinton, Sejnowski) • Bidirectional associative memory (Kosko) • Competitive learning networks (CLN) • Simple competitive learning networks • Self-organizing feature maps (Kohonen) • Adaptive resonance theory (Grossberg) Rudolf Mak TU/e Computer Science

  32. Hebb’s Postulate of Learning • Biological formulation • When an axon of cell A is near enough to excite a cell and repeatedly or persistently takes part in firing it, some growth process or metabolic change takes place in one or both cells such that A’s efficiency as one of the cells firing B is increased. • Mathematical formulation Rudolf Mak TU/e Computer Science

  33. Hebb’s Postulate: revisited • Stent (1973), and Changeux and Danchin (1976) • have expanded Hebb’s rule such that it also mo- • dels inhibitory synapses: • If two neurons on either side of a synapse are activated simultaneously (synchronously), then the strength of that synapse is selectively increased. • If two neurons on either side of a synapse are activated asynchronously, then that synapse is selectively weakened or eliminated. Rudolf Mak TU/e Computer Science

  34. Learning Methods • Supervised learning • Reinforcement learning • Corrective learning • Unsupervised learning • Competitive learning • Self-organizing learning • Off-line versus adaptive learning Rudolf Mak TU/e Computer Science

  35. Learning Tasks • Association • Classification • Clustering • Pattern recognition • Function approximation • Control • Adaptive filtering • Data compression • Prediction Rudolf Mak TU/e Computer Science

  36. Application areas (just a few) • Finance, Banking, Insurance • Loan approval, stock prediction, claim prediction, fraud detection • Business, Marketing • Sale prediction, customer profiling, data mining • Medicine • Diagnosis and treatment • Industry • Quality control • Machine/plant control • Telecommunication • Adaptive filtering (equalizing) • Speech recognition and synthesis Rudolf Mak TU/e Computer Science

  37. NN for setting target corn yields Rudolf Mak TU/e Computer Science

  38. (Optical) Character Recognition Rudolf Mak TU/e Computer Science

  39. Brief history • Early stages • 1943 McCulloch-Pitts: neuron as computing element • 1948 Wiener: cybernatics • 1949 Hebb: learning rule • 1958 Rosenblatt: perceptron • 1960 Widrow-Hoff: least mean square algorithm • Recession • 1969 Minsky-Papert: limitations perceptron model • Revival • 1982 Hopfield: recurrent network model • 1982 Kohonen: self-organizing maps • 1986 Rumelhart et. al.: backpropagation Rudolf Mak TU/e Computer Science

  40. Literature • The authoritative text on neural science is: • Principles of Neural Science, fourth edition, eds. E.R. Kandel, J.H. Schwartz, T.M. Jessell, • Mc-Graw-Hill, 2000. • The authoritative text on neural networks is: • Neural Networks: a comprehensive foundation, • second edition, Simon Haykin, Prentice-Hall, • 1999. Rudolf Mak TU/e Computer Science

More Related