1 / 21

Neural Network

Neural Network. Ming-Feng Yeh ( 葉明豐 ) Department of Electrical Engineering Lunghwa University of Science and Technology E-mail: mfyeh@mail.lhu.edu.tw Office: F412-III Tel: #5518. COURSE OBJECTIVE.

minowa
Télécharger la présentation

Neural Network

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Neural Network Ming-Feng Yeh (葉明豐) Department of Electrical Engineering Lunghwa University of Science and Technology E-mail: mfyeh@mail.lhu.edu.tw Office: F412-III Tel: #5518

  2. COURSE OBJECTIVE • This course gives an introduction to basic neural network architectures and learning rules. • Emphasis is placed on the mathematical analysis of these networks, on methods of training them and on their application to practical engineering problems in such areas as pattern recognition, signal processing and control systems.

  3. SYLLABUS • Textbook: Hagan, Demuth, Beale, Neural Network Design, PWS Publishing Company • Midterm Exam: 30% • Final Exam: 30% • Projects: 40%

  4. CONTENTS • Ch 1. Introduction • Ch 2. Neuron Model & Neural Architecture • Ch 3&4. Perceptron (感知機)Learning Rule • Ch 7. Supervised (監督式)Hebbian Learning • Ch 10. Widrow-Hoff Learning • Ch 11&12. Back-propagation (倒傳遞) • Ch 13. Associative (關聯) Learning • Ch 14. Competitive (競爭) Networks • Ch 15. Grossberg Networks • Ch 16. Adaptive Resonance (自適應) Theory • Ch 18. Hopfield Network

  5. Information • Review Ch 5 – Signal and Weight Vector Spaces Ch 6 – Linear Transformations for Neural Networks

  6. CHAPTER 1 Introduction

  7. Objectives • As you read these words you are using a complex biological neural network. You have a highly interconnected set of 1011 neurons to facilitate your reading, breathing, motion and thinking. • In the artificial neural network, the neurons are not biological. They are extremely simple abstractions of biological neurons, realized as elements in a program or perhaps as circuits made of silicon.

  8. History -1 • Pre-1940: von Hemholtz, Mach & Pavlov • General theories of learning, vision, conditioning • No specific mathematical models of neuron operation • 1940s: Hebb, McCulloch & Pitts • Mechanism for learning in biological neurons (Hebb) • Neural-like networks can compute any arithmetic or logical function (McCulloch & Pitts) • 1950s: Rosenblatt, Widrow & Hoff • First practical networks and learning rules: the perception network and associated learning rule (Rosenblatt) & Widrow-Hoff learning rule • Can not successfully modify their learning rules to train the more complex networks.

  9. History -2 • 1960s: Minsky & Papert • Demonstrated limitations of existing neural networks • Neural network research was largely suspended • 1970s: Kohonen, Anderson & Grossberg • Kohonen and Anderson independently and separately developed neural networks that could as memories • Self-organizing networks (Grossberg) • 1980s: Hopfield, Rumelhart & McClelland • The use of statistical mechanics to explain the operation of recurrent network: an associative memory (Hopfield) • Backpropagation algorithm (Rumelhart & McClelland)

  10. Applications • The applications are expanding because neural networks are good at solving problems, not just in engineering, science and mathematics, but in medicine, business, finance and literature as well.

  11. Biological Inspiration • Human brain consists of a large number (about 1011) of highly interconnected elements (about 104 connections per element) called neurons (神經元). • Three principle components are the dendrites, the cell body and the axon. • The point of contact is called a synapse.

  12. Biological Neurons Dendrites • Dendrites(樹突): carry electrical into the cell body Axon Cell Body(細胞體): sums and thresholds these incoming signals Cell Body Axon(軸突): carry the signal from the cell body out to other neurons Soma Synapse(突觸): contact between an axon of one cell and a dendrites of another cell Synapse

  13. Neural Networks • Neural Networks: a promising new generation of information processing systems, usually operate in parallel, that demonstrate the ability to learn, recall, and generalize from training patterns or data. • Basic models, learning rules, and distributed representations of neural networks will be discussed.

  14. 補充資料 • Artificial neural network可譯為類神經網路或人工神經網路,是指模仿生物神經網路的一種資訊處理系統。 • 類神經網路是一種計算系統,包括軟體與硬體,它使用大量簡單的相連人工神經元來模仿生物神經網路的能力。人工神經元是生物神經元的簡單模擬,它從外界環境或其它人工神經元取得資訊,並加以簡單的運算,並輸出其結果到外界環境或其它人工神經元。

  15. Fuzzy Logic • Fuzzy set theory was first proposed by Lotfi Zadeh in 1965. • A mathematical way to represent vagueness in linguistics • A generalization of classical set theory

  16. Fuzzy Systems v.s. Neural Networks • Fuzzy logic is based on the way the brain deals with inexact information. • Neural networks are modeled after the physical architecture of the brain. • Fuzzy systems and neural networks are both numerical model-free estimator and dynamical systems. • They share the common ability to improve the intelligence of systems working in an uncertain, imprecise and noisy environment.

  17. Machine Intelligence • Neural networks provide fuzzy systems with learning ability. • Fuzzy systems provide neural networks with a structure framework with high-level fuzzyIF-THEN rulethinking and reasoning.

  18. Fuzzy Neural Integrated System • Neural fuzzy systems: use of neural networks as tools in fuzzy models. • Fuzzy neural networks: fuzzification of conventional neural network models. • Fuzzy-neural hybrid systems: incorporation of fuzzy logic technology and neural networks into hybrid systems.

  19. Soft / Hard Computing • Hard computing whose prime desiderata are precision, certainty, and rigor. • Soft computing is tolerant of imprecision, uncertainty, and partial truth. (Lotfi Zadeh) • The primary aim of soft computing is to exploit such tolerance to achieve tractability, robustness, a high level of machine intelligence, and a low cost in practical applications. • Fuzzy logic, neural networks (including CMAC), probabilistic reasoning (genetic algorithm, evolutionary programming, and chaotic systems)

  20. Soft Computing

  21. Computational Intelligence • Fuzzy logic, neural network, genetic algorithm, and evolutionary programming are also considered the building blocks of computational intelligence. (James Bezdek) • Computational intelligence is low-level cognition in the style of human brain and is contrast to conventional (symbolic) artificial intelligence (AI).

More Related