Optimization-Neural Networks Learning from Data
80 likes | 332 Vues
Optimization-Neural Networks Learning from Data. Theodore B. Trafalis School of Industrial Engineering University of Oklahoma Norman, OK. Why Artificial Neural Networks?. Massive parallelism Distributed representation and computation Learning ability Adaptivity
Optimization-Neural Networks Learning from Data
E N D
Presentation Transcript
Optimization-Neural Networks Learning from Data Theodore B. Trafalis School of Industrial Engineering University of Oklahoma Norman, OK.
Why Artificial Neural Networks? • Massive parallelism • Distributed representation and computation • Learning ability • Adaptivity • Inherent contextual information processing • Fault tolerance • Low energy
Challenging Problems • Pattern Classification • Clustering/categorization • Function approximation • Prediction/forecasting • Optimization • Content-addressable memory • Control
Neural Network Components • Architecture:ANNs can be viewed as weighted directed graphs in which artificial neurons are nodes and directed edges (with weights) are connections between neuron outputs and neuron inputs. • Feed-forward networks: graphs have no loops. • Recurrent (feedback) networks: loops occur • Learning
Feed-forward Architecture We use the following architecture
Learning • Supervised: outputs are provided • Unsupervised: outputs are not provided • Reinforcement: the network is provided with only a critique on the correctness of network outputs, not the correct answers
Fundamental Issues of Learning Theory • Sample complexity: what is the number of training patterns needed for valid generalization. • Capacity: How many patterns can be stored and what functions and decision boundaries a network can form. • Computational complexity: time required for a learning algorithm to estimate a solution from training patterns. • Designing efficient algorithms for neural network learning is a very active research. • Develop new efficient algorithms.