1 / 5

Back-Propagation MLP Neural Network Optimizer

Back-Propagation MLP Neural Network Optimizer. ECE 539 Andrew Beckwith. Back-Propagation MLP Network Optimizer. Purpose Methods Features. Purpose. Configuring a Neural Network and its parameters is often a long and experimental process with much guess work. Let the computer do it for you.

Télécharger la présentation

Back-Propagation MLP Neural Network Optimizer

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Back-Propagation MLP Neural Network Optimizer ECE 539 Andrew Beckwith

  2. Back-Propagation MLP Network Optimizer • Purpose • Methods • Features

  3. Purpose • Configuring a Neural Network and its parameters is often a long and experimental process with much guess work. • Let the computer do it for you. • Design and implement a program that can test multiple network configurations with easy setup. • Allow user to modify data properly by enhancing important features and minimizing features with little importance or detrimental qualities.

  4. Methods • Use back-propagation algorithm with momentum • To test multiple configurations, use brute force method and keep track of most successful configuration. • Only parameter user cannot control is the number of neurons per hidden layer. • Each configuration is tested with 2, 3, 5, and 10 neurons per hidden layer. The last test is a random initialization between 1 and 10 for each layer. • Use hyperbolic tangent activation function for hidden neurons and sigmoidal activation function for output neurons. One could change this in the source code if desired.

  5. Features • Allow user to open data file, view mean and standard deviation for each feature of each class for modification purposes. • Allow user to enter ranges and number of trials for parameters such as: max epoch, epoch size, learning rate, momentum constant, and the number of hidden layers. • Allow user to set a tolerance to achieve maximum classification rate. • Allow user to view entire network – network configuration, weight values, etc.

More Related