1 / 7

Weekly Report on Genetic Algorithms and Neural Networks Research Progress

This report outlines the work completed by Bang-Xuan Huang during the week of February 15, 2012. Key achievements include a survey of genetic algorithms applied within DLM, focusing on three primary steps: reproduction (utilizing fitness functions such as roulette wheel selection and tournament selection), crossover (single, double, and average), and mutation. Additionally, the report summarizes insights gained from Chapter 7 of R. Rojas's book on Neural Networks, touching upon proposed methods for input-output relevance and feed-forward back-propagation strategies. Future work will emphasize contextual information incorporation and weight updates in neural network training.

faraji
Télécharger la présentation

Weekly Report on Genetic Algorithms and Neural Networks Research Progress

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. 2012.02.15Weekly report Bang-Xuan Huang

  2. What I have done this week • Survey • Genetic algorithm used in DLM • Genetic algorithm three steps 1.reproduction(fitness function) -roulette wheel selection (maybe we can use in MDLM) -tournament selection 2.crossover(single, double, average) 3.mutation

  3. What I have done this week • Read • Chapter 7 by R. Rojas: Neural Networks, Springer-Verlag, Berlin, 1996

  4. Proposed method input hidden output Previous state delayed class

  5. Past method input hidden output Relevance Previous state class delayed

  6. Feed-forward Back-propagation Sen3 b1 b2 b3 Sen1 w1 w2 w3 w4 w5 Sen2 a1 a2 Proposed method • Add relevance information w1 w3 w2 ….. ….. 1 2 1 0 … 1 2 1 0 … 1 2 1 0 … 01 0 0 … 0 0 1 0 … 1 0 0 0 … Future work • Add context information

  7. Back-propagation algorithm Feed-forward computation Back-propagation to the output layer Back-propagation to the hidden layer Weight updates

More Related