1 / 24

A Hybrid Self-Organizing Neural Gas Network

A Hybrid Self-Organizing Neural Gas Network. James Graham and Janusz Starzyk School of EECS, Ohio University Stocker Center, Athens, OH 45701 USA. IEEE World Conference on Computational Intelligence (WCCI’08) June 1-6, 2008 Hong Kong. Introduction. Self Organizing Networks

chavez
Télécharger la présentation

A Hybrid Self-Organizing Neural Gas Network

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. A Hybrid Self-Organizing Neural Gas Network James Graham and Janusz Starzyk School of EECS, Ohio University Stocker Center, Athens, OH 45701 USA IEEE World Conference on Computational Intelligence (WCCI’08) June 1-6, 2008 Hong Kong

  2. Introduction • Self Organizing Networks • Useful for representation building in unsupervised learning • Useful for clustering, visualization and feature maps • Numerous applications in surveillance, traffic monitoring, flight control, rescue mission, reinforcement learning, etc. • Some Types of Self Organizing Networks • Traditional Self-Organizing Map • Parameterless SOM • Neural Gas Network • Growing Neural Gas • Self-Organizing Neural Gas (SONG)

  3. Description of the approach- Fritzke’s GNG Network Algorithm Highlights • GNG starts with a set A of two units a and b at random positions wa and wb in Rn • In the set A find two nearest neighbors s1and s2 to the input signal x. • Connect s1 and s2, with an edge and set the edge age to zero. • Adjust the positions of s1and its neighborhood by a constant times (x-s1). (b for s1 and nfor the neighborhood) • Remove edges in the neighborhood that are older than amax. • Place a new node every λ cycles between the node with greatest error and its nearest neighbor. • Reduce error of the node with the maximum error and its nearest neighbor by  %, and add the removed error to the new node. • Reduce error of all nodes by a constant () times their current error.

  4. Example Example of Fritzke’s network results for 40,000 iterations with the following constants: b=0.05, n=.0006 , amax=88, =200, =.5, =0.0005.

  5. Description of the approach- Proposed Hybrid SONG Network Algorithm Highlights • SONG starts with a random pre-generated network of a fixed size. • Connections get “stiffer” with age, making their weight harder to change. • Error is calculated after the node position updates rather than before. • Weight adjustment and error distribution are functions of a distance rather than arbitrary, hard to set constants. • Edge connections are removed only under the following conditions: • When a connection is added and the node has a long connection 2x greater than its average connection length - the long edge is removed. • When a node is moved and has at least 2 connections (after attaching to its destination node) - its longest connection is removed.

  6. Description of the approach- Modification of new data neighborhood |w2-x| |w1-x| new data x s nearest neighbor |wN-x| |ws-x| remove connection to a distant neighbor >2 mean node removed if orphaned “Force” calculations Weight adjustment Error increase Age increase by 1

  7. Description of the approach- Node replacement Select a node with the minimum error Esk Spread Esk to its sk neighborhood maximum error node sq sk minimum error node moved

  8. Description of the approach- Node replacement Select a node with the minimum error Esk Spread Esk to its sk neighborhood maximum error node sq sk Insert sk to the neighborhood of sq using weights longest connection removed Remove the longest connection Spread half of sq neighborhood error to sk

  9. Results • Initial network structure with 1 random connection per node (for 200 nodes)

  10. Results (cont.) • Structure resulting form 1 initial random connection.

  11. Results (cont.) • Connection equilibrium reached for 1 initial connection.

  12. Results (cont.) • Structure resulting from 16 initial random connections.

  13. Results (cont.) • Connection equilibrium for 16 initial connections.

  14. Video of Network Progression Hybrid SONG Network Fritzke GNG Network

  15. 2-D comparison, with SOM network Salient features of the SOM algorithm: The SOM network starts as a predefined grid and is adjusted over many iterations. Connections are fixed and nodes are not inserted, moved, or relocated out of their preexisting grid. Weight adjustments occur over the entire grid and are controlled by weighted distance to the data point. Results (cont.)

  16. Growing SONG Network • Number of nodes in SONG can be automatically obtained • The SONG network starts with a few randomly placed nodes and build itself up until an equilibrium is reached between the network size and the error. • A node is added every λ cycles if MaxError > AveError + Constant • Equilibrium appears to be ~200 nodes.

  17. Growing SONG Network (cont.) • Error handling in growing SONG network was modified. • The error is “reset” and recomputed after the equilibrium was reached • Network continues to learn reaching new equilibrium • Approximation accuracy vary from run to run

  18. Growing SONG Network (cont.) • The results of growing SONG network run (on the right) compared to the simpler static approach (on the left).

  19. Other Applications- Sparsely connected hierarchical sensory network • The major features of the SONG algorithm such as the weight adjustment, error calculation, and neighborhood selection are utilized in building self-organizing sparsely connected hierarchical networks. • The sparse hierarchical network is locally connected based on neurons’ firing correlation • Feedback and time based correlation are used for invariant object recognition.

  20. Wiring area Correlation/PDF Example Other Applications- Sparsely connected hierarchical sensory network (cont.)

  21. Other Applications- Sparsely connected hierarchical network (cont.) Correlation based wiring Declining neurons’ activations Sparse hierarchical representations

  22. Conclusions • The SONG algorithm is more biologically plausible than Fritzke’s GNG algorithm. Specifically: • Weight and error adjustment are not parameter based. • Connections become stiffer with age rather than being removed at a maximum age as in Fritzke’s method. • Network has all neurons from the beginning • SONG approximates data distribution faster than the other methods tested. • Connectivity between neurons is automatically obtained and depends on the parameter that controls edge removal and the network size. • The number of neurons can be automatically obtained in growing SONG to achieve the desired accuracy.

  23. Future Work • Adapt the SONG algorithm to large input spaces (high dimensionality, i.e. images) • Adapt the SONG algorithm to a hierarchical network. • Possible applications in feature extraction, representation building, and shape recognition. • Insert new nodes as needed to reduce error. • Optimize the network design.

  24. Questions ? starzyk@bobcat.ent.ohiou.edu

More Related