1 / 22

Sparse Coding in Sparse Winner networks

Sparse Coding in Sparse Winner networks. ISNN 2007: The 4th International Symposium on Neural Networks. Janusz A. Starzyk 1 , Yinyin Liu 1 , David Vogel 2 1 School of Electrical Engineering & Computer Science Ohio University, USA 2 Ross University School of Medicine

joanne
Télécharger la présentation

Sparse Coding in Sparse Winner networks

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Sparse Coding in Sparse Winner networks ISNN 2007: The 4th International Symposium on Neural Networks Janusz A. Starzyk1, Yinyin Liu1, David Vogel2 1 School of Electrical Engineering & Computer Science Ohio University, USA 2 Ross University School of Medicine Commonwealth of Dominica

  2. Outline Motor cortex Somatosensory cortex Pars opercularis Sensory associative cortex Visual associative cortex Broca’s area Visual cortex Primary Auditory cortex Wernicke’s area • Sparse Coding • Sparse Structure • Sparse winner network with winner-take-all (WTA) mechanism • Sparse winner network with oligarchy-take-all (OTA) mechanism • Experimental results • Conclusions

  3. Sparse Coding Richard Axel, 1995 Kandel Fig. 30-1 Kandel Fig. 23-5 Hip Trunk Arm Hand Foot Face Tongue Larynx • How do we take in the sensory information and make sense of them?

  4. Sparse Coding • Neurons become active representing objects and concepts Produce sparse neural representation ——“sparse coding” • Metabolism demands of human sensory system and brain • Statistical properties of the environment – not every single bit information matters http://gandalf.psych.umn.edu/~kersten/kersten-lab/CompNeuro2002/ • “Grandmother cell” by J.V. Lettvin – only one neuron on the top level representing and recognizing an object (extreme case) • A small group of neuron on the top level representing an object C. Connor, “Friends and grandmothers’, Nature, Vol. 435, June, 2005

  5. Sparse Structure • 1012 neurons in human brain are sparsely connected • On average, each neuron is connected to other neurons through about 104 synapses • Sparse structure enables efficient computation and saves energy and cost

  6. Sparse Coding in Sparse Structure Increasing connection’s adaptability ………… …... Sensory input … … • Cortical learning: unsupervised learning • Finding sensory input activation pathway • Competition is needed: Finding neurons with stronger activities and suppress the ones with weaker activities • Winner-take-all (WTA)  a single neuron winner • Oligarchy-take-all (OTA)  a group of neurons with strong activities as winners

  7. Outline Motor cortex Somatosensory cortex Pars opercularis Sensory associative cortex Visual associative cortex Broca’s area Visual cortex Primary Auditory cortex Wernicke’s area • Sparse Coding • Sparse Structure • Sparse winner network with winner-take-all (WTA) mechanism • Sparse winner network with oligarchy-take-all (OTA) mechanism • Experimental results • Conclusions

  8. Sparse winner network with winner-take-all (WTA) • Local network model of cognition – R-net • Primary layer and secondary layer • Random sparse connection • For associative memories, not for feature extraction • Not in hierarchical structure Secondary layer Primary layer David Vogel, “A neural network model of memory and higher cognitive functions in the cerebrum”

  9. Sparse winner network with winner-take-all (WTA) … winner … … … Primary level h+1 … … Increasing number of Overall neurons Secondary level s … Primary level h Input pattern • Hierarchical learning network: • Use secondary neurons to provide “full connectivity” in sparse structure • More secondary levels can increase the sparsity • Primary levels and secondary levels • Finding neuronal representations: • Finding global winner which has the strongest signal strength • For large amount of neurons, it is very time-consuming

  10. Sparse winner network with winner-take-all (WTA) • Finding global winner using localized WTA: • Data transmission: feed-forward computation • Winner tree finding: local competition and feed-back • Winner selection: feed-forward computation and weight adjustment Global winner … … h+1 … … s2 s1 … h Input pattern

  11. Sparse winner network with winner-take-all (WTA) output input activation threshold • Data transmission: feed-forward computation • Signal calculation • Transfer function Input pattern

  12. Sparse winner network with winner-take-all (WTA) Local winner X l2 X l1 l3 • Winner tree finding: local competition and feedback • Local competition Current –mode WTA circuit (Signal – current) • Local competitions on network Local neighborhood: Local competition  local winner Branches logically cut off: l1 l3 Signal on goes to Set of post-synaptic neurons of N4level j 2 3 level+1 1 5 4 4 6 7 8 9 1 5 7 level 2 3 4 6 i Set of pre-synaptic neurons of N4level+1 N4level+1 is the winner among 4,5,6,7,8  N4level+1  N4level

  13. Sparse winner network with winner-take-all (WTA) S S S S S S S S S S S winner winner winner winner winner winner winner winner winner winner winner Input neuron Winner neuron in local competition Loser neuron in local competition Inactive neuron The winner network is found: all the neurons directly or indirectly connected with the global winner neuron Winner tree … … … … … …

  14. Sparse winner network with winner-take-all (WTA) • Winner selection: feed-forward computation and weight adjustment • Signal are recalculated through logically connected links • Weights are adjusted using concept of Hebbian Learning Number of global winners found is typically 1 with sufficient links • 64-256-1028-4096 network • Find 1 global winner with • over 8 connections

  15. Sparse winner network with winner-take-all (WTA) Number of global winners found is typically 1 with sufficient input links • 64-256-1028-4096 network • Find 1 global winner with over 8 connections

  16. Outline Motor cortex Somatosensory cortex Pars opercularis Sensory associative cortex Visual associative cortex Broca’s area Visual cortex Primary Auditory cortex Wernicke’s area • Sparse Coding • Sparse Structure • Sparse winner network with winner-take-all (WTA) mechanism • Sparse winner network with oligarchy-take-all (OTA) mechanism • Experimental results • Conclusions

  17. Sparse winner network with oligarchy-take-all (OTA) Active neuron Winner neuron in local competition Loser neuron in local competition Inactive neuron • Signal goes through layer by layer • Local competition is done after a layer is reached • Local WTA • Multiple local winner neurons on each level • Multiple winner neurons on the top level – oligarchy-take-all • Oligarchy represents the sensory input • Provide coding redundancy • More reliable than WTA … … … … … …

  18. Outline Motor cortex Somatosensory cortex Pars opercularis Sensory associative cortex Visual associative cortex Broca’s area Visual cortex Primary Auditory cortex Wernicke’s area • Sparse Coding • Sparse Structure • Sparse winner network with winner-take-all (WTA) • Sparse winner network with oligarchy-take-all (OTA) • Experimental results • Conclusions

  19. Experimental Results • WTA scheme in sparse network original image Input size: 8 x 8

  20. Experimental Results • OTA scheme in sparse network 64 bit input • Averagely, 28.3 neurons being active represent the objects. • Varies from 26 to 34 neurons

  21. Experimental Results WTA Accuracy level of random recognition Random recognition • OTA has better fault tolerance than WTA

  22. Conclusions & Future work • Sparse coding building in sparsely connected networks • WTA scheme: local competition accomplish the global competition using primary and secondary layers –efficient hardware implementation • OTA scheme: local competition produces neuronal activity reduction • OTA – redundant coding: more reliable and robust • WTA & OTA: learning memory for developing machine intelligence Future work: • Introducing temporal sequence learning • Building motor pathway on such learning memory • Combining with goal-creation pathway to build intelligent machine

More Related