Presentation is loading. Please wait.

Presentation is loading. Please wait.

Sparse Coding in Sparse Winner networks Janusz A. Starzyk 1, Yinyin Liu 1, David Vogel 2 1 School of Electrical Engineering & Computer Science Ohio University,

Similar presentations


Presentation on theme: "Sparse Coding in Sparse Winner networks Janusz A. Starzyk 1, Yinyin Liu 1, David Vogel 2 1 School of Electrical Engineering & Computer Science Ohio University,"— Presentation transcript:

1 Sparse Coding in Sparse Winner networks Janusz A. Starzyk 1, Yinyin Liu 1, David Vogel 2 1 School of Electrical Engineering & Computer Science Ohio University, USA 2 Ross University School of Medicine Commonwealth of Dominica ISNN 2007: The 4th International Symposium on Neural Networks

2 2 Outline Sparse CodingSparse Coding Sparse Structure Sparse winner network with winner-take-all (WTA) mechanism Sparse winner network with oligarchy-take-all (OTA) mechanism Experimental results Conclusions Broca’s area Pars opercularis Motor cortex Somatosensory cortex Sensory associative cortex Primary Auditory cortex Wernicke’s area Visual associative cortex Visual cortex

3 3 Kandel Fig. 23-5 Sparse Coding How do we take in the sensory information and make sense of them? Richard Axel, 1995 Foot Hip Trunk Arm Hand Face Tongue Larynx Kandel Fig. 30-1

4 4 Sparse Coding Neurons become active representing objects and concepts http://gandalf.psych.umn.edu/~kersten/kersten-lab/CompNeuro2002/ C. Connor, “Friends and grandmothers’, Nature, Vol. 435, June, 2005 Metabolism demands of human sensory system and brain Statistical properties of the environment – not every single bit information matters “Grandmother cell” by J.V. Lettvin – only one neuron on the top level representing and recognizing an object (extreme case) A small group of neuron on the top level representing an object Produce sparse neural representation ——“sparse coding”

5 5 Sparse Structure 10 12 neurons in human brain are sparsely connected On average, each neuron is connected to other neurons through about 10 4 synapses Sparse structure enables efficient computation and saves energy and cost

6 6 Sparse Coding in Sparse Structure Cortical learning: unsupervised learning Finding sensory input activation pathway Competition is needed: Finding neurons with stronger activities and suppress the ones with weaker activities Winner-take-all (WTA)  a single neuron winner Oligarchy-take-all (OTA)  a group of neurons with strong activities as winners

7 7 Outline Sparse Coding Sparse Structure Sparse winner network with winner-take-all (WTA) mechanismSparse winner network with winner-take-all (WTA) mechanism Sparse winner network with oligarchy-take-all (OTA) mechanism Experimental results Conclusions Broca’s area Pars opercularis Motor cortex Somatosensory cortex Sensory associative cortex Primary Auditory cortex Wernicke’s area Visual associative cortex Visual cortex

8 8 Local network model of cognition – R-net Primary layer and secondary layer Random sparse connection For associative memories, not for feature extraction Not in hierarchical structure Secondary layer Primary layer David Vogel, “A neural network model of memory and higher cognitive functions in the cerebrum” Sparse winner network with winner-take-all (WTA)

9 9 Use secondary neurons to provide “full connectivity” in sparse structure More secondary levels can increase the sparsity Primary levels and secondary levels Sparse winner network with winner-take-all (WTA)  Hierarchical learning network: Finding global winner which has the strongest signal strength For large amount of neurons, it is very time-consuming  Finding neuronal representations:

10 10 Data transmission: feed-forward computation … … … … Global winner Input pattern h+1 s2 h s1 Sparse winner network with winner-take-all (WTA)  Finding global winner using localized WTA: … Winner tree finding: local competition and feed-back Winner selection: feed-forward computation and weight adjustment

11 11 Signal calculation Transfer function input output activation threshold Input pattern Sparse winner network with winner-take-all (WTA)  Data transmission: feed-forward computation

12 12 Local competition Current –mode WTA circuit (Signal – current) Local competitions on network Sparse winner network with winner-take-all (WTA) Local neighborhood: Local competition  local winner Branches logically cut off: l1 l3 Signal on goes to  Winner tree finding: local competition and feedback Set of post-synaptic neurons of N 4 level Set of pre-synaptic neurons of N 4 level+1 N 4 level+1 is the winner among 4,5,6,7,8  N 4 level+1  N 4 level 5 4 7 6 8 2 34 6 9 i j 1 2 3 751 level+1 level Local winner l2 l1 l3 X X 4

13 13 The winner network is found: all the neurons directly or indirectly connected with the global winner neuron … … … Winner tree S winner S S S S S Input neuron Winner neuron in local competition Loser neuron in local competition Inactive neuron … … … S winner S S S S Sparse winner network with winner-take-all (WTA)

14 14 Signal are recalculated through logically connected links Weights are adjusted using concept of Hebbian Learning Sparse winner network with winner-take-all (WTA)  Winner selection: feed-forward computation and weight adjustment Number of global winners found is typically 1 with sufficient links 64-256-1028-4096 network Find 1 global winner with over 8 connections

15 15 Sparse winner network with winner-take-all (WTA) Number of global winners found is typically 1 with sufficient input links 64-256-1028-4096 network Find 1 global winner with over 8 connections

16 16 Outline Sparse Coding Sparse Structure Sparse winner network with winner-take-all (WTA) mechanism Sparse winner network with oligarchy-take-all (OTA) mechanismSparse winner network with oligarchy-take-all (OTA) mechanism Experimental results Conclusions Broca’s area Pars opercularis Motor cortex Somatosensory cortex Sensory associative cortex Primary Auditory cortex Wernicke’s area Visual associative cortex Visual cortex

17 17 Signal goes through layer by layer Local competition is done after a layer is reached Local WTA Multiple local winner neurons on each level Multiple winner neurons on the top level – oligarchy-take-all Oligarchy represents the sensory input Provide coding redundancy More reliable than WTA Sparse winner network with oligarchy-take-all (OTA) … … … Active neuron Winner neuron in local competition Loser neuron in local competition Inactive neuron … … …

18 18 Outline Sparse Coding Sparse Structure Sparse winner network with winner-take-all (WTA) Sparse winner network with oligarchy-take-all (OTA) Experimental resultsExperimental results Conclusions Broca’s area Pars opercularis Motor cortex Somatosensory cortex Sensory associative cortex Primary Auditory cortex Wernicke’s area Visual associative cortex Visual cortex

19 19 Experimental Results Input size: 8 x 8 original image  WTA scheme in sparse network

20 20 Experimental Results 64 bit input  Averagely, 28.3 neurons being active represent the objects.  Varies from 26 to 34 neurons  OTA scheme in sparse network

21 21 Accuracy level of random recognition WTA Random recognition  OTA has better fault tolerance than WTA Experimental Results

22 22 Conclusions & Future work Sparse coding building in sparsely connected networks WTA scheme: local competition accomplish the global competition using primary and secondary layers –efficient hardware implementation OTA scheme: local competition produces neuronal activity reduction OTA – redundant coding: more reliable and robust WTA & OTA: learning memory for developing machine intelligence Future work: Introducing temporal sequence learning Building motor pathway on such learning memory Combining with goal-creation pathway to build intelligent machine


Download ppt "Sparse Coding in Sparse Winner networks Janusz A. Starzyk 1, Yinyin Liu 1, David Vogel 2 1 School of Electrical Engineering & Computer Science Ohio University,"

Similar presentations


Ads by Google