Sparse Coding in Sparse Winner networks Janusz A. Starzyk 1, Yinyin Liu 1, David Vogel 2 1 School of Electrical Engineering & Computer Science Ohio University,

Slides:



Advertisements
Similar presentations
Introduction to Neural Networks
Advertisements

Neural Network I Week 7 1. Team Homework Assignment #9 Read pp. 327 – 334 and the Week 7 slide. Design a neural network for XOR (Exclusive OR) Explore.
Template design only ©copyright 2008 Ohio UniversityMedia Production Spring Quarter  A hierarchical neural network structure for text learning.
The Human Brain.
Un Supervised Learning & Self Organizing Maps. Un Supervised Competitive Learning In Hebbian networks, all neurons can fire at the same time Competitive.
5/16/2015Intelligent Systems and Soft Computing1 Introduction Introduction Hebbian learning Hebbian learning Generalised Hebbian learning algorithm Generalised.
Artificial neural networks:
Functional Link Network. Support Vector Machines.
Michigan State University1 Visual Attention and Recognition Through Neuromorphic Modeling of “Where” and “What” Pathways Zhengping Ji Embodied Intelligence.
[1].Edward G. Jones, Microcolumns in the Cerebral Cortex, Proc. of National Academy of Science of United States of America, vol. 97(10), 2000, pp
Slides are based on Negnevitsky, Pearson Education, Lecture 8 Artificial neural networks: Unsupervised learning n Introduction n Hebbian learning.
Hybrid Pipeline Structure for Self-Organizing Learning Array Yinyin Liu 1, Ding Mingwei 2, Janusz A. Starzyk 1, 1 School of Electrical Engineering & Computer.
Self-Organizing Hierarchical Neural Network
EE141 1 Design of Self-Organizing Learning Array for Intelligent Machines Janusz Starzyk School of Electrical Engineering and Computer Science Heidi Meeting.
Un Supervised Learning & Self Organizing Maps Learning From Examples
COGNITIVE NEUROSCIENCE
Fourth International Symposium on Neural Networks (ISNN) June 3-7, 2007, Nanjing, China A Hierarchical Self-organizing Associative Memory for Machine Learning.
EE141 1 Broca’s area Pars opercularis Motor cortexSomatosensory cortex Sensory associative cortex Primary Auditory cortex Wernicke’s area Visual associative.
EE141 1 Design of Self-Organizing Learning Array for Intelligent Machines Janusz Starzyk School of Electrical Engineering and Computer Science Heidi Meeting.
A Theory of Cerebral Cortex (or, “How Your Brain Works”) Andrew Smith (CSE)
Associative Learning in Hierarchical Self Organizing Learning Arrays Janusz A. Starzyk, Zhen Zhu, and Yue Li School of Electrical Engineering and Computer.
DESIGN OF A SELF- ORGANIZING LEARNING ARRAY SYSTEM Dr. Janusz Starzyk Tsun-Ho Liu Ohio University School of Electrical Engineering and Computer Science.
Biologically Inspired Robotics Group,EPFL Associative memory using coupled non-linear oscillators Semester project Final Presentation Vlad TRIFA.
Un Supervised Learning & Self Organizing Maps Learning From Examples
A Hybrid Self-Organizing Neural Gas Network James Graham and Janusz Starzyk School of EECS, Ohio University Stocker Center, Athens, OH USA IEEE World.
Lecture 09 Clustering-based Learning
Artificial Neural Networks for Secondary Structure Prediction CSC391/691 Bioinformatics Spring 2004 Fetrow/Burg/Miller (slides by J. Burg)
CHAPTER 12 ADVANCED INTELLIGENT SYSTEMS © 2005 Prentice Hall, Decision Support Systems and Intelligent Systems, 7th Edition, Turban, Aronson, and Liang.
Machine Learning. Learning agent Any other agent.
Organic Brain Syndromes in the Developmentally Disabled A new way of conceptualizing Dysfunction and Cognitive Restructuring Dr. Jay Rao M.B.,B.S., D.P.M.,
Presentation on Neural Networks.. Basics Of Neural Networks Neural networks refers to a connectionist model that simulates the biophysical information.
The Von Neumann Model The approach to solving problems is to process a set of instructions and data stored in locations. The instructions are processed.
Introduction to Neural Networks. Neural Networks in the Brain Human brain “computes” in an entirely different way from conventional digital computers.
IE 585 Introduction to Neural Networks. 2 Modeling Continuum Unarticulated Wisdom Articulated Qualitative Models Theoretic (First Principles) Models Empirical.
 The most intelligent device - “Human Brain”.  The machine that revolutionized the whole world – “computer”.  Inefficiencies of the computer has lead.
Artificial Neural Network Unsupervised Learning
Self organizing maps 1 iCSC2014, Juan López González, University of Oviedo Self organizing maps A visualization technique with data dimension reduction.
Artificial Neural Network Yalong Li Some slides are from _24_2011_ann.pdf.
Advances in Modeling Neocortex and its impact on machine intelligence Jeff Hawkins Numenta Inc. VS265 Neural Computation December 2, 2010 Documentation.
Neural Network with Memory and Cognitive Functions Janusz A. Starzyk, and Yue Li School of Electrical Engineering and Computer Science Ohio University,
Neural Networks Kasin Prakobwaitayakit Department of Electrical Engineering Chiangmai University EE459 Neural Networks The Structure.
NEURAL NETWORKS FOR DATA MINING
Neural coding (1) LECTURE 8. I.Introduction − Topographic Maps in Cortex − Synesthesia − Firing rates and tuning curves.
Artificial Neural Networks. The Brain How do brains work? How do human brains differ from that of other animals? Can we base models of artificial intelligence.
黃文中 Introduction The Model Results Conclusion 2.
What to make of: distributed representations summation of inputs Hebbian plasticity ? Competitive nets Pattern associators Autoassociators.
The primate visual systemHelmuth Radrich, The primate visual system 1.Structure of the eye 2.Neural responses to light 3.Brightness perception.
Pencil-and-Paper Neural Networks Prof. Kevin Crisp St. Olaf College.
Nervous System: Part VI Specialized Receptors: Eyes and Ears.
Cognitive Science Overview Cognitive Science Defined The Brain Assumptions of Cognitive Science Cognitive Information Processing Cognitive Science and.
Artificial Intelligence & Neural Network
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
Artificial Intelligence CIS 342 The College of Saint Rose David Goldschmidt, Ph.D.
Neural Networks (NN) Part 1 1.NN: Basic Ideas 2.Computational Principles 3.Examples of Neural Computation.
ECE 471/571 - Lecture 16 Hopfield Network 11/03/15.
Chapter 2 Cognitive Neuroscience. Some Questions to Consider What is cognitive neuroscience, and why is it necessary? How is information transmitted from.
March 31, 2016Introduction to Artificial Intelligence Lecture 16: Neural Network Paradigms I 1 … let us move on to… Artificial Neural Networks.
Neural Network Architecture Session 2
The Relationship between Deep Learning and Brain Function
DEPARTMENT: COMPUTER SC. & ENGG. SEMESTER : VII
CSSE463: Image Recognition Day 17
Intelligent Leaning -- A Brief Introduction to Artificial Neural Networks Chiung-Yao Fang.
Computer Science Department Brigham Young University
Basics of Deep Learning No Math Required
CSSE463: Image Recognition Day 17
CSSE463: Image Recognition Day 17
ARTIFICIAL NEURAL networks.
Introduction to Neural Network
Volume 27, Issue 2, Pages (August 2000)
Presentation transcript:

Sparse Coding in Sparse Winner networks Janusz A. Starzyk 1, Yinyin Liu 1, David Vogel 2 1 School of Electrical Engineering & Computer Science Ohio University, USA 2 Ross University School of Medicine Commonwealth of Dominica ISNN 2007: The 4th International Symposium on Neural Networks

2 Outline Sparse CodingSparse Coding Sparse Structure Sparse winner network with winner-take-all (WTA) mechanism Sparse winner network with oligarchy-take-all (OTA) mechanism Experimental results Conclusions Broca’s area Pars opercularis Motor cortex Somatosensory cortex Sensory associative cortex Primary Auditory cortex Wernicke’s area Visual associative cortex Visual cortex

3 Kandel Fig Sparse Coding How do we take in the sensory information and make sense of them? Richard Axel, 1995 Foot Hip Trunk Arm Hand Face Tongue Larynx Kandel Fig. 30-1

4 Sparse Coding Neurons become active representing objects and concepts C. Connor, “Friends and grandmothers’, Nature, Vol. 435, June, 2005 Metabolism demands of human sensory system and brain Statistical properties of the environment – not every single bit information matters “Grandmother cell” by J.V. Lettvin – only one neuron on the top level representing and recognizing an object (extreme case) A small group of neuron on the top level representing an object Produce sparse neural representation ——“sparse coding”

5 Sparse Structure neurons in human brain are sparsely connected On average, each neuron is connected to other neurons through about 10 4 synapses Sparse structure enables efficient computation and saves energy and cost

6 Sparse Coding in Sparse Structure Cortical learning: unsupervised learning Finding sensory input activation pathway Competition is needed: Finding neurons with stronger activities and suppress the ones with weaker activities Winner-take-all (WTA)  a single neuron winner Oligarchy-take-all (OTA)  a group of neurons with strong activities as winners

7 Outline Sparse Coding Sparse Structure Sparse winner network with winner-take-all (WTA) mechanismSparse winner network with winner-take-all (WTA) mechanism Sparse winner network with oligarchy-take-all (OTA) mechanism Experimental results Conclusions Broca’s area Pars opercularis Motor cortex Somatosensory cortex Sensory associative cortex Primary Auditory cortex Wernicke’s area Visual associative cortex Visual cortex

8 Local network model of cognition – R-net Primary layer and secondary layer Random sparse connection For associative memories, not for feature extraction Not in hierarchical structure Secondary layer Primary layer David Vogel, “A neural network model of memory and higher cognitive functions in the cerebrum” Sparse winner network with winner-take-all (WTA)

9 Use secondary neurons to provide “full connectivity” in sparse structure More secondary levels can increase the sparsity Primary levels and secondary levels Sparse winner network with winner-take-all (WTA)  Hierarchical learning network: Finding global winner which has the strongest signal strength For large amount of neurons, it is very time-consuming  Finding neuronal representations:

10 Data transmission: feed-forward computation … … … … Global winner Input pattern h+1 s2 h s1 Sparse winner network with winner-take-all (WTA)  Finding global winner using localized WTA: … Winner tree finding: local competition and feed-back Winner selection: feed-forward computation and weight adjustment

11 Signal calculation Transfer function input output activation threshold Input pattern Sparse winner network with winner-take-all (WTA)  Data transmission: feed-forward computation

12 Local competition Current –mode WTA circuit (Signal – current) Local competitions on network Sparse winner network with winner-take-all (WTA) Local neighborhood: Local competition  local winner Branches logically cut off: l1 l3 Signal on goes to  Winner tree finding: local competition and feedback Set of post-synaptic neurons of N 4 level Set of pre-synaptic neurons of N 4 level+1 N 4 level+1 is the winner among 4,5,6,7,8  N 4 level+1  N 4 level i j level+1 level Local winner l2 l1 l3 X X 4

13 The winner network is found: all the neurons directly or indirectly connected with the global winner neuron … … … Winner tree S winner S S S S S Input neuron Winner neuron in local competition Loser neuron in local competition Inactive neuron … … … S winner S S S S Sparse winner network with winner-take-all (WTA)

14 Signal are recalculated through logically connected links Weights are adjusted using concept of Hebbian Learning Sparse winner network with winner-take-all (WTA)  Winner selection: feed-forward computation and weight adjustment Number of global winners found is typically 1 with sufficient links network Find 1 global winner with over 8 connections

15 Sparse winner network with winner-take-all (WTA) Number of global winners found is typically 1 with sufficient input links network Find 1 global winner with over 8 connections

16 Outline Sparse Coding Sparse Structure Sparse winner network with winner-take-all (WTA) mechanism Sparse winner network with oligarchy-take-all (OTA) mechanismSparse winner network with oligarchy-take-all (OTA) mechanism Experimental results Conclusions Broca’s area Pars opercularis Motor cortex Somatosensory cortex Sensory associative cortex Primary Auditory cortex Wernicke’s area Visual associative cortex Visual cortex

17 Signal goes through layer by layer Local competition is done after a layer is reached Local WTA Multiple local winner neurons on each level Multiple winner neurons on the top level – oligarchy-take-all Oligarchy represents the sensory input Provide coding redundancy More reliable than WTA Sparse winner network with oligarchy-take-all (OTA) … … … Active neuron Winner neuron in local competition Loser neuron in local competition Inactive neuron … … …

18 Outline Sparse Coding Sparse Structure Sparse winner network with winner-take-all (WTA) Sparse winner network with oligarchy-take-all (OTA) Experimental resultsExperimental results Conclusions Broca’s area Pars opercularis Motor cortex Somatosensory cortex Sensory associative cortex Primary Auditory cortex Wernicke’s area Visual associative cortex Visual cortex

19 Experimental Results Input size: 8 x 8 original image  WTA scheme in sparse network

20 Experimental Results 64 bit input  Averagely, 28.3 neurons being active represent the objects.  Varies from 26 to 34 neurons  OTA scheme in sparse network

21 Accuracy level of random recognition WTA Random recognition  OTA has better fault tolerance than WTA Experimental Results

22 Conclusions & Future work Sparse coding building in sparsely connected networks WTA scheme: local competition accomplish the global competition using primary and secondary layers –efficient hardware implementation OTA scheme: local competition produces neuronal activity reduction OTA – redundant coding: more reliable and robust WTA & OTA: learning memory for developing machine intelligence Future work: Introducing temporal sequence learning Building motor pathway on such learning memory Combining with goal-creation pathway to build intelligent machine