Biological Modeling of Neural Networks Week 5 NETWORKS of NEURONS and ASSOCIATIVE MEMORY Wulfram Gerstner EPFL, Lausanne, Switzerland 5.1 Introduction.

Slides:



Advertisements
Similar presentations
Bioinspired Computing Lecture 16
Advertisements

Pattern Association.
Bioinspired Computing Lecture 13 Associative Memories with Artificial Neural Networks Netta Cohen.
Part III: Models of synaptic plasticity BOOK: Spiking Neuron Models, W. Gerstner and W. Kistler Cambridge University Press, 2002 Chapters Laboratory.
Introduction: Neurons and the Problem of Neural Coding Laboratory of Computational Neuroscience, LCN, CH 1015 Lausanne Swiss Federal Institute of Technology.
Artificial Neural Networks - Introduction -
Part II: Population Models BOOK: Spiking Neuron Models, W. Gerstner and W. Kistler Cambridge University Press, 2002 Chapters 6-9 Laboratory of Computational.
Plasticity in the nervous system Edward Mann 17 th Jan 2014.
Artificial Neural Networks - Introduction -
Biological Modeling of Neural Networks: Week 11 – Continuum models: Cortical fields and perception Wulfram Gerstner EPFL, Lausanne, Switzerland 11.1 Transients.
1 Neural networks 3. 2 Hopfield network (HN) model A Hopfield network is a form of recurrent artificial neural network invented by John Hopfield in 1982.
Brain-like design of sensory-motor programs for robots G. Palm, Uni-Ulm.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Un Supervised Learning & Self Organizing Maps Learning From Examples
COGNITIVE NEUROSCIENCE
Pattern Recognition using Hebbian Learning and Floating-Gates Certain pattern recognition problems have been shown to be easily solved by Artificial neural.
Perceptron Learning Rule Assuming the problem is linearly separable, there is a learning rule that converges in a finite time Motivation A new (unseen)
AN INTERACTIVE TOOL FOR THE STOCK MARKET RESEARCH USING RECURSIVE NEURAL NETWORKS Master Thesis Michal Trna
November 30, 2010Neural Networks Lecture 20: Interpolative Associative Memory 1 Associative Networks Associative networks are able to store a set of patterns.
Neural Networks Chapter 2 Joost N. Kok Universiteit Leiden.
Biologically Inspired Robotics Group,EPFL Associative memory using coupled non-linear oscillators Semester project Final Presentation Vlad TRIFA.
Connected Populations: oscillations, competition and spatial continuum (field equations) Lecture 12 Course: Neural Networks and Biological Modeling Wulfram.
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Biological Modeling of Neural Networks: Week 15 – Population Dynamics: The Integral –Equation Approach Wulfram Gerstner EPFL, Lausanne, Switzerland 15.1.
Biological Modeling of Neural Networks: Week 13 – Membrane potential densities and Fokker-Planck Wulfram Gerstner EPFL, Lausanne, Switzerland 13.1 Review:
Supervised Hebbian Learning
Biological Modeling of Neural Networks Week 3 – Reducing detail : Two-dimensional neuron models Wulfram Gerstner EPFL, Lausanne, Switzerland 3.1 From Hodgkin-Huxley.
Jochen Triesch, UC San Diego, 1 Short-term and Long-term Memory Motivation: very simple circuits can store patterns of.
Biological Modeling of Neural Networks: Week 14 – Dynamics and Plasticity Wulfram Gerstner EPFL, Lausanne, Switzerland 14.1 Reservoir computing - Complex.
Biological Modeling of Neural Networks: Week 9 – Adaptation and firing patterns Wulfram Gerstner EPFL, Lausanne, Switzerland 9.1 Firing patterns and adaptation.
Neural Networks Architecture Baktash Babadi IPM, SCS Fall 2004.
Biological Modeling of Neural Networks Week 8 – Noisy input models: Barrage of spike arrivals Wulfram Gerstner EPFL, Lausanne, Switzerland 8.1 Variation.
NEURAL NETWORKS FOR DATA MINING
Recurrent Network InputsOutputs. Motivation Associative Memory Concept Time Series Processing – Forecasting of Time series – Classification Time series.
Neural Networks and Fuzzy Systems Hopfield Network A feedback neural network has feedback loops from its outputs to its inputs. The presence of such loops.
7 1 Supervised Hebbian Learning. 7 2 Hebb’s Postulate “When an axon of cell A is near enough to excite a cell B and repeatedly or persistently takes part.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
What to make of: distributed representations summation of inputs Hebbian plasticity ? Competitive nets Pattern associators Autoassociators.
CSC321: Introduction to Neural Networks and machine Learning Lecture 16: Hopfield nets and simulated annealing Geoffrey Hinton.
Biological Modeling of Neural Networks Week 6 Hebbian LEARNING and ASSOCIATIVE MEMORY Wulfram Gerstner EPFL, Lausanne, Switzerland 6.1 Synaptic Plasticity.
Biological Modeling of Neural Networks Week 8 – Noisy output models: Escape rate and soft threshold Wulfram Gerstner EPFL, Lausanne, Switzerland 8.1 Variation.
Activations, attractors, and associators Jaap Murre Universiteit van Amsterdam
Biological Modeling of Neural Networks: Week 12 – Decision models: Competitive dynamics Wulfram Gerstner EPFL, Lausanne, Switzerland 12.1 Review: Population.
Artificial Neural Networks Students: Albu Alexandru Deaconescu Ionu.
Neural Networks with Short-Term Synaptic Dynamics (Leiden, May ) Misha Tsodyks, Weizmann Institute Mathematical Models of Short-Term Synaptic plasticity.
Announcements a3 is out, due 2/15 11:59pm Please please please start early quiz will be graded in about a week. a1 will be graded shortly—use glookup to.
Biological Modeling of Neural Networks Week 4 – Reducing detail - Adding detail Wulfram Gerstner EPFL, Lausanne, Switzerland 3.1 From Hodgkin-Huxley to.
Boltzman Machines Stochastic Hopfield Machines Lectures 11e 1.
Biological Modeling of Neural Networks Week 7 – Variability and Noise: The question of the neural code Wulfram Gerstner EPFL, Lausanne, Switzerland 7.1.
Neuronal Dynamics: Computational Neuroscience of Single Neurons
Activations, attractors, and associators Jaap Murre Universiteit van Amsterdam en Universiteit Utrecht
COSC 4426 AJ Boulay Julia Johnson Artificial Neural Networks: Introduction to Soft Computing (Textbook)
Biological Modeling of Neural Networks: Week 15 – Fast Transients and Rate models Wulfram Gerstner EPFL, Lausanne, Switzerland 15.1 Review Populations.
3.Learning In previous lecture, we discussed the biological foundations of of neural computation including  single neuron models  connecting single neuron.
Lecture 9 Model of Hopfield
ECE 471/571 - Lecture 16 Hopfield Network 11/03/15.
Biological Modeling of Neural Networks: Week 10 – Neuronal Populations Wulfram Gerstner EPFL, Lausanne, Switzerland 10.1 Cortical Populations - columns.
Where are we? What’s left? HW 7 due on Wednesday Finish learning this week. Exam #4 next Monday Final Exam is a take-home handed out next Friday in class.
CSC2535: Computation in Neural Networks Lecture 8: Hopfield nets Geoffrey Hinton.
Week 6 Hebbian LEARNING and ASSOCIATIVE MEMORY Wulfram Gerstner EPFL, Lausanne, Switzerland 6.1 Synaptic Plasticity - Hebbian Learning - Short-term Plasticity.
Biological Modeling of Neural Networks Week 10 – Variability and Noise: The question of the neural code Wulfram Gerstner EPFL, Lausanne, Switzerland 10.1.
Biological Modeling of Neural Networks Week 11 – Variability and Noise: Autocorrelation Wulfram Gerstner EPFL, Lausanne, Switzerland 11.1 Variation of.
Week 5 NETWORKS of NEURONS and ASSOCIATIVE MEMORY Wulfram Gerstner EPFL, Lausanne, Switzerland 5.1 Introduction - networks of neuron - systems for computing.
J. Kubalík, Gerstner Laboratory for Intelligent Decision Making and Control Artificial Neural Networks II - Outline Cascade Nets and Cascade-Correlation.
CSC321 Lecture 18: Hopfield nets and simulated annealing
Real Neurons Cell structures Cell body Dendrites Axon
Covariation Learning and Auto-Associative Memory
Corso su Sistemi complessi:
Supervised Hebbian Learning
CSC 578 Neural Networks and Deep Learning
Presentation transcript:

Biological Modeling of Neural Networks Week 5 NETWORKS of NEURONS and ASSOCIATIVE MEMORY Wulfram Gerstner EPFL, Lausanne, Switzerland 5.1 Introduction - networks of neuron - systems for computing - associative memory 5.2 Classification by similarity 5.3 Detour: Magnetic Materials 5.4 Hopfield Model 5.5 Learning of Associations 5.6 Storage Capacity Week 5: Networks of Neurons-Introduction

Systems for computing and information processing Brain Computer CPU memory input Von Neumann architecture (10 transistors) 1 CPU Distributed architecture 10 (10 proc. Elements/neurons) No separation of processing and memory 10

neurons 3 km wire 1mm Systems for computing and information processing

Brain Distributed architecture neurons No separation of processing and memory 4 10 connections/neurons Systems for computing and information processing neurons 3 km wire 1mm

Associations, Associative Memory Read this text NOW! I find it rea*l* amazin* t*at y*u ar* abl* to re*d t*is tex* despit* th* fac* *hat more t*an t*ent* perc*n* of t** char*cte*s a*e mis*ing. *his mean* t*at you* brai* i* abl* ** fill in missin* info*matio*.

Noisy word pattern completion/word recognition brain List of words brai* atom brave brain brass Output the closest one Your brain fills in missing information: ‘associative memory’

Biological Modeling of Neural Networks Week 5 NETWORKS of NEURONS and ASSOCIATIVE MEMORY Wulfram Gerstner EPFL, Lausanne, Switzerland 5.1 Introduction - networks of neuron - systems for computing - associative memory 5.2 Classification by similarity 5.3 Detour: Magnetic Materials 5.4 Hopfield Model 5.5 Learning of Associations 5.6 Storage Capacity Week 5: Networks of Neurons-Introduction

image 5.2 Classification by similarity: pattern recognition Classification: comparison with prototypes A B T Z Prototypes Noisy image

Prototypes Noisy image Classification by closest prototype Blackboard: 5.2 Classification by similarity: pattern recognition

Noisy image Aim: Understand Associative Memory Associative memory/ collective computation Brain-style computation Full image Partial word Full word 5.2 pattern recognition and Pattern completion

Quiz 5.1: Connectivity A typical neuron in the brain makes connections -To 6-20 neighbors -To neurons nearby -To more than 1000 neurons nearby -To more than 1000 neurons nearby or far away. In a typical cristal in nature, each atom interacts -with 6-20 neighbors -with neurons nearby -with more than 1000 neurons nearby -with more than 1000 neurons nearby or far away.

Biological Modeling of Neural Networks Week 5 NETWORKS of NEURONS and ASSOCIATIVE MEMORY Wulfram Gerstner EPFL, Lausanne, Switzerland 5.1 Introduction - networks of neuron - systems for computing - associative memory 5.2 Classification by similarity 5.3 Detour: Magnetic Materials 5.4 Hopfield Model 5.5 Learning of Associations 5.6 Storage Capacity Week 5: Networks of Neurons-Introduction

5.3 Detour: magnetism S N

Noisy magnetpure magnet 5.3 Detour: magnetism

Elementary magnet S i = +1 S i = -1 Blackboard: example 5.3 Detour: magnetism dynamics Sum over all interactions with i

Elementary magnet S i = +1 S i = -1 blackboard Anti-ferromagnet w ij = +1 w ij = Detour: magnetism dynamics Sum over all interactions with i

5.3 Magnetism and memory patterns Elementary pixel S i = +1 S i = -1 blackboard w ij = +1 w ij = -1 w ij = +1 Hopfield model: Several patterns  next section dynamics Sum over all interactions with i

Exercise 1: Associative memory (1 pattern) Elementary pixel S i = +1 S i = -1 w ij = +1 9 neurons - define appropriate weights - what happens if one neuron wrong? - what happens if n neurons wrong? Next lecture at 10h15 dynamics Sum over all interactions with i

Biological Modeling of Neural Networks Week 5 NETWORKS of NEURONS and ASSOCIATIVE MEMORY Wulfram Gerstner EPFL, Lausanne, Switzerland 5.1 Introduction - networks of neuron - systems for computing - associative memory 5.2 Classification by similarity 5.3 Detour: Magnetic Materials 5.4 Hopfield Model 5.5 Learning of Associations 5.6 Storage Capacity Week 5: Networks of Neurons-Introduction

5.4 Hopfield Model of Associative Memory dynamics Sum over all interactions with i Hopfield model Prototype p 1 Prototype p 2 interactions Sum over all prototypes

dynamics all interactions with i DEMO Random patterns, fully connected: Hopfield model Prototype p 1 interactions Sum over all prototypes This rule is very good for random patterns It does not work well for correlated patters 5.4 Hopfield Model of Associative Memory

overlap Blackboard

Hopfield model Prototype p 1 Finds the closest prototype i.e. maximal overlap (similarity) Interacting neurons Computation - without CPU, - without explicit memory unit 5.4 Hopfield Model of Associative Memory

Exercise 3 (now) Prototype p 1 Assume 4 patterns. At time t=0, overlap with Pattern 3, no overlap with other patterns. discuss temporal evolution (of overlaps) (assume that patterns are orthogonal) Sum over all interactions with i Next lecture at 11h15

Biological Modeling of Neural Networks Week 5 NETWORKS of NEURONS and ASSOCIATIVE MEMORY Wulfram Gerstner EPFL, Lausanne, Switzerland 5.1 Introduction - networks of neuron - systems for computing - associative memory 5.2 Classification by similarity 5.3 Detour: Magnetic Materials 5.4 Hopfield Model 5.5 Learning of Associations 5.6 Storage Capacity Week 5-5: Learning of Associations

Hebbian Learning pre j post i When an axon of cell j repeatedly or persistently takes part in firing cell i, then j’s efficiency as one of the cells firing i is increased Hebb, 1949 k - local rule - simultaneously active (correlations) Where do the connections come from? 5.5 Learning of Associations

5.5 Hebbian Learning of Associations

item memorized 5.5 Hebbian Learning of Associations

item recalled Recall: Partial info 5.5 Hebbian Learning: Associative Recall

5.5 Associative Recall Tell me the object shape for the following list of 5 items: Tell me the color for the following list of 5 items: be as fast as possible: time

Tell me the color for the following list of 5 items: Red Blue Yellow Green Red Stroop effect: Slow response: hard to work Against natural associations be as fast as possible: time 5.5 Associative Recall

Hierarchical organization of Associative memory animals birds fish Name as fast as possible an example of a bird swan (or goose or raven or …) Write down first letter: s for swan or r for raven … 5.5 Associative Recall

name as fast as possible an example of a hammer red Apple violin tool color music instrument fruit Nommez au plus vite possible un exemple d’un /d’une outil couleur fruit instrument de musique 5.5 Associative Recall

Biological Modeling of Neural Networks Week 5 NETWORKS of NEURONS and ASSOCIATIVE MEMORY Wulfram Gerstner EPFL, Lausanne, Switzerland 5.1 Introduction - networks of neuron - systems for computing - associative memory 5.2 Classification by similarity 5.3 Detour: Magnetic Materials 5.4 Hopfield Model 5.5 Learning of Associations 5.6 Storage Capacity Week 5-5: Learning of Associations

learning of prototypes Prototype p 1 Prototype p 2 interactions Sum over all prototypes (1) Q; How many prototypes can be stored? dynamics all interactions with i

Prototype p 1 Prototype p 2 Interactions (1) Q; How many prototypes can be stored? Dynamics (2) Random patterns Minimal condition: pattern is fixed point of dynamics -Assume we start directly in one pattern -Pattern stays Attention: Retrieval requires more (pattern completion) blackboard

Exercise 4 now: Associative memory Q; How many prototypes can be stored? Random patterns  random walk a) show relation to erf function: importance of p/N b) network of 1000 neurons – allow at most 1 wrong pixel? c) network of N neurons – at most 1 promille wrong pixels? End of lecture, exercise+ Computer exercise : 12:00 Prototype p 1 Prototype p 2 Interactions (1) Dynamics (2)

Prototype p 1 Prototype p 2 Interactions (1) Dynamics (2) Random patterns Minimal condition: pattern is fixed point of dynamics -Assume we start directly in one pattern -Pattern stays Attention: Retrieval requires more (pattern completion) Week 6 Review: storage capacity of Hopfield model

1 0 Q; How many prototypes can be stored?

Biological Modeling of Neural Networks Week 6 Hebbian LEARNING and ASSOCIATIVE MEMORY Wulfram Gerstner EPFL, Lausanne, Switzerland Week 6: Hopfield model continued 6.1 Stochastic Hopfield Model 6.2. Energy landscape 6.3. Low-activity patterns 6.4. Attractor memorie - spiking neurons - experimental data

Prototype p 1 Prototype p 2 interactions Sum over all prototypes (1) Deterministic dynamics 6.1 Review: Hopfield model

Prototype p 1 Prototype p 2 Interactions (1) Dynamics (2) Random patterns blackboard 6.1 Stochastic Hopfield model

6.1 Stochastic Hopfield model: memory retrieval

Dynamics (2) blackboard 6.1 Stochastic Hopfield model Assume that there is only overlap with pattern 17: two groups of neurons: those that should be ‘on’ and ‘off’

6.1 Stochastic Hopfield model: memory retrieval

Biological Modeling of Neural Networks Week 6 Hebbian LEARNING and ASSOCIATIVE MEMORY Wulfram Gerstner EPFL, Lausanne, Switzerland Week 6: Hopfield model continued 6.1 Stochastic Hopfield Model 6.2. Energy landscape 6.3. Low-activity patterns 6.4. Attractor memorie - spiking neurons - experimental data

6.2 memory retrieval

E 6.2 Symmetric interactions: Energy picture

Exercise 2 now: Energy picture Next lecture 11:25 E

Biological Modeling of Neural Networks Week 6 Hebbian LEARNING and ASSOCIATIVE MEMORY Wulfram Gerstner EPFL, Lausanne, Switzerland Week 6: Hopfield model continued 6.1 Stochastic Hopfield Model 6.2. Energy landscape 6.3. Low-activity patterns 6.4. Attractor memories - spiking neurons - experimental data

6.3 Attractor memory

Memory with spiking neurons -Mean activity of patterns? -Separation of excitation and inhibition? -Modeling? -Neural data? 6.3 attractor memory with spiking neurons

12 …Ni …  =1  =2  =3 Random patterns +/-1 with zero mean  50 percent of neurons should be active in each pattern 6.3 attractor memory with low activity patterns

12 …Ni …  =1  =2  =3 Random patterns +/-1 with low activity (mean =a<0)  20 percent of neurons should be active in each pattern activity Some constant

Biological Modeling of Neural Networks Week 6 Hebbian LEARNING and ASSOCIATIVE MEMORY Wulfram Gerstner EPFL, Lausanne, Switzerland Week 6: Hopfield model continued 6.1 Stochastic Hopfield Model 6.2. Energy landscape 6.3. Low-activity patterns 6.4. Attractor memories - spiking neurons - experimental data

Inh1 Inh2 theta Exc Inh1 Inh2 Hebb-rule: Active together 6.4 attractor memory with spiking neurons

Spike raster Overlap with patterns 1 … 3

overlaps Spike raster Overlap with patterns 1 … 11 (80 patterns stored!)

Memory with spiking neurons -Low activity of patterns? -Separation of excitation and inhibition? -Modeling? -Neural data? All possible

Sidney opera Human Hippocampus Quiroga, R. Q., Reddy, L., Kreiman, G., Koch, C., and Fried, I. (2005). Invariant visual representation by single neurons in the human brain. Nature, 435: memory data

Delayed Matching to Sample Task 1s samplematch 1s samplematch Animal experiments 6.4 memory data

20 1s samplematch [Hz] Miyashita, Y. (1988). Neuronal correlate of visual associative long-term memory in the primate temporal cortex. Nature, 335: memory data

match 20 [Hz] sample ms 0 Rainer and Miller (2002). Timecourse of object-related neural activity in the primate prefrontal cortex during a short-term memory task. Europ. J. Neurosci., 15: memory data

In the Hopfield model, neurons are characterized by a binary variable S i = +/-1. For an interpretation in terms of spikes it is, however, more appealing to work with a binary variable x i which is zero or 1. (i) Write S i = 2 x i - 1 and rewrite the Hopfield model in terms of x i. What are the conditions so that the input potential is (ii) Repeat the same calculation for low-activity patterns and and weights with some constants a and b Exercise 3 NOW- from Hopfield to spikes

The end