Presentation is loading. Please wait.

Presentation is loading. Please wait.

Week 6 Hebbian LEARNING and ASSOCIATIVE MEMORY Wulfram Gerstner EPFL, Lausanne, Switzerland 6.1 Synaptic Plasticity - Hebbian Learning - Short-term Plasticity.

Similar presentations


Presentation on theme: "Week 6 Hebbian LEARNING and ASSOCIATIVE MEMORY Wulfram Gerstner EPFL, Lausanne, Switzerland 6.1 Synaptic Plasticity - Hebbian Learning - Short-term Plasticity."— Presentation transcript:

1 Week 6 Hebbian LEARNING and ASSOCIATIVE MEMORY Wulfram Gerstner EPFL, Lausanne, Switzerland 6.1 Synaptic Plasticity - Hebbian Learning - Short-term Plasticity - Long-term Plasticity - Reinforcement Learning 6.2 Models of synaptic plasticity - Hebbian learning rules - Bienenstock-Cooper-Munro rule 6.3 Hopfield Model - probabilistic - energy landscape 6.4 Attractor memories 6.5 Online learning of memories Biological Modeling of Neural Networks Reading for week 6: NEURONAL DYNAMICS - Ch. 17.2.5 – 17.4 - Ch. 19.1-19.2; Ch. 3.1.3 Cambridge Univ. Press Wulfram Gerstner EPFL, Lausanne, Switzerland

2 pre post i j Synapse 6.1 Synaptic plasticity

3 pre j post i When an axon of cell j repeatedly or persistently takes part in firing cell i, then j’s efficiency as one of the cells firing i is increased Hebb, 1949 k - local rule - simultaneously active (correlations) 6.1 Review from week 5

4 Hebbian Learning 6.1 Synaptic plasticity: Hebbian Learning

5 item memorized 6.1 Synaptic plasticity: Hebbian Learning

6 item recalled Recall: Partial info 6.1 Synaptic plasticity: Hebbian Learning

7 - Hebbian Learning - Experiments on synaptic plasticity - Mathematical Formulations of Hebbian Learning (6.2) - Back to the Hopfield Model (6.3-6.5) 6.1 Synaptic plasticity: program for today

8 Hebbian Learning in experiments ( schematic ) post i EPSP pre j no spike of i EPSP pre j post i no spike of i pre j post i Both neurons simultaneously active Increased amplitude u 6.1 Synaptic plasticity

9 Standard LTP PAIRING experiment Test stimulus At 0.1 Hz LTP induction: tetanus at 100Hz neuron depolarized to -40mV neuron at -70mV 6.1 Classical paradigm of LTP induction – pairing Fig. from Nature Neuroscience 5, 295 - 296 (2002) D. S.F. Ling, … & Todd C. Sacktor See also: Bliss and Lomo (1973), Artola, Brocher, Singer (1990), Bliss and Collingridge (1993)

10 6.1 Spike-timing dependent plasticity (STDP) pre j post i Pre before post Markram et al, 1995,1997 Zhang et al, 1998 review: Bi and Poo, 2001 60 repetitions

11 pre post i j - Induction of changes - fast (if stimulated appropriately) - slow (homeostasis) Persistence of changes - long (LTP/LTD) - short (short-term plasticity) Functionality - useful for learning a new behavior/forming new memories - useful for development (wiring for receptive field development) - useful for activity control in network: homeostasis - useful for coding 6.1 Classification of synaptic changes

12 pre j post i +50ms Changes - induced over 0.5 sec - recover over 1 sec 20Hz Data: Silberberg,Markram Fit: Richardson (Tsodyks-Markram model) Short-term plasticity/fast synaptic dynamics Thomson et al. 1993 Markram et al 1998 Tsodyks and Markram 1997 6.1 Classification of synaptic changes: Short-term plasticity

13 pre j post i +50ms Changes - induced over 3 sec - persist over 1 – 10 hours 20Hz Long-term plasticity/changes persist 30 min (or longer?) 6.1 Classification of synaptic changes: Long-term plasticity

14 Changes - induced over 0.1-0.5 sec - recover over 1 sec Protocol - presynaptic spikes Model - well established (Tosdyks, Pawelzik, Markram Abbott-Dayan) Changes - induced over 0.5-5sec - remains over hours Protocol -presynaptic spikes + … Model - we will see LTP/LTD/Hebb Short-Term vs/ Long-Term 6.1 Classification of synaptic changes

15 Hebbian Learning = unsupervised learning pre post i j 6.1 Classification of synaptic changes: unsupervised learning

16 Reinforcement Learning = reward + Hebb SUCCESS local global 6.1 Classification of synaptic changes: Reinforcement Learning broadly diffused signal: neuromodulator

17 unsupervised vs reinforcement Theoretical concept - passive changes - exploit statistical correlations LTP/LTD/Hebb pre post i j Reinforcement Learning pre i j success Theoretical concept - conditioned changes - maximise reward Functionality -useful for development ( wiring for receptive fields) Functionality - useful for learning a new behavior 6.1 Classification of synaptic changes

18 6.1 Three-factor rule of Hebbian Learning = Hebb-rule gated by a neuromodulator Neuromodulator: Interestingness, surprise; attention; novelty local global

19 Quiz 6.1: Synaptic Plasticity and Learning Rules Long-term potentiation [ ] has an acronym LTP [ ] takes more than 10 minutes to induce [ ] lasts more than 30 minutes [ ] depends on presynaptic activity, but not on state of postsynaptic neuron Short-term potentiation [ ] has an acronym STP [ ] takes more than 10 minutes to induce [ ] lasts more than 30 minutes [ ] depends on presynaptic activity, but not on state of postsynaptic neuron Learning rules [ ] Hebbian learning depends on presynaptic activity and on state of postsynaptic neuron [ ] Reinforcement learning depends on neuromodulators such as dopamine indicating reward

20 Week 6 Hebbian LEARNING and ASSOCIATIVE MEMORY 6.1 Synaptic Plasticity - Hebbian Learning - Short-term Plasticity - Long-term Plasticity - Reinforcement Learning 6.2 Models of synaptic plasticity - Hebbian learning rules - Bienenstock-Cooper-Munro rule 6.3 Hopfield Model - probabilistic - energy landscape 6.4 Attractor memories 6.5 Online learning of memories Biological Modeling of Neural Networks Wulfram Gerstner EPFL, Lausanne, Switzerlandh

21 6.2 Hebbian Learning (rate models) pre j post i When an axon of cell j repeatedly or persistently takes part in firing cell i, then j’s efficiency as one of the cells firing i is increased Hebb, 1949 k - local rule - simultaneously active (correlations) active = high rate = many spikes per second Rate model :

22 6.2 Rate-based Hebbian Learning a = a(w ij ) a(w ij ) w ij Blackboard pre j post i

23 pre post on off on off onoff onoff + 000 6.2 Rate-based Hebbian Learning pre j post i k

24 Review from week 5: Hebbian Learning

25 pre post on off on off onoff onoff ++ + 000 -- + 0 0 - + - -- 6.2 Rate-based Hebbian Learning pre j post i k

26 pre j post i k presynaptically gated BCM Bienenstock, Cooper Munro, 1982 homeostasis 6.2 Bienenstock-Cooper-Munro rule

27 Hebbian Learning detects correlations in the input Fixed rate Jointly variing rate { { Development of Receptive Fields (see also course: Unsupervised and Reinforcement Learning) 6.2 Functional Consequence of Hebbian Learning

28 BCM rule Assume 2 groups of 10 neurons each. All weights equal 1. a)Group 1 fires at 3 Hz, then group 2 at 1 Hz. What happens? b)Group 1 fires at 3 Hz, then group 2 at 2.5 Hz. What happens? c) As in b, but make theta a function of the averaged rate. What happens? { { 20Hz Exercise 1 now: Bienenstock-Cooper-Munro Take 5 minutes = Discussion of ex at 9:58

29 unselective neurons output neurons output neurons specialize: Receptive fields Initial: random connections Correlated input BCM leads to specialized Neurons (developmental learning); Bienenstock et al. 1982 { { Development and learning rules: Willshaw&Malsburg, 1976 Linsker, 1986 K.D. Miller et al., 1989 6.2 Synaptic Changes for Development of Cortex

30 Week 6 Hebbian LEARNING and ASSOCIATIVE MEMORY 6.1 Synaptic Plasticity - Hebbian Learning - Short-term Plasticity - Long-term Plasticity - Reinforcement Learning 6.2 Models of synaptic plasticity - Hebbian learning rules - Bienenstock-Cooper-Munro rule 6.3 Hopfield Model - probabilistic - energy landscape 6.4 Attractor memories 6.5 Online learning of memories Biological Modeling of Neural Networks Wulfram Gerstner EPFL, Lausanne, Switzerland

31 Prototype p 1 Prototype p 2 interactions Sum over all prototypes 6.3 Review of week 5: Deterministic Hopfield model Input potential Sum over all inputs to neuron i prototypes

32 Exercise 2 now: learning of prototypes Prototype p 1 Prototype p 2 interactions Sum over all prototypes a) Show that (1) corresponds to a rate learning rule (1) Assume that weights are zero at the beginning; Each pattern is presented (enforced) during 0.5 sec (One after the other). note that but (2) b) Compare with: c) Is this unsupervised learning? Take 8 minutes, start the exercise Next lecture at 10:25

33 6.3 Stochastic Hopfield model: overlap / correlation Overlap: similarity between state S(t) and pattern Correlation: overlap between one pattern and another Orthogonal patterns Image: Neuronal Dynamics, Gerstner et al., Cambridge Univ. Press (2014),

34 Prototype p 1 Prototype p 2 interactions Sum over all prototypes Deterministic dynamics dynamics 6.3 Review of week 5: Deterministic Hopfield model Input potential Sum over all inputs to neuron i prototypes Similarity measure: Overlap w. pattern 17:

35 6.3 Hopfield model: memory retrieval (attractor model) Overlap (definition)

36 Prototype p 1 Prototype p 2 Interactions (1) Dynamics (2) Random patterns 6.3 Stochastic Hopfield model

37 6.3 Stochastic Hopfield model: firing probability 1 blackboard

38 Dynamics (2) blackboard Assume that there is only overlap with pattern 17: two groups of neurons: those that should be ‘on’ and ‘off’ Overlap (definition) 6.3 Stochastic Hopfield model

39 Exercise 3 now: Stochastic Hopfield Overlap (definition) Suppose initial overlap with pattern 17 is 0.4; Find equation for overlap at time (t+1), given overlap at time (t) Hint: Use result from blackboard and consider 4 groups of neurons -Those that should be ON and are ON -Those that should be ON and are OFF -Those that should be OFF and are ON -Those that should be OFF and are OFF 12 minutes, Try to get As far as possible Next lecture 11:15

40 6.3 Stochastic Hopfield model: memory retrieval overlap picture Overlap: Neurons that should be ‘on’Neurons that should be ‘off’

41 6.3 Stochastic Hopfield model = attractor model Image: Neuronal Dynamics, Gerstner et al., Cambridge Univ. Press (2014),

42 E 6.3 Symmetric interactions: Energy picture

43 Exercise 4 (later): energy E Assume symmetric interaction, Assume deterministic update Show that energy always decreases

44 Week 6 Hebbian LEARNING and ASSOCIATIVE MEMORY 6.1 Synaptic Plasticity - Hebbian Learning - Short-term Plasticity - Long-term Plasticity - Reinforcement Learning 6.2 Models of synaptic plasticity - Hebbian learning rules - Bienenstock-Cooper-Munro rule 6.3 Hopfield Model - probabilistic - energy landscape 6.4 Attractor memories 6.5 Online learning of memories Biological Modeling of Neural Networks Wulfram Gerstner EPFL, Lausanne, Switzerland

45 6.4 Attractor memory ‘attractor model’: memory retrieval = flow to fixed point

46 Memory with spiking neurons -Mean activity of patterns? -Better neuron model? -Separation of excitation and inhibition? -Modeling with integrate-and-fire model? -Neural data? 6.4 attractor memory with spiking neurons

47 12 …Ni …  =1  =2  =3 Random patterns +/-1 with zero mean  50 percent of neurons should be active in each pattern 6.4 attractor memory with ‘balanced’ activity patterns

48 12 …Ni …  =1  =2  =3 Random patterns +/-1 with low activity (mean =a<0)  20 percent of neurons should be active in each pattern activity Some constant 6.4 attractor memory with ‘low’ activity patterns

49 In the Hopfield model, neurons are characterized by a binary variable S i = +/-1. For an interpretation in terms of spikes it is, however, more appealing to work with a binary variable x i which is zero or 1. (i) Write S i = 2 x i - 1 and rewrite the Hopfield model in terms of x i. What are the conditions so that the input potential is (ii) Interpretation: can you also restric the weights to excitation only? Exercise 5 NOW- from Hopfield to spikes 5 minutes, Try to get As far as possible

50 Inh1 Inh2 theta Exc Inh1 Inh2 Hebb-rule: Active together 6.4 Separation of excitation and inhibition Image: Neuronal Dynamics, Gerstner et al., Cambridge Univ. Press (2014)

51 Spike raster Overlap with patterns 1 … 3

52 overlaps Spike raster Overlap with patterns 1 … 11 (80 patterns stored!)

53 Memory with spiking neurons -Low activity of patterns? -Separation of excitation and inhibition? -Modeling with integrate-and-fire? -Neural data? All possible

54 Sidney opera Human Hippocampus Quiroga, R. Q., Reddy, L., Kreiman, G., Koch, C., and Fried, I. (2005). Invariant visual representation by single neurons in the human brain. Nature, 435:1102-1107. 6.4 memory data (review from week 5)

55 Delayed Matching to Sample Task 1s samplematch 1s samplematch Animal experiments 6.4 memory data: delayed match to sample

56 20 1s samplematch [Hz] Miyashita, Y. (1988). Neuronal correlate of visual associative long-term memory in the primate temporal cortex. Nature, 335:817-820. 6.4 memory data: delayed match-to-sample

57 match 20 [Hz] sample 0 1650ms 0 Rainer and Miller (2002). Timecourse of object-related neural activity in the primate prefrontal cortex during a short-term memory task. Europ. J. Neurosci., 15:1244-1254. 6.4 memory data: delayed match-to-sample

58 Week 6 Hebbian LEARNING and ASSOCIATIVE MEMORY 6.1 Synaptic Plasticity - Hebbian Learning - Short-term Plasticity - Long-term Plasticity - Reinforcement Learning 6.2 Models of synaptic plasticity - Hebbian learning rules - Bienenstock-Cooper-Munro rule 6.3 Hopfield Model - probabilistic - energy landscape 6.4 Attractor memories 6.5 Online learning of memories Biological Modeling of Neural Networks Wulfram Gerstner EPFL, Lausanne, Switzerland

59 Memory - lasts (sometimes) - stream of inputs What do we remember? Examples: -Highway -Traumatic memories 6.5 online learning of memories

60 Synapse Neurons Synaptic Plasticity =Change in Connection Strength Behavioral Learning – and synaptic plasticity

61 item recalled Recall: Partial info 6.5 Review: Hebbian Learning/Assemblies

62 6.5 Preconfigured memory: bistable network e.g., groups of Hopfield, Amit, Brunel, Fusi, Sompolinsky, Tsodyks, background memory stimulus 4096 spiking neurons [Hz] background

63 6.5 Learning the memory: very hard Fusi, Fusi et al., Amit et al., Mongillo et al., 1995-2005 LTP/LTD STDP stimulus LTP Zenke et al., 2015

64 Synapse Synaptic Plasticity Learning Algorithms - Functional or Behavioral Consequences Memory formation Memory retention Network stability 6.5 Learning: the task of modeling

65 pre j post i k depend on - local rule - simultaneously active rate pair 6.5 Review: Rate models of Hebbian learning

66 pre j post i k - homosynaptic/Hebb (‘pre’ and ‘post’) - heterosynaptic plasticity (pure ‘post’-term) - transmitter-induced (pure ‘pre’-term) 6.5 Induction of Plasticity

67 30x 3 Experiments Chen et al. 2013, Chistiakova et al. 2014 See also: Lynch et al. 1977 Zenke et al. (2015) 6.5 Heterosynaptic Plasticity (exper. and model)

68 - nonlinear Hebb for potentiation - pre-post for depression - heterosynaptic plasticity (pure ‘post’) - transmitter-induced (pure ‘pre’) Bienenstock et al., 1982 Pfister and Gerstner, 2006 6.5 Induction of Plasticity (rate-based)

69  Self-stabilizing! Heterosynaptic plasticity must act on the same time scale Zenke+Gerstner, PLOS Comp. B. 2013 Zenke et al., Nat. Comm., 2015 w=z w>z w >>z 6.5 Plasticity model in network

70 6.5. Plasticity in feedforward /recurrent connections Zenke et al., Nat. Comm. (2015)

71 Stable memory recall despite - ongoing plasticity - ongoing activity Zenke et al., Nat. Comm. (2015) 6.5 Plasticity model in network

72 pre post i j - Induction of changes - fast (if stimulated appropriately) - slow (homeostasis) Persistence of changes - long (LTP/LTD) - short (short-term plasticity) Functionality - useful for learning a new behavior/new memories - useful for development (wiring for receptive field development) - useful for activity control in network (homeostasis) - useful for coding 6.5 Synaptic changes – review and summary

73 The end


Download ppt "Week 6 Hebbian LEARNING and ASSOCIATIVE MEMORY Wulfram Gerstner EPFL, Lausanne, Switzerland 6.1 Synaptic Plasticity - Hebbian Learning - Short-term Plasticity."

Similar presentations


Ads by Google