Download presentation

Presentation is loading. Please wait.

Published byKolton Cushen Modified over 2 years ago

1
Lecture 13: Associative Memory References: D Amit, N Brunel, Cerebral Cortex 7, 237-252 (1997) N Brunel, Network 11, 261-280 (2000) N Brunel, Cerebral Cortex 13, 1151-1161 (2003) J Hertz, in Models of Neural Networks IV (L van Hemmen, J Cowan and E Domany, eds) Springer Verlag, 2002; sect 1.4

2
What is associative memory?

3
“Patterns”: firing activity of specific sets of neurons (Hebb: “assemblies”)

4
What is associative memory? “Patterns”: firing activity of specific sets of neurons (Hebb: “assemblies”) “Store” patterns in synaptic strengths

5
What is associative memory? “Patterns”: firing activity of specific sets of neurons (Hebb: “assemblies”) “Store” patterns in synaptic strengths Recall: Given input (initial activity pattern) not equal to any stored pattern, network dynamics should take it to “nearest” (most similar) stored pattern

6
What is associative memory? “Patterns”: firing activity of specific sets of neurons (Hebb: “assemblies”) “Store” patterns in synaptic strengths Recall: Given input (initial activity pattern) not equal to any stored pattern, network dynamics should take it to “nearest” (most similar) stored pattern (categorization, error correction, …)

7
Implementation in balanced excitatory-inhibitory network Model (Amit & Brunel): p non-overlapping excitatory subpopulations

8
Implementation in balanced excitatory-inhibitory network Model (Amit & Brunel): p non-overlapping excitatory subpopulations each of size n = fN (fp < 1)

9
Implementation in balanced excitatory-inhibitory network Model (Amit & Brunel): p non-overlapping excitatory subpopulations each of size n = fN (fp < 1) stronger connections within subpopulations (“assemblies”)

10
Implementation in balanced excitatory-inhibitory network Model (Amit & Brunel): p non-overlapping excitatory subpopulations each of size n = fN (fp < 1) stronger connections within subpopulations (“assemblies”) weakened connections between subpopulations

11
Implementation in balanced excitatory-inhibitory network Model (Amit & Brunel): p non-overlapping excitatory subpopulations each of size n = fN (fp < 1) stronger connections within subpopulations (“assemblies”) weakened connections between subpopulations Looking for selective states: higher rates in a single assembly

12
Model Like Amit-Brunel model (Lecture 9) except for exc-exc synapses:

13
Model Like Amit-Brunel model (Lecture 9) except for exc-exc synapses: From within the same assembly:

14
Model Like Amit-Brunel model (Lecture 9) except for exc-exc synapses: From within the same assembly:

15
Model Like Amit-Brunel model (Lecture 9) except for exc-exc synapses: From within the same assembly: (strengthened, “Hebb” rule)

16
Model Like Amit-Brunel model (Lecture 9) except for exc-exc synapses: From within the same assembly: From outside the assembly: (strengthened, “Hebb” rule) (weakened, “anti-Hebb”)

17
Model Like Amit-Brunel model (Lecture 9) except for exc-exc synapses: From within the same assembly: From outside the assembly: Otherwise:no change (strengthened, “Hebb” rule) (weakened, “anti-Hebb”)

18
Model Like Amit-Brunel model (Lecture 9) except for exc-exc synapses: From within the same assembly: From outside the assembly: Otherwise:no change (strengthened, “Hebb” rule) (weakened, “anti-Hebb”) To conserve average strength:

19
Model Like Amit-Brunel model (Lecture 9) except for exc-exc synapses: From within the same assembly: From outside the assembly: Otherwise:no change (strengthened, “Hebb” rule) (weakened, “anti-Hebb”) To conserve average strength: =>

20
Mean field theory Rates: active assembly inactive assemblies rest of excitatory neurons inhibitory neurons ext input neurons

21
Mean field theory Input current to neurons in the active assembly: Rates: active assembly inactive assemblies rest of excitatory neurons inhibitory neurons ext input neurons

22
Mean field theory Input current to neurons in the active assembly: Rates: active assembly inactive assemblies rest of excitatory neurons inhibitory neurons ext input neurons to rest of assemblies:

23
Mean field theory Input current to neurons in the active assembly: Rates: active assembly inactive assemblies rest of excitatory neurons inhibitory neurons ext input neurons to rest of assemblies: to other excitatory neurons:

24
Mean field theory Input current to neurons in the active assembly: Rates: active assembly inactive assemblies rest of excitatory neurons inhibitory neurons ext input neurons to rest of assemblies: to other excitatory neurons: to inhibitory neurons:

25
Mean field theory (2) Noise variances (white noise approximation):

26
Mean field theory (2) Noise variances (white noise approximation):

27
Mean field theory (2) Noise variances (white noise approximation):

28
Mean field theory (2) Noise variances (white noise approximation):

29
Mean field theory (2) Noise variances (white noise approximation):

30
Mean field theory (2) Noise variances (white noise approximation): Rate of an I&F neuron driven by white noise:

31
Mean field theory (2) Noise variances (white noise approximation): Rate of an I&F neuron driven by white noise:

32
Spontaneous activity: All assemblies inactive:

33
Spontaneous activity: All assemblies inactive:

34
Spontaneous activity: All assemblies inactive: becomes

35
Spontaneous activity: All assemblies inactive: becomes Similarly,

36
Spontaneous activity: All assemblies inactive: becomes Similarly,

37
Spontaneous activity: All assemblies inactive: becomes Similarly, and

38
Spontaneous activity: All assemblies inactive: becomes Similarly, and

39
Spontaneous activity: All assemblies inactive: becomes Similarly, and Solve for

40
Simplified model (Brunel 2000)

41
pf << 1

42
Simplified model (Brunel 2000) pf << 1 g + ~1/f >> 1

43
Simplified model (Brunel 2000) pf << 1 g + ~1/f >> 1 variances + = act, 1 as in spontaneous-activity state

44
Simplified model (Brunel 2000) pf << 1 g + ~1/f >> 1 variances + = act, 1 as in spontaneous-activity state Define L = fJ 11 g +

45
Simplified model (Brunel 2000) pf << 1 g + ~1/f >> 1 variances + = act, 1 as in spontaneous-activity state Define L = fJ 11 g + Then (1) spontaneous activity state has r + = r 1,

46
Simplified model (Brunel 2000) pf << 1 g + ~1/f >> 1 variances + = act, 1 as in spontaneous-activity state Define L = fJ 11 g + Then (1) spontaneous activity state has r + = r 1, (2) In recall state with r act > r +, r 1 and r 2 are same as in spontaneous activity state

47
Simplified model (Brunel 2000) pf << 1 g + ~1/f >> 1 variances + = act, 1 as in spontaneous-activity state Define L = fJ 11 g + Then (1) spontaneous activity state has r + = r 1, (2) In recall state with r act > r +, r 1 and r 2 are same as in spontaneous activity state (3) r act is determined by

48
Graphical solution (This L = (our L ) x m ) ( r -> )

49
Graphical solution (This L = (our L ) x m ) ( r -> ) 1-assembly memory/recall state stable for big enough L (or g + ) ~ describes “working memory” in prefrontal cortex

50
Capacity problem In this model, memory assemblies were non-overlapping. This is unrealistic.

51
Capacity problem In this model, memory assemblies were non-overlapping. This is unrealistic. Alternative model: neurons in each assembly independently chosen A single neuron can be in many assemblies

52
Capacity problem In this model, memory assemblies were non-overlapping. This is unrealistic. Alternative model: neurons in each assembly independently chosen A single neuron can be in many assemblies How many patterns can be stored using N neurons before interference between patterns destroys the recall ability?

53
Capacity problem In this model, memory assemblies were non-overlapping. This is unrealistic. Alternative model: neurons in each assembly independently chosen A single neuron can be in many assemblies How many patterns can be stored using N neurons before interference between patterns destroys the recall ability? Here: solve this for a simplified model (binary neurons, can be either excitatory on inhibitory, “Hebbian” synapse formula)

54
Model N Binary neurons:

55
Model N Binary neurons: Assemblies/patterns : p sets of n = fN neurons with S i = 1

56
Model N Binary neurons: Assemblies/patterns : p sets of n = fN neurons with S i = 1

57
Model N Binary neurons: Assemblies/patterns : p sets of n = fN neurons with S i = 1

58
Model N Binary neurons: Assemblies/patterns : p sets of n = fN neurons with S i = 1 (Synchronous) dynamics:

59
Model N Binary neurons: Assemblies/patterns : p sets of n = fN neurons with S i = 1 (Synchronous) dynamics: Synapses:

60
Model N Binary neurons: Assemblies/patterns : p sets of n = fN neurons with S i = 1 (Synchronous) dynamics: Synapses: (global inhibitory term makes average J ij = 0 )

61
Order parameters

62
(normalized) overlap with pattern 1:

63
Order parameters (normalized) overlap with pattern 1: Total average activity:

64
Net input to neuron i:

68
Fluctuations

73
with = p/N

74
Fluctuations with = p/N (recall nf = O(1) )

75
Mean field equations For neurons in pattern 1, h = m + Gaussian noise

76
Mean field equations For neurons in pattern 1, h = m + Gaussian noise =>

77
Mean field equations For neurons in pattern 1, h = m + Gaussian noise => with

78
Mean field equations For neurons in pattern 1, h = m + Gaussian noise => with

79
Mean field equations For neurons in pattern 1, h = m + Gaussian noise => with For other neurons, h = Gaussian noise

80
Mean field equations For neurons in pattern 1, h = m + Gaussian noise => with For other neurons, h = Gaussian noise =>

81
Mean field equations For neurons in pattern 1, h = m + Gaussian noise => with For other neurons, h = Gaussian noise => Solve for m and Q

82
Graphical interpretation

83
Capacity estimate Weight in tail ( h > m ) of big gaussian centered at 0 must be < weight in small one centered at m

84
Capacity estimate Weight in tail ( h > m ) of big gaussian centered at 0 must be < weight in small one centered at m Can take

85
Capacity estimate Weight in tail ( h > m ) of big gaussian centered at 0 must be < weight in small one centered at m Can take =>

86
Capacity estimate Weight in tail ( h > m ) of big gaussian centered at 0 must be < weight in small one centered at m Can take => i.e.,

87
Capacity estimate Weight in tail ( h > m ) of big gaussian centered at 0 must be < weight in small one centered at m Can take => i.e., Use asymptotic form of H :

88
Capacity estimate Weight in tail ( h > m ) of big gaussian centered at 0 must be < weight in small one centered at m Can take => i.e., Use asymptotic form of H : => capacity estimate

Similar presentations

OK

Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.

Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.

© 2018 SlidePlayer.com Inc.

All rights reserved.

Ads by Google

Ppt on adolf hitler Lecture ppt on digital image processing Ppt on natural and artificial satellites of india Ppt on fast food industry in india Ppt on eia reporting Converter pub to ppt online templates Ppt on channels of distribution for services Ppt on supply chain management of nokia phones Ppt on united nations and its various organs Ppt on great indian mathematicians