Download presentation

Presentation is loading. Please wait.

Published bySkyla Dyson Modified over 2 years ago

1
Feedback Networks and Associative Memories

2
Content Introduction Discrete Hopfield NNs Continuous Hopfield NNs Associative Memories – Hopfield Memory – Bidirection Memory

3
Feedback Networks and Associative Memories Introduction

4
Feedforward/Feedback NNs Feedforward NNs – The connections between units do not form cycles. – Usually produce a response to an input quickly. – Most feedforward NNs can be trained using a wide variety of efficient algorithms. Feedback or recurrent NNs – There are cycles in the connections. – In some feedback NNs, each time an input is presented, the NN must iterate for a potentially long time before it produces a response. – Usually more difficult to train than feedforward NNs.

5
Supervised-Learning NNs Feedforward NNs – Perceptron – Adaline, Madaline – Backpropagation (BP) – Artmap – Learning Vector Quantization (LVQ) – Probabilistic Neural Network (PNN) – General Regression Neural Network (GRNN) Feedback or recurrent NNs – Brain-State-in-a-Box (BSB) – Fuzzy Congitive Map (FCM) – Boltzmann Machine (BM) – Backpropagation through time (BPTT)

6
Unsupervised-Learning NNs Feedforward NNs – Learning Matrix (LM) – Sparse Distributed Associative Memory (SDM) – Fuzzy Associative Memory (FAM) – Counterprogation (CPN) Feedback or Recurrent NNs – Binary Adaptive Resonance Theory (ART1) – Analog Adaptive Resonance Theory (ART2, ART2a) – Discrete Hopfield (DH) – Continuous Hopfield (CH) – Discrete Bidirectional Associative Memory (BAM) – Kohonen Self-organizing Map/Topology-preserving map (SOM/TPM)

7
The Hopfield NNs In 1982, Hopfield, a Caltech physicist, mathematically tied together many of the ideas from previous research. A fully connected, symmetrically weighted network where each node functions both as input and output node. Used for – Associated memories – Combinatorial optimization

8
Associative Memories An associative memory is a content-addressable structure that maps a set of input patterns to a set of output patterns. Two types of associative memory: autoassociative and heteroassociative. Auto-association – retrieves a previously stored pattern that most closely resembles the current pattern. Hetero-association – the retrieved pattern is, in general, different from the input pattern not only in content but possibly also in type and format.

9
Associative Memories Auto-association A A Hetero-association Niagara Waterfall memory

10
Optimization Problems Associate costs with energy functions in Hopfield Networks – Need to be in quadratic form Hopfield Network finds local, satisfactory soluions, doesnt choose solutions from a set. Local optimums, not global.

11
Feedback Networks and Associative Memories Discrete Hopfield NNs

12
The Discrete Hopfield NNs 1 1 2 2 3 3 n n... v1v1 v2v2 v3v3 vnvn I1I1 I2I2 I3I3 InIn w 21 w 31 wn1wn1 w 12 w 32 wn2wn2 w 13 w 23 wn3wn3 w1nw1n w2nw2n w3nw3n

13
The Discrete Hopfield NNs 1 1 2 2 3 3 n n... v1v1 v2v2 v3v3 vnvn I1I1 I2I2 I3I3 InIn w 21 w 31 wn1wn1 w 12 w 32 wn2wn2 w 13 w 23 wn3wn3 w1nw1n w2nw2n w3nw3n w ij = w ij w ii = 0 w ij = w ij w ii = 0

14
The Discrete Hopfield NNs 1 1 2 2 3 3 n n... v1v1 v2v2 v3v3 vnvn I1I1 I2I2 I3I3 InIn w 21 w 31 wn1wn1 w 12 w 32 wn2wn2 w 13 w 23 wn3wn3 w1nw1n w2nw2n w3nw3n w ij = w ij w ii = 0 w ij = w ij w ii = 0

15
State Update Rule Asynchronous mode Update rule Stable?

16
Energy Function Fact: E is lower bounded (upper bounded). If E is monotonically decreasing (increasing), the system is stable.

17
The Proof Suppose that at time t + 1, the k th neuron is selected for update.

18
The Proof Suppose that at time t + 1, the k th neuron is selected for update. Their values are not changed at time t + 1.

19
The Proof

20
1 0 1 0 1 < 0 1 1 0 1 1 1 0 vk(t)vk(t)v k (t+1)Hk(t)Hk(t) E E Stable

21
Feedback Networks and Associative Memories Continuous Hopfield NNs

22
The Neuron of Continuous Hopfield NNs uiui v i =a( u i ) 1 1 vivi u i =a 1 (v i ) 1 1...... gigi CiCi wi1wi1 wi2wi2 w in IiIi v1v1 v2v2 vnvn uiui vivi vivi

23
The Dynamics...... gigi CiCi wi1wi1 wi2wi2 w in IiIi v1v1 v2v2 vnvn uiui vivi vivi GiGi

24
The Continuous Hopfield NNs

25
Stable?

26
Equilibrium Points Consider the autonomous system: Equilibrium Points Satisfy

27
The system is asymptotically stable if the following holds: Lyapunov Theorem There exists a positive- definite function E(y) s.t. Call E(y) as energy function.

28
Lyapunov Energy Function

29
g1g1 C1C1 I1I1 u1u1 g2g2 C2C2 I2I2 u2u2 g3g3 C3C3 I3I3 u3u3 gngn CnCn InIn unun v1v1 v2v2 v3v3 vnvn v1v1 v2v2 v3v3 vnvn w 21 wn1wn1 w 31 w 12 w 13 w1nw1n w 23 w2nw2n w 32 w3nw3n wn2wn2 wn3wn3... v u=a 1 (v) 1 1 v 1 1 uiui v i =a( u i ) 1 1

30
Stability of Continuous Hopfield NNs Dynamics

31
Stability of Continuous Hopfield NNs Dynamics > 0 v u=a 1 (v) 1 1

32
Stability of Continuous Hopfield NNs Stable

33
Feedback Networks and Associative Memories Associative Memories

34
Associative Memories Also named content-addressable memory. Autoassociative Memory – Hopfield Memory Heteroassociative Memory – Bidirection Associative Memory (BAM)

35
Associative Memories Stored Patterns Autoassociative Heteroassociative

36
Feedback Networks and Associative Memories Associative Memories Hopfield Memory Bidirection Memory

37
Hopfield Memory 12 10 Neurons Fully connected 14,400 weights

38
Example Stored Patterns Memory Association

39
Example Stored Patterns Memory Association

40
Example Stored Patterns Memory Association How to Store Patterns?

41
The Storage Algorithm Suppose the set of stored pattern is of dimension n. =?

42
The Storage Algorithm

43
Analysis Suppose that x x i.

44
Example

45
1 1 2 2 3 3 4 4 2 2

46
1 1 2 2 3 3 4 4 2 2 11 1 1 1 1 1 1 1 1 1 1 Stable E=4 E=0 E= 4

47
Example 1 1 2 2 3 3 4 4 2 2 11 1 1 1 1 1 1 1 1 1 1 Stable E=4 E=0 E= 4

48
Problems of Hopfield Memory Complement Memories Spurious stable states Capacity

49
Capacity of Hopfield Memory The number of storable patterns w.r.t. the size of the network. Study methods: – Experiments – Probability – Information theory – Radius of attraction ( )

50
Capacity of Hopfield Memory The number of storable patterns w.r.t. the size of the network. Hopfield (1982) demonstrated that the maximum number of patterns that can be stored in the Hopfield model of n nodes before the error in the retrieved pattern becomes severe is around 0.15n. The memory capacity of the Hopfield model can be increased as shown by Andrecut (1972).

51
Radius of attraction ( 0 1/2 ) n False memories

52
Feedback Networks and Associative Memories Associative Memories Hopfield Memory Bidirection Memory

53
Bidirection Memory... y1y1 y2y2 ynyn x1x1 x2x2 xmxm X Layer Y Layer W=[w ij ] n m Forward Pass Forward Pass Backward Pass Backward Pass

54
Bidirection Memory... y1y1 y2y2 ynyn x1x1 x2x2 xmxm X Layer Y Layer W=[w ij ] n m Forward Pass Forward Pass Backward Pass Backward Pass Stored Patterns ?

55
The Storage Algorithm Stored Patterns

56
Analysis Suppose x k is one of the stored vector. 0

57
Energy Function

58
Bidirection Memory Energy Function:

Similar presentations

© 2017 SlidePlayer.com Inc.

All rights reserved.

Ads by Google