Presentation is loading. Please wait.

Presentation is loading. Please wait.

18 1 Hopfield Network. 18 2 Hopfield Model 18 3 Equations of Operation n i - input voltage to the ith amplifier a i - output voltage of the ith amplifier.

Similar presentations


Presentation on theme: "18 1 Hopfield Network. 18 2 Hopfield Model 18 3 Equations of Operation n i - input voltage to the ith amplifier a i - output voltage of the ith amplifier."— Presentation transcript:

1 18 1 Hopfield Network

2 18 2 Hopfield Model

3 18 3 Equations of Operation n i - input voltage to the ith amplifier a i - output voltage of the ith amplifier C- amplifier input capacitance I i - fixed input current to the ith amplifier

4 18 4 Network Format Define: Vector Form:

5 18 5 Hopfield Network

6 18 6 Lyapunov Function V a  1 2 --- a T Wa – f 1– u  ud 0 a i       i1= S  b T a –+=

7 18 7 Individual Derivatives td d f 1– u  ud 0 a i       a i d d f 1– u  ud 0 a i       td da i f 1– a i  td i n i td i === d dt ----- f 1– u  ud 0 a i       i1= S  n T d a dt ------ = Third Term: Second Term: First Term:

8 18 8 Complete Lyapunov Derivative  – a i d d f 1– a i    td da i   2 i1= S  = From the system equations we know: So the derivative can be written: Ifthen a i d d f 1– a i  0 

9 18 9 Invariant Sets This will be zero only if the neuron outputs are not changing: Therefore, the system energy is not changing only at the equilibrium points of the circuit. Thus, all points in Z are potential attractors:

10 18 10 Example

11 18 11 Example Lyapunov Function V a  1 2 --- a T Wa –f 1– u  ud 0 a i       i1= S  b T a –+=

12 18 12 Example Network Equations

13 18 13 Lyapunov Function and Trajectory a 1 a 2 a 2 a 1 V(a)V(a)

14 18 14 Time Response tt a 1 a 2 V(a)V(a)

15 18 15 Convergence to a Saddle Point a 1 a 2

16 18 16 Hopfield Attractors V  a 1   V a 2   V... a S   V T 0 == The potential attractors of the Hopfield network satisfy: How are these points related to the minima of V(a)? The minima must satisfy: Where the Lyapunov function is given by: V a  1 2 --- a T Wa – f 1– u  ud 0 a i       i1= S  b T a –+=

17 18 17 Hopfield Attractors Using previous results, we can show that: The ith element of the gradient is therefore: a i   V a  – td dn i  – td d f 1– a i  ()  – a i d d f 1– a i  td da i === Since the transfer function and its inverse are monotonic increasing: All points for whichwill also satisfy Therefore all attractors will be stationary points of V(a).

18 18 Effect of Gain n a

19 18 19 Lyapunov Function V a  1 2 --- a T Wa – f 1– u  ud 0 a i       i1= S  b T a –+=  1.4=  0.14=  14= a f 1– u  ud 0 a i  2  ------ 2  ---  a i 2 --------   cos   log 4  2 ---------  a i 2 --------   coslog–== 4  2 ---------  a 2 --------   coslog–

20 18 20 High Gain Lyapunov Function where As  the Lyapunov function reduces to: The high gain Lyapunov function is quadratic:

21 18 21 Example a 1 a 2 a 2 a 1 V(a)V(a)

22 18 22 Hopfield Design Choose the weight matrix W and the bias vector b so that V takes on the form of a function you want to minimize. The Hopfield network will minimize the following Lyapunov function:

23 18 23 Content-Addressable Memory Content-Addressable Memory - retrieves stored memories on the basis of part of the contents. Prototype Patterns: Proposed Performance Index: (bipolar vectors) For orthogonal prototypes, if we evaluate the performance index at a prototype: J(a) will be largest when a is not close to any prototype pattern, and smallest when a is equal to a prototype pattern.

24 18 24 Hebb Rule If we use the supervised Hebb rule to compute the weight matrix: the Lyapunov function will be: This can be rewritten: Therefore the Lyapunov function is equal to our performance index for the content addressable memory.

25 18 25 Hebb Rule Analysis If we apply prototype p j to the network: Therefore each prototype is an eigenvector, and they have a common eigenvalue of S. The eigenspace for the eigenvalue = S is therefore: An S-dimensional space of all vectors which can be written as linear combinations of the prototype vectors.

26 18 26 Weight Matrix Eigenspace The entire input space can be divided into two disjoint sets: where X  is the orthogonal complement of X. For vectors a in the orthogonal complement we have: Therefore, The eigenvalues of W are S and 0, with corresponding eigenspaces of X and X . For the Hessian matrix the eigenvalues are -S and 0, with the same eigenspaces.

27 18 27 Lyapunov Surface The high-gain Lyapunov function is a quadratic function. Therefore, the eigenvalues of the Hessian matrix determine its shape. Because the first eigenvalue is negative, V will have negative curvature in X. Because the second eigenvalue is zero, V will have zero curvature in X . Because V has negative curvature in X, the trajectories of the Hopfield network will tend to fall into the corners of the hypercube {a: -1 < a i < 1} that are contained in X.

28 18 28 Example

29 18 29 Zero Diagonal Elements We can zero the diagonal elements of the weight matrix: The prototypes remain eigenvectors of this new matrix, but the corresponding eigenvalue is now (S-Q): The elements of X  also remain eigenvectors of this new matrix, with a corresponding eigenvalue of (-Q): The Lyapunov surface will have negative curvature in X and positive curvature in X , in contrast with the original Lyapunov function, which had negative curvature in X and zero curvature in X .

30 18 30 Example If the initial condition falls exactly on the line a 1 = -a 2, and the weight matrix W is used, then the network output will remain constant. If the initial condition falls exactly on the line a 1 = -a 2, and the weight matrix W’ is used, then the network output will converge to the saddle point at the origin.


Download ppt "18 1 Hopfield Network. 18 2 Hopfield Model 18 3 Equations of Operation n i - input voltage to the ith amplifier a i - output voltage of the ith amplifier."

Similar presentations


Ads by Google