CS623: Introduction to Computing with Neural Nets (lecture-11)

Slides:



Advertisements
Similar presentations
Bioinspired Computing Lecture 16
Advertisements

Chapter3 Pattern Association & Associative Memory
Pattern Association.
Computational Intelligence
Introduction to Neural Networks Computing
1 Neural networks 3. 2 Hopfield network (HN) model A Hopfield network is a form of recurrent artificial neural network invented by John Hopfield in 1982.
Introduction to Neural Networks John Paxton Montana State University Summer 2003.
18 1 Hopfield Network Hopfield Model 18 3 Equations of Operation n i - input voltage to the ith amplifier a i - output voltage of the ith amplifier.
Neural Networks Chapter 2 Joost N. Kok Universiteit Leiden.
December 7, 2010Neural Networks Lecture 21: Hopfield Network Convergence 1 The Hopfield Network The nodes of a Hopfield network can be updated synchronously.
CS623: Introduction to Computing with Neural Nets (lecture-10) Pushpak Bhattacharyya Computer Science and Engineering Department IIT Bombay.
Chapter 6 Associative Models. Introduction Associating patterns which are –similar, –contrary, –in close proximity (spatial), –in close succession (temporal)
© Negnevitsky, Pearson Education, Lecture 7 Artificial neural networks: Supervised learning Introduction, or how the brain works Introduction, or.
Jochen Triesch, UC San Diego, 1 Short-term and Long-term Memory Motivation: very simple circuits can store patterns of.
10/6/20151 III. Recurrent Neural Networks. 10/6/20152 A. The Hopfield Network.
Hebbian Coincidence Learning
Recurrent Network InputsOutputs. Motivation Associative Memory Concept Time Series Processing – Forecasting of Time series – Classification Time series.
Neural Networks and Fuzzy Systems Hopfield Network A feedback neural network has feedback loops from its outputs to its inputs. The presence of such loops.
IE 585 Associative Network. 2 Associative Memory NN Single-layer net in which the weights are determined in such a way that the net can store a set of.
CSC321: Introduction to Neural Networks and machine Learning Lecture 16: Hopfield nets and simulated annealing Geoffrey Hinton.
CS344: Introduction to Artificial Intelligence (associated lab: CS386) Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture 31: Feedforward N/W; sigmoid.
Activations, attractors, and associators Jaap Murre Universiteit van Amsterdam
Instructor: Prof. Pushpak Bhattacharyya 13/08/2004 CS-621/CS-449 Lecture Notes CS621/CS449 Artificial Intelligence Lecture Notes Set 4: 24/08/2004, 25/08/2004,
Prof. Pushpak Bhattacharyya, IIT Bombay 1 CS 621 Artificial Intelligence Lecture /10/05 Prof. Pushpak Bhattacharyya Artificial Neural Networks:
CS344: Introduction to Artificial Intelligence (associated lab: CS386) Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture 32: sigmoid neuron; Feedforward.
Activations, attractors, and associators Jaap Murre Universiteit van Amsterdam en Universiteit Utrecht
CS623: Introduction to Computing with Neural Nets (lecture-12) Pushpak Bhattacharyya Computer Science and Engineering Department IIT Bombay.
CS623: Introduction to Computing with Neural Nets (lecture-17) Pushpak Bhattacharyya Computer Science and Engineering Department IIT Bombay.
Computational Intelligence Winter Term 2015/16 Prof. Dr. Günter Rudolph Lehrstuhl für Algorithm Engineering (LS 11) Fakultät für Informatik TU Dortmund.
CSC2535: Computation in Neural Networks Lecture 8: Hopfield nets Geoffrey Hinton.
Assocative Neural Networks (Hopfield) Sule Yildirim 01/11/2004.
Lecture 39 Hopfield Network
CS621 : Artificial Intelligence
J. Kubalík, Gerstner Laboratory for Intelligent Decision Making and Control Artificial Neural Networks II - Outline Cascade Nets and Cascade-Correlation.
REVIEW Linear Combinations Given vectors and given scalars
Hopfield Networks MacKay - Chapter 42.
CSC321 Lecture 18: Hopfield nets and simulated annealing
Chapter 6 Associative Models
Ch7: Hopfield Neural Model
CS623: Introduction to Computing with Neural Nets (lecture-5)
Real Neurons Cell structures Cell body Dendrites Axon
CS621: Artificial Intelligence
Hidden Markov Models Part 2: Algorithms
CS623: Introduction to Computing with Neural Nets (lecture-2)
Chapter 3. Artificial Neural Networks - Introduction -
Outline Associative Learning: Hebbian Learning
Corso su Sistemi complessi:
Hopfield Network.
Pushpak Bhattacharyya Computer Science and Engineering Department
CS623: Introduction to Computing with Neural Nets (lecture-15)
CS623: Introduction to Computing with Neural Nets (lecture-4)
CS 621 Artificial Intelligence Lecture 25 – 14/10/05
CS623: Introduction to Computing with Neural Nets (lecture-9)
Pushpak Bhattacharyya Computer Science and Engineering Department
Recurrent Networks A recurrent network is characterized by
CS621: Artificial Intelligence
CS 621 Artificial Intelligence Lecture 27 – 21/10/05
Lecture 39 Hopfield Network
ARTIFICIAL INTELLIGENCE
CS623: Introduction to Computing with Neural Nets (lecture-5)
CS623: Introduction to Computing with Neural Nets (lecture-14)
CS623: Introduction to Computing with Neural Nets (lecture-3)
CS344 : Introduction to Artificial Intelligence
Computational Intelligence
CS621: Artificial Intelligence Lecture 22-23: Sigmoid neuron, Backpropagation (Lecture 20 and 21 taken by Anup on Graphical Models) Pushpak Bhattacharyya.
Computational Intelligence
CS621: Artificial Intelligence Lecture 14: perceptron training
CS621: Artificial Intelligence Lecture 18: Feedforward network contd
CSC 578 Neural Networks and Deep Learning
CS621: Artificial Intelligence Lecture 17: Feedforward network (lecture 16 was on Adaptive Hypermedia: Debraj, Kekin and Raunak) Pushpak Bhattacharyya.
Presentation transcript:

CS623: Introduction to Computing with Neural Nets (lecture-11) Pushpak Bhattacharyya Computer Science and Engineering Department IIT Bombay

Hopfield net (recap of main points) Inspired by associative memory which means memory retrieval is not by address, but by part of the data. Consists of N neurons fully connected with symmetric weight strength wij = wji No self connection. So the weight matrix is 0-diagonal and symmetric. Each computing element or neuron is a linear threshold element with threshold = 0.

Connection matrix of the network, 0-diagonal and symmetric j n1 n2 n3 . . . nk n1 n2 n3 . . . nk i wij 0 – diagonal

Example w12 = w21 = 5 w13 = w31 = 3 w23 = w32 = 2 s1(t) = 1 s2(t) = -1 At time t=0 s1(t) = 1 s2(t) = -1 s3(t) = 1 Unstable state: Neuron 1 will flip. A stable pattern is called an attractor for the net.

Concept of Energy Energy at state s is given by the equation:

Relation between weight vector W and state vector X Transpose of state vector For example, in figure 1, At time t = 0, state of the neural network is: s(0) = <1, -1, 1> and corresponding vectors are as shown. 3 2 1 5 -1 Fig. 1

W.XT gives the inputs to the neurons at the next time instant This shows that the n/w will change state

Theorem In the asynchronous mode of operation, the energy of the Hopfield net always decreases. Proof:

Proof Let neuron 1 change state by summing and comparing We get following equation for energy

Proof: note that only neuron 1 changes state Since only neuron 1 changes state, xj(t1)=xj(t2), j=2, 3, 4, …n, and hence

Proof (continued) Observations: When the state changes from -1 to 1, (S) has to be +ve and (D) is –ve; so ΔE becomes negative. When the state changes from 1 to -1, (S) has to be -ve and (D) is +ve; so ΔE becomes negative. Therefore, Energy for any state change always decreases. (D) (S)

The Hopfield net has to “converge” in the asynchronous mode of operation As the energy E goes on decreasing, it has to hit the bottom, since the weight and the state vector have finite values. That is, the Hopfield Net has to converge to an energy minimum. Hence the Hopfield Net reaches stability.

Training of Hopfield Net Early Training Rule proposed by Hopfield Rule inspired by the concept of electron spin Hebb’s rule of learning If two neurons i and j have activation xi and xj respectively, then the weight wij between the two neurons is directly proportional to the product xi ·xj i.e.

Hopfield Rule Training by Hopfield Rule Train the Hopfield net for a specific memory behavior Store memory elements How to store patterns?

Hopfield Rule To store a pattern <xn, xn-1, …., x3, x2, x1> make Storing pattern is equivalent to ‘Making that pattern the stable state’

Training of Hopfield Net Establish that <xn, xn-1, …., x3, x2, x1> is a stable state of the net To show the stability of impress at t=0 <xtn, xtn-1, …., xt3, xt2, xt1>

Training of Hopfield Net Consider neuron i at t=1

Establishing stability

Example We want <1, -1, 1> as stored memory C B A 1 -1 We want <1, -1, 1> as stored memory Calculate all the wij values wAB = 1/(3-1) * 1 * -1 = -1/2 Similarly wBC = -1/2 and wCA = ½ Is <1, -1, 1> stable? Initially C B A -0.5 0.5 1 -1 After calculating weight values

Observations How much deviation can the net tolerate? What if more than one pattern is to be stored?

Storing k patterns Let the patterns be: Generalized Hopfield Rule is: P1 : <xn, xn-1, …., x3, x2, x1>1 P2 : <xn, xn-1, …., x3, x2, x1>2 . Pk : <xn, xn-1, …., x3, x2, x1>k Generalized Hopfield Rule is: Pth pattern

Storing k patterns Study the stability of <xn, xn-1, …., x3, x2, x1> Impress the vector at t=0 and observer network dynamics Looking at neuron i at t=1, we have

Examining stability of the qth pattern

Examining stability of the qth pattern Small when k << n

Storing k patterns Condition for patterns to be stable on a Hopfield net with n neurons is: k << n The storage capacity of Hopfield net is very small Hence it is not a practical memory element