Lecture 39 Hopfield Network

Slides:



Advertisements
Similar presentations
Feedback Neural Networks
Advertisements

Bioinspired Computing Lecture 16
Chapter3 Pattern Association & Associative Memory
Computational Intelligence
1 Neural networks 3. 2 Hopfield network (HN) model A Hopfield network is a form of recurrent artificial neural network invented by John Hopfield in 1982.
Intro. ANN & Fuzzy Systems Lecture 8. Learning (V): Perceptron Learning.
Neural Networks for Optimization William J. Wolfe California State University Channel Islands.
Neural Networks for Optimization Bill Wolfe California State University Channel Islands.
Introduction to Neural Networks John Paxton Montana State University Summer 2003.
18 1 Hopfield Network Hopfield Model 18 3 Equations of Operation n i - input voltage to the ith amplifier a i - output voltage of the ith amplifier.
Hypercubes and Neural Networks bill wolfe 10/23/2005.
November 30, 2010Neural Networks Lecture 20: Interpolative Associative Memory 1 Associative Networks Associative networks are able to store a set of patterns.
Energy function: E(S 1,…,S N ) = - ½ Σ W ij S i S j + C (W ii = P/N) (Lyapunov function) Attractors= local minima of energy function. Inverse states Mixture.
Intro. ANN & Fuzzy Systems Lecture 21 Clustering (2)
Neural Networks Chapter 2 Joost N. Kok Universiteit Leiden.
December 7, 2010Neural Networks Lecture 21: Hopfield Network Convergence 1 The Hopfield Network The nodes of a Hopfield network can be updated synchronously.
CS623: Introduction to Computing with Neural Nets (lecture-10) Pushpak Bhattacharyya Computer Science and Engineering Department IIT Bombay.
Chapter 6 Associative Models. Introduction Associating patterns which are –similar, –contrary, –in close proximity (spatial), –in close succession (temporal)
10/6/20151 III. Recurrent Neural Networks. 10/6/20152 A. The Hopfield Network.
Neural Networks Architecture Baktash Babadi IPM, SCS Fall 2004.
Artificial Neural Network Supervised Learning دكترمحسن كاهاني
Recurrent neural networks
Hebbian Coincidence Learning
Recurrent Network InputsOutputs. Motivation Associative Memory Concept Time Series Processing – Forecasting of Time series – Classification Time series.
Neural Networks and Fuzzy Systems Hopfield Network A feedback neural network has feedback loops from its outputs to its inputs. The presence of such loops.
IE 585 Associative Network. 2 Associative Memory NN Single-layer net in which the weights are determined in such a way that the net can store a set of.
Intro. ANN & Fuzzy Systems Lecture 26 Modeling (1): Time Series Prediction.
Activations, attractors, and associators Jaap Murre Universiteit van Amsterdam
Optimization with Neural Networks Presented by: Mahmood Khademi Babak Bashiri Instructor: Dr. Bagheri Sharif University of Technology April 2007.
Intro. ANN & Fuzzy Systems Lecture 14. MLP (VI): Model Selection.
Fundamentals of Artificial Neural Networks Chapter 7 in amlbook.com.
CHAPTER 10 Widrow-Hoff Learning Ming-Feng Yeh.
Designing High-Capacity Neural Networks for Storing, Retrieving and Forgetting Patterns in Real-Time Dmitry O. Gorodnichy IMMS, Cybernetics Center of Ukrainian.
Activations, attractors, and associators Jaap Murre Universiteit van Amsterdam en Universiteit Utrecht
Hopfield Neural Networks for Optimization 虞台文 大同大學資工所 智慧型多媒體研究室.
Intro. ANN & Fuzzy Systems Lecture 13. MLP (V): Speed Up Learning.
CS623: Introduction to Computing with Neural Nets (lecture-12) Pushpak Bhattacharyya Computer Science and Engineering Department IIT Bombay.
Intro. ANN & Fuzzy Systems Lecture 38 Mixture of Experts Neural Network.
Lecture 9 Model of Hopfield
Artificial Intelligence Methods Neural Networks Lecture 3 Rakesh K. Bissoondeeal Rakesh K. Bissoondeeal.
Neural Network to solve Traveling Salesman Problem Amit Goyal Koustubh Vachhani Ankur Jain 01D05007.
Intro. ANN & Fuzzy Systems Lecture 37 Genetic and Random Search Algorithms (2)
Computational Intelligence Winter Term 2015/16 Prof. Dr. Günter Rudolph Lehrstuhl für Algorithm Engineering (LS 11) Fakultät für Informatik TU Dortmund.
Intro. ANN & Fuzzy Systems Lecture 24 Radial Basis Network (I)
Intro. ANN & Fuzzy Systems Lecture 20 Clustering (1)
Intro. ANN & Fuzzy Systems Lecture 3 Basic Definitions of ANN.
Intro. ANN & Fuzzy Systems Lecture 11. MLP (III): Back-Propagation.
Assocative Neural Networks (Hopfield) Sule Yildirim 01/11/2004.
Solving Traveling salesman Problem with Hopfield Net
Chapter 6 Associative Models
Ch7: Hopfield Neural Model
NEURONAL DYNAMICS 2: ACTIVATION MODELS
Lecture 12. MLP (IV): Programming & Implementation
Lecture 12. MLP (IV): Programming & Implementation
Lecture 9 MLP (I): Feed-forward Model
Lecture 11. MLP (III): Back-Propagation
Outline Single neuron case: Nonlinear error correcting learning
ECE/CS/ME 539 Neural Networks and Fuzzy Systems
Lecture 24 Radial Basis Network (I)
Hopfield Network.
Lecture 7. Learning (IV): Error correcting Learning and LMS Algorithm
Neural Networks Chapter 4
Recurrent Networks A recurrent network is characterized by
Lecture 39 Hopfield Network
Hopfield Neural Networks for Optimization
Lecture 8. Learning (V): Perceptron Learning
Computational Intelligence
Computational Intelligence
CS623: Introduction to Computing with Neural Nets (lecture-11)
Presentation transcript:

Lecture 39 Hopfield Network

Outline Fundamentals of Hopfield Net Analog Implementation Associate Retrieval Solving Optimization Problem (C) 2001-2003 by Yu Hen Hu

Fundamentals of Hopfield Net Proposed by J.J. Hopfield. A fully Connected, feed-back, fixed weight network. Each neuron accepts its input from the outputs of all other neurons and the its own input: Net function Output: + I1 V1 –T1 + V2 I2 –T2 + I3 V3 –T3 (C) 2001-2003 by Yu Hen Hu

Discrete Time Formulation Define V = [V1, V2, • • •, Vn]T, T = [T1, T2, • • •, Tn]T, I = [I1, I2, • • •, In]T, and Then V(t+1) = sgn{ WV(t) + I(t) – T(t)} (C) 2001-2003 by Yu Hen Hu

Example Let Then (C) 2001-2003 by Yu Hen Hu

Example (continued) [1 1 1 –1]T and [–1 –1 –1 1]T are the two stable attractors. Note that (C) 2001-2003 by Yu Hen Hu

Observations Let v* = [ 1 1 1 1]T. For any v(0) such that vT(0)v*  0, Otherwise, v(t) will oscillate between ±v(0). Exercise: try v(0) = [ 1 1 1 1]T or [ 1 1 1 1]T. Discussion: Synchronous update: All neurons are updated together. Suitable for digital implementation Asynchronous update: Some neurons are updated faster than others. Not all neurons are updated simultaneously. Most natural for analog implementation. (C) 2001-2003 by Yu Hen Hu

Lyapunov function for Stability Consider a scalar function E(V) satisfying: (i) E(V*) = 0 (ii) E(V) > 0 for V  V* (iii) dE/dV = 0 at V = V*, and dE/dV < 0 for V  V* If such an E(V) can be found, it is called a Lyapunov function, and the system is asymptotically stable (i.e. V  V* as t ). (C) 2001-2003 by Yu Hen Hu

Hopfield Net Energy Function Hence, Hopfield net dynamic equation is to minimize E(v) along descending gradient direction. Stability of Hopfield Net – If wij = wji & wii = 0, the output will converge to a local minimum (instead of oscillating). (C) 2001-2003 by Yu Hen Hu

Associative Retrieval Want to store a set of binary input vector {bm; 1  m  M} such that when a perturbed b'm is presented as I (input), the binary output V= bm. Weight Matrix: Assume binary values ±1 (C) 2001-2003 by Yu Hen Hu

Example b1 = [ 1 1 1 –1]T, b2 = [1 1 –1 –1]T Let I = V(0) = [ –1 1 –1 –1]T, then (C) 2001-2003 by Yu Hen Hu

Hopfield Net Solution to TSP (Hopfield and Tank) Use an n by n matrix to represent a tour. Vij – i-th city as the j-th stop. Each entry is a neuron! A 1 5 B 4 C 3 D 2 E City/tour (C) 2001-2003 by Yu Hen Hu

Energy Function First three terms makes V a permutation matrix. Last term minimizes the tour distance Validity of the solution – e.g. the A, B, C, D coefficients in the TSP problem. Quality of the solution – the initial condition will affect the (C) 2001-2003 by Yu Hen Hu