Lecture 39 Hopfield Network

Slides:



Advertisements
Similar presentations
Feedback Neural Networks
Advertisements

Bioinspired Computing Lecture 16
Chapter3 Pattern Association & Associative Memory
Computational Intelligence
CS 678 –Relaxation and Hopfield Networks1 Relaxation and Hopfield Networks Totally connected recurrent relaxation networks Bidirectional weights (symmetric)
1 Neural networks 3. 2 Hopfield network (HN) model A Hopfield network is a form of recurrent artificial neural network invented by John Hopfield in 1982.
Intro. ANN & Fuzzy Systems Lecture 8. Learning (V): Perceptron Learning.
Neural Networks for Optimization William J. Wolfe California State University Channel Islands.
18 1 Hopfield Network Hopfield Model 18 3 Equations of Operation n i - input voltage to the ith amplifier a i - output voltage of the ith amplifier.
Hypercubes and Neural Networks bill wolfe 10/23/2005.
Modern Control Systems1 Lecture 07 Analysis (III) -- Stability 7.1 Bounded-Input Bounded-Output (BIBO) Stability 7.2 Asymptotic Stability 7.3 Lyapunov.
Energy function: E(S 1,…,S N ) = - ½ Σ W ij S i S j + C (W ii = P/N) (Lyapunov function) Attractors= local minima of energy function. Inverse states Mixture.
Neural Networks Chapter 2 Joost N. Kok Universiteit Leiden.
December 7, 2010Neural Networks Lecture 21: Hopfield Network Convergence 1 The Hopfield Network The nodes of a Hopfield network can be updated synchronously.
CS623: Introduction to Computing with Neural Nets (lecture-10) Pushpak Bhattacharyya Computer Science and Engineering Department IIT Bombay.
Chapter 6 Associative Models. Introduction Associating patterns which are –similar, –contrary, –in close proximity (spatial), –in close succession (temporal)
10/6/20151 III. Recurrent Neural Networks. 10/6/20152 A. The Hopfield Network.
Neural Networks Architecture Baktash Babadi IPM, SCS Fall 2004.
Artificial Neural Network Supervised Learning دكترمحسن كاهاني
Recurrent neural networks
Hebbian Coincidence Learning
Recurrent Network InputsOutputs. Motivation Associative Memory Concept Time Series Processing – Forecasting of Time series – Classification Time series.
CS 478 – Tools for Machine Learning and Data Mining Backpropagation.
Neural Networks and Fuzzy Systems Hopfield Network A feedback neural network has feedback loops from its outputs to its inputs. The presence of such loops.
IE 585 Associative Network. 2 Associative Memory NN Single-layer net in which the weights are determined in such a way that the net can store a set of.
Activations, attractors, and associators Jaap Murre Universiteit van Amsterdam
Optimization with Neural Networks Presented by: Mahmood Khademi Babak Bashiri Instructor: Dr. Bagheri Sharif University of Technology April 2007.
CHAPTER 10 Widrow-Hoff Learning Ming-Feng Yeh.
Designing High-Capacity Neural Networks for Storing, Retrieving and Forgetting Patterns in Real-Time Dmitry O. Gorodnichy IMMS, Cybernetics Center of Ukrainian.
Activations, attractors, and associators Jaap Murre Universiteit van Amsterdam en Universiteit Utrecht
Hopfield Neural Networks for Optimization 虞台文 大同大學資工所 智慧型多媒體研究室.
Intro. ANN & Fuzzy Systems Lecture 13. MLP (V): Speed Up Learning.
CS623: Introduction to Computing with Neural Nets (lecture-12) Pushpak Bhattacharyya Computer Science and Engineering Department IIT Bombay.
Lecture 9 Model of Hopfield
Artificial Intelligence Methods Neural Networks Lecture 3 Rakesh K. Bissoondeeal Rakesh K. Bissoondeeal.
Neural Network to solve Traveling Salesman Problem Amit Goyal Koustubh Vachhani Ankur Jain 01D05007.
Intro. ANN & Fuzzy Systems Lecture 37 Genetic and Random Search Algorithms (2)
Computational Intelligence Winter Term 2015/16 Prof. Dr. Günter Rudolph Lehrstuhl für Algorithm Engineering (LS 11) Fakultät für Informatik TU Dortmund.
Intro. ANN & Fuzzy Systems Lecture 24 Radial Basis Network (I)
CSC2535: Computation in Neural Networks Lecture 8: Hopfield nets Geoffrey Hinton.
Intro. ANN & Fuzzy Systems Lecture 11. MLP (III): Back-Propagation.
Lecture 39 Hopfield Network
Fall 2004 Backpropagation CS478 - Machine Learning.
Hopfield Networks MacKay - Chapter 42.
Solving Traveling salesman Problem with Hopfield Net
Chapter 6 Associative Models
Ch7: Hopfield Neural Model
NEURONAL DYNAMICS 2: ACTIVATION MODELS
Lecture 12. MLP (IV): Programming & Implementation
Recurrent Neural Networks
Lecture 25 Radial Basis Network (II)
Autonomous Cyber-Physical Systems: Dynamical Systems
Lecture 12. MLP (IV): Programming & Implementation
Lecture 9 MLP (I): Feed-forward Model
Lecture 11. MLP (III): Back-Propagation
Outline Single neuron case: Nonlinear error correcting learning
ECE/CS/ME 539 Neural Networks and Fuzzy Systems
Lecture 24 Radial Basis Network (I)
Outline Associative Learning: Hebbian Learning
Hopfield Network.
Lecture 7. Learning (IV): Error correcting Learning and LMS Algorithm
Neural Networks Chapter 4
Recurrent Networks A recurrent network is characterized by
Hopfield Neural Networks for Optimization
Lecture 8. Learning (V): Perceptron Learning
Computational Intelligence
Computational Intelligence
CS623: Introduction to Computing with Neural Nets (lecture-11)
CSC 578 Neural Networks and Deep Learning
Presentation transcript:

Lecture 39 Hopfield Network

Outline Fundamentals of Hopfield Net Analog Implementation Associate Retrieval Solving Optimization Problem (C) 2001 by Yu Hen Hu

Fundamentals of Hopfield Net Proposed by J.J. Hopfield. A fully Connected, feed-back, fixed weight network. Each neuron accepts its input from the outputs of all other neurons and the its own input: Net function Output: + I1 V1 –T1 + V2 I2 –T2 + I3 V3 –T3 (C) 2001 by Yu Hen Hu

Discrete Time Formulation Define V = [V1, V2, • • •, Vn]T, T = [T1, T2, • • •, Tn]T, I = [I1, I2, • • •, In]T, and Then V(t+1) = sgn{ WV(t) + I(t) – T(t)} (C) 2001 by Yu Hen Hu

Example Let Then (C) 2001 by Yu Hen Hu

Example (continued) [1 1 1 –1]T and [–1 –1 –1 1]T are the two stable attractors. Note that (C) 2001 by Yu Hen Hu

Observations Let v* = [ 1 1 1 1]T. For any v(0) such that vT(0)v*  0, Otherwise, v(t) will oscillate between ±v(0). Exercise: try v(0) = [ 1 1 1 1]T or [ 1 1 1 1]T. Discussion: Synchronous update: All neurons are updated together. Suitable for digital implementation Asynchronous update: Some neurons are updated faster than others. Not all neurons are updated simultaneously. Most natural for analog implementation. (C) 2001 by Yu Hen Hu

Lyapunov function for Stability Consider a scalar function E(V) satisfying: (i) E(V*) = 0 (ii) E(V) > 0 for V  V* (iii) dE/dV = 0 at V = V*, and dE/dV < 0 for V  V* If such an E(V) can be found, it is called a Lyapunov function, and the system is asymptotically stable (i.e. V  V* as t ). (C) 2001 by Yu Hen Hu

Hopfield Net Energy Function Hence, Hopfield net dynamic equation is to minimize E(v) along descending gradient direction. Stability of Hopfield Net – If wij = wji & wii = 0, the output will converge to a local minimum (instead of oscillating). (C) 2001 by Yu Hen Hu

Associative Retrieval Want to store a set of binary input vector {bm; 1  m  M} such that when a perturbed b'm is presented as I (input), the binary output V= bm. Weight Matrix: Assume binary values ±1 (C) 2001 by Yu Hen Hu

Example b1 = [ 1 1 1 –1]T, b2 = [1 1 –1 –1]T Let I = V(0) = [ –1 1 –1 –1]T, then (C) 2001 by Yu Hen Hu

Hopfield Net Solution to TSP (Hopfield and Tank) Use an n by n matrix to represent a tour. Vij – i-th city as the j-th stop. Each entry is a neuron! A 1 5 B 4 C 3 D 2 E City/tour (C) 2001 by Yu Hen Hu

Energy Function First three terms makes V a permutation matrix. Last term minimizes the tour distance Validity of the solution – e.g. the A, B, C, D coefficients in the TSP problem. Quality of the solution – the initial condition will affect the (C) 2001 by Yu Hen Hu