1 Neural networks 3. 2 Hopfield network (HN) model A Hopfield network is a form of recurrent artificial neural network invented by John Hopfield in 1982.

Slides:



Advertisements
Similar presentations
Bioinspired Computing Lecture 16
Advertisements

Chapter3 Pattern Association & Associative Memory
Pattern Association.
Presentation By Utkarsh Trivedi Y8544
Memristor in Learning Neural Networks
Machine Learning: Connectionist McCulloch-Pitts Neuron Perceptrons Multilayer Networks Support Vector Machines Feedback Networks Hopfield Networks.
CS 678 –Relaxation and Hopfield Networks1 Relaxation and Hopfield Networks Totally connected recurrent relaxation networks Bidirectional weights (symmetric)
Simple Neural Nets For Pattern Classification
The back-propagation training algorithm
Introduction to Neural Networks John Paxton Montana State University Summer 2003.
Carla P. Gomes CS4700 CS 4700: Foundations of Artificial Intelligence Prof. Carla P. Gomes Module: Neural Networks: Concepts (Reading:
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Pattern Recognition using Hebbian Learning and Floating-Gates Certain pattern recognition problems have been shown to be easily solved by Artificial neural.
Before we start ADALINE
November 30, 2010Neural Networks Lecture 20: Interpolative Associative Memory 1 Associative Networks Associative networks are able to store a set of patterns.
Neural Networks Chapter 2 Joost N. Kok Universiteit Leiden.
December 7, 2010Neural Networks Lecture 21: Hopfield Network Convergence 1 The Hopfield Network The nodes of a Hopfield network can be updated synchronously.
Image Compression Using Neural Networks Vishal Agrawal (Y6541) Nandan Dubey (Y6279)
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Supervised Learning: Perceptrons and Backpropagation.
CS623: Introduction to Computing with Neural Nets (lecture-10) Pushpak Bhattacharyya Computer Science and Engineering Department IIT Bombay.
© Negnevitsky, Pearson Education, Lecture 7 Artificial neural networks: Supervised learning Introduction, or how the brain works Introduction, or.
Jochen Triesch, UC San Diego, 1 Short-term and Long-term Memory Motivation: very simple circuits can store patterns of.
Presentation on Neural Networks.. Basics Of Neural Networks Neural networks refers to a connectionist model that simulates the biophysical information.
Associative-Memory Networks Input: Pattern (often noisy/corrupted) Output: Corresponding pattern (complete / relatively noise-free) Process 1.Load input.
Artificial Neural Networks
10/6/20151 III. Recurrent Neural Networks. 10/6/20152 A. The Hopfield Network.
Explorations in Neural Networks Tianhui Cai Period 3.
Neural Networks Architecture Baktash Babadi IPM, SCS Fall 2004.
Neural Networks Ellen Walker Hiram College. Connectionist Architectures Characterized by (Rich & Knight) –Large number of very simple neuron-like processing.
Neural Network Hopfield model Kim, Il Joong. Contents  Neural network: Introduction  Definition & Application  Network architectures  Learning processes.
1 Chapter 6: Artificial Neural Networks Part 2 of 3 (Sections 6.4 – 6.6) Asst. Prof. Dr. Sukanya Pongsuparb Dr. Srisupa Palakvangsa Na Ayudhya Dr. Benjarath.
Artificial Neural Network Supervised Learning دكترمحسن كاهاني
NEURAL NETWORKS FOR DATA MINING
Recurrent Network InputsOutputs. Motivation Associative Memory Concept Time Series Processing – Forecasting of Time series – Classification Time series.
Neural Networks and Fuzzy Systems Hopfield Network A feedback neural network has feedback loops from its outputs to its inputs. The presence of such loops.
IE 585 Associative Network. 2 Associative Memory NN Single-layer net in which the weights are determined in such a way that the net can store a set of.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
So Far……  Clustering basics, necessity for clustering, Usage in various fields : engineering and industrial fields  Properties : hierarchical, flat,
Activations, attractors, and associators Jaap Murre Universiteit van Amsterdam
CS 478 – Tools for Machine Learning and Data Mining Perceptron.
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
Ten MC Questions taken from the Text, slides and described in class presentation. COSC 4426 AJ Boulay Julia Johnson.
IE 585 History of Neural Networks & Introduction to Simple Learning Rules.
Activations, attractors, and associators Jaap Murre Universiteit van Amsterdam en Universiteit Utrecht
Neural Networks 2nd Edition Simon Haykin
Chapter 6 Neural Network.
CS623: Introduction to Computing with Neural Nets (lecture-12) Pushpak Bhattacharyya Computer Science and Engineering Department IIT Bombay.
Lecture 9 Model of Hopfield
Artificial Intelligence Methods Neural Networks Lecture 3 Rakesh K. Bissoondeeal Rakesh K. Bissoondeeal.
Bab 5 Classification: Alternative Techniques Part 4 Artificial Neural Networks Based Classifer.
ECE 471/571 - Lecture 16 Hopfield Network 11/03/15.
1 Neural networks 2. 2 Introduction: Neural networks The nervous system contains 10^12 interconnected neurons.
CSC2535: Computation in Neural Networks Lecture 8: Hopfield nets Geoffrey Hinton.
Assocative Neural Networks (Hopfield) Sule Yildirim 01/11/2004.
Lecture 39 Hopfield Network
J. Kubalík, Gerstner Laboratory for Intelligent Decision Making and Control Artificial Neural Networks II - Outline Cascade Nets and Cascade-Correlation.
Neural networks.
Ch7: Hopfield Neural Model
ECE 471/571 - Lecture 15 Hopfield Network 03/29/17.
Real Neurons Cell structures Cell body Dendrites Axon
What is an ANN ? The inventor of the first neuro computer, Dr. Robert defines a neural network as,A human brain like system consisting of a large number.
ECE 471/571 - Lecture 19 Hopfield Network.
ECE/CS/ME 539 Neural Networks and Fuzzy Systems
network of simple neuron-like computing elements
Recurrent Networks A recurrent network is characterized by
Lecture 39 Hopfield Network
David Kauchak CS158 – Spring 2019
CS623: Introduction to Computing with Neural Nets (lecture-11)
CSC 578 Neural Networks and Deep Learning
Presentation transcript:

1 Neural networks 3

2 Hopfield network (HN) model A Hopfield network is a form of recurrent artificial neural network invented by John Hopfield in 1982 The network is entirely interconnected –All neurons are both input and output with binary threshold units 1, if x > 0 F(x) = -1, otherwise These are single layered recurrent networks

3 Hopfield network model Applications Recalling or Reconstructing corrupted patterns Large-scale computational intelligence systems Handwriting Recognition Software Hopfield is model used to solve optimization problems such as traveler salesman, task scheduling, etc. Practical applications of HNs are limited because number of training patterns can be at most about 14% the number of nodes in the network. If the network is overloaded -- trained with more than the maximum acceptable number of attractors -- then it won't converge to clearly defined attractors. These are single layered recurrent networks

4 Hopfield network model Hopfield nets serve as content-addressable memory systems with binary threshold nodes (associative memory) –The information is distributed and stored in the network connection. If one element disappear, the information is not lost We access to information using memory content ( CAM : Content addressable memory) The network build progressively its attractors –In the learning phase neurones adjust there synapses based on there activities and predefined rules –Hebb rule – Widrow-Hoff rule (delta rule)

5 Hopfield network model Associative memory Learning

6 Hopfield network model Learning (Hebb rule): Neurone are symetric: |P|: number of training patterns (input combinations) is ith bit in the pattern P

7 Lets train this network for following patterns Pattern 1:- ie O a(1) =-1,O b(1) =-1,O c(1) =1 Pattern 2:- ie O a(2) =1,O b(2) =-1,O c(3) =-1 Pattern 3:- ie O a(3) =-1,O b(3) =1,O c(3) =1 w1,1 = 0 w1,2 = 1/3( OA(1) × OB(1) + OA(2) × OB(2) + OA(3) × OB(3)) =1/3( (-1) × (-1) + 1 × (-1) + (- 1) × 1) = 1/3 w1,3 = 1/3( OA(1) × OC(1) + OA(2) × OC(2) + OA(3) × OC(3)) = 1/3((-1) × × (-1) + (-1) × 1) = 1/3( -3) w2,2 = 0 w2,1 = 1/3( OB(1) × OA(1) + OB(2) × OA(2) + OB(3) × OA(3)) = 1/3((-1) × (-1) + (-1) × × (-1) = 1/3( -1) w2,3 = 1/3( OB(1) × OC(1) + OB(2) × OC(2) + OB(3) × OC(3)) = 1/3((-1) × 1 + (-1) × (-1) + 1 × 1) = 1/3(1) w3,3 = 0 w3,1 = 1/3( OC(1) × OA(1) + OC(2) × OA(2) + OC(3) × OA(3)) = 1/3( 1 × (-1) + (-1) × × (-1)) = 1/3( -3) w3,2 = 1/3( OC(1) × OB(1) + OC(2) × OB(2) + OC(3) × OB(3)) = 1/3( 1 × (-1) + (-1) × (-1) + 1 × 1) = 1/3( 1) Hopfield network model

8 Utilization : –A pattern is entered in the network by setting all nodes (neurones) to a specific value, or by setting only part of the nodes. –The network is then subject to a number of iterations using asynchronous or synchronous updating. –This is stopped after a while. The network neurons are then read out to see which pattern is in the network.

9 Hopfield network model There are two ways of updating them: Asynchronous: one picks one neuron, calculates the weighted input sum and updates immediately. This can be done in a fixed order, or neurons can be picked at random, which is called asynchronous random updating. Synchronous: the weighted input sums of all neurons are calculated without updating the neurons. Then all neurons are set to their new value, according to the value of their weighted input sum.

10 Hopfield network model Neurone state modification:

11 Hopfield network model Network energy : a stable states have a low energy This value is called the "energy" because the definition ensures that when units are randomly chosen to update, the energy E will either lower in value or stay the same.

12 Hopfield network model Shortcomings of HNs Training patterns can be at most about 14% the number of nodes in the network. If more patterns are used then  the stored patterns become unstable;  spurious stable states appear (i.e., stable states which do not correspond with stored patterns). Can sometimes misinterpret the corrupted pattern.

13 Learning HNs through example Moving onto little more complex problem described in Haykin’s Neural Network Book They book used N=120 neuron and trained network with 120 pixel images where each pixel was represented by one neuron. Following 8 patterns were used to train neural network. Hopfield network model

14 Flow Chart summarizing overall process Network returns the decrypted pattern Run the trained network with corrupted pattern Update weight vectors of Network Train HN using Standard patterns Hopfield network model

15 Learning HNs through example Hopfield network model

16 Learning HNs through example 16 Hopfield network model

17 Shortcomings of HNs Hopfield network model

18 Project (exam)