R ECURRENT N EURAL N ETWORKS OR A SSOCIATIVE M EMORIES Ranga Rodrigo February 24, 2014 1.

Slides:



Advertisements
Similar presentations
Chapter3 Pattern Association & Associative Memory
Advertisements

Pattern Association.
Artificial Neural Networks (1)
Introduction to Training and Learning in Neural Networks n CS/PY 399 Lab Presentation # 4 n February 1, 2001 n Mount Union College.
An Illustrative Example.
Lecture 16 Spiking neural networks
An Illustrative Example.
Neural NetworksNN 11 Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
Neural Networks Basic concepts ArchitectureOperation.
Pattern Association A pattern association learns associations between input patterns and output patterns. One of the most appealing characteristics of.
Correlation Matrix Memory CS/CMPE 333 – Neural Networks.
Introduction to Neural Networks John Paxton Montana State University Summer 2003.
20.5 Nerual Networks Thanks: Professors Frank Hoffmann and Jiawei Han, and Russell and Norvig.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Self Organization: Hebbian Learning CS/CMPE 333 – Neural Networks.
Un Supervised Learning & Self Organizing Maps Learning From Examples
An Illustrative Example
MAE 552 Heuristic Optimization Instructor: John Eddy Lecture #31 4/17/02 Neural Networks.
CHAPTER 3 Pattern Association.
Neural Networks Lab 5. What Is Neural Networks? Neural networks are composed of simple elements( Neurons) operating in parallel. Neural networks are composed.
Ming-Feng Yeh1 CHAPTER 3 An Illustrative Example.
CS623: Introduction to Computing with Neural Nets (lecture-10) Pushpak Bhattacharyya Computer Science and Engineering Department IIT Bombay.
Soft Computing Colloquium 2 Selection of neural network, Hybrid neural networks.
© Negnevitsky, Pearson Education, Lecture 7 Artificial neural networks: Supervised learning Introduction, or how the brain works Introduction, or.
Multiple-Layer Networks and Backpropagation Algorithms
Multi Layer NN and Bit-True Modeling of These Networks SILab presentation Ali Ahmadi September 2007.
Neural Networks Architecture Baktash Babadi IPM, SCS Fall 2004.
Neural Network Hopfield model Kim, Il Joong. Contents  Neural network: Introduction  Definition & Application  Network architectures  Learning processes.
-Artificial Neural Network- Hopfield Neural Network(HNN) 朝陽科技大學 資訊管理系 李麗華 教授.
Chapter 3 Neural Network Xiu-jun GONG (Ph. D) School of Computer Science and Technology, Tianjin University
1 Chapter 6: Artificial Neural Networks Part 2 of 3 (Sections 6.4 – 6.6) Asst. Prof. Dr. Sukanya Pongsuparb Dr. Srisupa Palakvangsa Na Ayudhya Dr. Benjarath.
Artificial Neural Network Supervised Learning دكترمحسن كاهاني
Hebbian Coincidence Learning
Fuzzy BSB-neuro-model. «Brain-State-in-a-Box Model» (BSB-model) Dynamic of BSB-model: (1) Activation function: (2) 2.
Neural Networks and Fuzzy Systems Hopfield Network A feedback neural network has feedback loops from its outputs to its inputs. The presence of such loops.
IE 585 Associative Network. 2 Associative Memory NN Single-layer net in which the weights are determined in such a way that the net can store a set of.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Focus on Unsupervised Learning.  No teacher specifying right answer.
Pencil-and-Paper Neural Networks Prof. Kevin Crisp St. Olaf College.
Activations, attractors, and associators Jaap Murre Universiteit van Amsterdam
METU Informatics Institute Min720 Pattern Classification with Bio-Medical Applications Part 8: Neural Networks.
Neural Network Basics Anns are analytical systems that address problems whose solutions have not been explicitly formulated Structure in which multiple.
381 Self Organization Map Learning without Examples.
What is Unsupervised Learning? Learning without a teacher. No feedback to indicate the desired outputs. The network must by itself discover the relationship.
Unsupervised Learning Networks 主講人 : 虞台文. Content Introduction Important Unsupervised Learning NNs – Hamming Networks – Kohonen’s Self-Organizing Feature.
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
Activations, attractors, and associators Jaap Murre Universiteit van Amsterdam en Universiteit Utrecht
CS623: Introduction to Computing with Neural Nets (lecture-12) Pushpak Bhattacharyya Computer Science and Engineering Department IIT Bombay.
Lecture 9 Model of Hopfield
Neural Networks Lecture 11: Learning in recurrent networks Geoffrey Hinton.
Lecture 2 Introduction to Neural Networks and Fuzzy Logic President UniversityErwin SitompulNNFL 2/1 Dr.-Ing. Erwin Sitompul President University
Assocative Neural Networks (Hopfield) Sule Yildirim 01/11/2004.
J. Kubalík, Gerstner Laboratory for Intelligent Decision Making and Control Artificial Neural Networks II - Outline Cascade Nets and Cascade-Correlation.
Hopfield Networks MacKay - Chapter 42.
Real Neurons Cell structures Cell body Dendrites Axon
Ranga Rodrigo February 8, 2014
DEPARTMENT: COMPUTER SC. & ENGG. SEMESTER : VII
FUNDAMENTAL CONCEPT OF ARTIFICIAL NETWORKS
CSE P573 Applications of Artificial Intelligence Neural Networks
Artificial Neural Network & Backpropagation Algorithm
CSE 573 Introduction to Artificial Intelligence Neural Networks
network of simple neuron-like computing elements
An Illustrative Example.
Multi-Layer Perceptron
Copyright © 2014 Elsevier Inc. All rights reserved.
The Network Approach: Mind as a Web
An Illustrative Example.
AI Lectures by Engr.Q.Zia
Presentation transcript:

R ECURRENT N EURAL N ETWORKS OR A SSOCIATIVE M EMORIES Ranga Rodrigo February 24,

I NTRODUCTION In a network, the signal received at the output was sent again to the network input. Such circulation of the signal is called feedback. Such neural networks are are called recurrent neural networks. Recurrent neural networks: – Hopfield neural network – Hamming neural network – Real Time Recurrent Network (RTRN) – Elman neural network – Bidirectional Associative Memory (BAM) 2

HOPFIELD NEURAL NETWORK 3

4  w 11 w 12 w1nw1n w 21 w 22 w2nw2n wn1wn1 wn2wn2 w nn z -1 y1y1 y2y2 ynyn w 10 w 20 wn0wn

H OPFIED S TRUCTURE It is a one-layer network with a regular structure, made of many neurons connected one to the other. Output is fedback after one-step time delay. There are no feedbacks in the same neuron. During learning weights w kj get modified depending on the value of learning vector x. In retrieval mode, the input signal stimulates the network which, through the feedback, repeatedly receives the output signal at its input, until the answer is stabilized. 5

6    w 11 w 12 w1nw1n w 21 w 22 w2nw2n wn1wn1 wn2wn2 w nn z -1 y1y1 y2y2 ynyn w 10 w 20 wn0wn

L EARNING M ETHODS FOR H OPFIELD : H EBB The generalized Hebbian rule for M leaning vectors of the form The maximum number of patterns which the network is able to memorize using this rule is only 13.8% of the number of neurons. 7

L EARNING M ETHODS FOR H OPFIELD : P SEUDOINVERSE for M leaning vectors of the form Form a matrix of learning vectors The n  n weights matrix W is found as In Matlab – W = X*pinv(X) 8

HAMMING NEURAL NETWORK 9

10

O PERATION OF THE H AMMING NN In the first layer, there are p neurons, which determine the Hamming distance between the input vector and each of the p desired vectors coded in the weights of this layer. The second layer is called MAXNET. It is a layer corresponding to the Hopfield network. However, in this layer feedback covering the same neuron are added. The weights in these feedbacks are equal to 1. The values of weights of other neurons of this layer are selected so that they inhibit the process. Thus, in the MAXNET layer there is the extinction of all outputs except the one which was the strongest in the first layer. The neuron of this layer, which is identified with the winner, through the weights of output neurons with a linear activation function, will retrieve the output vector associated with the vector coded in the first layer. 11