IE 585 Associative Network. 2 Associative Memory NN Single-layer net in which the weights are determined in such a way that the net can store a set of.

Slides:



Advertisements
Similar presentations
Chapter3 Pattern Association & Associative Memory
Advertisements

Pattern Association.
Feedback Networks and Associative Memories
Computational Intelligence
1 Neural networks 3. 2 Hopfield network (HN) model A Hopfield network is a form of recurrent artificial neural network invented by John Hopfield in 1982.
Simple Neural Nets For Pattern Classification
Neural Networks.
A Review: Architecture
Pattern Association A pattern association learns associations between input patterns and output patterns. One of the most appealing characteristics of.
Introduction to Neural Networks John Paxton Montana State University Summer 2003.
Carla P. Gomes CS4700 CS 4700: Foundations of Artificial Intelligence Prof. Carla P. Gomes Module: Neural Networks: Concepts (Reading:
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Un Supervised Learning & Self Organizing Maps Learning From Examples
Rutgers CS440, Fall 2003 Neural networks Reading: Ch. 20, Sec. 5, AIMA 2 nd Ed.
Prediction Networks Prediction –Predict f(t) based on values of f(t – 1), f(t – 2),… –Two NN models: feedforward and recurrent A simple example (section.
Introduction to Neural Networks John Paxton Montana State University Summer 2003.
Before we start ADALINE
November 30, 2010Neural Networks Lecture 20: Interpolative Associative Memory 1 Associative Networks Associative networks are able to store a set of patterns.
CHAPTER 3 Pattern Association.
Neural Networks An Introduction.
Neural Networks Chapter 2 Joost N. Kok Universiteit Leiden.
-Artificial Neural Network- Chapter 3 Perceptron 朝陽科技大學 資訊管理系 李麗華教授.
CS623: Introduction to Computing with Neural Nets (lecture-10) Pushpak Bhattacharyya Computer Science and Engineering Department IIT Bombay.
Chapter 6 Associative Models. Introduction Associating patterns which are –similar, –contrary, –in close proximity (spatial), –in close succession (temporal)
© Negnevitsky, Pearson Education, Lecture 7 Artificial neural networks: Supervised learning Introduction, or how the brain works Introduction, or.
Neural Networks. Plan Perceptron  Linear discriminant Associative memories  Hopfield networks  Chaotic networks Multilayer perceptron  Backpropagation.
Presentation on Neural Networks.. Basics Of Neural Networks Neural networks refers to a connectionist model that simulates the biophysical information.
Associative-Memory Networks Input: Pattern (often noisy/corrupted) Output: Corresponding pattern (complete / relatively noise-free) Process 1.Load input.
Neural Networks Architecture Baktash Babadi IPM, SCS Fall 2004.
Neural Network Hopfield model Kim, Il Joong. Contents  Neural network: Introduction  Definition & Application  Network architectures  Learning processes.
-Artificial Neural Network- Hopfield Neural Network(HNN) 朝陽科技大學 資訊管理系 李麗華 教授.
© Negnevitsky, Pearson Education, Lecture 8 (chapter 6) Artificial neural networks: Supervised learning The perceptron Quick Review The perceptron.
Artificial Neural Network Supervised Learning دكترمحسن كاهاني
NEURAL NETWORKS FOR DATA MINING
Hebbian Coincidence Learning
Recurrent Network InputsOutputs. Motivation Associative Memory Concept Time Series Processing – Forecasting of Time series – Classification Time series.
10/17/2015Intelligent Systems and Soft Computing1 Lecture 7 Artificial neural networks: Supervised learning Introduction, or how the brain works Introduction,
Neural Networks and Fuzzy Systems Hopfield Network A feedback neural network has feedback loops from its outputs to its inputs. The presence of such loops.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Activations, attractors, and associators Jaap Murre Universiteit van Amsterdam
METU Informatics Institute Min720 Pattern Classification with Bio-Medical Applications Part 8: Neural Networks.
1 Lecture 6 Neural Network Training. 2 Neural Network Training Network training is basic to establishing the functional relationship between the inputs.
What is Unsupervised Learning? Learning without a teacher. No feedback to indicate the desired outputs. The network must by itself discover the relationship.
Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
IE 585 History of Neural Networks & Introduction to Simple Learning Rules.
Activations, attractors, and associators Jaap Murre Universiteit van Amsterdam en Universiteit Utrecht
R ECURRENT N EURAL N ETWORKS OR A SSOCIATIVE M EMORIES Ranga Rodrigo February 24,
Lecture 9 Model of Hopfield
ECE 471/571 - Lecture 16 Hopfield Network 11/03/15.
Computational Intelligence Winter Term 2015/16 Prof. Dr. Günter Rudolph Lehrstuhl für Algorithm Engineering (LS 11) Fakultät für Informatik TU Dortmund.
CSC2535: Computation in Neural Networks Lecture 8: Hopfield nets Geoffrey Hinton.
Assocative Neural Networks (Hopfield) Sule Yildirim 01/11/2004.
An Associative Memory based on a Mixed-Signal Cellular Neural Network Michael Flynn, Daniel Weyer.
Lecture 39 Hopfield Network
J. Kubalík, Gerstner Laboratory for Intelligent Decision Making and Control Artificial Neural Networks II - Outline Cascade Nets and Cascade-Correlation.
Ch7: Hopfield Neural Model
ECE 471/571 - Lecture 15 Hopfield Network 03/29/17.
Real Neurons Cell structures Cell body Dendrites Axon
FUNDAMENTAL CONCEPT OF ARTIFICIAL NETWORKS
Hebb and Perceptron.
ECE 471/571 - Lecture 19 Hopfield Network.
Recurrent Networks A recurrent network is characterized by
Lecture 39 Hopfield Network
Computational Intelligence
Ch6: AM and BAM 6.1 Introduction AM: Associative Memory
An Illustrative Example.
Computational Intelligence
CS623: Introduction to Computing with Neural Nets (lecture-11)
CSC 578 Neural Networks and Deep Learning
Presentation transcript:

IE 585 Associative Network

2 Associative Memory NN Single-layer net in which the weights are determined in such a way that the net can store a set of pattern associations The net only only learns the specific pattern pairs that were used for training, but also is able to recall the desired response pattern when given an input stimulus that is similar, but not identical, to the training input

3 Types of Associative Memory Autoassociative memory:  if each vector t (output, target) is the same as the vector s (input) with which it is associated Heteroassociative memory:  if the t’s are different from the s’s.

4 Discrete Hopfield Net Developed by Hopfield (binary – 1982, bipolar – 1984) Recurrent autoassociative Fully interconnected neural network Symmetric weights with no self-connections Asychronous  only one unit updates its activation at a time  each unit continues to receive an external signal in addition to the signal from the other units in the net

5 Architecture of Hopfield Net x1x1 x2x2 Y1Y1.... xnxn Y2Y2 YnYn

6 Procedure of the Discrete Hopfield Net Initialize weights to store patterns For each input vector x set y i = x i for each unit Y i compute broadcast the y i to all other units Continue until it converges

7 Hebb Learning Binary Bipolar

8 Transfer Function Step Function Binary Transfer Function: Bipolar Transfer Function: 1 1 0

9 is usually 0 The order of update of the units is random The order of learning set does not affect Extension to continuous activation for both pattern association or constrained optimization by Hopfiled & Tank (1985, Hopfiled-Tank Net)

10 Hopfield Net Example

11 Energy Function for the discrete Hopfield Net Also called Lyapunov Function Developed by Alexander Lyaphnov (Russian) Asynchronous updating of the Hopfield net allows such a function Prove that the net will converge to a stable limit point (pattern of activation of the units) Non-increasing function of the state of the system

12 Storage Capacity of Hopfield Net Binary Bipolar P: # of patterns that can be stored an recalled in a net with reasonable accuracy n: # of neurons in the net

13 Spurious Memory It is stable in energy state, but converges to an activation vector that is not one of the stored patterns  a discrete Hopfield net can be used to determine whether an input vector is a “known” vector or an “unknown” vector

14 Hamming Distance (HD)  # of different bits in two binary or bipolar vectors Orthgonal  HD = n/2 n: # of bits  can store max # of patterns

15 Bi-directional Associative Memory (BAM) Developed by Kosko (1988) Recurrent heteroassociative Two layers of neurons connected by directional weighted connection paths Symmetric weights with no self-connections Signals are sent only from one layer to the other at any step of the process, not simultaneously in both directions

16 Architecture of BAM x1x1 x2x2 Y1Y1.... xnxn Y2Y2 YnYn....

17 Procedure of the BAM Initialize weights to store patterns Initialize all activations to 0 For each testing input vector present input x to the X-layer (or Y-layer) compute compute send signals to Y-layer Continue until it converges

18 Hebb Learning Binary Bipolar

19 Transfer Function Step Function Binary Transfer Function: Bipolar Transfer Function: 1 1 0

20 BAM Example

21 Storage Capacity of the BAM P: # of patterns that can be stored an recalled in a net with reasonable accuracy n: # of inputs m: # of outputs