7 1 Supervised Hebbian Learning. 7 2 Hebb’s Postulate “When an axon of cell A is near enough to excite a cell B and repeatedly or persistently takes part.

Slides:



Advertisements
Similar presentations
Computational Neuroscience 03 Lecture 8
Advertisements

Pattern Association.
Introduction to Neural Networks Computing
B.Macukow 1 Lecture 12 Neural Networks. B.Macukow 2 Neural Networks for Matrix Algebra Problems.
An Illustrative Example.
Justin Besant BIONB 2220 Final Project
Learning crossmodal spatial transformations through STDP Gerhard Neumann Seminar B, SS 06.
Chapter 7 Supervised Hebbian Learning.
Correlation Matrix Memory CS/CMPE 333 – Neural Networks.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Self Organization: Hebbian Learning CS/CMPE 333 – Neural Networks.
Contents Hebb. Learn. Patt. Assoc. Associator Correlations CS 476: Networks of Neural Computation, CSD, UOC, 2009 Examples Conclusions WK7 – Hebbian Learning.
Connectionist Modeling Some material taken from cspeech.ucd.ie/~connectionism and Rich & Knight, 1991.
September 16, 2010Neural Networks Lecture 4: Models of Neurons and Neural Networks 1 Capabilities of Threshold Neurons By choosing appropriate weights.
AN INTERACTIVE TOOL FOR THE STOCK MARKET RESEARCH USING RECURSIVE NEURAL NETWORKS Master Thesis Michal Trna
Supervised Hebbian Learning. Hebb’s Postulate “When an axon of cell A is near enough to excite a cell B and repeatedly or persistently takes part in firing.
18 1 Hopfield Network Hopfield Model 18 3 Equations of Operation n i - input voltage to the ith amplifier a i - output voltage of the ith amplifier.
Artificial neural networks.
November 30, 2010Neural Networks Lecture 20: Interpolative Associative Memory 1 Associative Networks Associative networks are able to store a set of patterns.
1 Pertemuan 18 HOPFIELD DESIGN Matakuliah: H0434/Jaringan Syaraf Tiruan Tahun: 2005 Versi: 1.
1 Activity-dependent Development (2) Hebb’s hypothesis Hebbian plasticity in visual system Cellular mechanism of Hebbian plasticity.
PART 5 Supervised Hebbian Learning. Outline Linear Associator The Hebb Rule Pseudoinverse Rule Application.
COMP305. Part I. Artificial neural networks.. Topic 3. Learning Rules of the Artificial Neural Networks.
Unsupervised learning
Supervised Hebbian Learning
Where We’re At Three learning rules  Hebbian learning regression  LMS (delta rule) regression  Perceptron classification.
Learning Processes.
PSY105 Neural Networks 4/5 4. “Traces in time” Assignment note: you don't need to read the full book to answer the first half of the question. You should.
Artificial Neural Network Yalong Li Some slides are from _24_2011_ann.pdf.
HEBB’S THEORY The implications of his theory, and their application to Artificial Life.
Hebbian Coincidence Learning
Fuzzy BSB-neuro-model. «Brain-State-in-a-Box Model» (BSB-model) Dynamic of BSB-model: (1) Activation function: (2) 2.
Neural Networks and Fuzzy Systems Hopfield Network A feedback neural network has feedback loops from its outputs to its inputs. The presence of such loops.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Unsupervised learning
Biological Modeling of Neural Networks Week 6 Hebbian LEARNING and ASSOCIATIVE MEMORY Wulfram Gerstner EPFL, Lausanne, Switzerland 6.1 Synaptic Plasticity.
Pencil-and-Paper Neural Networks Prof. Kevin Crisp St. Olaf College.
The delta rule. Learn from your mistakes If it ain’t broke, don’t fix it.
”When spikes do matter: speed and plasticity” Thomas Trappenberg 1.Generation of spikes 2.Hodgkin-Huxley equation 3.Beyond HH (Wilson model) 4.Compartmental.
Neural Networks 2nd Edition Simon Haykin
1 4. Associators and Synaptic Plasticity Lecture Notes on Brain and Computation Byoung-Tak Zhang Biointelligence Laboratory School of Computer Science.
1 Financial Informatics –XVII: Unsupervised Learning 1 Khurshid Ahmad, Professor of Computer Science, Department of Computer Science Trinity College, Dublin-2,
1 Lecture 6 Neural Network Training. 2 Neural Network Training Network training is basic to establishing the functional relationship between the inputs.
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
Storage capacity: consider the neocortex ~20*10^9 cells, 20*10^13 synapses.
IE 585 History of Neural Networks & Introduction to Simple Learning Rules.
13 1 Associative Learning Simple Associative Network.
3.Learning In previous lecture, we discussed the biological foundations of of neural computation including  single neuron models  connecting single neuron.
0 Chapter 4: Associators and synaptic plasticity Fundamentals of Computational Neuroscience Dec 09.
Pattern Associators, Generalization, Processing Psych /719 Feb 6, 2001.
1 Neural networks 2. 2 Introduction: Neural networks The nervous system contains 10^12 interconnected neurons.
Connectionist Modelling Summer School Lecture Two.
Lecture 2 Introduction to Neural Networks and Fuzzy Logic President UniversityErwin SitompulNNFL 2/1 Dr.-Ing. Erwin Sitompul President University
Introduction to Connectionism Jaap Murre Universiteit van Amsterdam en Universiteit Utrecht
J. Kubalík, Gerstner Laboratory for Intelligent Decision Making and Control Artificial Neural Networks II - Outline Cascade Nets and Cascade-Correlation.
1 Neural Networks MUMT 611 Philippe Zaborowski April 2005.
Real Neurons Cell structures Cell body Dendrites Axon
CSE P573 Applications of Artificial Intelligence Neural Networks
Simple learning in connectionist networks
Financial Informatics –XVII: Unsupervised Learning
Hebb and Perceptron.
Hopfield Network.
Associative Learning.
CSE 573 Introduction to Artificial Intelligence Neural Networks
An Illustrative Example.
Backpropagation.
Simple learning in connectionist networks
Introduction to Neural Network
Associative Learning.
Supervised Hebbian Learning
Presentation transcript:

7 1 Supervised Hebbian Learning

7 2 Hebb’s Postulate “When an axon of cell A is near enough to excite a cell B and repeatedly or persistently takes part in firing it, some growth process or metabolic change takes place in one or both cells such that A’s efficiency, as one of the cells firing B, is increased.” D. O. Hebb, 1949 A B

7 3 Linear Associator Training Set:

7 4 Hebb Rule Presynaptic Signal Postsynaptic Signal Simplified Form: Supervised Form: Matrix Form:

7 5 Batch Operation  W t 1 t 2  t Q p 1 T p 2 T p Q T TP T == T t 1 t 2  t Q = P p 1 p 2  p Q = Matrix Form: (Zero Initial Weights)

7 6 Performance Analysis 0qk  = Case I, input patterns are orthogonal. Therefore the network output equals the target: Case II, input patterns are normalized, but not orthogonal. Error

7 7 Example BananaAppleNormalized Prototype Patterns Weight Matrix (Hebb Rule): Tests: Banana Apple

7 8 Pseudoinverse Rule - (1) T t 1 t 2  t Q = P p 1 p 2  p Q = || E 2 e ij 2 j  i  = Performance Index: Matrix Form:

7 9 Pseudoinverse Rule - (2) Minimize: If an inverse exists for P, F(W) can be made zero: When an inverse does not exist F(W) can be minimized using the pseudoinverse:

7 10 Relationship to the Hebb Rule WTP T = Hebb Rule Pseudoinverse Rule If the prototype patterns are orthonormal:

7 11 Example

7 12 Autoassociative Memory

7 13 Tests 50% Occluded 67% Occluded Noisy Patterns (7 pixels)

7 14 Variations of Hebbian Learning Basic Rule: Learning Rate: Smoothing: Delta Rule: Unsupervised: