Supervised Hebbian Learning

Slides:



Advertisements
Similar presentations
Computational Neuroscience 03 Lecture 8
Advertisements

Pattern Association.
Introduction to Neural Networks Computing
B.Macukow 1 Lecture 12 Neural Networks. B.Macukow 2 Neural Networks for Matrix Algebra Problems.
An Illustrative Example.
Justin Besant BIONB 2220 Final Project
Learning crossmodal spatial transformations through STDP Gerhard Neumann Seminar B, SS 06.
Chapter 7 Supervised Hebbian Learning.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Self Organization: Hebbian Learning CS/CMPE 333 – Neural Networks.
AN INTERACTIVE TOOL FOR THE STOCK MARKET RESEARCH USING RECURSIVE NEURAL NETWORKS Master Thesis Michal Trna
Supervised Hebbian Learning. Hebb’s Postulate “When an axon of cell A is near enough to excite a cell B and repeatedly or persistently takes part in firing.
18 1 Hopfield Network Hopfield Model 18 3 Equations of Operation n i - input voltage to the ith amplifier a i - output voltage of the ith amplifier.
Artificial neural networks.
November 30, 2010Neural Networks Lecture 20: Interpolative Associative Memory 1 Associative Networks Associative networks are able to store a set of patterns.
1 Pertemuan 18 HOPFIELD DESIGN Matakuliah: H0434/Jaringan Syaraf Tiruan Tahun: 2005 Versi: 1.
1 Activity-dependent Development (2) Hebb’s hypothesis Hebbian plasticity in visual system Cellular mechanism of Hebbian plasticity.
PART 5 Supervised Hebbian Learning. Outline Linear Associator The Hebb Rule Pseudoinverse Rule Application.
COMP305. Part I. Artificial neural networks.. Topic 3. Learning Rules of the Artificial Neural Networks.
Radial-Basis Function Networks
Unsupervised learning
Supervised Hebbian Learning
Where We’re At Three learning rules  Hebbian learning regression  LMS (delta rule) regression  Perceptron classification.
Learning Processes.
PSY105 Neural Networks 4/5 4. “Traces in time” Assignment note: you don't need to read the full book to answer the first half of the question. You should.
Artificial Neural Network Yalong Li Some slides are from _24_2011_ann.pdf.
HEBB’S THEORY The implications of his theory, and their application to Artificial Life.
Hebbian Coincidence Learning
Fuzzy BSB-neuro-model. «Brain-State-in-a-Box Model» (BSB-model) Dynamic of BSB-model: (1) Activation function: (2) 2.
Neural Networks and Fuzzy Systems Hopfield Network A feedback neural network has feedback loops from its outputs to its inputs. The presence of such loops.
7 1 Supervised Hebbian Learning. 7 2 Hebb’s Postulate “When an axon of cell A is near enough to excite a cell B and repeatedly or persistently takes part.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Unsupervised learning
Pencil-and-Paper Neural Networks Prof. Kevin Crisp St. Olaf College.
The delta rule. Learn from your mistakes If it ain’t broke, don’t fix it.
Unsupervised Learning Motivation: Given a set of training examples with no teacher or critic, why do we learn? Feature extraction Data compression Signal.
Neural Networks 2nd Edition Simon Haykin
1 Financial Informatics –XVII: Unsupervised Learning 1 Khurshid Ahmad, Professor of Computer Science, Department of Computer Science Trinity College, Dublin-2,
1 Lecture 6 Neural Network Training. 2 Neural Network Training Network training is basic to establishing the functional relationship between the inputs.
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
Storage capacity: consider the neocortex ~20*10^9 cells, 20*10^13 synapses.
IE 585 History of Neural Networks & Introduction to Simple Learning Rules.
13 1 Associative Learning Simple Associative Network.
3.Learning In previous lecture, we discussed the biological foundations of of neural computation including  single neuron models  connecting single neuron.
0 Chapter 4: Associators and synaptic plasticity Fundamentals of Computational Neuroscience Dec 09.
Pattern Associators, Generalization, Processing Psych /719 Feb 6, 2001.
1 Neural networks 2. 2 Introduction: Neural networks The nervous system contains 10^12 interconnected neurons.
Connectionist Modelling Summer School Lecture Two.
Introduction to Connectionism Jaap Murre Universiteit van Amsterdam en Universiteit Utrecht
Week 5 NETWORKS of NEURONS and ASSOCIATIVE MEMORY Wulfram Gerstner EPFL, Lausanne, Switzerland 5.1 Introduction - networks of neuron - systems for computing.
J. Kubalík, Gerstner Laboratory for Intelligent Decision Making and Control Artificial Neural Networks II - Outline Cascade Nets and Cascade-Correlation.
1 Neural Networks MUMT 611 Philippe Zaborowski April 2005.
Real Neurons Cell structures Cell body Dendrites Axon
DEPARTMENT: COMPUTER SC. & ENGG. SEMESTER : VII
CSE P573 Applications of Artificial Intelligence Neural Networks
Simple learning in connectionist networks
Financial Informatics –XVII: Unsupervised Learning
Hebb and Perceptron.
Biological and Artificial Neuron
Biological and Artificial Neuron
Corso su Sistemi complessi:
Hopfield Network.
Associative Learning.
Biological and Artificial Neuron
CSE 573 Introduction to Artificial Intelligence Neural Networks
An Illustrative Example.
Backpropagation.
Simple learning in connectionist networks
Introduction to Neural Network
Associative Learning.
Presentation transcript:

Supervised Hebbian Learning

Hebb’s Postulate “When an axon of cell A is near enough to excite a cell B and repeatedly or persistently takes part in firing it, some growth process or metabolic change takes place in one or both cells such that A’s efficiency, as one of the cells firing B, is increased.” D. O. Hebb, 1949 B A

Linear Associator Training Set:

Hebb Rule Simplified Form: Supervised Form: Matrix Form: Presynaptic Signal Postsynaptic Signal Simplified Form: Supervised Form: Matrix Form:

Batch Operation Matrix Form: W t p P P p T t (Zero Initial Weights) = ¼ W t 1 2 Q p T P = P p 1 2 ¼ Q = T t 1 2 ¼ Q =

Performance Analysis Case I, input patterns are orthogonal. q k ¹ = Therefore the network output equals the target: Case II, input patterns are normalized, but not orthogonal. Error

Example Weight Matrix (Hebb Rule): Tests: Banana Apple Normalized Prototype Patterns Weight Matrix (Hebb Rule): Tests: Banana Apple

Pseudoinverse Rule - (1) Performance Index: Matrix Form: T t 1 2 ¼ Q = P p 1 2 ¼ Q = || E 2 e i j å =

Pseudoinverse Rule - (2) Minimize: If an inverse exists for P, F(W) can be made zero: When an inverse does not exist F(W) can be minimized using the pseudoinverse:

Relationship to the Hebb Rule W T P = Pseudoinverse Rule If the prototype patterns are orthonormal:

Example

Autoassociative Memory

Noisy Patterns (7 pixels) Tests 50% Occluded 67% Occluded Noisy Patterns (7 pixels)

Variations of Hebbian Learning Basic Rule: Learning Rate: Smoothing: Delta Rule: Unsupervised: