An Illustrative Example.

Slides:



Advertisements
Similar presentations
Pattern Association.
Advertisements

Perceptron Lecture 4.
Introduction to Neural Networks Computing
Artificial Neural Networks (1)
Perceptron Learning Rule
Machine Learning: Connectionist McCulloch-Pitts Neuron Perceptrons Multilayer Networks Support Vector Machines Feedback Networks Hopfield Networks.
An Illustrative Example.
4 1 Perceptron Learning Rule. 4 2 Learning Rules Learning Rules : A procedure for modifying the weights and biases of a network. Learning Rules : Supervised.
Artificial Neural Networks
Neural NetworksNN 11 Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
Linear Learning Machines  Simplest case: the decision function is a hyperplane in input space.  The Perceptron Algorithm: Rosenblatt, 1956  An on-line.
A Review: Architecture
Introduction to Neural Networks John Paxton Montana State University Summer 2003.
Carla P. Gomes CS4700 CS 4700: Foundations of Artificial Intelligence Prof. Carla P. Gomes Module: Neural Networks: Concepts (Reading:
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Rutgers CS440, Fall 2003 Neural networks Reading: Ch. 20, Sec. 5, AIMA 2 nd Ed.
Linear Learning Machines  Simplest case: the decision function is a hyperplane in input space.  The Perceptron Algorithm: Rosenblatt, 1956  An on-line.
Perceptron Learning Rule
An Illustrative Example
AN INTERACTIVE TOOL FOR THE STOCK MARKET RESEARCH USING RECURSIVE NEURAL NETWORKS Master Thesis Michal Trna
18 1 Hopfield Network Hopfield Model 18 3 Equations of Operation n i - input voltage to the ith amplifier a i - output voltage of the ith amplifier.
CHAPTER 11 Back-Propagation Ming-Feng Yeh.
Neural Networks Lab 5. What Is Neural Networks? Neural networks are composed of simple elements( Neurons) operating in parallel. Neural networks are composed.
1 Pertemuan 18 HOPFIELD DESIGN Matakuliah: H0434/Jaringan Syaraf Tiruan Tahun: 2005 Versi: 1.
Ming-Feng Yeh1 CHAPTER 3 An Illustrative Example.
Artificial Neural Network
Supervised Hebbian Learning
Where We’re At Three learning rules  Hebbian learning regression  LMS (delta rule) regression  Perceptron classification.
Neural NetworksNN 11 Neural netwoks thanks to: Basics of neural network theory and practice for supervised and unsupervised.
ANNs (Artificial Neural Networks). THE PERCEPTRON.
Matlab Matlab Sigmoid Sigmoid Perceptron Perceptron Linear Linear Training Training Small, Round Blue-Cell Tumor Classification Example Small, Round Blue-Cell.
1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 20 Oct 26, 2005 Nanjing University of Science & Technology.
Fuzzy BSB-neuro-model. «Brain-State-in-a-Box Model» (BSB-model) Dynamic of BSB-model: (1) Activation function: (2) 2.
7 1 Supervised Hebbian Learning. 7 2 Hebb’s Postulate “When an axon of cell A is near enough to excite a cell B and repeatedly or persistently takes part.
George F Luger ARTIFICIAL INTELLIGENCE 6th edition Structures and Strategies for Complex Problem Solving Machine Learning: Connectionist Luger: Artificial.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Focus on Unsupervised Learning.  No teacher specifying right answer.
A note about gradient descent: Consider the function f(x)=(x-x 0 ) 2 Its derivative is: By gradient descent. x0x0 + -
Digital Image Processing Lecture 25: Object Recognition Prof. Charlene Tsai.
Ten MC Questions taken from the Text, slides and described in class presentation. COSC 4426 AJ Boulay Julia Johnson.
CHAPTER 14 Competitive Networks Ming-Feng Yeh.
Modelleerimine ja Juhtimine Tehisnärvivõrgudega Identification and Control with artificial neural networks.
Perceptrons Michael J. Watts
Chapter 6 Neural Network.
ECE 471/571 - Lecture 16 Hopfield Network 11/03/15.
숭실대 전기공학과 C ontrol I nformation P rocess L ab 김경진.
Lecture 12. Outline of Rule-Based Classification 1. Overview of ANN 2. Basic Feedforward ANN 3. Linear Perceptron Algorithm 4. Nonlinear and Multilayer.
Ch 7. Computing with Population Coding Summarized by Kim, Kwonill Bayesian Brain: Probabilistic Approaches to Neural Coding P. Latham & A. Pouget.
1 Introduction to Neural Networks Recurrent Neural Networks.
Artificial Neural Networks
Learning in Neural Networks
ECE 471/571 - Lecture 15 Hopfield Network 03/29/17.
Modelleerimine ja Juhtimine Tehisnärvivõrgudega
ECE 471/571 - Lecture 19 Hopfield Network.
Lecture 9 MLP (I): Feed-forward Model
Neural Networks Chapter 5
An Illustrative Example.
Adaptive Resonance Theory
Multi-Layer Perceptron
Copyright © 2014 Elsevier Inc. All rights reserved.
Adaptive Resonance Theory
Neuro-Computing Lecture 2 Single-Layer Perceptrons
CSSE463: Image Recognition Day 17
Perceptron Learning Rule
An Illustrative Example.
Perceptron Learning Rule
Supervised Hebbian Learning
Perceptron Learning Rule
Teaching Recurrent NN to be Dynamical Systems
Presentation transcript:

An Illustrative Example

Apple/Banana Sorter

Prototype Vectors Measurement Vector Prototype Banana Prototype Apple Shape: {1 : round ; -1 : eliptical} Texture: {1 : smooth ; -1 : rough} Weight: {1 : > 1 lb. ; -1 : < 1 lb.}

Perceptron

Two-Input Case Decision Boundary

Apple/Banana Example The decision boundary should separate the prototype vectors. The weight vector should be orthogonal to the decision boundary, and should point in the direction of the vector which should produce an output of 1. The bias determines the position of the boundary

Testing the Network Banana: Apple: “Rough” Banana:

Hamming Network

For Banana/Apple Recognition Feedforward Layer For Banana/Apple Recognition

Recurrent Layer

Hamming Operation First Layer Input (Rough Banana)

Hamming Operation Second Layer

Hopfield Network

Apple/Banana Problem Test: “Rough” Banana (Banana)

Summary Perceptron Hamming Network Hopfield Network Feedforward Network Linear Decision Boundary One Neuron for Each Decision Hamming Network Competitive Network First Layer – Pattern Matching (Inner Product) Second Layer – Competition (Winner-Take-All) # Neurons = # Prototype Patterns Hopfield Network Dynamic Associative Memory Network Network Output Converges to a Prototype Pattern # Neurons = # Elements in each Prototype Pattern

The End of Chapter III