An Illustrative Example.

Slides:



Advertisements
Similar presentations
Pattern Association.
Advertisements

Perceptron Lecture 4.
1 Image Classification MSc Image Processing Assignment March 2003.
Artificial Neural Networks (1)
Tuomas Sandholm Carnegie Mellon University Computer Science Department
An Illustrative Example.
Machine Learning: Connectionist McCulloch-Pitts Neuron Perceptrons Multilayer Networks Support Vector Machines Feedback Networks Hopfield Networks.

4 1 Perceptron Learning Rule. 4 2 Learning Rules Learning Rules : A procedure for modifying the weights and biases of a network. Learning Rules : Supervised.
Artificial Neural Networks
Neural NetworksNN 11 Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
Carla P. Gomes CS4700 CS 4700: Foundations of Artificial Intelligence Prof. Carla P. Gomes Module: Neural Networks: Concepts (Reading:
The Perceptron CS/CMPE 333 – Neural Networks. CS/CMPE Neural Networks (Sp 2002/2003) - Asim LUMS2 The Perceptron – Basics Simplest and one.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Pattern Recognition using Hebbian Learning and Floating-Gates Certain pattern recognition problems have been shown to be easily solved by Artificial neural.
Perceptron Learning Rule
An Illustrative Example
Artificial Neural Networks
CHAPTER 11 Back-Propagation Ming-Feng Yeh.
Neural Networks An Introduction.
Neural Networks Lab 5. What Is Neural Networks? Neural networks are composed of simple elements( Neurons) operating in parallel. Neural networks are composed.
Ming-Feng Yeh1 CHAPTER 3 An Illustrative Example.
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Neuron Model and Network Architecture
Introduction to MATLAB Neural Network Toolbox
Neural NetworksNN 11 Neural netwoks thanks to: Basics of neural network theory and practice for supervised and unsupervised.
Artificial Neural Nets and AI Connectionism Sub symbolic reasoning.
Explorations in Neural Networks Tianhui Cai Period 3.
Neural Network Hopfield model Kim, Il Joong. Contents  Neural network: Introduction  Definition & Application  Network architectures  Learning processes.
Matlab Matlab Sigmoid Sigmoid Perceptron Perceptron Linear Linear Training Training Small, Round Blue-Cell Tumor Classification Example Small, Round Blue-Cell.
Artificial Neural Network Supervised Learning دكترمحسن كاهاني
1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 20 Oct 26, 2005 Nanjing University of Science & Technology.
NEURAL NETWORKS FOR DATA MINING
Radial Basis Function Networks:
Artificial Neural Networks. The Brain How do brains work? How do human brains differ from that of other animals? Can we base models of artificial intelligence.
Fuzzy BSB-neuro-model. «Brain-State-in-a-Box Model» (BSB-model) Dynamic of BSB-model: (1) Activation function: (2) 2.
IE 585 Associative Network. 2 Associative Memory NN Single-layer net in which the weights are determined in such a way that the net can store a set of.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Focus on Unsupervised Learning.  No teacher specifying right answer.
A note about gradient descent: Consider the function f(x)=(x-x 0 ) 2 Its derivative is: By gradient descent. x0x0 + -
Digital Image Processing Lecture 25: Object Recognition Prof. Charlene Tsai.
1 Lecture 6 Neural Network Training. 2 Neural Network Training Network training is basic to establishing the functional relationship between the inputs.
13 1 Associative Learning Simple Associative Network.
CHAPTER 14 Competitive Networks Ming-Feng Yeh.
R ECURRENT N EURAL N ETWORKS OR A SSOCIATIVE M EMORIES Ranga Rodrigo February 24,
Modelleerimine ja Juhtimine Tehisnärvivõrgudega Identification and Control with artificial neural networks.
Chapter 6 Neural Network.
ECE 471/571 - Lecture 16 Hopfield Network 11/03/15.
W is weight matrices, dimension 1xR p is input vector, dimension Rx1 b is bias a = f(Wp + b) w1, w2 harus dicari berdasarkan nilai w1, w2 awal yang diberikan.
Lecture 12. Outline of Rule-Based Classification 1. Overview of ANN 2. Basic Feedforward ANN 3. Linear Perceptron Algorithm 4. Nonlinear and Multilayer.
Ch 7. Computing with Population Coding Summarized by Kim, Kwonill Bayesian Brain: Probabilistic Approaches to Neural Coding P. Latham & A. Pouget.
J. Kubalík, Gerstner Laboratory for Intelligent Decision Making and Control Artificial Neural Networks II - Outline Cascade Nets and Cascade-Correlation.
1 Introduction to Neural Networks Recurrent Neural Networks.
Learning in Neural Networks
Data Mining, Neural Network and Genetic Programming
ECE 471/571 - Lecture 15 Hopfield Network 03/29/17.
Modelleerimine ja Juhtimine Tehisnärvivõrgudega
CSSE463: Image Recognition Day 17
ECE 471/571 - Lecture 19 Hopfield Network.
Face Recognition with Neural Networks
network of simple neuron-like computing elements
Neural Networks Chapter 5
An Illustrative Example.
Multi-Layer Perceptron
CSSE463: Image Recognition Day 17
CSSE463: Image Recognition Day 17
Perceptron Learning Rule
An Illustrative Example.
Perceptron Learning Rule
Perceptron Learning Rule
Presentation transcript:

An Illustrative Example

Apple/Orange Sorter Neural Network Sensors Sorter Apple Orange Shape: {1 : round ; -1 : elliptical} Texture: {1 : smooth ; -1 : rough} Weight: {1 : > 1 lb. ; -1 : < 1 lb.} Texture 結構 平滑 粗糙 Elliptical橢圓 LB 是 "磅" (pound) 的縮寫 1 LB= 0.454 KG 1 KG = 2.205LB Apple Orange

Prototype Vectors sensors: Shape: {1 : round ; -1 : elliptical} Texture: {1 : smooth ; -1 : rough} Weight: {1 : > 1 lb. ; -1 : < 1 lb.} sensors:

Perceptron a = -1, n < 0 hardlims: a = 1, n ≧0

Perceptron (cont.) i.e. W = [-1,1]T (p1, p2) = (-1,2), then n = 2 a = hardlims(2) = 1 (p1, p2) = (1,-3), then n = -5 a = hardlims(-5) = -1

Apple/Orange Example

Apple/Orange Example 橘子

Hamming Network

For Orange/Apple Recognition Feedforward Layer For Orange/Apple Recognition S=2 purelin:a=n

Feedforward Layer (cont.) Why is it called Hamming ? The Hamming distance between two vectors is equal the number of elements that are different. e.g. the Hamming distance between [1,-1,-1] and [1,1,1] is 2 , the Hamming distance between [1,1,-1] and [1,1,1] is 1

Recurrent Layer a = 0, n < 0 poslims: a= n, n ≧0

Hamming Operation First Layer:input

Hamming Operation Second Layer: ε=0.5 橘子

Hopfield Network

Apple/Orange Problem a = -1, n <-1 satlins: a = n, -1≦n ≦1

Apple/Orange Problem Test: 橘子

Summary Perceptron Hamming Network Hopfield Network Feedforward Network Linear Decision Boundary One Neuron for Each Decision Hamming Network Competitive Network First Layer – Pattern Matching (Inner Product) Second Layer – Competition (Winner-Take-All) # Neurons = # Prototype Patterns Hopfield Network Dynamic Associative Memory Network Network Output Converges to a Prototype Pattern # Neurons = # Elements in each Prototype Pattern