Presentation is loading. Please wait.

Presentation is loading. Please wait.

An Illustrative Example.

Similar presentations


Presentation on theme: "An Illustrative Example."— Presentation transcript:

1 An Illustrative Example

2 Apple/Banana Sorter

3 Prototype Vectors Measurement Vector Prototype Banana Prototype Apple
Shape: {1 : round ; -1 : eliptical} Texture: {1 : smooth ; -1 : rough} Weight: {1 : > 1 lb. ; -1 : < 1 lb.}

4 Perceptron

5 Two-Input Case Decision Boundary

6 Apple/Banana Example The decision boundary should
separate the prototype vectors. The weight vector should be orthogonal to the decision boundary, and should point in the direction of the vector which should produce an output of 1. The bias determines the position of the boundary

7 Testing the Network Banana: Apple: “Rough” Banana:

8 Hamming Network

9 Properties of Hamming Network
It Was designed for binary pattern recognition problems Input is either 1 or -1 Has a feed-forward and recurrent layers Number of neurons in the first layer is the same as those in the second layer It is intended to decide which given prototype vector is closest to the input The feedforward layer performs inner product between each of the prototype patterns and the input pattern The rows of the W in the feedforward layer are set to the prototype patterns The feedforward layer uses a linear transform (purelin) Each element of the bias vector is the number of elements in the input vector, R. This assures that input to the recurrent network is always positive One neuron in the recurrent layer for each prototype pattern The recurrent layer is known as the “competitive layer” inputs to this layer are the outputs from the feedforward layer When the recurrent layer converges then there will be one neuron with non -zero output that determines the prototype closest to the input vector

10 For Banana/Apple Recognition
Feedforward Layer For Banana/Apple Recognition

11 Recurrent Layer S is Number of neurons in recurrent layer

12 Hamming Operation First Layer Input (Rough Banana)

13 Hamming Operation Second Layer

14 Hopfield Network

15 Properties of Hopfield Network
The neurons are initialized with the input vector The network iterates until the output converges The network is converged when one of the inputs appear at the output. That is if the network works correctly. It produces one of the prototypes. The satlin transfer function is used in the feedforward layer. Computing W is complex but for now we wish to make it such that the cause of difference in strengthen and the cause of similarity is weakened.

16 Apple/Banana Problem Test: “Rough” Banana (Banana)

17 Summary Perceptron Hamming Network Hopfield Network
Feedforward Network Linear Decision Boundary One Neuron for Each Decision Hamming Network Competitive Network First Layer – Pattern Matching (Inner Product) Second Layer – Competition (Winner-Take-All) # Neurons = # Prototype Patterns Hopfield Network Dynamic Associative Memory Network Network Output Converges to a Prototype Pattern # Neurons = # Elements in each Prototype Pattern


Download ppt "An Illustrative Example."

Similar presentations


Ads by Google