Presentation is loading. Please wait.

Presentation is loading. Please wait.

Multi-Layer Perceptron

Similar presentations


Presentation on theme: "Multi-Layer Perceptron"— Presentation transcript:

1 Multi-Layer Perceptron
Ranga Rodrigo February 8, 2014

2 Introduction Perceptron can only be a linear classifier.
We can have a network of neurons (perceptron-like structures) with an input layer, one or more hidden layers, and an output layer. Each layer consists of many neurons and the output of a layer is fed as inputs to all neurons of the next layer.

3 N1xN2 weights Layer L (output layer) Layer 1 Layer 2 Layer k

4 Description of the MLP In each layer, there are Nk elements (neurons), k = 1, ...,L, denoted as Nki , Each neuron may be a sigmoidal neuron. There are N0 inputs, to which signals x1(t), ..., xN0(t), are applied, notated in the form of a vector The output signal of i-th neuron in k-th layer is denoted as ,

5 Description of Parameters
Input vector for kth layer Input for kth layer from the output of (k-1) layer (except for k=1, i = 0) weights of neuron

6 i-th neuron in k-th layer

7 Forward Pass Output signals of Lth layer Output desired signals

8 Backpropagation Weights update


Download ppt "Multi-Layer Perceptron"

Similar presentations


Ads by Google