Presentation is loading. Please wait.

Presentation is loading. Please wait.

Neural Networks.

Similar presentations


Presentation on theme: "Neural Networks."— Presentation transcript:

1 Neural Networks

2 Introduction Artificial Neural Networks (ANN)
Connectionist computation Parallel distributed processing Biologically Inspired computational models Machine Learning Artificial intelligence "the study and design of intelligent agents" where an intelligent agent is a system that perceives its environment and takes actions that maximize its chances of success.

3 History McCulloch and Pitts introduced the Perceptron in 1943.
Simplified model of a biological neuron The drawback in the late 1960's (Minsky and Papert) Perceptron limitations The solution in the mid 1980's Multi-layer perceptron Back-propagation training

4 Summary of Applications
Function approximation Pattern recognition/Classification Signal processing Modeling Control Machine learning

5 Biologically Inspired.
Electro-chemical signals Threshold output firing Human brain: About 100 billion (1011) neurons and 100 trillion (1014) synapses

6 The Perceptron Sum of Weighted Inputs. Threshold activation function

7 Activation Function The sigmoid function: Logsig (Matlab)

8 Activation Function The tanH function: tansig (Matlab)

9 The multi layer perceptron (MLP)
W1 W2 W3 f f f f f f zin ... f f f zout ... ... ... f f f 1 1 1 Y0 X1 Y1 X2 Y2 X3 Y3 W1 W2 W3 F1 F2 F3 1 zin zout

10 The multi layer perceptron (MLP)
X1 Y1 X2 Y2 X3 Y3 W1 W2 W3 F1 F2 F3 1 zin zout

11 Supervised Learning Learning a function from supervised training data. A set of Input vectors Zin and corresponding desired output vectors Zout. The performance function

12 Supervised Learning Gradient descent backpropagation
The Back Propagation Error Algorithm

13 BPE learning. f S X1 Y1 zin F zout W1 W2 f S 1 ... f S ... ... f S 1

14 Neural Networks 0 Collect data. 1 Create the network. 2 Configure the network. 3 Initialize the weights. 4 Train the network. 5 Validate the network. 6 Use the network.

15 Lack of information in the traning data.
Collect data. Lack of information in the traning data. The main problem ! As few neurons in the hidden layer as posible. Only use the network in working points represented in the traningdata. Use validation and test data. Normalize inputs/targets to fall in the range [-1,1] or have zero mean and unity variance

16 Create the network. Configure the network. Initialize the weights.
... f S ... ... f S Only one hidden layer. 1 Number of neurons in the hidden layer

17 Train the network. Validate the network.
Dividing the Data into three subsets. Training set (fx. 70%) Validation set (fx. 15%) Test set (fx. 15%) trainlm: Levenberg-Marquardt trainbr: Bayesian Regularization trainbfg: BFGS Quasi-Newton trainrp: Resilient Backpropagation trainscg: Scaled Conjugate Gradient traincgb: Conjugate Gradient with Powell/Beale Restarts traincgf: Fletcher-Powell Conjugate Gradient traincgp: Polak-Ribiére Conjugate Gradient trainoss: One Step Secant traingdx: Variable Learning Rate Gradient Descent traingdm: Gradient Descent with Momentom traingd: Gradient Descent Number of iterations.

18 Other types of Neural networks
The RCE net: Only for classification. o X1 x o x o x x x o o x x x o o o x x x o o X2

19 Other types of Neural networks
The RCE net: Only for classification. o X1 x o x o l x l x x S o o ... l x ... x x o o l o x x x o o X2

20 Parzen Estimator X Y G S G Xin Yout G / S G Yout x x x x x x x x Xin
... G / S ... G Yout x x x x x x x x Xin


Download ppt "Neural Networks."

Similar presentations


Ads by Google