Presentation is loading. Please wait.

Presentation is loading. Please wait.

Generalization ..

Similar presentations


Presentation on theme: "Generalization .."— Presentation transcript:

1 Generalization .

2 One of the major advantages of neural nets is their ability to generalize. This means that a trained net could classify data from the same class as the learning data that it has never seen before. In other words Generalization of neural network is ability to handle unseen data.

3 In real world applications developers normally have only a small part of all possible patterns for the generation of a neural net. To reach the best generalization, the dataset should be split into three parts: (1)The training set is used to train a neural net. The error of this dataset is minimized during training. (2)The validation set is used to determine the performance of a neural network on patterns that are not trained during learning. (3)A test set for finally checking the over all performance of a neural net.

4 Generalization error  generalization error (also known as the out-of-sample error]) is a measure of how accurately an algorithm is able to predict outcome values for previously unseen data.

5 Method for improving generalization of Neural Networks
There are two other methods for improving generalization that are implemented in Neural Network Toolbox™ software: regularization early stopping Retraining Neural Networks

6 (1) Regularization Another method for improving generalization is called regularization. This involves modifying the performance function, which is normally chosen to be the sum of squares of the network errors on the training set. The typical performance function used for training neural networks is the mean sum of squares of the network errors.

7 It is possible to improve generalization if you modify the performance function by adding a term that consists of the mean of the sum of squares of the network weights and biases 

8 (2) Early stopping The default method for improving generalization is called early stopping. In this technique the available data is divided into three subsets. Training set Validation set Test set

9 (3) Retraining Neural Networks
Typically each back propagation training session starts with different initial weights and biases, and different divisions of data into training, validation, and test sets. These different conditions can lead to very different solutions for the same problem. It is a good idea to train several networks to ensure that a network with good generalization is found.


Download ppt "Generalization .."

Similar presentations


Ads by Google