Presentation is loading. Please wait.

Presentation is loading. Please wait.

Saichon Jaiyen, Chidchanok Lursinsap, Suphakant Phimoltares IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL. 21, NO. 3, MARCH 2010 1 Paper study-

Similar presentations


Presentation on theme: "Saichon Jaiyen, Chidchanok Lursinsap, Suphakant Phimoltares IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL. 21, NO. 3, MARCH 2010 1 Paper study-"— Presentation transcript:

1 Saichon Jaiyen, Chidchanok Lursinsap, Suphakant Phimoltares IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL. 21, NO. 3, MARCH 2010 1 Paper study-

2 OUTLINE 2 Introduction VEBF Neural Network Example for Training Experimental Results

3 OUTLINE 3 Introduction Introduction VEBF Neural Network Example for Training Experimental Results

4 Introduction 4 Most current training algorithms require both new incoming data and those previously trained data together in order to correctly learn the whole data set. This paper propose the very fast training algorithm to learn a data set in only one pass. The structure of proposed neural network consists of three layers but the structure is flexible and can be adjusted during the training process.

5 OUTLINE 5 Introduction VEBF Neural Network VEBF Neural Network Example for Training Experimental Results

6 VEBF Neural Network 6

7 7 VEBF : versatile elliptic basis function Outline of learning algorithm 1. add a training data to the VEBF neural network 2. create a new neuron or not Create: set the parameters It can join into a current node: update 3.detect merge condition

8 VEBF Neural Network 8 R n R n for each vector x = [x 1,x 2,…,x n ] T in R n and orthonormal basis {u 1,u 2,…,u n } for R n x i = x T u i the hyperellipsoidal equation unrotated and centered at the origin is defined as Where a i is the width of the i th axis in hyperellipsoid. The simplification can be written as Define a new basis function as

9 VEBF Neural Network 9 If the original axes of the hyperellipsoidal equation are translated from the origin to the coordinates of c = [c 1,c 2,…,c n ] T. Consequently, the new coordinates of vector x, denoted by x’ = [x’ 1,x’ 2,…,x’ n ] T, with respect to the new axes can be written as The simplification can be written as

10 VEBF Neural Network 10 The VEBF as Where {u 1,u 2,…,u n } is the orthonormal basis, the constant a i,i = 1,…,n, is the width of the i th axis in the hyperellipsoid, and the center vector c = [c 1,c 2,…,c n ] T refers to the mean vector.

11 VEBF Neural Network 11

12 VEBF Neural Network Let cs be the index of this closest hidden neuron. If > 0, a new hidden neuron is allocated and added into the network. If < 0, joint to the closest hidden neuron.

13 VEBF Neural Network 13 Mean computation The recursive relation can be written as follows where is the new mean vector,, and

14 VEBF Neural Network 14 Covariance matrix computation The recursive relation can be written as follows where is the new covariance matrix,, and The orthonormal axes vectors are computed by the eigenvectors of the sorted eigenvalues of the covariance matrix.

15 VEBF Neural Network - merge 15

16 VEBF Neural Network - merge 16

17 OUTLINE 17 Introduction VEBF Neural Network Example for Training Example for Training Experimental Results

18 Example for Training 18 R 2 Suppose that X = {(5,16) T,0), (15,6) T,1),(10,18) T,0), (5,6) T,1), (11,16) T,0)} is a set of training data in R 2. Suppose that the training data in class 0 are illustrated by “ + ” while the training data of class 1 is illustrated by “ *.”

19 Example for Training 19 1. The training data (5,16) T,0) is presented to the VEBF neural network. class 0 create a new neuron

20 Example for Training 20 2. The training data (15,6) T,1) is fed to the VEBF neural network. class 1 create a new neuron

21 Example for Training 21 3. The training data (10,18) T,0) is fed to the VEBF neural network. class 0 find the closest neuron detect the distance update the parameters

22 Example for Training 22 3. The training data (10,18) T,0) is fed to the VEBF neural network. class 0 find the closest neuron detect the distance update the parameters

23 Example for Training 23 4. The training data (5,6) T,1) is fed to the VEBF neural network. class 1 find the closest neuron detect the distance create a new neuron

24 Example for Training 24 4. The training data (5,6) T,1) is fed to the VEBF neural network. class 1 find the closest neuron detect the distance create a new neuron

25 Example for Training 25 5. The training data (11,16) T,0) is fed to the VEBF neural network. class 0 find the closest neuron detect the distance update the parameters

26 Example for Training 26 5. The training data (11,16) T,0) is fed to the VEBF neural network. class 0 find the closest neuron detect the distance update the parameters

27 OUTLINE 27 Introduction VEBF Neural Network Example for Training Experimental Results Experimental Results

28 Experimental Results In multiclass classification problem the results are compared with the conventional RBF neural network with Gaussian RBF, multilayer perceptron (MLP). In two-class classification problem the results are also compared with the support vector machine (SVM)

29 Experimental Results 29 The data sets used to train and test are collected from the University of California at Irvine (UCI) Repository of the machine learning database.

30 Experimental Results 30 Multiclass classification Data set# of attributes# of classes# of instances Iris43150

31 Experimental Results 31 Multiclass classification Data set# of attributes# of classes# of instances E.coli88336

32 Experimental Results 32 Multiclass classification Data set# of attributes# of classes# of instances Yeast8101484

33 Experimental Results 33 Multiclass classification Data set# of attributes# of classes# of instances Image Segmentation1972310

34 Experimental Results 34 Multiclass classification Data set# of attributes# of classes# of instances Waveform2135000

35 Experimental Results 35 Two-class classification Data set# of attributes# of classes# of instances Heart132270

36 Experimental Results 36 Two-class classification Data set# of attributes# of classes# of instances Heart132270

37 Experimental Results 37 Two-class classification Data set# of attributes# of classes# of instances Spambase5724601

38 Experimental Results 38 Two-class classification Data set# of attributes# of classes# of instances Spambase5724601

39 Experimental Results 39 Two-class classification Data set# of attributes# of classes# of instances Sonar602208

40 Experimental Results 40 Two-class classification Data set# of attributes# of classes# of instances Sonar602208

41 Experimental Results 41 Two-class classification Data set# of attributes# of classes# of instances Liver72345

42 Experimental Results 42 Two-class classification Data set# of attributes# of classes# of instances Liver72345


Download ppt "Saichon Jaiyen, Chidchanok Lursinsap, Suphakant Phimoltares IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL. 21, NO. 3, MARCH 2010 1 Paper study-"

Similar presentations


Ads by Google