Presentation is loading. Please wait.

Presentation is loading. Please wait.

Co-operative Training in Classifier Ensembles Rozita Dara PAMI Lab University of Waterloo.

Similar presentations


Presentation on theme: "Co-operative Training in Classifier Ensembles Rozita Dara PAMI Lab University of Waterloo."— Presentation transcript:

1 Co-operative Training in Classifier Ensembles Rozita Dara PAMI Lab University of Waterloo

2 Introduction Sharing Training Resources Sharing Training Patterns Sharing Training Patterns Sharing Training Algorithms Sharing Training Algorithms Sharing Training Information Sharing Training Information Sharing Training Information: An Algorithm Experimental Study Discussion and Conclusions Outline

3 IFusion 2004Co-operative Training in Classifier Ensembles Introduction Multiple Classifier Systems provide Improved Performance Improved Performance Better Reliability and Generalization Better Reliability and Generalization Multiple Classifier Systems motivations include Empirical Observation Empirical Observation Problem decomposed naturally from using various sensors Problem decomposed naturally from using various sensors Avoid making commitments to arbitrary initial conditions or parameters Avoid making commitments to arbitrary initial conditions or parameters

4 IFusion 2004Co-operative Training in Classifier Ensembles Introduction (cntd…) “Combining identical classifiers will not lead to improved performance.” Importance of creating diverse classifiers How does the amount of “sharing” between classifiers affect the performance?

5 IFusion 2004Co-operative Training in Classifier Ensembles Sharing Training Resources A measure of the degree of co-operation between various classifiers. Sharing Training Patterns Sharing Training Patterns Sharing Training Algorithms Sharing Training Algorithms Sharing Training Information Sharing Training Information

6 IFusion 2004Co-operative Training in Classifier Ensembles Sharing Training Patterns

7 IFusion 2004Co-operative Training in Classifier Ensembles Sharing Training Algorithms

8 IFusion 2004Co-operative Training in Classifier Ensembles Sharing Training Information

9 IFusion 2004Co-operative Training in Classifier Ensembles Training Training each component independently Optimize individual components, may not lead to overall improvement Optimize individual components, may not lead to overall improvement Collinearity, high correlation between classifiers Collinearity, high correlation between classifiers Components, under-trained or over-trained Components, under-trained or over-trained

10 IFusion 2004Co-operative Training in Classifier Ensembles Training (cntd…) Adaptive training Selective: Reducing correlation between components Selective: Reducing correlation between components Focused: Re-training focuses on misclassified patterns. Focused: Re-training focuses on misclassified patterns. Efficient: Determined the duration of training Efficient: Determined the duration of training

11 IFusion 2004Co-operative Training in Classifier Ensembles Adaptive Training: Main loop Share Training Information between members of the ensemble Incremental learning Evaluation of training to determine the re-training set

12 IFusion 2004Co-operative Training in Classifier Ensembles Adaptive Training: Training Save classifier if it performs well on the evaluation set Determine when to terminate training for each module

13 IFusion 2004Co-operative Training in Classifier Ensembles Adaptive Training: Evaluation Train aggregation modules Evaluate training sets for each classifier Compose new training data

14 IFusion 2004Co-operative Training in Classifier Ensembles Adaptive Training: Data Selection New training data are composed by concatenating Error i : Misclassified entries of training data for classifier i. Error i : Misclassified entries of training data for classifier i. Correct i : Random choice of  R*(P*δ_i)  correctly classified entries of the training data for classifier i. Correct i : Random choice of  R*(P*δ_i)  correctly classified entries of the training data for classifier i.

15 IFusion 2004Co-operative Training in Classifier Ensembles Results Five one-hidden layer BP classifiers Training used partially disjoint data sets No optimization is performed for the trained networks The parameters of all the networks are maintained for all the classifiers that are trained Three data sets 20 Class Gaussian 20 Class Gaussian Satimages Satimages

16 IFusion 2004Co-operative Training in Classifier Ensembles Results (cntd…) 20 ClassSatimages Without SharingWith SharingWithout SharingWith Sharing Majority 14.25  0.5113.48  0.3213.42  0.8813.39  0.90 Maximum 14.34  0.9914.00  0.3414.87  1.9014.59  1.54 Average 13.88  0.4113.23  0.0913.31  1.0113.26  0.93 Nash 13.75  0.4313.08  0.1917.36  4.1516.28  4.44 Borda 14.00  0.6213.06  0.2513.97  1.3313.90  1.03 Weighted Average 13.46  0.5612.84  0.1713.17  0.9413.09  0.91 Bayesian 13.12  0.2012.66  0.1013.63  0.8013.76  1.03 Choquet Integral 14.39  0.9714.12  0.3014.85  2.0714.57  1.43 Best Classifier 16.20  4.0315.57  3.2716.52  2.8017.13  1.03 Oracle 3.74  0.32 3.88  0.31 5.08  0.13 5.41  0.23

17 IFusion 2004Co-operative Training in Classifier Ensembles Conclusions Exchange of information during training would allow for a informed fusion process Enhance diversity amongst classifiers Algorithms that share training information can improve overall classification accuracy.

18 IFusion 2004Co-operative Training in Classifier Ensembles Conclusions (cntd…)

19 IFusion 2004Co-operative Training in Classifier Ensembles References


Download ppt "Co-operative Training in Classifier Ensembles Rozita Dara PAMI Lab University of Waterloo."

Similar presentations


Ads by Google