Presentation is loading. Please wait.

Presentation is loading. Please wait.

Cooperative Classifiers Rozita Dara Supervisor: Prof. Kamel Pattern Analysis and Machine Intelligence Lab University of Waterloo.

Similar presentations


Presentation on theme: "Cooperative Classifiers Rozita Dara Supervisor: Prof. Kamel Pattern Analysis and Machine Intelligence Lab University of Waterloo."— Presentation transcript:

1 Cooperative Classifiers Rozita Dara Supervisor: Prof. Kamel Pattern Analysis and Machine Intelligence Lab University of Waterloo

2 Combining Classifiers Goals:  Improve performance over constituent classifiers.  Maximize information use.  Obtain a reliable system. Challenges:  Intelligent combination that exploits complementary information.

3 Problem  What type of cooperation between classifiers is the most effective?  What important criteria should be considered when designing a multiple classifier system?  What combination method is the best for a specific problem?

4 Objectives  Enhance understanding of the combination methods and their applications.  Obtain insights into designing and developing new architectures.  Examine the usefulness and efficiency of our finding for document categorization.

5 Proposed Approach  A thorough understanding of cooperation among Multiple classifiers System components provides guidelines for optimization of the system.  Different levels of sharing  Training Level  Feature Level  Architecture Level  Decision Level

6 Proposed Approach (cont ’ d)  Training Level  Sharing training patterns  Sharing training algorithm  Feature Level  Sharing features  Architecture Level  Sharing information  Decision Level  Sharing classifiers ’ decision

7 Key Accomplishments Training Level  Training Data: Disjoint, Overlapped, and identical  Training Data Size  small, medium, and large  Data dimensionality  small and large  Type of data  large interclass distances and small interclass distances  Architectures  ensemble and modular

8 Research in Progress  Sharing training algorithm  architectures  Sharing at feature level  overlapped, identical, disjoint  Sharing at architecture level  share information  Sharing at decision level  classifiers ’ output

9 Research in Progress (cont ’ d)  The advantages of using multiple classifiers in document analysis have been realized in recent years.  Document data  high dimensional  large number of classes  large number of inputs patterns


Download ppt "Cooperative Classifiers Rozita Dara Supervisor: Prof. Kamel Pattern Analysis and Machine Intelligence Lab University of Waterloo."

Similar presentations


Ads by Google