Cooperative Classifiers Rozita Dara Supervisor: Prof. Kamel Pattern Analysis and Machine Intelligence Lab University of Waterloo.

Slides:



Advertisements
Similar presentations
Component Analysis (Review)
Advertisements

DIMENSIONALITY REDUCTION: FEATURE EXTRACTION & FEATURE SELECTION Principle Component Analysis.
COMPUTER AIDED DIAGNOSIS: FEATURE SELECTION Prof. Yasser Mostafa Kadah –
Software Quality Ranking: Bringing Order to Software Modules in Testing Fei Xing Michael R. Lyu Ping Guo.
Prénom Nom Document Analysis: Linear Discrimination Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
Student Mini-Camp Project Report Pattern Recognition Participant StudentsAffiliations Patrick ChoiClaremont Graduate University Joseph McGrathUniv. of.
Greg GrudicIntro AI1 Introduction to Artificial Intelligence CSCI 3202 Fall 2007 Introduction to Classification Greg Grudic.
Three kinds of learning
The Implicit Mapping into Feature Space. In order to learn non-linear relations with a linear machine, we need to select a set of non- linear features.
PowerPoint Presentation by Charlie Cook Copyright © 2004 South-Western. All rights reserved. Chapter 7 System Design and Implementation System Design and.
Spam? Not any more !! Detecting spam s using neural networks ECE/CS/ME 539 Project presentation Submitted by Sivanadyan, Thiagarajan.
Face Recognition Using Neural Networks Presented By: Hadis Mohseni Leila Taghavi Atefeh Mirsafian.
Chapter 1: The Object-Oriented Systems Development Environment Object-Oriented Systems Analysis and Design Joey F. George, Dinesh Batra, Joseph S. Valacich,
Geodesic Minimal Paths Vida Movahedi Elder Lab, January 2010.
Intelligent Database Systems Lab 國立雲林科技大學 National Yunlin University of Science and Technology 1 Data mining for credit card fraud: A comparative study.
Data Dependence in Combining Classifiers Mohamed Kamel PAMI Lab University of Waterloo.
K Nearest Neighbors Classifier & Decision Trees
1 Distributed Energy-Efficient Scheduling for Data-Intensive Applications with Deadline Constraints on Data Grids Cong Liu and Xiao Qin Auburn University.
1 Part II: Practical Implementations.. 2 Modeling the Classes Stochastic Discrimination.
Integration of Search and Learning Algorithms Eugene Fink.
Procedures for managing workflow components Workflow components: A workflow can usually be described using formal or informal flow diagramming techniques,
Center for Evolutionary Functional Genomics Large-Scale Sparse Logistic Regression Jieping Ye Arizona State University Joint work with Jun Liu and Jianhui.
Jun-Won Suh Intelligent Electronic Systems Human and Systems Engineering Department of Electrical and Computer Engineering Speaker Verification System.
An Introduction to Support Vector Machines (M. Law)
AUDIO TONALITY MODE CLASSIFICATION WITHOUT TONIC ANNOTATIONS Zhiyao Duan 1,2, Lie Lu 1, and Changshui Zhang 2 1. Microsoft Research Asia (MSRA), China.
Neural Networks - Lecture 81 Unsupervised competitive learning Particularities of unsupervised learning Data clustering Neural networks for clustering.
Face Detection Ying Wu Electrical and Computer Engineering Northwestern University, Evanston, IL
Gang WangDerek HoiemDavid Forsyth. INTRODUCTION APROACH (implement detail) EXPERIMENTS CONCLUSION.
SemiBoost : Boosting for Semi-supervised Learning Pavan Kumar Mallapragada, Student Member, IEEE, Rong Jin, Member, IEEE, Anil K. Jain, Fellow, IEEE, and.
Neural Text Categorizer for Exclusive Text Categorization Journal of Information Processing Systems, Vol.4, No.2, June 2008 Taeho Jo* 報告者 : 林昱志.
Intelligent Database Systems Lab 國立雲林科技大學 National Yunlin University of Science and Technology 1 Fuzzy integration of structure adaptive SOMs for web content.
1 CSCI 3202: Introduction to AI Decision Trees Greg Grudic (Notes borrowed from Thomas G. Dietterich and Tom Mitchell) Intro AIDecision Trees.
Computational Approaches for Biomarker Discovery SubbaLakshmiswetha Patchamatla.
Co-operative Training in Classifier Ensembles Rozita Dara PAMI Lab University of Waterloo.
Iterative similarity based adaptation technique for Cross Domain text classification Under: Prof. Amitabha Mukherjee By: Narendra Roy Roll no: Group:
Intelligent Database Systems Lab 國立雲林科技大學 National Yunlin University of Science and Technology 1 A self-organizing map for adaptive processing of structured.
Fast Query-Optimized Kernel Machine Classification Via Incremental Approximate Nearest Support Vectors by Dennis DeCoste and Dominic Mazzoni International.
Competition II: Springleaf Sha Li (Team leader) Xiaoyan Chong, Minglu Ma, Yue Wang CAMCOS Fall 2015 San Jose State University.
Using decision trees to build an a framework for multivariate time- series classification 1 Present By Xiayi Kuang.
Support-Vector Networks C Cortes and V Vapnik (Tue) Computational Models of Intelligence Joon Shik Kim.
2D-LDA: A statistical linear discriminant analysis for image matrix
Predicting Post-Operative Patient Gait Jongmin Kim Movement Research Lab. Seoul National University.
Intelligent Database Systems Lab Presenter : Chang,Chun-Chih Authors : Emilio Corchado, Bruno Baruque 2012 NeurCom WeVoS-ViSOM: An ensemble summarization.
Incremental Reduced Support Vector Machines Yuh-Jye Lee, Hung-Yi Lo and Su-Yun Huang National Taiwan University of Science and Technology and Institute.
SUPPORT VECTOR MACHINES Presented by: Naman Fatehpuria Sumana Venkatesh.
SemiBoost : Boosting for Semi-supervised Learning Pavan Kumar Mallapragada, Student Member, IEEE, Rong Jin, Member, IEEE, Anil K. Jain, Fellow, IEEE, and.
Theme Guidance - Network Traffic Proposed NMLRG IETF 95, April 2016 Sheng Jiang (Speaker, Co-chair) Page 1/6.
A Document-Level Sentiment Analysis Approach Using Artificial Neural Network and Sentiment Lexicons Yan Zhu.
High resolution product by SVM. L’Aquila experience and prospects for the validation site R. Anniballe DIET- Sapienza University of Rome.
Introduction to Machine Learning, its potential usage in network area,
Model Discovery through Metalearning
Hanan Ayad Supervisor Prof. Mohamed Kamel
Zaman Faisal Kyushu Institute of Technology Fukuoka, JAPAN
LECTURE 10: DISCRIMINANT ANALYSIS
Implementing Boosting and Convolutional Neural Networks For Particle Identification (PID) Khalid Teli .
Introduction to Soft Computing
Basic machine learning background with Python scikit-learn
A Consensus-Based Clustering Method
Machine Learning Week 1.
A weight-incorporated similarity-based clustering ensemble method based on swarm intelligence Yue Ming NJIT#:
Advantages to Simple Machines
The Combination of Supervised and Unsupervised Approach
What is a Logic Model? A depiction of a program showing what the program will do and what it is to accomplish. A series of “if-then” relationships that.
LECTURE 09: DISCRIMINANT ANALYSIS
CSE 802. Prepared by Martin Law
Group Members First Names
Bell Work: Machine Lab What is the equation to find Input Work? (look at note guide from yesterday) Calculate the input work for your ramp in trial 1.
August 8, 2006 Danny Budik, Itamar Elhanany Machine Intelligence Lab
What is a Logic Model? A depiction of a program showing what the program will do and what it is to accomplish. A series of “if-then” relationships that.
What is Artificial Intelligence?
Presentation transcript:

Cooperative Classifiers Rozita Dara Supervisor: Prof. Kamel Pattern Analysis and Machine Intelligence Lab University of Waterloo

Combining Classifiers Goals:  Improve performance over constituent classifiers.  Maximize information use.  Obtain a reliable system. Challenges:  Intelligent combination that exploits complementary information.

Problem  What type of cooperation between classifiers is the most effective?  What important criteria should be considered when designing a multiple classifier system?  What combination method is the best for a specific problem?

Objectives  Enhance understanding of the combination methods and their applications.  Obtain insights into designing and developing new architectures.  Examine the usefulness and efficiency of our finding for document categorization.

Proposed Approach  A thorough understanding of cooperation among Multiple classifiers System components provides guidelines for optimization of the system.  Different levels of sharing  Training Level  Feature Level  Architecture Level  Decision Level

Proposed Approach (cont ’ d)  Training Level  Sharing training patterns  Sharing training algorithm  Feature Level  Sharing features  Architecture Level  Sharing information  Decision Level  Sharing classifiers ’ decision

Key Accomplishments Training Level  Training Data: Disjoint, Overlapped, and identical  Training Data Size  small, medium, and large  Data dimensionality  small and large  Type of data  large interclass distances and small interclass distances  Architectures  ensemble and modular

Research in Progress  Sharing training algorithm  architectures  Sharing at feature level  overlapped, identical, disjoint  Sharing at architecture level  share information  Sharing at decision level  classifiers ’ output

Research in Progress (cont ’ d)  The advantages of using multiple classifiers in document analysis have been realized in recent years.  Document data  high dimensional  large number of classes  large number of inputs patterns