On Utillizing LVQ3-Type Algorithms to Enhance Prototype Reduction Schemes Sang-Woon Kim and B. John Oommen* Myongji University, Carleton University*

Slides:



Advertisements
Similar presentations
Mining customer ratings for product recommendation using the support vector machine and the latent class model William K. Cheung, James T. Kwok, Martin.
Advertisements

ECG Signal processing (2)
Aggregating local image descriptors into compact codes
Integrated Instance- and Class- based Generative Modeling for Text Classification Antti PuurulaUniversity of Waikato Sung-Hyon MyaengKAIST 5/12/2013 Australasian.
Particle swarm optimization for parameter determination and feature selection of support vector machines Shih-Wei Lin, Kuo-Ching Ying, Shih-Chieh Chen,
An Overview of Machine Learning
Face Recognition & Biometric Systems Support Vector Machines (part 2)
Second order cone programming approaches for handing missing and uncertain data P. K. Shivaswamy, C. Bhattacharyya and A. J. Smola Discussion led by Qi.
Classification and Decision Boundaries
Empowering visual categorization with the GPU Present by 陳群元 我是強壯 !
Neural Networks Part 4 Dan Simon Cleveland State University 1.
Lazy Learning k-Nearest Neighbour Motivation: availability of large amounts of processing power improves our ability to tune k-NN classifiers.
Presented by Zeehasham Rasheed
KNN, LVQ, SOM. Instance Based Learning K-Nearest Neighbor Algorithm (LVQ) Learning Vector Quantization (SOM) Self Organizing Maps.
1 Ensembles of Nearest Neighbor Forecasts Dragomir Yankov, Eamonn Keogh Dept. of Computer Science & Eng. University of California Riverside Dennis DeCoste.
CS Instance Based Learning1 Instance Based Learning.
Review Rong Jin. Comparison of Different Classification Models  The goal of all classifiers Predicating class label y for an input x Estimate p(y|x)
Ch. Eick: Support Vector Machines: The Main Ideas Reading Material Support Vector Machines: 1.Textbook 2. First 3 columns of Smola/Schönkopf article on.
Efficient Model Selection for Support Vector Machines
Active Learning for Class Imbalance Problem
Data mining and machine learning A brief introduction.
Introduction to variable selection I Qi Yu. 2 Problems due to poor variable selection: Input dimension is too large; the curse of dimensionality problem.
Presented by Tienwei Tsai July, 2005
Instance Selection. 1.Introduction 2.Training Set Selection vs. Prototype Selection 3.Prototype Selection Taxonomy 4.Description of Methods 5.Related.
COMMON EVALUATION FINAL PROJECT Vira Oleksyuk ECE 8110: Introduction to machine Learning and Pattern Recognition.
Introduction to machine learning and data mining 1 iCSC2014, Juan López González, University of Oviedo Introduction to machine learning Juan López González.
Selective Block Minimization for Faster Convergence of Limited Memory Large-scale Linear Models Kai-Wei Chang and Dan Roth Experiment Settings Block Minimization.
An Introduction to Support Vector Machine (SVM) Presenter : Ahey Date : 2007/07/20 The slides are based on lecture notes of Prof. 林智仁 and Daniel Yeung.
Machine Learning Using Support Vector Machines (Paper Review) Presented to: Prof. Dr. Mohamed Batouche Prepared By: Asma B. Al-Saleh Amani A. Al-Ajlan.
Stochastic Subgradient Approach for Solving Linear Support Vector Machines Jan Rupnik Jozef Stefan Institute.
1 Pattern Recognition Pattern recognition is: 1. A research area in which patterns in data are found, recognized, discovered, …whatever. 2. A catchall.
Greedy is not Enough: An Efficient Batch Mode Active Learning Algorithm Chen, Yi-wen( 陳憶文 ) Graduate Institute of Computer Science & Information Engineering.
Breast Cancer Diagnosis via Neural Network Classification Jing Jiang May 10, 2000.
Introducing the Separability Matrix for ECOC coding
CSSE463: Image Recognition Day 11 Lab 4 (shape) tomorrow: feel free to start in advance Lab 4 (shape) tomorrow: feel free to start in advance Test Monday.
Intelligent Database Systems Lab N.Y.U.S.T. I. M. A fast nearest neighbor classifier based on self-organizing incremental neural network (SOINN) Neuron.
Gang WangDerek HoiemDavid Forsyth. INTRODUCTION APROACH (implement detail) EXPERIMENTS CONCLUSION.
Learning from Positive and Unlabeled Examples Investigator: Bing Liu, Computer Science Prime Grant Support: National Science Foundation Problem Statement.
Machine Learning for Spam Filtering 1 Sai Koushik Haddunoori.
A NOVEL METHOD FOR COLOR FACE RECOGNITION USING KNN CLASSIFIER
A Fast LBG Codebook Training Algorithm for Vector Quantization Presented by 蔡進義.
Chapter 13 (Prototype Methods and Nearest-Neighbors )
Fast Query-Optimized Kernel Machine Classification Via Incremental Approximate Nearest Support Vectors by Dennis DeCoste and Dominic Mazzoni International.
CSSE463: Image Recognition Day 11 Due: Due: Written assignment 1 tomorrow, 4:00 pm Written assignment 1 tomorrow, 4:00 pm Start thinking about term project.
Competition II: Springleaf Sha Li (Team leader) Xiaoyan Chong, Minglu Ma, Yue Wang CAMCOS Fall 2015 San Jose State University.
Similarity Measurement and Detection of Video Sequences Chu-Hong HOI Supervisor: Prof. Michael R. LYU Marker: Prof. Yiu Sang MOON 25 April, 2003 Dept.
Debrup Chakraborty Non Parametric Methods Pattern Recognition and Machine Learning.
Incremental Reduced Support Vector Machines Yuh-Jye Lee, Hung-Yi Lo and Su-Yun Huang National Taiwan University of Science and Technology and Institute.
FUZZ-IEEE Kernel Machines and Additive Fuzzy Systems: Classification and Function Approximation Yixin Chen and James Z. Wang The Pennsylvania State.
Instance-Based Learning Evgueni Smirnov. Overview Instance-Based Learning Comparison of Eager and Instance-Based Learning Instance Distances for Instance-Based.
A Binary Linear Programming Formulation of the Graph Edit Distance Presented by Shihao Ji Duke University Machine Learning Group July 17, 2006 Authors:
A distributed PSO – SVM hybrid system with feature selection and parameter optimization Cheng-Lung Huang & Jian-Fan Dun Soft Computing 2008.
The Chinese University of Hong Kong Learning Larger Margin Machine Locally and Globally Dept. of Computer Science and Engineering The Chinese University.
Course Outline (6 Weeks) for Professor K.H Wong
An Image Database Retrieval Scheme Based Upon Multivariate Analysis and Data Mining Presented by C.C. Chang Dept. of Computer Science and Information.
CATEGORIZATION OF NEWS ARTICLES USING NEURAL TEXT CATEGORIZER
CSSE463: Image Recognition Day 11
Table 1. Advantages and Disadvantages of Traditional DM/ML Methods
Boosting Nearest-Neighbor Classifier for Character Recognition
CSSE463: Image Recognition Day 11
Nearest-Neighbor Classifiers
A Unifying View on Instance Selection
Deep Learning Hierarchical Representations for Image Steganalysis
COSC 4335: Other Classification Techniques
Shih-Wei Lin, Kuo-Ching Ying, Shih-Chieh Chen, Zne-Jung Lee
CSSE463: Image Recognition Day 11
CSSE463: Image Recognition Day 11
Physics-guided machine learning for milling stability:
Automatic Handwriting Generation
Predictive Grayscale Image Coding Scheme Using VQ and BTC
Presentation transcript:

On Utillizing LVQ3-Type Algorithms to Enhance Prototype Reduction Schemes Sang-Woon Kim and B. John Oommen* Myongji University, Carleton University*

Workshop on PRIS’ 2002 Outline of the Study Introduction Overview of the Prototype Reduction Schemes The Proposed Reduction Method Experiments & Discussions Conclusions

Workshop on PRIS’ 2002 Introduction (1) The Nearest Neighbor (NN) Classifier : A widely used classifier, which is simple and yet one of the most efficient classification rules in practice. However, its application often suffers from the computational complexity caused by the huge amount of information.

Workshop on PRIS’ 2002 Introduction (2) Solving strategies to the problem : Reducing the size of the design set without sacrificing the performance. Accelerating the speed of computation by eliminating the necessity of calculating many distances. Increasing the accuracy of the classifiers designed with limited samples.

Workshop on PRIS’ 2002 Motivation of the Study zIn NN classifications, prototypes near the boundary play more important roles. zThe prototypes need to be moved or adjusted towards the classification boundary. zThe proposed approach is based on this philosophy, namely that of creating and adjusting.

Workshop on PRIS’ 2002 Prototype Reduction Schemes - Conventional Approaches - The Condensed Nearest Neighbor (CNN) : The RNN, SNN, ENN, mCNN rules The Prototypes for Nearest Neighbor (PNN) classifiers The Vector Quantization (VQ) & Bootstrap (BT) techniques The Support Vector Machines (SVM)

Workshop on PRIS’ 2002 A Graphical Example (PNN)

Workshop on PRIS’ 2002 LVQ3 Algorithm An improved LVQ algorithm : Learning Parameters : Initial vectors Learning rates : Iteration numbers Training Set = Placement + Optimizing:

Workshop on PRIS’ 2002 Support Vector Machines (SVM) The SVM has a capability of extracting vectors which support the boundary between two classes, and they can satisfactorily represent the global distribution structure.

Workshop on PRIS’ 2002 Extension by Kernels

Workshop on PRIS’ 2002 The Proposed Method First, the CNN, PNN, VQ, SVM are employed to select initial prototype vectors. Next, an LVQ3-type learning is performed to adjust the prototypes: Perform the LVQ3 with Tip to select w Perform the LVQ3 with Tip to select e Repeat the above steps to obtain the best w* and e* Finally, determine the best prototypes by invoking the learning n times with Tip and Tio.

Workshop on PRIS’ 2002 Experiments The proposed method is tested with artificial and real benchmark design data sets, and compared with the conventional methods. The one-against-all NN classifier is designed. Benchmark data sets :

Workshop on PRIS’ 2002

Experimental Results (3)

Workshop on PRIS’ 2002 Experimental Results (4)

Workshop on PRIS’ 2002 Data Compression Rates

Workshop on PRIS’ 2002 Classification Error Rates (%) - Before Adjusting -

Workshop on PRIS’ 2002 Classification Error Rates (%) - After Adjusting with LVQ3 -

Workshop on PRIS’ 2002 Conclusions The method provides a principled way of choosing prototype vectors for designing NN classifiers. The performance of a classifier trained with the method is better than that of the CNN, PNN, VQ, and SVM classifier. The future work is to expand this study into large data set problems such as data mining and text categorization.