“Study on Parallel SVM Based on MapReduce” Kuei-Ti Lu 03/12/2015.

Slides:



Advertisements
Similar presentations
Introduction to Support Vector Machines (SVM)
Advertisements

Lecture 9 Support Vector Machines
Image classification Given the bag-of-features representations of images from different classes, how do we learn a model for distinguishing them?
S UPPORT V ECTOR M ACHINES Jianping Fan Dept of Computer Science UNC-Charlotte.
SVM - Support Vector Machines A new classification method for both linear and nonlinear data It uses a nonlinear mapping to transform the original training.
Classification / Regression Support Vector Machines
Structured SVM Chen-Tse Tsai and Siddharth Gupta.
Pattern Recognition and Machine Learning
An Introduction of Support Vector Machine
Support Vector Machines
SVM—Support Vector Machines
Search Engines Information Retrieval in Practice All slides ©Addison Wesley, 2008.
CSCI 347 / CS 4206: Data Mining Module 07: Implementations Topic 03: Linear Models.
Software Quality Ranking: Bringing Order to Software Modules in Testing Fei Xing Michael R. Lyu Ping Guo.
Jun Zhu Dept. of Comp. Sci. & Tech., Tsinghua University This work was done when I was a visiting researcher at CMU. Joint.
Discriminative and generative methods for bags of features
Support Vector Machines (and Kernel Methods in general)
Image classification Given the bag-of-features representations of images from different classes, how do we learn a model for distinguishing them?
Text Classification With Support Vector Machines
University of Texas at Austin Machine Learning Group Department of Computer Sciences University of Texas at Austin Support Vector Machines.
Classification and risk prediction Usman Roshan. Disease risk prediction What is the best method to predict disease risk? –We looked at the maximum likelihood.
Using Analytic QP and Sparseness to Speed Training of Support Vector Machines John C. Platt Presented by: Travis Desell.
Support Vector Machine (SVM) Classification
Sketched Derivation of error bound using VC-dimension (1) Bound our usual PAC expression by the probability that an algorithm has 0 error on the training.
Support Vector Machines
Dept. of Computer Science & Engineering, CUHK Pseudo Relevance Feedback with Biased Support Vector Machine in Multimedia Retrieval Steven C.H. Hoi 14-Oct,
Data mining and statistical learning - lecture 13 Separating hyperplane.
What is Learning All about ?  Get knowledge of by study, experience, or being taught  Become aware by information or from observation  Commit to memory.
Map-Reduce and Parallel Computing for Large-Scale Media Processing Youjie Zhou.
Greg GrudicIntro AI1 Support Vector Machine (SVM) Classification Greg Grudic.
Optimization Theory Primal Optimization Problem subject to: Primal Optimal Value:
Accelerating Machine Learning Applications on Graphics Processors Narayanan Sundaram and Bryan Catanzaro Presented by Narayanan Sundaram.
Review Rong Jin. Comparison of Different Classification Models  The goal of all classifiers Predicating class label y for an input x Estimate p(y|x)
CSCI 347 / CS 4206: Data Mining Module 04: Algorithms Topic 06: Regression.
An Introduction to Support Vector Machines Martin Law.
Support Vector Machines Piyush Kumar. Perceptrons revisited Class 1 : (+1) Class 2 : (-1) Is this unique?
Applying Twister to Scientific Applications CloudCom 2010 Indianapolis, Indiana, USA Nov 30 – Dec 3, 2010.
Page: 1 of 38 Support Vector Machine 李旭斌 (LI mining Lab. 6/19/2012.
Face Detection And Recognition For Distributed Systems Meng Lin and Ermin Hodžić 1.
Support Vector Machines Mei-Chen Yeh 04/20/2010. The Classification Problem Label instances, usually represented by feature vectors, into one of the predefined.
1 SUPPORT VECTOR MACHINES İsmail GÜNEŞ. 2 What is SVM? A new generation learning system. A new generation learning system. Based on recent advances in.
计算机学院 计算感知 Support Vector Machines. 2 University of Texas at Austin Machine Learning Group 计算感知 计算机学院 Perceptron Revisited: Linear Separators Binary classification.
Machine Learning Using Support Vector Machines (Paper Review) Presented to: Prof. Dr. Mohamed Batouche Prepared By: Asma B. Al-Saleh Amani A. Al-Ajlan.
Data Mining Practical Machine Learning Tools and Techniques Chapter 4: Algorithms: The Basic Methods Section 4.6: Linear Models Rodney Nielsen Many of.
Fast Support Vector Machine Training and Classification on Graphics Processors Bryan Catanzaro Narayanan Sundaram Kurt Keutzer Parallel Computing Laboratory,
CS Statistical Machine learning Lecture 18 Yuan (Alan) Qi Purdue CS Oct
Classifiers Given a feature representation for images, how do we learn a model for distinguishing features from different classes? Zebra Non-zebra Decision.
An Introduction to Support Vector Machines (M. Law)
1 Chapter 6. Classification and Prediction Overview Classification algorithms and methods Decision tree induction Bayesian classification Lazy learning.
Mining Binary Constraints in Feature Models: A Classification-based Approach Yi Li.
Kernels Usman Roshan CS 675 Machine Learning. Feature space representation Consider two classes shown below Data cannot be separated by a hyperplane.
Biointelligence Laboratory, Seoul National University
CSSE463: Image Recognition Day 14 Lab due Weds, 3:25. Lab due Weds, 3:25. My solutions assume that you don't threshold the shapes.ppt image. My solutions.
University of Texas at Austin Machine Learning Group Department of Computer Sciences University of Texas at Austin Support Vector Machines.
1  The Problem: Consider a two class task with ω 1, ω 2   LINEAR CLASSIFIERS.
Chapter 6. Classification and Prediction Classification by decision tree induction Bayesian classification Rule-based classification Classification by.
Chapter 6. Classification and Prediction Classification by decision tree induction Bayesian classification Rule-based classification Classification by.
1 Detecting Hidden Messages using higher-order stats and SVMs Siwei Lyu and Hany Farid.
Greg GrudicIntro AI1 Support Vector Machine (SVM) Classification Greg Grudic.
Lecture 14. Outline Support Vector Machine 1. Overview of SVM 2. Problem setting of linear separators 3. Soft Margin Method 4. Lagrange Multiplier Method.
1 Kernel Machines A relatively new learning methodology (1992) derived from statistical learning theory. Became famous when it gave accuracy comparable.
High resolution product by SVM. L’Aquila experience and prospects for the validation site R. Anniballe DIET- Sapienza University of Rome.
Computer Science and Engineering Parallelizing Feature Mining Using FREERIDE Leonid Glimcher P. 1 ipdps’04 Scaling and Parallelizing a Scientific Feature.
1 Support Vector Machines: Maximum Margin Classifiers Machine Learning and Pattern Recognition: September 23, 2010 Piotr Mirowski Based on slides by Sumit.
SVM With Stochastic Parameter Selection For Bovine Leather Defect Classification Roberto Viana, Ricardo Rodrigues Marco A. Alvarez And Hemerson Pistori.
Lecture 07: Soft-margin SVM
Applying Twister to Scientific Applications
Lecture 07: Soft-margin SVM
The following slides are taken from:
Presentation transcript:

“Study on Parallel SVM Based on MapReduce” Kuei-Ti Lu 03/12/2015

Support Vector Machine (SVM) Used for – Classification – Regression Applied in – Network intrusion detection – Image processing – Text classification – …

libSVM Library for support vector machines Integrate different types of SVMs

Types of SVMs Supported by libSVM For support vector classification – C-SVC – Nu-SVC For support vector regression – Epsilon-SVR – Nu-SVR For distribution estimation – One-class SVM

C-SVC Goal: Find the separating hyperplane that maximizes the margin Support vectors: data points closest to the separating hyperplane

C-SVC Primal form Dual form (derived using Lagrange multipliers)

Speedup Computation and storage requirements increase rapidly as the number of training vectors (also called training samples or training points) increases Need efficient algorithms and implementation to apply to large scale data mining => Parallel SVM

Parallel SVM Methods Message Passing Interface (MPI) – Efficient for computation-intensive problems Ex. Simulation MapReduce – Can be used for data-intensive problems …

Other Speedup Techniques Chunking: optimize subsets of training data iteratively until the global optimum is reached – Ex. Sequential Minimal Optimization (SMO) Use a chunk size of 2 vectors Eliminate non-support vectors early

This Paper’s Approach 1.Partition & distribute data to nodes 2.Map class: Train each subSVM to find support vectors for subset of data 3.Reduce class: Combine support vectors of each 2 subSVMs 4.If more than 1 SVM Go to 2.

Twister Support iterative MapReduce More efficient than Hadoop or Dryad/DryadLINQ for iterative MapReduce

Computation Complexity

Evaluations Number of nodes Training time Accuracy = # correctly predicted data / # total testing data * 100 %

Adult Data Analysis Binary classification Correlation between attribute variable X and class variable Y used to select attributes

Adult Data Analysis Computation cost concentrates on training Data transfer time cost minor Last layer computation time depends on α and β instead of number of nodes (1 node only) Feature selection reduces computation greatly but does not reduce accuracy very much

Forest Cover Type Classification Multiclass classification – Use k(k - 1)/2 binary SVMs as k-class SVM – 1 binary SVM for each pair of classes – Use maximum voting to determine the class

Forest Cover Type Classification Correlation between attribute variable X and class variable Y used to select attributes Attribute variables are normalized to [0, 1]

Forest Cover Type Classification Last layer computation time depends on α and β instead of number of nodes (1 node only) Feature selection reduces computation greatly but does not reduce accuracy very much

Heart Disease Classification Binary classification Data replicated different times to compare results for different sample sizes

Heart Disease Classification When sample size too big, can’t be processed with 1 node because of memory constraint Training time decreases little when number of nodes > 8

Conclusion Classical SVM impractical for large scale data Need parallel SVM This paper proposes a model based on iterative MapReduce Show the model efficient for data-intensive problems

References [1]Z. Sun and G. Fox, “Study on Parallel SVM Based on MapReduce,” in PDPTA., Las Vegas, NV, 2012, pp. [2]C. Lin et al., “Anomaly Detection Using LibSVM Training Tools,” in ISA., Busan, Korea, 2008, pp

Q & A