Presentation is loading. Please wait.

Presentation is loading. Please wait.

Machine Learning Using Support Vector Machines (Paper Review) Presented to: Prof. Dr. Mohamed Batouche Prepared By: Asma B. Al-Saleh Amani A. Al-Ajlan.

Similar presentations


Presentation on theme: "Machine Learning Using Support Vector Machines (Paper Review) Presented to: Prof. Dr. Mohamed Batouche Prepared By: Asma B. Al-Saleh Amani A. Al-Ajlan."— Presentation transcript:

1 Machine Learning Using Support Vector Machines (Paper Review) Presented to: Prof. Dr. Mohamed Batouche Prepared By: Asma B. Al-Saleh Amani A. Al-Ajlan 427220094 427220111 King Saud University The College of Computer & Information Science Computer Science Department (Master) Neural Networks and Machine Learning Applications (CSC 563 ) Spring 2008

2 Paper Information  Title: Machine Learning Using Support Vector Machines.  Authors: Abdul Rahim Ahmad. Marzuki Khalid. Rubiyah Yusof.  Publisher: MSTC 2002, Johor Bahru.  Date: September 2002.

3 Review Outlines  Introduction  Artificial Neural Network (ANN)  Support Vector Machine (SVM) Support Vectors Theory of SVM Quadratic Programming Non-linear SVM SVM Implementations SVM for Multi-class Classification  Handwriting Recognition Experimental Results  ANN vs. SVM  Conclusion

4 Introduction

5  The aim of this paper is to Present SVM as a comparison to ANN. Show the concept of SVM by providing them some details about SVM.

6 Machine Learning

7 Machine Learning (ML)  Constructing computer program that automatically improve its performance with experience. ??? Learned Data

8 Machine Learning (ML) Applications 1.Data mining programs. 2.Information filtering systems. 3.Autonomous vehicles. 4.Pattern recognition system: Speech recognition. Handwriting recognition. Face recognition. Text categorization.

9 Artificial Neural Network

10 Artificial Neural Network (ANN)  Massively parallel computing systems consisting of an extremely large number of simple processors with many interconnections.

11 Artificial Neural Network (ANN)  The main characteristics of ANN are: 1.Ability to learn complex nonlinear input- output relationships 2.They use sequential training procedures to updating (adapt) network architecture and connection weights so that a network can work efficiently.

12 Artificial Neural Network (ANN)  In the area of pattern classification, the feed-forward network is most popularly used. Data Clustering Pattern Classification  Kohonen Self- Organizing Map (SOM)  Multilayer perceptron (MLP)  Radial-BasisFunction (RBF) networks

13 ANN and Pattern Recognition  ANN is low dependence on domain- specific knowledge compared to rule- based approaches.  Availability of efficient learning algorithms to use.

14 Support Vector Machine

15 Support Vector Machine – (SVM)  SVM was introduced in 1992 by Vapnik and his coworkers.  SVM original form: Binary classifier; separates between two classes. Design for linear and separable data set.  SVM used for classification and regression.

16 Support Vector Machine – (SVM)

17 Theory of SVM Constraints 1.No data points between H1 and H2. 2.Margin between H1 and H2 is maximized. -

18 Support Vectors Solution: expressed as a linear combination of support vectors: Subset of training patterns Close to the decision boundary

19 Theory of SVM  training data: {,, ……, } Where: SVM Class or label Input features

20 Theory of SVM Class 1:Class 2:

21 Theory of SVM  Learn a linear separating hyper plane classifier:

22 Quadratic Programming  to maximize the margin, we need to minimize:  Quadratic Programming solved by introducing Lagrange multipliers

23 Lagrange Multipliers  Maximize L where w and b are eliminated: and

24 Theory of SVM  Discriminate function:

25 Non-linear SVM 1.SVM mapped the data sets of input space into a higher dimensional feature space 2.Linear and the large-margin learning algorithm is then applied.

26 Non-linear SVM

27  If the mapping function is,we just solve:  However, the mapping can be implicitly done by kernel functions:

28 Non-linear SVM  Discriminate function:

29 Kernel  There are many kernels that can be used that way.  Any kernel that satisfies Mercer’s condition can be used.

30 Kernel - Examples  Polynomial kernels  Hyperbolic tangent  Radial basis function (Gaussian kernel)

31 Non-separable Input Space  In real world problem, there is always noise.  Noise  Non-separable data.  Slack variables are introduced to each input:  Penalty Parameter C: control overfitting.

32 Non-separable Input Space

33 SVM for Multi-class Classification Basic SVM is binary classifier; separates between two classes. In real world, more than two classes is usually needed. Ex: handwriting recognition

34 SVM for Multi-class Classification Methods Modifying binary to incorporate multi-class learning. Combining binary classifiers One vs. One K (K-1) /2 One vs. All K

35 SVM for Multi-class Classification  One vs. One and DAGSVM (Directed Acyclic Graph) are the best choices for practical use.  they are: Less complex Easy to construct Faster to train. Tapia et al, 2005

36 SVM Implementation  Quadratic programming (QP) which is computationally intensive.  However, many decomposition methods have been proposed that avoids the QP and makes SVM learning practical for many current problems.  Ex: Sequential Minimal Optimization (SMO)

37 Results of Experimental Studies

38 Data  Handwritten digit database: MNIST dataset. USPS dataset.  more difficult; human recognition error rate is as high as 2.5%.

39 Error rate comparison of ANN, SVM and other algorithms for MNIST and USPS database.

40 1.SVM error rate is significantly lower than most other algorithms except for LeNet 5 NN. 2.Training time for SVM was significantly slower the higher recognition rate (low error rate) justify for the usage. 3.SVM usage should be increasing and replacing ANN in the area of handwriting recognition where faster method of implementing SVM have been introduced recently. Results of Experimental Studies

41 SVM vs. ANN SVMANN  Naturally handles multi-class classification.  Multi-class implementation needs to be performed.  ANN is known to overfit data unless cross-validation is applied.  SVM does not overfit data (Structural Risk Minimization).  Local minimum  Global minimum.

42 Conclusion  SVM is Powerful and is a useful alternative to neural networks.  SVM find Global, unique solution.  Two key concepts of SVM: maximize the margin and choice of kernel.  Performance depends on choice of kernel and parameters  Still a research topic.  Training is memory-intensive due to QP.

43 Conclusion  Many active research is taking place on areas related to SVM.  Many SVM implementations are available on the Web:  SVMLight  LIBSVM

44 Thank you ….. Questions?


Download ppt "Machine Learning Using Support Vector Machines (Paper Review) Presented to: Prof. Dr. Mohamed Batouche Prepared By: Asma B. Al-Saleh Amani A. Al-Ajlan."

Similar presentations


Ads by Google