Presentation is loading. Please wait.

Presentation is loading. Please wait.

Proximal Support Vector Machine Classifiers KDD 2001 San Francisco August 26-29, 2001 Glenn Fung & Olvi Mangasarian Data Mining Institute University of.

Similar presentations


Presentation on theme: "Proximal Support Vector Machine Classifiers KDD 2001 San Francisco August 26-29, 2001 Glenn Fung & Olvi Mangasarian Data Mining Institute University of."— Presentation transcript:

1 Proximal Support Vector Machine Classifiers KDD 2001 San Francisco August 26-29, 2001 Glenn Fung & Olvi Mangasarian Data Mining Institute University of Wisconsin - Madison

2 Support Vector Machines Maximizing the Margin between Bounding Planes A+ A-

3 Proximal Vector Machines Fitting the Data using two parallel Bounding Planes A+ A-

4 Standard Support Vector Machine Formulation  Margin is maximized by minimizing  Solve the quadratic program for some : min s. t. (QP),, denotes where or membership.

5 PSVM Formulation We have from the QP SVM formulation: (QP) min s. t. This simple, but critical modification, changes the nature of the optimization problem tremendously!! Solving for in terms of and gives: min

6 Advantages of New Formulation  Objective function remains strongly convex  An explicit exact solution can be written in terms of the problem data  PSVM classifier is obtained by solving a single system of linear equations in the usually small dimensional input space  Exact leave-one-out-correctness can be obtained in terms of problem data

7 Linear PSVM We want to solve: min  Setting the gradient equal to zero, gives a nonsingular system of linear equations.  Solution of the system gives the desired PSVM classifier

8 Linear PSVM Solution Here,  The linear system to solve depends on: which is of the size  is usually much smaller than

9 Linear Proximal SVM Algorithm Classifier: Input Define Solve Calculate

10 Nonlinear PSVM Formulation By QP “duality”,. Maximizing the margin in the “dual space”, gives: min  Replace by a nonlinear kernel : min  Linear PSVM: (Linear separating surface: ) (QP) min s. t.

11 The Nonlinear Classifier  The nonlinear classifier:  Where K is a nonlinear kernel, e.g.:  Gaussian (Radial Basis) Kernel :  The -entry of represents the “similarity” of data pointsand

12 Nonlinear PSVM Defining slightly different:  Similar to the linear case, setting the gradient equal to zero, we obtain: However, reduced kernels techniques can be used (RSVM) to reduce dimensionality.  Here, the linear system to solve is of the size

13 Linear Proximal SVM Algorithm Input Solve Calculate Non Define Classifier:


Download ppt "Proximal Support Vector Machine Classifiers KDD 2001 San Francisco August 26-29, 2001 Glenn Fung & Olvi Mangasarian Data Mining Institute University of."

Similar presentations


Ads by Google