Presentation is loading. Please wait.

Presentation is loading. Please wait.

Lecture 14. Outline Support Vector Machine 1. Overview of SVM 2. Problem setting of linear separators 3. Soft Margin Method 4. Lagrange Multiplier Method.

Similar presentations


Presentation on theme: "Lecture 14. Outline Support Vector Machine 1. Overview of SVM 2. Problem setting of linear separators 3. Soft Margin Method 4. Lagrange Multiplier Method."— Presentation transcript:

1 Lecture 14

2 Outline Support Vector Machine 1. Overview of SVM 2. Problem setting of linear separators 3. Soft Margin Method 4. Lagrange Multiplier Method to find solutions

3 1. Support Vector Machines (SVM) Invented by Vladimir Vapnik and co-workers Introduced at the Computational Learning Theory (COLT) 1992 conference Derived from statistical learning theory

4 Support Vector Machines (SVM) Empirically good performance: successful applications in many fields (bioinformatics, text, image recognition,... ) Quite popular Now superseded by deep learning neural networks

5 Support Vector Machines (SVM) Linear regression: Use hyper-planes to separate two classes Based on idea of maximum “support”

6 1. Support Vector Machines If the two classes can be separated perfectly by a line in the x space, how do we choose the “best” line?

7 Support Vector Machines

8

9

10

11 One solution is to choose the line (hyperplane) with the largest margin. The margin is the distance between the two parallel lines on either side. B 1 B 2 b 11 b 12 b 21 b 22 margin

12 2. Optimization Problem setting

13 l This can be formulated as a constrained optimization problem. l We want to maximize l This is equivalent to minimizing l We have the following constraints l So we have a quadratic objective function with linear constraints which means it is a convex optimization problem and we can use Lagrange multipliers

14 2. Linear SVM Maximum margin becomes constrained optimization problem Quadratic programming optimization problem Can apply Lagrange multipliers

15 Read Example 5.5 In Page 264

16 3. Soft Margin for Linear SVM What to do when complete linear separation is impossible?

17 3. Linear SVMs Soft Margin method Corinna Cortes and Vladimir Vapnik propose (1995) modification allowing for mislabeled examples using “slack variables”

18 What if the problem is not linearly separable? Then we can introduce slack variables: Minimize Subject to # number of mistakes If data not separable introduce penalty Choose C based on cross validation How to penalize mistakes?

19 4. Use quadratic solver

20 Online Lessons for Lagrange Simplex Method and Optimization https://modelsim.wordpress.com/modules/optimizati on/ Mathematical Modeling and Simulation Module 2, lesson 2 – 6.

21 Exercise in Geometry Prove the distance between the two parallel planes is Hint, randomly select two points P1 & P2, one on each plane, and project the vector P1P2 to the normal n. the distance is the length of the projected vector of p1p2 on the normal vector.

22 The Midge Classification Challenge MCM Problem 1987 Adapted from Dr. Ben Fusaro Biologists W.L. Grogan of Salisbury Univ., and W.W. Wirth of the Smithsonian Institute, do research on biting midges.

23 The Midge Classification Challenge Grogan and Wirth were doing field work & captured 18 biting midges. They agreed that nine of the midges belonged to an antenna-dominated species, Ma, and six belonged to a wing-dominated species, Mw. The were sure that each of the three left-overs (red dots) belonged to one of the two species but which one...? The challenge -- Take a look at their antenna-wing data and see if you can help them out. ------------------ Midge Classification – Problem A ( Continuous ) from the 1989 MCM.

24 = Ma = M w The three unknowns -- (1.24, 1.80), (1.28, 1.84), (1.40, 2.04).


Download ppt "Lecture 14. Outline Support Vector Machine 1. Overview of SVM 2. Problem setting of linear separators 3. Soft Margin Method 4. Lagrange Multiplier Method."

Similar presentations


Ads by Google