Presentation is loading. Please wait.

Presentation is loading. Please wait.

Reformulated - SVR as a Constrained Minimization Problem subject to n+1+2m variables and 2m constrains minimization problem Enlarge the problem size and.

Similar presentations


Presentation on theme: "Reformulated - SVR as a Constrained Minimization Problem subject to n+1+2m variables and 2m constrains minimization problem Enlarge the problem size and."— Presentation transcript:

1 Reformulated - SVR as a Constrained Minimization Problem subject to n+1+2m variables and 2m constrains minimization problem Enlarge the problem size and computational complexity for solving the problem

2 SV Regression by Minimizing Quadratic -Insensitive Loss  We have the following problem: where

3 Primal Formulation of SVR for Quadratic -Insensitive Loss  Extremely important: At the solution subject to

4 Dual Formulation of SVR for Quadratic -Insensitive Loss subject to

5 KKT Complementarity Conditions  KKT conditions are :  Don ’ t forget we have:

6 Simplify Dual Formulation of SVR subject to  The case, problem becomes to the least squares linear regression with a weight decay factor

7 Kernel in Dual Formulation for SVR  Then the regression function is defined by  Supposesolves the QP problem: where is chosen such that with subject to

8 Kernel Ridge Regression  Consider the least squares linear regression with a weight decay factor (i.e., quadratic 0-insensitive loss regression)  We ignore the bias term

9 General Issues for Solving the Problem in Dual Form  General strategies:  Start with any feasible point  Increase the dual objective function value iteratively Always stay in the feasible region  Stop until a stopping criterion is satisfied  Derive the stopping criterion via properties of convex optimization problem

10 Three Ways to Get the Stopping Criterion  Monitoring the growth of the dual objective  Stop when the fractional rate of increase is less than a small tolerance  Could deliver very poor results  Monitoring the KKT conditions  Necessary & sufficient conditions  Monitoring the duality gap  Vanishes only at the optimal point

11 1-Norm Soft Margin Dual Formulation The Lagrangian for 1-norm soft margin: where The partial derivatives with respect to primal variables equal zeros

12 Introduce Kernel in Dual Formulation for 1-Norm Soft Margin  Then the decision rule is defined by  The feature space implicitly defined by  Supposesolves the QP problem:

13 Introduce Kernel in Dual Formulation for 1-Norm Soft Margin  We are going to use gradient ascent method to solve the problem  Let set the bias to a fixed value  Then the QP problem becomes:  Easy to understand but extremely slow

14 Gradient Ascent Algorithm for the Relaxation QP Given training set S and learning rate Repeat for end until stopping criterion satisfied


Download ppt "Reformulated - SVR as a Constrained Minimization Problem subject to n+1+2m variables and 2m constrains minimization problem Enlarge the problem size and."

Similar presentations


Ads by Google