Presentation is loading. Please wait.

Presentation is loading. Please wait.

Support Vector Regression (Linear Case:)  Given the training set:  Find a linear function, where is determined by solving a minimization problem that.

Similar presentations


Presentation on theme: "Support Vector Regression (Linear Case:)  Given the training set:  Find a linear function, where is determined by solving a minimization problem that."— Presentation transcript:

1 Support Vector Regression (Linear Case:)  Given the training set:  Find a linear function, where is determined by solving a minimization problem that guarantees the smallest overall experiment error made by  Motivated by SVM:  should be as small as possible  Some tiny error should be discard

2 -Insensitive Loss Function  -insensitive loss function:  The loss made by the estimation function, at the data point is  If then is defined as:

3 -Insensitive Linear Regression Find with the smallest overall error

4 - insensitive Support Vector Regression Model Motivated by SVM:  should be as small as possible  Some tiny error should be discarded where

5 Reformulated - SVR as a Constrained Minimization Problem subject to n+1+2m variables and 2m constrains minimization problem Enlarge the problem size and computational complexity for solving the problem

6 SV Regression by Minimizing Quadratic -Insensitive Loss  We minimizeat the same time  Occam’s razor : the simplest is the best  We have the following (nonsmooth) problem: where  Have the strong convexity of the problem

7 - insensitive Loss Function

8 Quadratic -insensitive Loss Function

9 -function replaceUse Quadratic -insensitive Function whichis defined by -function with

10

11 -insensitive Smooth Support Vector Regression strongly convex This problem is a strongly convex minimization problem without any constrains twice differentiable Newton-Armijo method The object function is twice differentiable thus we can use a fast Newton-Armijo method to solve this problem

12 Nonlinear -SVR Based on duality theorem and KKT – optimality conditions In nonlinear case :

13 Nonlinear SVR Let and Nonlinear regression function :

14 Nonlinear Smooth Support Vector -insensitive Regression

15 Slice method Training set and testing set (Slice method) Gaussian kernel Gaussian kernel is used to generate nonlinear -SVR in all experiments Reduced kernel technique Reduced kernel technique is utilized when training dataset is bigger then 1000 Error measure : 2-norm relative error Numerical Results : observations : predicted values

16 +noise Noise: mean=0, 101 points Parameter: Training time : 0.3 sec. 101 Data Points in Nonlinear SSVR with Kernel:

17 First Artificial Dataset random noise with mean=0,standard deviation 0.04 Training Time : 0.016 sec. Error : 0.059 Training Time : 0.015 sec. Error : 0.068 - SSVR LIBSVM

18 Original Function Noise : mean=0, Parameter : Training time : 9.61 sec. Mean Absolute Error (MAE) of 49x49 mesh points : 0.1761 Estimated Function 481 Data Points in

19 Noise : mean=0, Estimated Function Original Function Using Reduced Kernel: Parameter : Training time : 22.58 sec. MAE of 49x49 mesh points : 0.0513

20 Real Datasets

21 Linear -SSVR Tenfold Numerical Result

22 Nonlinear -SSVR Tenfold Numerical Result 1/2

23 Nonlinear -SSVR Tenfold Numerical Result 2/2


Download ppt "Support Vector Regression (Linear Case:)  Given the training set:  Find a linear function, where is determined by solving a minimization problem that."

Similar presentations


Ads by Google