Financial time series forecasting using support vector machines Author: Kyoung-jae Kim 2003 Elsevier B.V.

Slides:



Advertisements
Similar presentations
Introduction to Support Vector Machines (SVM)
Advertisements

Generative Models Thus far we have essentially considered techniques that perform classification indirectly by modeling the training data, optimizing.
CSC321: Introduction to Neural Networks and Machine Learning Lecture 24: Non-linear Support Vector Machines Geoffrey Hinton.
Lecture 9 Support Vector Machines
ECG Signal processing (2)
Image classification Given the bag-of-features representations of images from different classes, how do we learn a model for distinguishing them?
SVM - Support Vector Machines A new classification method for both linear and nonlinear data It uses a nonlinear mapping to transform the original training.
An Introduction of Support Vector Machine
Classification / Regression Support Vector Machines
CHAPTER 10: Linear Discrimination
An Introduction of Support Vector Machine
Support Vector Machines and Kernels Adapted from slides by Tim Oates Cognition, Robotics, and Learning (CORAL) Lab University of Maryland Baltimore County.
Support Vector Machines
1 Lecture 5 Support Vector Machines Large-margin linear classifier Non-separable case The Kernel trick.
SVM—Support Vector Machines
CSCI 347 / CS 4206: Data Mining Module 07: Implementations Topic 03: Linear Models.
The Disputed Federalist Papers : SVM Feature Selection via Concave Minimization Glenn Fung and Olvi L. Mangasarian CSNA 2002 June 13-16, 2002 Madison,
Support Vector Machines (and Kernel Methods in general)
Support Vector Machines and Kernel Methods
Fuzzy Support Vector Machines (FSVMs) Weijia Wang, Huanren Zhang, Vijendra Purohit, Aditi Gupta.
Support Vector Machines (SVMs) Chapter 5 (Duda et al.)
The Nature of Statistical Learning Theory by V. Vapnik
1 Classification: Definition Given a collection of records (training set ) Each record contains a set of attributes, one of the attributes is the class.
Rutgers CS440, Fall 2003 Support vector machines Reading: Ch. 20, Sec. 6, AIMA 2 nd Ed.
Sketched Derivation of error bound using VC-dimension (1) Bound our usual PAC expression by the probability that an algorithm has 0 error on the training.
A Kernel-based Support Vector Machine by Peter Axelberg and Johan Löfhede.
SVMs Finalized. Where we are Last time Support vector machines in grungy detail The SVM objective function and QP Today Last details on SVMs Putting it.
Lecture outline Support vector machines. Support Vector Machines Find a linear hyperplane (decision boundary) that will separate the data.
What is Learning All about ?  Get knowledge of by study, experience, or being taught  Become aware by information or from observation  Commit to memory.
Statistical Learning Theory: Classification Using Support Vector Machines John DiMona Some slides based on Prof Andrew Moore at CMU:
SVMs, cont’d Intro to Bayesian learning. Quadratic programming Problems of the form Minimize: Subject to: are called “quadratic programming” problems.
Ch. Eick: Support Vector Machines: The Main Ideas Reading Material Support Vector Machines: 1.Textbook 2. First 3 columns of Smola/Schönkopf article on.
Machine Learning CUNY Graduate Center Lecture 3: Linear Regression.
1 SUPPORT VECTOR MACHINES İsmail GÜNEŞ. 2 What is SVM? A new generation learning system. A new generation learning system. Based on recent advances in.
Support Vector Machine With Adaptive Parameters in Financial Time Series Forecasting by L. J. Cao and Francis E. H. Tay IEEE Transactions On Neural Networks,
Kernel Methods A B M Shawkat Ali 1 2 Data Mining ¤ DM or KDD (Knowledge Discovery in Databases) Extracting previously unknown, valid, and actionable.
SVM Support Vector Machines Presented by: Anas Assiri Supervisor Prof. Dr. Mohamed Batouche.
An Introduction to Support Vector Machines (M. Law)
CISC667, F05, Lec22, Liao1 CISC 667 Intro to Bioinformatics (Fall 2005) Support Vector Machines I.
CS 478 – Tools for Machine Learning and Data Mining SVM.
Kernel Methods: Support Vector Machines Maximum Margin Classifiers and Support Vector Machines.
Ohad Hageby IDC Support Vector Machines & Kernel Machines IP Seminar 2008 IDC Herzliya.
CSSE463: Image Recognition Day 14 Lab due Weds, 3:25. Lab due Weds, 3:25. My solutions assume that you don't threshold the shapes.ppt image. My solutions.
Support vector machine LING 572 Fei Xia Week 8: 2/23/2010 TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: A 1.
Support Vector Machine Debapriyo Majumdar Data Mining – Fall 2014 Indian Statistical Institute Kolkata November 3, 2014.
Support Vector Machines Tao Department of computer science University of Illinois.
Final Exam Review CS479/679 Pattern Recognition Dr. George Bebis 1.
Text Classification using Support Vector Machine Debapriyo Majumdar Information Retrieval – Spring 2015 Indian Statistical Institute Kolkata.
Kernel Methods: Support Vector Machines Maximum Margin Classifiers and Support Vector Machines.
SVMs in a Nutshell.
Introduction to Machine Learning Prof. Nir Ailon Lecture 5: Support Vector Machines (SVM)
A Brief Introduction to Support Vector Machine (SVM) Most slides were from Prof. A. W. Moore, School of Computer Science, Carnegie Mellon University.
Support Vector Machines (SVMs) Chapter 5 (Duda et al.) CS479/679 Pattern Recognition Dr. George Bebis.
SUPPORT VECTOR MACHINES
Support Vector Machines
PREDICT 422: Practical Machine Learning
Support Vector Machine
Support Vector Machines
An Introduction to Support Vector Machines
LINEAR AND NON-LINEAR CLASSIFICATION USING SVM and KERNELS
Support Vector Machines Introduction to Data Mining, 2nd Edition by
Support Vector Machines
Pattern Recognition CS479/679 Pattern Recognition Dr. George Bebis
Hyperparameters, bias-variance tradeoff, validation
CSSE463: Image Recognition Day 14
Support Vector Machines and Kernels
COSC 4368 Machine Learning Organization
SVMs for Document Ranking
Support Vector Machines 2
Introduction to Machine Learning
Presentation transcript:

Financial time series forecasting using support vector machines Author: Kyoung-jae Kim 2003 Elsevier B.V.

Outline Introduction to SVM Introduction to datasets Experimental settings Analysis of experimental results

Linear separability – In general, two groups are linearly separable in n- dimensional space if they can be separated by an (n − 1)-dimensional hyperplane.

Support Vector Machines Maximum-margin hyperplane maximum-margin hyperplane

Formalization Training data Hyperplane Parallel bounding hyperplanes

Objective Minimize (in w, b) ||w|| subject to (for any i=1, …, n)

A 2-D case In 2-D: – Training data: xixi cici x+2y+1=-1 -2x+2y+1=1 -2x+2y+1=0 w= b=-1 margin=sqrt(2)/2

Not linear separable No hyperplane can separate the two groups

Soft Margin Choose a hyperplane that splits the examples as cleanly as possible Still maximizing the distance to the nearest cleanly split examples Introduce an error cost C d*C

Higher dimensions Separation might be easier

Kernel Trick Build maximal margin hyperplanes in high- dimenisonal feature space depends on inner product: more cost Use a kernel function that lives in low dimensions, but behaves like an inner product in high dimensions

Kernels Polynomial – K(p, q) = (pq + c) d Radial basis function – K(p, q) = exp(-γ||p-q|| 2 ) Gaussian radial basis – K(p, q) = exp(-||p-q|| 2 /2δ 2 )

Tuning parameters Error weight – C Kernel parameters – δ 2 – d – c 0

Underfitting & Overfitting Underfitting Overfitting High generalization ability

Datasets Input variables – 12 technical indicators Target attribute – Korea composite stock price index (KOSPI) 2928 trading days – 80% for training, 20% for holdout

Settings (1/3) SVM – kernels polynomial kernel Gaussian radial basis function – δ 2 – error cost C

Settings (2/3) BP-Network – layers 3 – number of hidden nodes 6, 12, 24 – learning epochs per training example 50, 100, 200 – learning rate 0.1 – momentum 0.1 – input nodes 12

Settings (3/3) Case-Based Reasoning – k-NN k = 1, 2, 3, 4, 5 – distance evaluation Euclidean distance

Experimental results The results of SVMs with various C where δ 2 is fixed at 25 Too small C underfitting * Too large C overfitting * * F.E.H. Tay, L. Cao, Application of support vector machines in -nancial time series forecasting, Omega 29 (2001) 309–317

Experimental results The results of SVMs with various δ 2 where C is fixed at 78 Small value of δ 2 overfitting * Large value of δ 2 underfitting * * F.E.H. Tay, L. Cao, Application of support vector machines in -nancial time series forecasting, Omega 29 (2001) 309–317

Experimental results and conclusion SVM outperformes BPN and CBR SVM minimizes structural risk SVM provides a promising alternative for financial time-series forecasting Issues – parameter tuning