Neural Networks Neural Network Application For Predicting Stock Index Volatility Using High Frequency Data Project No CFWin03-32 Presented by: Venkatesh.

Slides:



Advertisements
Similar presentations
1 Machine Learning: Lecture 4 Artificial Neural Networks (Based on Chapter 4 of Mitchell T.., Machine Learning, 1997)
Advertisements

1 Neural networks. Neural networks are made up of many artificial neurons. Each input into the neuron has its own weight associated with it illustrated.
Multilayer Perceptrons 1. Overview  Recap of neural network theory  The multi-layered perceptron  Back-propagation  Introduction to training  Uses.
Mehran University of Engineering and Technology, Jamshoro Department of Electronic Engineering Neural Networks Feedforward Networks By Dr. Mukhtiar Ali.
Risk and Return in Capital Markets
Day Trading with Opening Range Breakout Strategies
MBA & MBA – Banking and Finance (Term-IV) Course : Security Analysis and Portfolio Management Unit I : Introduction to Security analysis Lesson No. 1.2-
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wiley.
Radial Basis Functions
I welcome you all to this presentation On: Neural Network Applications Systems Engineering Dept. KFUPM Imran Nadeem & Naveed R. Butt &
Neural Networks Marco Loog.
A Neural Network Approach For Options Pricing By: Jing Wang Course: CS757 Computational Finance Instructor: Dr. Ruppa Thulasiram Project #: CFWin03-35.
Chapter 6: Multilayer Neural Networks
Copyright © 2011 Pearson Prentice Hall. All rights reserved. Chapter 10 Capital Markets and the Pricing of Risk.
Volatility Chapter 9 Risk Management and Financial Institutions 2e, Chapter 9, Copyright © John C. Hull
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Radial-Basis Function Networks
Hazırlayan NEURAL NETWORKS Radial Basis Function Networks II PROF. DR. YUSUF OYSAL.
Statistical Physics Approaches to Financial Fluctuations Fengzhong Wang Advisor: H. Eugene Stanley Dec 13, 2007 Collaborators: Philipp Weber, Woo-Sung.
Traffic Sign Recognition Using Artificial Neural Network Radi Bekker
Artificial Neural Networks (ANN). Output Y is 1 if at least two of the three inputs are equal to 1.
Artificial Neural Networks
Using Neural Networks in Database Mining Tino Jimenez CS157B MW 9-10:15 February 19, 2009.
Artificial Neural Nets and AI Connectionism Sub symbolic reasoning.
Appendix B: An Example of Back-propagation algorithm
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wiley.
NEURAL NETWORKS FOR DATA MINING
ECE 8443 – Pattern Recognition ECE 8527 – Introduction to Machine Learning and Pattern Recognition LECTURE 16: NEURAL NETWORKS Objectives: Feedforward.
1 Integration of Neural Network and Fuzzy system for Stock Price Prediction Student : Dah-Sheng Lee Professor: Hahn-Ming Lee Date:5 December 2003.
Artificial Intelligence Methods Neural Networks Lecture 4 Rakesh K. Bissoondeeal Rakesh K. Bissoondeeal.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Data Mining: Neural Network Applications by Louise Francis CAS Annual Meeting, Nov 11, 2002 Francis Analytics and Actuarial Data Mining, Inc.
1 Prediction method for stock market Student : Dah-Sheng Lee Professor: Hahn-Ming Lee Date:30 January 2004.
CS344: Introduction to Artificial Intelligence (associated lab: CS386) Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture 31: Feedforward N/W; sigmoid.
Artificial Intelligence Chapter 3 Neural Networks Artificial Intelligence Chapter 3 Neural Networks Biointelligence Lab School of Computer Sci. & Eng.
Multi-Layer Perceptron
Soft Computing Lecture 8 Using of perceptron for image recognition and forecasting.
Neural Network Implementation of Poker AI
Dissertation paper Modelling and Forecasting Volatility Index based on the stochastic volatility models MSc Student: LAVINIA-ROXANA DAVID Supervisor: Professor.
Back-Propagation Algorithm AN INTRODUCTION TO LEARNING INTERNAL REPRESENTATIONS BY ERROR PROPAGATION Presented by: Kunal Parmar UHID:
CS344: Introduction to Artificial Intelligence (associated lab: CS386) Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture 32: sigmoid neuron; Feedforward.
Over-Trained Network Node Removal and Neurotransmitter-Inspired Artificial Neural Networks By: Kyle Wray.
Dr.Abeer Mahmoud ARTIFICIAL INTELLIGENCE (CS 461D) Dr. Abeer Mahmoud Computer science Department Princess Nora University Faculty of Computer & Information.
Neural Networks The Elements of Statistical Learning, Chapter 12 Presented by Nick Rizzolo.
Data Mining: Neural Network Applications by Louise Francis CAS Convention, Nov 13, 2001 Francis Analytics and Actuarial Data Mining, Inc.
Neural network based hybrid computing model for wind speed prediction K. Gnana Sheela, S.N. Deepa Neurocomputing Volume 122, 25 December 2013, Pages 425–429.
Prepared by Fayes Salma.  Introduction: Financial Tasks  Data Mining process  Methods in Financial Data mining o Neural Network o Decision Tree  Trading.
Machine Learning Supervised Learning Classification and Regression
Neural networks.
Outline Problem Description Data Acquisition Method Overview
Artificial neural networks
Artificial Intelligence (CS 370D)
CSE 473 Introduction to Artificial Intelligence Neural Networks
FUNDAMENTAL CONCEPT OF ARTIFICIAL NETWORKS
CS621: Artificial Intelligence
Prof. Carolina Ruiz Department of Computer Science
of the Artificial Neural Networks.
Artificial Intelligence Chapter 3 Neural Networks
Artificial Neural Networks
Capabilities of Threshold Neurons
Artificial Intelligence Chapter 3 Neural Networks
Yi Zhao1, Yanyan Shen*1, Yanmin Zhu1, Junjie Yao2
Artificial Intelligence Chapter 3 Neural Networks
Central China Normal University , Wuhan , China
Artificial Intelligence Chapter 3 Neural Networks
Computer Vision Lecture 19: Object Recognition III
CME Bitcoin Futures: Volatility and Liquidity
Artificial Intelligence Chapter 3 Neural Networks
Prof. Carolina Ruiz Department of Computer Science
Presentation transcript:

Neural Networks Neural Network Application For Predicting Stock Index Volatility Using High Frequency Data Project No CFWin03-32 Presented by: Venkatesh Manian Professor : Dr Ruppa K Tulasiramcs May, 30, 2003

Neural Networks Outline Introduction and Motivation Background Problem Statement Solution Strategy and Implementation Results Conclusion and Future Workcs May, 30, 2003

Neural Networks Introduction and Motivation Index is defined as “a statistical measure of the changes in the portfolio of stocks representing a portion of the overall market”[3]. Hol and Koopman [2] calculates volatility using high frequency intraday returns. The noise present in the daily squared series decreases as the sampling frequency of the returns increases. Pierre et.al[7] says that price change are correlated only over a short period of time whereas “absolute value has the ability to show correlation on time up to many years”. Predicting capability of neural networks.cs May, 30, 2003

Neural Networks Background Schwert in [9] points out the importance of intraday data on stock prices to keep track of market trends. –Market decline on October 13, Refenes in [8] explains about different problem available and its solution strategies. He says that neural networks is used in cases where the behavior of the system cannot be predicted.cs May, 30, 2003

Neural Networks Problem Statement The goal of this project is to predict the volatility of stock index using Radial Basis Function (RBF) Neural Networks The project focuses on the following aspects.  Using high frequency intraday returns so as to reduce the noise present in the input.  Using RBF networks which can calculate the number of hidden nodes needed for predicting volatility at runtime so as to reduce the problems involved in using more hidden nodes or less. Prediction of stock index volatility is also tested using multilayer feedforward network. In this case sigmoidal function is used as activation function.cs May, 30, 2003

Neural Networks Solution Strategy and Implementation Collection of every five-minute value of stock index. Intraday returns are calculated by subtracting successive log prices. Overnight returns is calculated in the similar way as the intraday returns using the closing price of the index and the price of index with which the market starts on the following day. Calculation of the daily realized volatility by finding the cumulative squared intraday returns. Realized volatility is used as input of the neural network and future stock index value is predicted.cs May, 30, 2003

Neural Networks Algorithm – Radial Basis Function Networks o H1H1 H2H2 H3H3 H4H4 InIn I1I1cs May, 30, 2003

Neural Networks Cont..cs May, 30, 2003

Neural Networks Cont.. Calculated intraday value and its corresponding realized volatility Normalized input valuecs May, 30, 2003

Neural Networks Cont.. Normalization is done using the following equation =((x-mean)/standard deviation) Input Data Day Volatility Volatility cs May, 30, 2003

Neural Networks Prediction using RBF network Configuration of the network –Number of input nodes is ten. –Initially the number of hidden nodes is set to zero. –Number of output nodes is set to one. Due to the high computational complexity of the system the size of the network has to be kept minimal. –Number of input nodes cannot be increased more than 15. –Because for each hidden node added into the network number of parameters to be updated in each equation of the Extended Kalman Filter is –k(nx+ny+1)+ny. –Where ‘k’ is number of hidden nodes, ‘nx’ is number of inputs and ‘ny’ is one in this case i.e. number of output.cs May, 30, 2003

Neural Networks Cont.. Learning in RBF network Learning of this network involved assigning a larger centers and then fine tuning of these centers. Based out difference between expected value and output. Setting up window size to see whether the normalized output value of each hidden node for a particular duration is below a threshold value. If the normalized output value of a particular hidden node is below a threshold vale for a duration called the window size then the particular hidden node is pruned. The major problem due to the presence of noise in the input data is over fitting. This results in increase in the number of hidden nodes with increasing the number patterns. Root Mean Square value of the output error is calculated to overcome this over fitting problem.cs May, 30, 2003

Neural Networks Cont.. Problems encountered. Initially I did not use normalized inputs but I reduced the size by dividing each input by This experiment gave me a kind of favorable results. The number of hidden nodes learned in case is four. The number of input patterns used in this case is 200. Number of input nodes used in this case is 10. Since normalizing is the way to reduce the range of the input value, each input data is normalized with respect to the mean and standard deviation of the data. After normalizing the network started to overfit the data. I tried to update the value of different parameters. But I was unable to control the effect of this problem. Hence I used different network for prediction. I used sigmoid function in this case as the activation function.cs May, 30, 2003

Neural Networks ANN using Sigmoid Function Algorithm In this case all connections are associated with weights. Weighted sum is given is given to each nodes of the next layer which calculates sigmoid function. On receiving the output from the output node, it is compared with the expected value and the output error is calculated. This value of error propagated back into network to adjust the weights. o H1H1 H2H2 H3H3 H4H4 InIn I1I1cs May, 30, 2003

Neural Networks Results I trained the network so as to get a minimum error in the testing phase. MAPE (mean absolute percentage error) is used as the evaluation method in this case.cs May, 30, 2003

Neural Networks Results – using test data The above table gives the output of the network using test data.cs May, 30, 2003

Neural Networks Conclusion and Future Work I used high frequency intraday data for predicting the value of volatility. The method used for prediction in this project is neural network. Since I did not get any favorable results in this case, I would take some help in solving the problem due to over fitting of data. I will also try to find a way to get better results using ANN, which uses sigmoid function. I would also make up a better algorithm which can overcome the memory problem involved in using large amount of data.cs May, 30, 2003

Neural Networks References 1.Andersen, T. and T. Bollerslev (1997). “Intraday periodicity and volatility persistence in Financial markets”. Journal of Empirical Finance 4, Eugene Hol and Siem Jan Koopman, “Stock Index Volatility Forecasting with High Frequency Data” No /4 in Tinbergen Institute Discussion Papers from Tinbergen Institute. 3.Investopedia.com 4.JingTao Yao and Chew Lim Tan. “Guidelines for Financial Forecasting with Neural Networks”. In Proceeding of International Conference on Neural Information Processing, Shangai, China, Pages , Iebeling Kaastra and Milton S. Boyd. “Forecasting Futures trading volume using Neural Networks”. Journal of Futures Market, 15(8): , December P. Sarachandran, N. Sundarajan and Lu Ying Wei. “Radial Basis Function Neural Networks with Sequential Learning”. World Scientific Publication Co. Pt. Ltd, march Pierre Cizeau, Yanhui Liu, Martin Meyer, C-K. Peng and H. Eugene Stanley. “Volatility distribution in the S&P500 stock index”. arXiv:condmat/ , August 1997.cs May, 30, 2003

Neural Networks 8.Apostolos-Paul Refenes. “Neural Network In the Capital Market”. John Wiley and Sons, LONDON, G. Williams Schwert. “Stock Market Volatility”. Financial Analysts Journal, pages 23-34, May-June Yahoo Finance May, 30, 2003

Neural Networks Thank Youcs May, 30, 2003

Neural Networks Network Training I have considered two types of network in this project. Radial Basis Function(RBF) network Artificial Neural Network –Sigmoid functioncs74.757