Presented By Dr. Paul Cottrell Company: Reykjavik.

Slides:



Advertisements
Similar presentations
Cognitive Systems, ICANN panel, Q1 What is machine intelligence, as beyond pattern matching, classification and prediction. What is machine intelligence,
Advertisements

Hierarchical Temporal Memory (HTM)
INTRODUCTION TO Machine Learning ETHEM ALPAYDIN © The MIT Press, Lecture Slides for.
1 Neural networks. Neural networks are made up of many artificial neurons. Each input into the neuron has its own weight associated with it illustrated.
Artificial Neural Network
Intelligent Environments1 Computer Science and Engineering University of Texas at Arlington.
Template design only ©copyright 2008 Ohio UniversityMedia Production Spring Quarter  A hierarchical neural network structure for text learning.
Functional Link Network. Support Vector Machines.
ETHEM ALPAYDIN © The MIT Press, Lecture Slides for.
Connectionist models. Connectionist Models Motivated by Brain rather than Mind –A large number of very simple processing elements –A large number of weighted.
Introduction to Neural Networks John Paxton Montana State University Summer 2003.
IT 691 Final Presentation Pace University Created by: Robert M Gust Mark Lee Samir Hessami Mark Lee Samir Hessami.
Random Administrivia In CMC 306 on Monday for LISP lab.
NEURAL NETWORKS Introduction
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
SmarTrader Analytical/research software system © ITC Software.
Traffic Sign Recognition Using Artificial Neural Network Radi Bekker
Alternative Parallel Processing Approaches Jonathan Sagabaen.
CHAPTER 12 ADVANCED INTELLIGENT SYSTEMS © 2005 Prentice Hall, Decision Support Systems and Intelligent Systems, 7th Edition, Turban, Aronson, and Liang.
By Paul Cottrell, BSc, MBA, ABD. Author Complexity Science, Behavioral Finance, Dynamic Hedging, Financial Statistics, Chaos Theory Proprietary Trader.
1 Introduction to Artificial Neural Networks Andrew L. Nelson Visiting Research Faculty University of South Florida.
Alan M. Turing “Alan M. Turing and Intelligent Machinery, a prelude to Artificial Intelligence” May 4, 2012 Version 2.0; 05/04/2012 John M. Casarella Proceedings.
An Introduction to Artificial Intelligence and Knowledge Engineering N. Kasabov, Foundations of Neural Networks, Fuzzy Systems, and Knowledge Engineering,
Neural Networks Chapter 6 Joost N. Kok Universiteit Leiden.
Outline What Neural Networks are and why they are desirable Historical background Applications Strengths neural networks and advantages Status N.N and.
Advances in Modeling Neocortex and its impact on machine intelligence Jeff Hawkins Numenta Inc. VS265 Neural Computation December 2, 2010 Documentation.
NEURAL NETWORKS FOR DATA MINING
What's in the brain that ink may character?1 What’s in the brain that ink may character? (Net-based models in neurology, cognition, and social processes)
Presented By Dr. Paul Cottrell Company: Reykjavik.
0 Chapter 1: Introduction Fundamentals of Computational Neuroscience Dec 09.
Hierarchical Temporal Memory as a Means for Image Recognition by Wesley Bruning CHEM/CSE 597D Final Project Presentation December 10, 2008.
A New Theory of Neocortex and Its Implications for Machine Intelligence TTI/Vanguard, All that Data February 9, 2005 Jeff Hawkins Director The Redwood.
Artificial Neural Networks An Introduction. What is a Neural Network? A human Brain A porpoise brain The brain in a living creature A computer program.
1 Introduction to Neural Networks And Their Applications.
Dr. Z. R. Ghassabi Spring 2015 Deep learning for Human action Recognition 1.
Neural Networks Steven Le. Overview Introduction Architectures Learning Techniques Advantages Applications.
WEEK INTRODUCTION IT440 ARTIFICIAL INTELLIGENCE.
Artificial Intelligence, Expert Systems, and Neural Networks Group 10 Cameron Kinard Leaundre Zeno Heath Carley Megan Wiedmaier.
Back-Propagation Algorithm AN INTRODUCTION TO LEARNING INTERNAL REPRESENTATIONS BY ERROR PROPAGATION Presented by: Kunal Parmar UHID:
Introduction to Neural Networks Introduction to Neural Networks Applied to OCR and Speech Recognition An actual neuron A crude model of a neuron Computational.
Soft Computing methods for High frequency tradin.
Chapter 18 Connectionist Models
Why Can't A Computer Be More Like A Brain?. Outline Introduction Turning Test HTM ◦ A. Theory ◦ B. Applications & Limits Conclusion.
Chapter 8: Adaptive Networks
Introduction to Deep Learning
Chapter 6 Neural Network.
Financial Data mining and Tools CSCI 4333 Presentation Group 6 Date10th November 2003.
Stock market forecasting using LASSO Linear Regression model
Seth Kulman Faculty Sponsor: Professor Gordon H. Dash.
Michael Holden Faculty Sponsor: Professor Gordon H. Dash.
Network Management Lecture 13. MACHINE LEARNING TECHNIQUES 2 Dr. Atiq Ahmed Université de Balouchistan.
A Presentation on Adaptive Neuro-Fuzzy Inference System using Particle Swarm Optimization and it’s Application By Sumanta Kundu (En.R.No.
Chapter 13 Artificial Intelligence. Artificial Intelligence – Figure 13.1 The Turing Test.
NEURONAL NETWORKS AND CONNECTIONIST (PDP) MODELS Thorndike’s “Law of Effect” (1920’s) –Reward strengthens connections for operant response Hebb’s “reverberatory.
How to Create a Mind The Secret of Human Thought Revealed by Ray Kurzweil Slides by Prof. Tappert.
INTRODUCTION TO NEURAL NETWORKS 2 A new sort of computer What are (everyday) computer systems good at... and not so good at? Good at..Not so good at..
Business Intelligence and Decision Support Systems (9 th Ed., Prentice Hall) Chapter 6: Artificial Neural Networks for Data Mining.
Neural Networks: An Introduction and Overview
Neural Network Architecture Session 2
Multilayer Perceptrons
CSE 473 Introduction to Artificial Intelligence Neural Networks
Intelligent Information System Lab
Introduction to Neural Networks And Their Applications
Intelligent Leaning -- A Brief Introduction to Artificial Neural Networks Chiung-Yao Fang.
Artificial Intelligence introduction(2)
Intelligent Leaning -- A Brief Introduction to Artificial Neural Networks Chiung-Yao Fang.
Overview of deep learning
The Network Approach: Mind as a Web
Neural Networks: An Introduction and Overview
Yingze Wang and Shi-Kuo Chang University of Pittsburgh
Presentation transcript:

Presented By Dr. Paul Cottrell Company: Reykjavik

Introduction Cortical computing is defined as the use of neural networks to store and process data, which is similar to how the neo-cortex performs activities. (gray matter) Subcortical computing is defined as the use of primitive neural activity to store and process data, which is similar to how the limbic system performs activities. (emotion) Cortical and subcortical computing can be used for algorithmic trading of financial markets

Literature Review Sparse distributed representations can code patterns from input signals (Numenta, 2011). Hierarchical temporal memory can be utilized for pattern recognition and prediction (Numenta, 2011). An artificial intelligent system can be developed utilizing similar neurological processes in the human brain pertaining to neural connections (Hawkins and Blakeslee, 2004). Artificial neural networks can be utilized for machine learning when using backward propagation (Rumelhart and McClelland, 1986)

Literature Review Evolutionary algorithms became computationally feasible (Goldberg, 1989) The Selfridge pandemonium is a pathway of higher sematic understanding of inputs (Minsky and Papert, 1969). The Selfrige pandemonium is similar to the Numenta concept of the hierarchal temporal memory architecture. The use of a flexible probability between knowledge nodes can allow for pattern recognition and neural plasticity (Geortzel et al., 2012).

Methodology Longitudinal study Independent Variables SDR variables (NT charts, Time, Profit, Trading Type, Etc.) Dependent Variable Performance relative to buy/hold strategy Samples West Texas Intermediate (WTI) from January 3, 2005 through April 21, 2015.

The Dataset Time Series Daily closings from January 3 rd, 2005 through April 21 st, From Federal Reserve Economic Data (FRED)

Algo #n BuySell Hold Profit / Loss + Association / - Association Neurotransmitter Chart SDR* Data SDR* Data *Sparse Distributed Representations (SDR) New Cognitive Architecture Theoretical Framework: Pattern recognition via emotional states and trial & error.

Neurotransmitters Neurotransmitter Chart AssociationValue Range Change0 or 1 Stay0 or 1 Fear1-99 Danger1-99 Surprise1-99 Trauma1-99 Protective1-10 (1 = high risk aversion)

SDR Data SDR Data SDR Data Store in a data structure Node Name +/- association of result on account value Behavior (buy/sell/hold) Neurotransmitter Chart Emotional State Time stamp Data from the algorithm Delta of forecast and market price Variable values and parameters for the Algorithm SDR has a complete history. Fresher SDR data used in referencing decisions. SDR Old Data SDR Old Data SDR Fresher Data SDR Fresher Data

Artificial Intelligence - EEG

SDR Slice

Oil Trading Performance AI performs better than a buy and hold strategy

Conclusion When trading the oil market an artificial intelligent system utilizing cortical and subcortical computing can perform better than a buy and hold strategy between Jan 3, 2005 through April 21, A cortical and subcortical algorithmic system can recognize patterns in a dataset, and therefore can be utilized to do automatic adjustments per an objective function. The computer code overhead is not substantial for trading individual markets. Adding cortical tissue layers seem to allow for more sematic understanding of environment for better trading determination.

Reference Geortzel, B., Pennachin, C., & Geisweiller, N. (2012). Building better minds: Artificial general intelligence via the CogPrime Architecture. Goldberg, D. E. (1989). Genetic algorithm in search, optimization, and machine learning. Addison-Wesley. Hawkins, J., & Blakeslee, Sandra. (2004). On intelligence: How a new understanding of the brain will lead to the creation of truly intelligent machines. New York, NY: Henry Holt and Company. LLC. Minsky, M. & Papert, S. (1969). Perceptrons: An introduction to computational geometry. Cambridge, Mass.: MIT Press. ISBN Numenta (2011). Hierarchical Temporal Memory (White Paper). Rumelhart, D. & McClelland, J. (1986). Parallel Distributed Processing. Cambridge, Mass.: MIT Press.

Contact Information and Publications Dr. Paul Cottrell (Twitter) Paul Cottrell (YouTube)