Ghent University An overview of Reservoir Computing: theory, applications and implementations Benjamin Schrauwen David Verstraeten and Jan Van Campenhout.

Slides:



Advertisements
Similar presentations
Chrisantha Fernando & Sampsa Sojakka
Advertisements

INTRODUCTION TO Machine Learning ETHEM ALPAYDIN © The MIT Press, Lecture Slides for.
CSC321: 2011 Introduction to Neural Networks and Machine Learning Lecture 7: Learning in recurrent networks Geoffrey Hinton.
Neural Networks  A neural network is a network of simulated neurons that can be used to recognize instances of patterns. NNs learn by searching through.
CSCI 347 / CS 4206: Data Mining Module 07: Implementations Topic 03: Linear Models.
Tuomas Sandholm Carnegie Mellon University Computer Science Department
Machine Learning Neural Networks
Soft computing Lecture 6 Introduction to neural networks.
Lecture 14 – Neural Networks
ETHEM ALPAYDIN © The MIT Press, Lecture Slides for.
Supervised and Unsupervised learning and application to Neuroscience Cours CA6b-4.
INTRODUCTION TO Machine Learning ETHEM ALPAYDIN © The MIT Press, Lecture Slides for.
Connectionist models. Connectionist Models Motivated by Brain rather than Mind –A large number of very simple processing elements –A large number of weighted.
Carla P. Gomes CS4700 CS 4700: Foundations of Artificial Intelligence Prof. Carla P. Gomes Module: Neural Networks: Concepts (Reading:
Goals of Adaptive Signal Processing Design algorithms that learn from training data Algorithms must have good properties: attain good solutions, simple.
MACHINE LEARNING 12. Multilayer Perceptrons. Neural Networks Lecture Notes for E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1)
Artificial Neural Networks
RCT tutorial 1 RC tutorial David Verstraeten. RCT tutorial 2 Reservoir Computing Random and fixed Trained.
Tutorial : Echo State Networks Dan Popovici University of Montreal (UdeM) MITACS 2005.
Soft Computing Colloquium 2 Selection of neural network, Hybrid neural networks.
Self-Organized Recurrent Neural Learning for Language Processing April 1, March 31, 2012 State from June 2009.
Machine Learning1 Machine Learning: Summary Greg Grudic CSCI-4830.
Presentation on Neural Networks.. Basics Of Neural Networks Neural networks refers to a connectionist model that simulates the biophysical information.
Pattern Similarity and Storage Capacity of Hopfield Network Suman K Manandhar Prof. Ramakoti Sadananda Computer Science and Information Management AIT.
Artificial Neural Nets and AI Connectionism Sub symbolic reasoning.
IE 585 Introduction to Neural Networks. 2 Modeling Continuum Unarticulated Wisdom Articulated Qualitative Models Theoretic (First Principles) Models Empirical.
Self organizing maps 1 iCSC2014, Juan López González, University of Oviedo Self organizing maps A visualization technique with data dimension reduction.
Outline 1-D regression Least-squares Regression Non-iterative Least-squares Regression Basis Functions Overfitting Validation 2.
NEURAL NETWORKS FOR DATA MINING
Dynamical network motifs: building blocks of complex dynamics in biological networks Valentin Zhigulin Department of Physics, Caltech, and Institute for.
Machine Learning Using Support Vector Machines (Paper Review) Presented to: Prof. Dr. Mohamed Batouche Prepared By: Asma B. Al-Saleh Amani A. Al-Ajlan.
PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 3: LINEAR MODELS FOR REGRESSION.
Neural Networks Dr. Thompson March 19, Artificial Intelligence Robotics Computer Vision & Speech Recognition Expert Systems Pattern Recognition.
Institute of Intelligent Power Electronics – IPE Page1 A Dynamical Fuzzy System with Linguistic Information Feedback Xiao-Zhi Gao and Seppo J. Ovaska Institute.
Soft computing Lecture 7 Multi-Layer perceptrons.
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
Learning Chaotic Dynamics from Time Series Data A Recurrent Support Vector Machine Approach Vinay Varadan.
Ghent University Pattern recognition with CNNs as reservoirs David Verstraeten 1 – Samuel Xavier de Souza 2 – Benjamin Schrauwen 1 Johan Suykens 2 – Dirk.
Artificial Neural Networks (ANN). Artificial Neural Networks First proposed in 1940s as an attempt to simulate the human brain’s cognitive learning processes.
Chapter 6 Neural Network.
Object Recognizing. Deep Learning Success in 2012 DeepNet and speech processing.
BACKPROPAGATION (CONTINUED) Hidden unit transfer function usually sigmoid (s-shaped), a smooth curve. Limits the output (activation) unit between 0..1.
IJCNN, July 27, 2004 Extending SpikeProp Benjamin Schrauwen Jan Van Campenhout Ghent University Belgium.
Ghent University Compact hardware for real-time speech recognition using a Liquid State Machine Benjamin Schrauwen – Michiel D’Haene David Verstraeten.
Neural Networks Lecture 11: Learning in recurrent networks Geoffrey Hinton.
Ghent University On Implementing Reservoir Computing Benjamin Schrauwen Electronics and Information Systems Department Ghent University – Belgium December.
Lecture 12. Outline of Rule-Based Classification 1. Overview of ANN 2. Basic Feedforward ANN 3. Linear Perceptron Algorithm 4. Nonlinear and Multilayer.
Artificial Neural Networks By: Steve Kidos. Outline Artificial Neural Networks: An Introduction Frank Rosenblatt’s Perceptron Multi-layer Perceptron Dot.
Ghent University Backpropagation for Population-Temporal Coded Spiking Neural Networks July WCCI/IJCNN 2006 Benjamin Schrauwen and Jan Van Campenhout.
Machine Learning Supervised Learning Classification and Regression
Deep Learning Amin Sobhani.
Randomness in Neural Networks
Learning in Neural Networks
ECE 5424: Introduction to Machine Learning
Computer Science and Engineering, Seoul National University
27 April ESANN 2006 Benjamin Schrauwen and Jan Van Campenhout
Matt Gormley Lecture 16 October 24, 2016
Intelligent Information System Lab
Artificial Neural Networks
Department of Electrical and Computer Engineering
Artificial Neural Networks
OCNC Statistical Approach to Neural Learning and Population Coding ---- Introduction to Mathematical.
Vinit Shah, Joseph Picone and Iyad Obeid
Introduction to Radial Basis Function Networks
A Dynamic System Analysis of Simultaneous Recurrent Neural Network
Deep Learning Authors: Yann LeCun, Yoshua Bengio, Geoffrey Hinton
August 8, 2006 Danny Budik, Itamar Elhanany Machine Intelligence Lab
Automatic Handwriting Generation
Presentation transcript:

Ghent University An overview of Reservoir Computing: theory, applications and implementations Benjamin Schrauwen David Verstraeten and Jan Van Campenhout Electronics and Information Systems Department Ghent University – Belgium April – ESANN

2/19 An overview of Reservoir Computing: theory, applications and implementations ESANN - April Intro In ML and pattern recognition: mostly FF structures NN, Bayesian models, kernel methods … Well understood, non-temporal Many applications: temporal domain Time series prediction Financial data Dynamic systems and control Robotics Vision, speech,… Takens’ theorem: explicit embedding Or introducing recurrence: loopy belief propagation, RNN

3/19 An overview of Reservoir Computing: theory, applications and implementations ESANN - April Intro Recurrent Neural Networks: Hopfield (1982): Specific topologies with symmetric weights Information stored in attractors Werbos (1974): BackProp Through Time (and all its improvements) Problem of fading gradient, mathematically difficult Few applications, difficult to master Special topologies: LSTM (Schmidhuber) RNN are universal approximators (ESANN special session 2005)

4/19 An overview of Reservoir Computing: theory, applications and implementations ESANN - April Intro Recurrent structures without the training: Reservoir Computing Early related work by Buonomano (1995) and Laurenco (1994) Independently discovered: Jaeger (2001): Echo State Networks (engineering) Maass (2002): Liquid State Machines (neuroscience) Shortly afterwards: Steil (2003): weight dynamics of Atiya-Parlos equivalent RC: Fixed (random) topology operated in correct dynamic regime Linear “readout” function which is trained

5/19 An overview of Reservoir Computing: theory, applications and implementations ESANN - April Reservoir Computing

6/19 An overview of Reservoir Computing: theory, applications and implementations ESANN - April Reservoir Computing Properties of reservoir: Exact topology, connectivity, weights: not important Has to have fading memory: when not chaotic Longest memory if at the edge of stability: memory = nr. nodes Reservoir size can be large: no over-fitting Training with linear regression (pseudo-inv, ridge regression): No local minima, no problems with recurrent structure, one shot learning Can do regression, classification, prediction On-line learning also possible with LMS and RLS

7/19 An overview of Reservoir Computing: theory, applications and implementations ESANN - April Reservoir Computing RC does on-line computing: prediction at every time-step Theoretically: Any time-invariant filter with fading memory can be learned But: unable to implement generic FSMs Recently Maass (2006): when adding output feedback Also non-fading memory filters: generic FSMs Ability to simulate any n-th order dynamical system Turing universal

8/19 An overview of Reservoir Computing: theory, applications and implementations ESANN - April RC: tone generation example Taken from H. Jaeger

9/19 An overview of Reservoir Computing: theory, applications and implementations ESANN - April Usual setup and training Create random weight matrices Rescale reservoir weights so that max absolute eigenvalue close to one (edge of stability) Excite reservoir with input and record all states Train readouts by minimizing (Aw-b) 2 A space time w B

10/19 An overview of Reservoir Computing: theory, applications and implementations ESANN - April Influence of parameters Not very important: Connection fraction Exact topology Weight distribution timescale error dynamic regime chaos error reservoir size error

11/19 An overview of Reservoir Computing: theory, applications and implementations ESANN - April A useful ESANN analogy Input Compute output Reservoir Reservoir state 2008

12/19 An overview of Reservoir Computing: theory, applications and implementations ESANN - April State space view

13/19 An overview of Reservoir Computing: theory, applications and implementations ESANN - April z' * * * * * * * ● ● ● ● ● ● x y * * * * * * * x' y' Kernel ● ● ● ● ● ● Kernel: Projection of input space into high- dimensional feature space. ● Conventional methods rely on 'kernel trick' to avoid explicitly going to feature space. ● Reservoir computing works in feature space, but reservoir state contains temporal information. Temporal to spatial transformation Link to kernel machines

14/19 An overview of Reservoir Computing: theory, applications and implementations ESANN - April Link to FSMs FSMRC with output feedback

15/19 An overview of Reservoir Computing: theory, applications and implementations ESANN - April RC: Applications Chaotic time series prediction: order of magnitude better than SOA Speech recognition on small vocabulary: outperform HMM-based recognizer (Sphinx) Digits recognition: better than SOA Robot control System identification Noise removal/modelling …

16/19 An overview of Reservoir Computing: theory, applications and implementations ESANN - April Larger example: speech ReadoutReservoir Pre-processing Speech Post-processing Σ Σ... Downsampling... t t t Reservoir state Mean... WTA '6'

17/19 An overview of Reservoir Computing: theory, applications and implementations ESANN - April RC: novel computing paradigm RC presents a novel way of looking at computation “Random” dynamic systems can be used by only training a linear readout layer RC already used to show general computing capabilities of: Microcolumn structure in the cortex Gene regulatory network The visual cortex of a real cat Implementation: Toolbox (freely available at “Bucket of water”, aVLSI, digital hardware Photonics (in progress)

18/19 An overview of Reservoir Computing: theory, applications and implementations ESANN - April Current research topics Theoretical: Proper understanding of importance of dynamics Regularisation Reservoir optimisation Intrinsic plasticity: unsupervised reservoir adaptation based on infomax to set dynamic regime Timescales Modular reservoirs Generic reservoir idea Applications

19/19 An overview of Reservoir Computing: theory, applications and implementations ESANN - April This session Reservoir optimisation Intrinsic plasticity: Steil, Verstraeten et al., Wardermann et al. Reservoir pruning: Dutoit et al. Alternate reservoir ideas Gao et al. Lourenco