Prediction of a nonlinear time series with feedforward neural networks Mats Nikus Process Control Laboratory.

Slides:



Advertisements
Similar presentations
Perceptron Lecture 4.
Advertisements

NEURAL NETWORKS Backpropagation Algorithm
CSC321: 2011 Introduction to Neural Networks and Machine Learning Lecture 7: Learning in recurrent networks Geoffrey Hinton.
Some terminology When the relation between variables are expressed in this manner, we call the relevant equation(s) mathematical models The intercept and.
Machine Learning Neural Networks
An introduction to: Deep Learning aka or related to Deep Neural Networks Deep Structural Learning Deep Belief Networks etc,
I welcome you all to this presentation On: Neural Network Applications Systems Engineering Dept. KFUPM Imran Nadeem & Naveed R. Butt &
September 30, 2010Neural Networks Lecture 8: Backpropagation Learning 1 Sigmoidal Neurons In backpropagation networks, we typically choose  = 1 and 
Neural Networks. R & G Chapter Feed-Forward Neural Networks otherwise known as The Multi-layer Perceptron or The Back-Propagation Neural Network.
Goals of Adaptive Signal Processing Design algorithms that learn from training data Algorithms must have good properties: attain good solutions, simple.
Chapter Seven The Network Approach: Mind as a Web.
Machine Learning Motivation for machine learning How to set up a problem How to design a learner Introduce one class of learners (ANN) –Perceptrons –Feed-forward.
November 30, 2010Neural Networks Lecture 20: Interpolative Associative Memory 1 Associative Networks Associative networks are able to store a set of patterns.
-Artificial Neural Network- Chapter 5 Back Propagation Network
November 21, 2012Introduction to Artificial Intelligence Lecture 16: Neural Network Paradigms III 1 Learning in the BPN Gradients of two-dimensional functions:
Linear Regression Analysis
Neuro-fuzzy Systems Xinbo Gao School of Electronic Engineering Xidian University 2004,10.
Neural networks - Lecture 111 Recurrent neural networks (II) Time series processing –Networks with delayed input layer –Elman network Cellular networks.
Artificial neural networks:
Artificial Neural Networks
Using Neural Networks in Database Mining Tino Jimenez CS157B MW 9-10:15 February 19, 2009.
1 What Is Forecasting? Sales will be $200 Million!
Authors : Ramon F. Astudillo, Silvio Amir, Wang Lin, Mario Silva, Isabel Trancoso Learning Word Representations from Scarce Data By: Aadil Hayat (13002)
Neural Networks AI – Week 23 Sub-symbolic AI Multi-Layer Neural Networks Lee McCluskey, room 3/10
Appendix B: An Example of Back-propagation algorithm
 Diagram of a Neuron  The Simple Perceptron  Multilayer Neural Network  What is Hidden Layer?  Why do we Need a Hidden Layer?  How do Multilayer.
Classification / Regression Neural Networks 2
1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 21 Oct 28, 2005 Nanjing University of Science & Technology.
Artificial Neural Networks. The Brain How do brains work? How do human brains differ from that of other animals? Can we base models of artificial intelligence.
CS 478 – Tools for Machine Learning and Data Mining Backpropagation.
CSC321: Neural Networks Lecture 2: Learning with linear neurons Geoffrey Hinton.
Modelling Language Evolution Lecture 1: Introduction to Learning Simon Kirby University of Edinburgh Language Evolution & Computation Research Unit.
Applications of Neural Networks in Time-Series Analysis Adam Maus Computer Science Department Mentor: Doctor Sprott Physics Department.
CS344: Introduction to Artificial Intelligence (associated lab: CS386) Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture 31: Feedforward N/W; sigmoid.
BACKPROPAGATION: An Example of Supervised Learning One useful network is feed-forward network (often trained using the backpropagation algorithm) called.
Multi-Layer Perceptron
Introduction to Chaos by: Saeed Heidary 29 Feb 2013.
Akram Bitar and Larry Manevitz Department of Computer Science
CSE & CSE6002E - Soft Computing Winter Semester, 2011 Neural Networks Videos Brief Review The Next Generation Neural Networks - Geoff Hinton.
Review: Neural Network Control of Robot Manipulators; Frank L. Lewis; 1996.
CS621 : Artificial Intelligence
Fundamentals of Artificial Neural Networks Chapter 7 in amlbook.com.
CIAR Summer School Tutorial Lecture 1b Sigmoid Belief Nets Geoffrey Hinton.
Neural Networks - lecture 51 Multi-layer neural networks  Motivation  Choosing the architecture  Functioning. FORWARD algorithm  Neural networks as.
EEE502 Pattern Recognition
Chapter 8: Adaptive Networks
Hazırlayan NEURAL NETWORKS Backpropagation Network PROF. DR. YUSUF OYSAL.
CHEE825 Fall 2005J. McLellan1 Nonlinear Empirical Models.
Artificial Neural Networks (ANN). Artificial Neural Networks First proposed in 1940s as an attempt to simulate the human brain’s cognitive learning processes.
Previous Lecture Perceptron W  t+1  W  t  t  d(t) - sign (w(t)  x)] x Adaline W  t+1  W  t  t  d(t) - f(w(t)  x)] f’ x Gradient.
How do we describe this? 10. Wave Motion. An initial shape… …so can we just let the normal modes oscillate at  n ? No! Because that initial shape can.
Neural Networks Lecture 11: Learning in recurrent networks Geoffrey Hinton.
Neural Networks The Elements of Statistical Learning, Chapter 12 Presented by Nick Rizzolo.
Neural Networks Lecture 4 out of 4. Practical Considerations Input Architecture Output.
The Gradient Descent Algorithm
Learning in Neural Networks
A Simple Artificial Neuron
Announcements HW4 due today (11:59pm) HW5 out today (due 11/17 11:59pm)
PSG College of Technology
CSE P573 Applications of Artificial Intelligence Neural Networks
Artificial Intelligence Methods
Using Artificial Neural Networks and Support Vector Regression to Model the Lyapunov Exponent Adam Maus.
CSE 573 Introduction to Artificial Intelligence Neural Networks
Neural Networks Chapter 5
Neural networks (1) Traditional multi-layer perceptrons
Backpropagation David Kauchak CS159 – Fall 2019.
Prediction Networks Prediction A simple example (section 3.7.3)
August 8, 2006 Danny Budik, Itamar Elhanany Machine Intelligence Lab
The Network Approach: Mind as a Web
Akram Bitar and Larry Manevitz Department of Computer Science
Presentation transcript:

Prediction of a nonlinear time series with feedforward neural networks Mats Nikus Process Control Laboratory

The time series

A closer look

Another look

Studying the time series Some features seem to reapeat themselves over and over, but not totally ”deterministically” Lets study the autocovariance function

The autocovariance function

Studying the time series The autocovariance function tells the same: There are certainly some dynamics in the data Lets now make a phaseplot of the data In a phaseplot the signal is plotted against itself with some lag With one lag we get

Phase plot

3D phase plot

The phase plots tell Use two lagged values The first lagged value describes a parabola Lets make a neural network for prediction of the timeseries based on the findings.

The neural network y(k+1) ^ y(k) y(k-1) Lets try with 3 hidden nodes 2 for the ”parabola” and one for the ”rest”

Prediction results

Residuals (on test data)

A more difficult case If the time series is time variant (i.e. the dynamic behaviour changes over time) and the measurement data is noisy, the prediction task becomes more challenging.

Phase plot for a noisy timevariant case

Residuals with the model

Use a Kalman-filter to update the weights We can improve the predictions by using a Kalman-filter Assume that the process we want to predict is described by

Kalman-filter Use the following recursive equations The gradient needed in C k is fairly simple to calculate for a sigmoidal network

Residuals

Neural network parameters

Henon series The timeseries is actually described by