Computational Intelligence Winter Term 2011/12 Prof. Dr. Günter Rudolph Lehrstuhl für Algorithm Engineering (LS 11) Fakultät für Informatik TU Dortmund.

Slides:



Advertisements
Similar presentations
Artificial Neural Networks
Advertisements

Pattern Recognition and Machine Learning
Feed-forward Networks
Multi-Layer Perceptron (MLP)
6. Radial-basis function (RBF) networks
Computational Intelligence Winter Term 2009/10 Prof. Dr. Günter Rudolph Lehrstuhl für Algorithm Engineering (LS 11) Fakultät für Informatik TU Dortmund.
Computational Intelligence Winter Term 2009/10 Prof. Dr. Günter Rudolph Lehrstuhl für Algorithm Engineering (LS 11) Fakultät für Informatik TU Dortmund.
Computational Intelligence Winter Term 2011/12 Prof. Dr. Günter Rudolph Lehrstuhl für Algorithm Engineering (LS 11) Fakultät für Informatik TU Dortmund.
Computational Intelligence Winter Term 2009/10 Prof. Dr. Günter Rudolph Lehrstuhl für Algorithm Engineering (LS 11) Fakultät für Informatik TU Dortmund.
Computational Intelligence Winter Term 2013/14 Prof. Dr. Günter Rudolph Lehrstuhl für Algorithm Engineering (LS 11) Fakultät für Informatik TU Dortmund.
Computational Intelligence Winter Term 2010/11 Prof. Dr. Günter Rudolph Lehrstuhl für Algorithm Engineering (LS 11) Fakultät für Informatik TU Dortmund.
Backpropagation Learning Algorithm
NEURAL NETWORKS Biological analogy
Computational Intelligence Winter Term 2014/15 Prof. Dr. Günter Rudolph Lehrstuhl für Algorithm Engineering (LS 11) Fakultät für Informatik TU Dortmund.
Computational Intelligence
EE 690 Design of Embodied Intelligence
Computational Intelligence Winter Term 2014/15 Prof. Dr. Günter Rudolph Lehrstuhl für Algorithm Engineering (LS 11) Fakultät für Informatik TU Dortmund.
CSC321: 2011 Introduction to Neural Networks and Machine Learning Lecture 7: Learning in recurrent networks Geoffrey Hinton.
Ch. 4: Radial Basis Functions Stephen Marsland, Machine Learning: An Algorithmic Perspective. CRC 2009 based on slides from many Internet sources Longin.
6/10/ Visual Recognition1 Radial Basis Function Networks Computer Science, KAIST.
RBF Neural Networks x x1 Examples inside circles 1 and 2 are of class +, examples outside both circles are of class – What NN does.
Radial Basis-Function Networks. Back-Propagation Stochastic Back-Propagation Algorithm Step by Step Example Radial Basis-Function Networks Gaussian response.
Radial Basis Functions
Slide 1 EE3J2 Data Mining EE3J2 Data Mining Lecture 15: Introduction to Artificial Neural Networks Martin Russell.
November 2, 2010Neural Networks Lecture 14: Radial Basis Functions 1 Cascade Correlation Weights to each new hidden node are trained to maximize the covariance.
Prediction Networks Prediction –Predict f(t) based on values of f(t – 1), f(t – 2),… –Two NN models: feedforward and recurrent A simple example (section.
CHAPTER 11 Back-Propagation Ming-Feng Yeh.
Hazırlayan NEURAL NETWORKS Radial Basis Function Networks I PROF. DR. YUSUF OYSAL.
Aula 4 Radial Basis Function Networks
November 21, 2012Introduction to Artificial Intelligence Lecture 16: Neural Network Paradigms III 1 Learning in the BPN Gradients of two-dimensional functions:
Radial-Basis Function Networks
Hazırlayan NEURAL NETWORKS Radial Basis Function Networks II PROF. DR. YUSUF OYSAL.
Radial Basis Function Networks
Radial Basis Function Networks
Chapter 4 Supervised learning: Multilayer Networks II.
Artificial Neural Networks Shreekanth Mandayam Robi Polikar …… …... … net k   
Introduction to Neural Networks Debrup Chakraborty Pattern Recognition and Machine Learning 2006.
LINEAR CLASSIFICATION. Biological inspirations  Some numbers…  The human brain contains about 10 billion nerve cells ( neurons )  Each neuron is connected.
Radial Basis Function Networks:
Akram Bitar and Larry Manevitz Department of Computer Science
Intro. ANN & Fuzzy Systems Lecture 14. MLP (VI): Model Selection.
1  The Problem: Consider a two class task with ω 1, ω 2   LINEAR CLASSIFIERS.
1  Problem: Consider a two class task with ω 1, ω 2   LINEAR CLASSIFIERS.
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
Computational Intelligence Winter Term 2015/16 Prof. Dr. Günter Rudolph Lehrstuhl für Algorithm Engineering (LS 11) Fakultät für Informatik TU Dortmund.
Neural Networks Lecture 11: Learning in recurrent networks Geoffrey Hinton.
Computational Intelligence Winter Term 2015/16 Prof. Dr. Günter Rudolph Lehrstuhl für Algorithm Engineering (LS 11) Fakultät für Informatik TU Dortmund.
CSE343/543 Machine Learning Mayank Vatsa Lecture slides are prepared using several teaching resources and no authorship is claimed for any slides.
Machine Learning Supervised Learning Classification and Regression
Supervised Learning in ANNs
Neural Networks Winter-Spring 2014
Adavanced Numerical Computation 2008, AM NDHU
Radial Basis Function G.Anuradha.
Soft Computing Applied to Finite Element Tasks
Computational Intelligence
Computational Intelligence
Computational Intelligence
Computational Intelligence
Neuro-Computing Lecture 4 Radial Basis Function Network
Neural Network - 2 Mayank Vatsa
Computational Intelligence
Introduction to Radial Basis Function Networks
Computational Intelligence
Computational Intelligence
CSC321: Neural Networks Lecture 11: Learning in recurrent networks
Prediction Networks Prediction A simple example (section 3.7.3)
Computational Intelligence
Computational Intelligence
Computational Intelligence
Akram Bitar and Larry Manevitz Department of Computer Science
Presentation transcript:

Computational Intelligence Winter Term 2011/12 Prof. Dr. Günter Rudolph Lehrstuhl für Algorithm Engineering (LS 11) Fakultät für Informatik TU Dortmund

Lecture 03 G. Rudolph: Computational Intelligence ▪ Winter Term 2011/12 2 Plan for Today ● Application Fields of ANNs  Classification  Prediction  Function Approximation ● Radial Basis Function Nets (RBF Nets)  Model  Training ● Recurrent MLP  Elman Nets  Jordan Nets

Lecture 03 G. Rudolph: Computational Intelligence ▪ Winter Term 2011/12 3 Application Fields of ANNs Classification given: set of training patterns (input / output)output = label (e.g. class A, class B,...) training patterns (known) weights (learnt) input (unknown) output (guessed) parameters phase I: train network phase II: apply network to unkown inputs for classification

Lecture 03 G. Rudolph: Computational Intelligence ▪ Winter Term 2011/12 4 Application Fields of ANNs Prediction of Time Series time series x 1, x 2, x 3,... (e.g. temperatures, exchange rates,...) task: given a subset of historical data, predict the future predictor... phase I: train network phase II: apply network to historical inputs for predicting unkown outputs training patterns: historical data where true output is known; error per pattern =

Lecture 03 G. Rudolph: Computational Intelligence ▪ Winter Term 2011/12 5 Application Fields of ANNs Prediction of Time Series: Example for Creating Training Data given: time series 10.5, 3.4, 5.6, 2.4, 5.9, 8.4, 3.9, 4.4, 1.7 time window: k=3 (10.5, 3.4, 5.6) known input 2.4 known output first input / output pair further input / output pairs:(3.4, 5.6, 2.4)5.9 (5.6, 2.4, 5.9) 8.4 (2.4, 5.9, 8.4) (5.9, 8.4, 3.9) (8.4, 3.9, 4.4)

Lecture 03 G. Rudolph: Computational Intelligence ▪ Winter Term 2011/12 6 Application Fields of ANNs Function Approximation (the general case) task: given training patterns (input / output), approximate unkown function → should give outputs close to true unkown function for arbitrary inputs values between training patterns are interpolated values outside convex hull of training patterns are extrapolated x x x x x x x x : input training pattern : input pattern where output to be interpolated : input pattern where output to be extrapolated

Lecture 03 G. Rudolph: Computational Intelligence ▪ Winter Term 2011/12 7 Radial Basis Function Nets (RBF Nets) Definition: A function  : R n → R is termed radial basis function iff 9  : R → R : 8 x 2 R n :  (x; c) =  ( || x – c || ). □ typically, || x || denotes Euclidean norm of vector x examples: Gaussian Epanechnikov Cosine unbounded bounded Definition: RBF local iff  (r) → 0 as r → 1 □ local

Lecture 03 G. Rudolph: Computational Intelligence ▪ Winter Term 2011/12 8 Radial Basis Function Nets (RBF Nets) Definition: A function f: R n → R is termed radial basis function net (RBF net) iff f(x) = w 1  (|| x – c 1 || ) + w 2  (|| x – c 2 || ) w p  (|| x – c q || ) □ layered net 1st layer fully connected no weights in 1st layer activation functions differ  (||x-c 1 ||)  (||x-c 2 ||)  (||x-c q ||) x1x1 x2x2 xnxn  w1w1 w2w2 wpwp RBF neurons

Lecture 03 G. Rudolph: Computational Intelligence ▪ Winter Term 2011/12 9 Radial Basis Function Nets (RBF Nets) given: N training patterns (x i, y i ) and q RBF neurons find: weights w 1,..., w q with minimal error known value unknown ) N linear equations with q unknowns solution: we know that f(x i ) = y i for i = 1,..., N or equivalently p ik known value

Lecture 03 G. Rudolph: Computational Intelligence ▪ Winter Term 2011/12 10 Radial Basis Function Nets (RBF Nets) in matrix form:P w = ywith P = (p ik ) and P: N x q, y: N x 1, w: q x 1, case N = q:w = P -1 yif P has full rank case N < q:many solutionsbut of no practical relevance case N > q:w = P + ywhere P + is Moore-Penrose pseudo inverse P w = y | ¢ P‘ from left hand side (P‘ is transpose of P) P‘P w = P‘ y | ¢ (P‘P) -1 from left hand side (P‘P) -1 P‘P w = (P‘P) -1 P‘ y | simplify unit matrix P+P+

Lecture 03 G. Rudolph: Computational Intelligence ▪ Winter Term 2011/12 11 Radial Basis Function Nets (RBF Nets) complexity (naive) w = (P‘P) -1 P‘ y P‘P: N 2 qinversion: q 3 P‘y: qNmultiplication: q 2 O(N 2 q) remark: if N large then inaccuracies for P‘P likely ) first analytic solution, then gradient descent starting from this solution requires differentiable basis functions!

Lecture 03 G. Rudolph: Computational Intelligence ▪ Winter Term 2011/12 12 Radial Basis Function Nets (RBF Nets) so far: tacitly assumed that RBF neurons are given ) center c k and radii  considered given and known how to choose c k and  ? uniform covering x x x x x x x x x x x if training patterns inhomogenously distributed then first cluster analysis choose center of basis function from each cluster, use cluster size for setting 

Lecture 03 G. Rudolph: Computational Intelligence ▪ Winter Term 2011/12 13 Radial Basis Function Nets (RBF Nets) advantages: additional training patterns → only local adjustment of weights optimal weights determinable in polynomial time regions not supported by RBF net can be identified by zero outputs disadvantages: number of neurons increases exponentially with input dimension unable to extrapolate (since there are no centers and RBFs are local)

Lecture 03 G. Rudolph: Computational Intelligence ▪ Winter Term 2011/12 14 Recurrent MLPs Jordan nets (1986) context neuron: reads output from some neuron at step t and feeds value into net at step t+1 x1x1 x2x2 y1y1 y2y2 Jordan net = MLP + context neuron for each output, context neurons fully connected to input layer

Lecture 03 G. Rudolph: Computational Intelligence ▪ Winter Term 2011/12 15 Recurrent MLPs Elman nets (1990) Elman net = MLP + context neuron for each neuron output of MLP, context neurons fully connected to associated MLP layer

Lecture 03 G. Rudolph: Computational Intelligence ▪ Winter Term 2011/12 16 Recurrent MLPs Training? ) unfolding in time (“loop unrolling“) identical MLPs serially connected (finitely often) results in a large MLP with many hidden (inner) layers backpropagation may take a long time but reasonable if most recent past more important than layers far away Why using backpropagation? ) use Evolutionary Algorithms directly on recurrent MLP! later!