Dünaamiliste süsteemide modelleerimine Identification for control in a non- linear system world Eduard Petlenkov, Automaatikainstituut, 2013.

Slides:



Advertisements
Similar presentations
NEURAL NETWORKS Backpropagation Algorithm
Advertisements

Artificial Neural Networks (1)
Neural Networks  A neural network is a network of simulated neurons that can be used to recognize instances of patterns. NNs learn by searching through.
Training and Testing Neural Networks 서울대학교 산업공학과 생산정보시스템연구실 이상진.
Machine Learning: Connectionist McCulloch-Pitts Neuron Perceptrons Multilayer Networks Support Vector Machines Feedback Networks Hopfield Networks.
Machine Learning Neural Networks
Artificial Neural Networks ECE 398BD Instructor: Shobha Vasudevan.
Decision Support Systems
Neural Networks Basic concepts ArchitectureOperation.
RBF Neural Networks x x1 Examples inside circles 1 and 2 are of class +, examples outside both circles are of class – What NN does.
Prénom Nom Document Analysis: Artificial Neural Networks Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
I welcome you all to this presentation On: Neural Network Applications Systems Engineering Dept. KFUPM Imran Nadeem & Naveed R. Butt &
November 19, 2009Introduction to Cognitive Science Lecture 20: Artificial Neural Networks I 1 Artificial Neural Network (ANN) Paradigms Overview: The Backpropagation.
Neural Networks. R & G Chapter Feed-Forward Neural Networks otherwise known as The Multi-layer Perceptron or The Back-Propagation Neural Network.
An Illustrative Example
Artificial Neural Networks (ANNs)
Image Compression Using Neural Networks Vishal Agrawal (Y6541) Nandan Dubey (Y6279)
SOMTIME: AN ARTIFICIAL NEURAL NETWORK FOR TOPOLOGICAL AND TEMPORAL CORRELATION FOR SPATIOTEMPORAL PATTERN LEARNING.
Radial-Basis Function Networks
Face Recognition Using Neural Networks Presented By: Hadis Mohseni Leila Taghavi Atefeh Mirsafian.
Traffic Sign Recognition Using Artificial Neural Network Radi Bekker
MSE 2400 EaLiCaRA Spring 2015 Dr. Tom Way
Presentation on Neural Networks.. Basics Of Neural Networks Neural networks refers to a connectionist model that simulates the biophysical information.
Multiple-Layer Networks and Backpropagation Algorithms
Artificial Neural Networks
Eduard Petlenkov, Associate Professor, TUT Department of Computer Control
1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 23 Nov 2, 2005 Nanjing University of Science & Technology.
Chapter 3 Neural Network Xiu-jun GONG (Ph. D) School of Computer Science and Technology, Tianjin University
11 CSE 4705 Artificial Intelligence Jinbo Bi Department of Computer Science & Engineering
1 Chapter 6: Artificial Neural Networks Part 2 of 3 (Sections 6.4 – 6.6) Asst. Prof. Dr. Sukanya Pongsuparb Dr. Srisupa Palakvangsa Na Ayudhya Dr. Benjarath.
Appendix B: An Example of Back-propagation algorithm
NEURAL NETWORKS FOR DATA MINING
Artificial Intelligence Techniques Multilayer Perceptrons.
Artificial Neural Networks. The Brain How do brains work? How do human brains differ from that of other animals? Can we base models of artificial intelligence.
Machine Learning Neural Networks (3). Understanding Supervised and Unsupervised Learning.
Multi-Layer Perceptron
Akram Bitar and Larry Manevitz Department of Computer Science
 Based on observed functioning of human brain.  (Artificial Neural Networks (ANN)  Our view of neural networks is very simplistic.  We view a neural.
1 Lecture 6 Neural Network Training. 2 Neural Network Training Network training is basic to establishing the functional relationship between the inputs.
Introduction to Neural Networks Introduction to Neural Networks Applied to OCR and Speech Recognition An actual neuron A crude model of a neuron Computational.
Eduard Petlenkov, TTÜ automaatikainstituudi dotsent
Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
Modelleerimine ja Juhtimine Tehisnärvivõrgudega Identification and Control with artificial neural networks (2nd part) Eduard Petlenkov, Automaatikainstituut.
Hand-written character recognition
CHEE825 Fall 2005J. McLellan1 Nonlinear Empirical Models.
Modelleerimine ja Juhtimine Tehisnärvivõrgudega Identification and Control with artificial neural networks.
Artificial Intelligence CIS 342 The College of Saint Rose David Goldschmidt, Ph.D.
Example Apply hierarchical clustering with d min to below data where c=3. Nearest neighbor clustering d min d max will form elongated clusters!
Bab 5 Classification: Alternative Techniques Part 4 Artificial Neural Networks Based Classifer.
Business Intelligence and Decision Support Systems (9 th Ed., Prentice Hall) Chapter 6: Artificial Neural Networks for Data Mining.
Business Intelligence and Decision Support Systems (9 th Ed., Prentice Hall) Chapter 6: Artificial Neural Networks for Data Mining.
Supervised Learning – Network is presented with the input and the desired output. – Uses a set of inputs for which the desired outputs results / classes.
Machine Learning Artificial Neural Networks MPλ ∀ Stergiou Theodoros 1.
Evolutionary Design of the Closed Loop Control on the Basis of NN-ANARX Model Using Genetic Algoritm.
Business Intelligence and Decision Support Systems (9 th Ed., Prentice Hall) Chapter 6: Artificial Neural Networks for Data Mining.
Eduard Petlenkov, Associate Professor, TUT Department of Computer Control
Neural Network Architecture Session 2
ANN-based program for Tablet PC character recognition
One-layer neural networks Approximation problems
Modelleerimine ja Juhtimine Tehisnärvivõrgudega
Classification with Perceptrons Reading:
CSSE463: Image Recognition Day 17
شبکه عصبی تنظیم: بهروز نصرالهی-فریده امدادی استاد محترم: سرکار خانم کریمی دانشگاه آزاد اسلامی واحد شهرری.
Artificial Intelligence Methods
network of simple neuron-like computing elements
CSSE463: Image Recognition Day 17
CSSE463: Image Recognition Day 17
CSSE463: Image Recognition Day 17
CSSE463: Image Recognition Day 17
Akram Bitar and Larry Manevitz Department of Computer Science
Presentation transcript:

Dünaamiliste süsteemide modelleerimine Identification for control in a non- linear system world Eduard Petlenkov, Automaatikainstituut, 2013

Control Design Nominal Model Regulator © Lennart Ljung

Control Design True System Regulator © Lennart Ljung

Control Design Nominal Model Regulator Model error model © Lennart Ljung

Control Design Nominal Model Regulator Model error model True system © Lennart Ljung

Control Design Nominal Model Regulator Model error model Nominal closed loop system © Lennart Ljung

Model Error Models   = y – y model u  © Lennart Ljung

Identification for Control Identifiction for control is the art and technique to design identification experiments and regulator design methods so that the model error model matches the nominal closed loop system in a suitable way © Lennart Ljung

A Data Set/ nonlinear process Output Input © Lennart Ljung

Feedback Neural Networks. Identification of dynamic systems (1) Eduard Petlenkov, Automaatikainstituut, 2013 External feedback:

11 Eduard Petlenkov, Automaatikainstituut, 2013 Elman’i võrk Internal feedback: where r is the number of neurons on the recurrent layer; Feedback Neural Networks. Identification of dynamic systems (2)

Identification with ANNs (1) Eduard Petlenkov, Automaatikainstituut, 2013 is the input of the system is the output of the system is the output of the model

Identification with ANNs (2) Eduard Petlenkov, Automaatikainstituut, 2013

Identification with ANNs (3) Eduard Petlenkov, Automaatikainstituut, Getting experimental training data: Input signal is given to the system and the corresponding output reaction is measured. Obtained input-output data is used to train a NN-based model; 2. Choosing NN’s structure: number of inputs, outputs, hidden layers, neurons; activation functions; 3. Choosing initial weighting coefficients (usually random).

Identification with ANNs (3) Eduard Petlenkov, Automaatikainstituut, Calculation of NN outputs corresponding to the training input set; 5. Comparison of outputs with reference outputs from the training set – calculation of model error; 6. Calculation of uptadeted weighting coefficients using selected training algorithm; Repeat steps 4-6 untill a predefined number of iterations (epochs) is reached or until the desire accuracy is achieved.

Dünaamiliste süsteemide juhtimine tehisnärvivõrkudega NN-based control of dynamic systems Eduard Petlenkov, Automaatikainstituut, 2013

Predictive Control

Inverse model Eduard Petlenkov, Automaatikainstituut, 2013

Inverse model Eduard Petlenkov, Automaatikainstituut, 2013

Inverse model based control Here Eduard Petlenkov, Automaatikainstituut, 2013

Adaptive inverse model based control Eduard Petlenkov, Automaatikainstituut, 2013

Adaptive Predictive Control

Mustrite tuvastamine ja klassifitseerimine tehisnärvivõrkudega NN-based pattern recognition and classification Eduard Petlenkov, Automaatikainstituut, 2013

24 Self-organizing networks Self-orgamizing network is capable of adjusting its parameters on the basis of inputs and chosen optimization criteria. It doesn’t have any reference outputs Kohonen map Eduard Petlenkov, Automaatikainstituut, 2013

Näide 1: Character recognition(1)

Näide 1: Character recognition (2)

Näide 2: number recognition

Etalon_1=[ ]'; Etalon_2=[ ]'; Etalon_3=[ ]'; Etalon_4=[ ]'; Etalon_5=[ ]'; Etalon_6=[ ]'; Etalon_7=[ ]'; Etalon_8=[ ]'; Etalon_9=[ ]'; Etalon_0=[ ]'; net = newff(minmax(P),[25 10],{'logsig' 'logsig'},'traingda') net.trainParam.goal = 0 net.trainParam.epochs=10000 Näide 2: Number recognition– NN structure

Näide 2: Number recognition– testing on the training set

Näide 2: Number recognition– testing on the validation set ,06680,00060,00660,00260,00890,01520,00340,00000,00500, ,00190,15480,00140,00050,01080,00050,00440,00010,00000, ,01450,00020,14950,00010,00320,03430,00810,00010,00390, ,00030,00260,00010,01020,00040,00000,00140,00470,00080, ,00000,00030,0000 0,01770, ,00010,00130,00200,00020,00050,06110,00020,00000,00650, ,00020,00000,00020,00030,00010,00000,01540,00090, ,10300,0000 0,58680, ,0002 0,00030,00080,05800,00560,00060,00100,27990, ,0000 0,0186

Databases for benchmark problems  MNIST database of handwritten digits  26x26px  labeled images and test images  Semeion handwritten digits from around 80 persons  16x16px