Soft Computing Colloquium 2 Selection of neural network, Hybrid neural networks.

Slides:



Advertisements
Similar presentations
1 Neural networks. Neural networks are made up of many artificial neurons. Each input into the neuron has its own weight associated with it illustrated.
Advertisements

Artificial Neural Networks - Introduction -
Machine Learning: Connectionist McCulloch-Pitts Neuron Perceptrons Multilayer Networks Support Vector Machines Feedback Networks Hopfield Networks.
Machine Learning Neural Networks
Soft computing Lecture 6 Introduction to neural networks.
ETHEM ALPAYDIN © The MIT Press, Lecture Slides for.
Artificial Intelligence (CS 461D)
Neural NetworksNN 11 Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
Decision Support Systems
RBF Neural Networks x x1 Examples inside circles 1 and 2 are of class +, examples outside both circles are of class – What NN does.
I welcome you all to this presentation On: Neural Network Applications Systems Engineering Dept. KFUPM Imran Nadeem & Naveed R. Butt &
Carla P. Gomes CS4700 CS 4700: Foundations of Artificial Intelligence Prof. Carla P. Gomes Module: Neural Networks: Concepts (Reading:
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Slide 1 EE3J2 Data Mining EE3J2 Data Mining Lecture 15: Introduction to Artificial Neural Networks Martin Russell.
Rutgers CS440, Fall 2003 Neural networks Reading: Ch. 20, Sec. 5, AIMA 2 nd Ed.
S. Mandayam/ ANN/ECE Dept./Rowan University Artificial Neural Networks / Fall 2004 Shreekanth Mandayam ECE Department Rowan University.
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Artificial Neural Networks - Introduction -. Overview 1.Biological inspiration 2.Artificial neurons and neural networks 3.Application.
Radial Basis Function G.Anuradha.
Radial-Basis Function Networks
Radial Basis Function Networks
Face Recognition Using Neural Networks Presented By: Hadis Mohseni Leila Taghavi Atefeh Mirsafian.
CHAPTER 12 ADVANCED INTELLIGENT SYSTEMS © 2005 Prentice Hall, Decision Support Systems and Intelligent Systems, 7th Edition, Turban, Aronson, and Liang.
Radial Basis Function Networks
MSE 2400 EaLiCaRA Spring 2015 Dr. Tom Way
Presentation on Neural Networks.. Basics Of Neural Networks Neural networks refers to a connectionist model that simulates the biophysical information.
Intrusion Detection Using Hybrid Neural Networks Vishal Sevani ( )
Neural NetworksNN 11 Neural netwoks thanks to: Basics of neural network theory and practice for supervised and unsupervised.
Artificial Neural Nets and AI Connectionism Sub symbolic reasoning.
Artificial Neural Networks An Overview and Analysis.
Introduction to Neural Networks Debrup Chakraborty Pattern Recognition and Machine Learning 2006.
Chapter 3 Neural Network Xiu-jun GONG (Ph. D) School of Computer Science and Technology, Tianjin University
11 CSE 4705 Artificial Intelligence Jinbo Bi Department of Computer Science & Engineering
NEURAL NETWORKS FOR DATA MINING
LINEAR CLASSIFICATION. Biological inspirations  Some numbers…  The human brain contains about 10 billion nerve cells ( neurons )  Each neuron is connected.
Artificial Neural Networks. The Brain How do brains work? How do human brains differ from that of other animals? Can we base models of artificial intelligence.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Chapter 6: Techniques for Predictive Modeling
Artificial Neural Networks An Introduction. What is a Neural Network? A human Brain A porpoise brain The brain in a living creature A computer program.
Soft Computing Lecture 8 Using of perceptron for image recognition and forecasting.
Chapter 6: Artificial Neural Networks for Data Mining
Artificial Neural Networks Students: Albu Alexandru Deaconescu Ionu.
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
NEURAL NETWORKS LECTURE 1 dr Zoran Ševarac FON, 2015.
Artificial Intelligence CIS 342 The College of Saint Rose David Goldschmidt, Ph.D.
Artificial Neural Networks for Data Mining. Copyright © 2011 Pearson Education, Inc. Publishing as Prentice Hall 6-2 Learning Objectives Understand the.
Business Intelligence and Decision Support Systems (9 th Ed., Prentice Hall) Chapter 6: Artificial Neural Networks for Data Mining.
Business Intelligence and Decision Support Systems (9 th Ed., Prentice Hall) Chapter 6: Artificial Neural Networks for Data Mining.
Supervised Learning – Network is presented with the input and the desired output. – Uses a set of inputs for which the desired outputs results / classes.
Artificial Neural Networks By: Steve Kidos. Outline Artificial Neural Networks: An Introduction Frank Rosenblatt’s Perceptron Multi-layer Perceptron Dot.
Neural network based hybrid computing model for wind speed prediction K. Gnana Sheela, S.N. Deepa Neurocomputing Volume 122, 25 December 2013, Pages 425–429.
1 Neural Networks MUMT 611 Philippe Zaborowski April 2005.
Business Intelligence and Decision Support Systems (9 th Ed., Prentice Hall) Chapter 6: Artificial Neural Networks for Data Mining.
Machine Learning 12. Local Models.
Machine Learning Supervised Learning Classification and Regression
Neural networks.
Introduction to Artificial Neural Network Session 1
Deep Learning Amin Sobhani.
Artificial neural networks
Learning in Neural Networks
Soft Computing Introduction.
Radial Basis Function G.Anuradha.
CSE P573 Applications of Artificial Intelligence Neural Networks
Intelligent Leaning -- A Brief Introduction to Artificial Neural Networks Chiung-Yao Fang.
Intelligent Leaning -- A Brief Introduction to Artificial Neural Networks Chiung-Yao Fang.
CSE 573 Introduction to Artificial Intelligence Neural Networks
network of simple neuron-like computing elements
Introduction to Radial Basis Function Networks
The Network Approach: Mind as a Web
Presentation transcript:

Soft Computing Colloquium 2 Selection of neural network, Hybrid neural networks.

Objectives Why too much of models of neural networks (NN)? Classes of tasks and classes of NN Hybrid neural networks Hybrid model based on MLP and ART-2 Paths to improvement of neural networks

Submit a questions to discuss Paths to improvement of neural networks: –Development of growth neural networks with feedback and delays –Development of theory of spiking neurons and building of associative memory based on its –Development of neural network in which during learning logical (verbal) inference would appearance from associative memory

Why too much of models of neural networks (NN)? Models of neural networks simulate separate aspects of working of brain (e.g. associative memory, but how it works in whole is unknown for us. Questions: 1)What is consciousness? 2)What is role of emotions? 3)How different areas of brain are coordinated? 4) How associative links are transformed and used in logical inference and calculations?

6 Classes of tasks : prediction classification data association data conceptualization data filtering Neuromathematics

Classes of Neural Networks: Multi Layer Networks –Multi Layer Perceptron (MLP) Supervised learning –Radial Basis Functions (RBF- networks) Supervised learning –Recurrent Neural Networks (Elman, Jordan) Supervised learning Reinforcement learning –Counterpropagation network Supervised learning One-layer networks –Self-organized map (MAP) Unsupervised learning –Artificial resonance theory (ART) Unsupervised learning –Hamming network Supervised learning Fully interconnected networks –Hopfield network Supervised learning –Boltzmann machine Supervised learning –Bi-directional associative memory Supervised learning Spiking networks Supervised learning Unsupervised learning Reinforcement learning

Counterpropagation network

Network Selector Table

Hybrid Neural Networks. Includes: –Main neural network –Other neural network Preprocessing Postprocessing Some models of neural networks consist of some layers working by different manner and so such neural networks may be viewed as hybrid neural networks (including more elementary networks) Some authors calls hybrid neural networks such model which combine paradigms of neural networks and knowledge engineering.

Hybrid Neural Network based on models of Multi-Layer Perceptron and Adaptive Resonance Theory (A.Gavrilov, 2005) Aims to keep capabilities of ARM (plasticity and stability) Include in ART capabilities of MLP during learning to obtain complex secondary features from primary features (to approximate any function)

Disadvantages of model ART- 2 for recognition of images It uses of metrics of primary features of images to recognize of class or create of new class, Transformations of graphic images (shift or rotation or others) essentially influence on distance between input vectors So it is unsuitable for control system of a mobile robots

Architecture of hybrid neural network output vector output layer: clusters input layer: input variables y 1 y 2 y m input layer of ART-2, output layer of perceptron hidden layer of perceptron input vector x 1 x 2 x n

Algorithm of learning without teacher Set of initial weights of neurons; N out :=0; Input of image-example and calculate of outputs of perceptron; If N out =0 then forming of new cluster-output neuron; If N out >0 then calculate of distances between weight vector of ART-2 and output vector of perceptron, select of minimum of them (selection of output neuron-winner) and decide to create or not new cluster; If new cluster is not created then calculate new values of weights of output neuron-winner and calculate new weights of perceptron with algorithm “error back propagation”.

The illustration of algorithm R1R1

Images and parameters used in experiments Quantity of input neurons (pixels) (100х100), Quantity of neurons in hidden layer of perceptron - 20, Quantity of output neurons of perceptron (in input layer of ART-2) Nout - 10, Radius of cluster R was used in experiments in different manners: 1) adapt and fix, 2) calculate for every image by formulas S/(2N out ), where S – average input signal, N out – number of output neurons of perceptron, 3) calculated as 2D min, where D min – minimal distance between input vector of ART2 and weight vectors in previous image. Activation function of neurons of perceptron is rational sigmoid with parameter a=1, Value of learning step of perceptron is 1, Number of iterations of recalculation of weights of perceptron is from 1 to 10. 1) 2)3)

Series of images

Program for experiments

For sequence of images of series 1, 2, 1, 2 (a dark points are corresponding to 2nd kind of calculation of vigilance and light – to 1st one).

For sequence of images of series 1 at different number of iteration of EBP algorithm: 1, 3, 5, 7, 9.

Paths to improvement of neural networks Development of growth neural networks with feedback and delays Development of theory of spiking neural networks and building of associative memory based on them Development of neural network in which during learning logical (verbal) inference would appearance from associative memory