Soft Competitive Learning without Fixed Network Dimensionality Jacob Chakareski and Sergey Makarov Rice University, Worcester Polytechnic Institute.

Slides:



Advertisements
Similar presentations
Distributed Algorithm for a Mobile Wireless Sensor Network for Optimal Coverage of Non-stationary Signals Andrea Kulakov University Sts Cyril and Methodius.
Advertisements

1. Find the cost of each of the following using the Nearest Neighbor Algorithm. a)Start at Vertex M.
2806 Neural Computation Self-Organizing Maps Lecture Ari Visa.
November 18, 2010Neural Networks Lecture 18: Applications of SOMs 1 Assignment #3 Question 2 Regarding your cascade correlation projects, here are a few.
Neural Networks Chapter 9 Joost N. Kok Universiteit Leiden.
Self Organization: Competitive Learning
5/16/2015Intelligent Systems and Soft Computing1 Introduction Introduction Hebbian learning Hebbian learning Generalised Hebbian learning algorithm Generalised.
Artificial neural networks:
Adaptive Resonance Theory (ART) networks perform completely unsupervised learning. Their competitive learning algorithm is similar to the first (unsupervised)
X0 xn w0 wn o Threshold units SOM.
4. PREFERENTIAL ATTACHMENT The rich gets richer. Empirical evidences Many large networks are scale free The degree distribution has a power-law behavior.
Slides are based on Negnevitsky, Pearson Education, Lecture 8 Artificial neural networks: Unsupervised learning n Introduction n Hebbian learning.
Introduction to Neural Networks John Paxton Montana State University Summer 2003.
November 24, 2009Introduction to Cognitive Science Lecture 21: Self-Organizing Maps 1 Self-Organizing Maps (Kohonen Maps) In the BPN, we used supervised.
October 28, 2010Neural Networks Lecture 13: Adaptive Networks 1 Adaptive Networks As you know, there is no equation that would tell you the ideal number.
Neural Networks Lecture 17: Self-Organizing Maps
A Hybrid Self-Organizing Neural Gas Network James Graham and Janusz Starzyk School of EECS, Ohio University Stocker Center, Athens, OH USA IEEE World.
Lecture 09 Clustering-based Learning
2.1 Integers & Absolute Value 2.2 Adding Integers.
Project Presentation Arpan Maheshwari Y7082,CSE Supervisor: Prof. Amitav Mukerjee Madan M Dabbeeru.
Neural networks - Lecture 111 Recurrent neural networks (II) Time series processing –Networks with delayed input layer –Elman network Cellular networks.
Escaping local optimas Accept nonimproving neighbors – Tabu search and simulated annealing Iterating with different initial solutions – Multistart local.
Lecture 12 Self-organizing maps of Kohonen RBF-networks
KOHONEN SELF ORGANISING MAP SEMINAR BY M.V.MAHENDRAN., Reg no: III SEM, M.E., Control And Instrumentation Engg.
Presentation on Neural Networks.. Basics Of Neural Networks Neural networks refers to a connectionist model that simulates the biophysical information.
Evolving a Sigma-Pi Network as a Network Simulator by Justin Basilico.
CZ5225: Modeling and Simulation in Biology Lecture 5: Clustering Analysis for Microarray Data III Prof. Chen Yu Zong Tel:
Soft Computing Lecture 18 Foundations of genetic algorithms (GA). Using of GA.
Self-organizing map Speech and Image Processing Unit Department of Computer Science University of Joensuu, FINLAND Pasi Fränti Clustering Methods: Part.
Artificial Neural Networks Dr. Abdul Basit Siddiqui Assistant Professor FURC.
Artificial Neural Network Unsupervised Learning
Community Architectures for Network Information Systems
Self-Organizing Maps Corby Ziesman March 21, 2007.
Institute for Advanced Studies in Basic Sciences – Zanjan Kohonen Artificial Neural Networks in Analytical Chemistry Mahdi Vasighi.
Self Organizing Feature Map CS570 인공지능 이대성 Computer Science KAIST.
Neural Networks - Lecture 81 Unsupervised competitive learning Particularities of unsupervised learning Data clustering Neural networks for clustering.
UNSUPERVISED LEARNING NETWORKS
381 Self Organization Map Learning without Examples.
PARALLELIZATION OF ARTIFICIAL NEURAL NETWORKS Joe Bradish CS5802 Fall 2015.
Semiconductors, BP&A Planning, DREAM PLAN IDEA IMPLEMENTATION.
Self-Organizing Maps (SOM) (§ 5.5)
CHAPTER 14 Competitive Networks Ming-Feng Yeh.
Perceptrons Michael J. Watts
Example Apply hierarchical clustering with d min to below data where c=3. Nearest neighbor clustering d min d max will form elongated clusters!
Soft Computing Lecture 15 Constructive learning algorithms. Network of Hamming.
Computational Intelligence: Methods and Applications Lecture 9 Self-Organized Mappings Włodzisław Duch Dept. of Informatics, UMK Google: W Duch.
Machine Learning 12. Local Models.
List Algorithms Taken from notes by Dr. Neil Moore & Dr. Debby Keen
Self-Organizing Network Model (SOM) Session 11
Deep Feedforward Networks
Neural Networks Winter-Spring 2014
Adaptive Resonance Theory (ART)
Self Organizing Maps: Parametrization of Parton Distribution Functions
Addition of Signed Numbers
Goal: use counters to add integers
Deep Neural Networks: Visualization and Dropout
Structure learning with deep autoencoders
Dr. Unnikrishnan P.C. Professor, EEE
Fitting Curve Models to Edges
List Algorithms Taken from notes by Dr. Neil Moore
CSE 373 Data Structures and Algorithms
Robot Biconnectivity Problem
Algorithms for Budget-Constrained Survivable Topology Design
Self-Organizing Maps (SOM) (§ 5.5)
CSE 373: Data Structures and Algorithms
Artificial Neural Networks
Unsupervised Networks Closely related to clustering
A Neural Net For Terrain Classification
Growing Curvilinear Component Analysis (GCCA)
Presentation transcript:

Soft Competitive Learning without Fixed Network Dimensionality Jacob Chakareski and Sergey Makarov Rice University, Worcester Polytechnic Institute

Algorithms Neural Gas Competitive Hebbian Learning Neural Gas + Competitive Hebbian Learning Growing Neural Gas

Neural Gas Sorts the network units based on their distance from the input signal Adapts a certain number of units, based on this “rank order” The number of adapted units and the adaptation strength are decreased according to a fixed schedule

The algorithm Initialize a set A with N units c i Sort the network units Adapt the network units

Simulation Results

Competitive Hebbian Learning Usually not used on its own, but in conjunction with other methods It does not change reference vectors w j at all It only generates a number of neighborhood edges between the units of the network

The algorithm Initialize a set A with N units c i and the connection set C Determine units s 1 and s 2 Create a connection between s 1 and s 2

Simulation Results

Neural Gas + CHL A superposition of NG and CHL Sometimes denoted as “topology- representing networks” A local edge aging mechanism implemented to remove edges which are not valid anymore

The algorithm Set the age of the connection between s 1 and s 2 to zero (“refresh” the edge) Increment the age of all edges emanating from s 1 Remove edges with an age larger than the current age T(t)

Simulation Results

Growing Neural Gas Number of units changes (mostly increases) during the self-organization process Starting with very few units new units are added successively Local error measures are gathered to determine where to insert new units Each new unit is inserted near the unit with the largest accumulated error

The algorithm Add the squared distance between the input signal and the winner to a local error variable Adapt the winner and its neighbors If the number of input signals generated so far is a multiple integer of a parameter, insert a new unit :

 Determine the unit with the max Err  Determine the neighbor of q with the max Err  Add a new unit r to the network  Insert edges connecting r with q and f, and remove the original edge between q and f  Decrease the error variables of q and f

 Interpolate the error variable of r from q and f Decrease the error variables of all units

Simulation Results

Applications: Web/Database Maps