CUNY Graduate Center December 15 Erdal Kose. Outlines Define SOMs Application Areas Structure Of SOMs (Basic Algorithm) Learning Algorithm Simulation.

Slides:



Advertisements
Similar presentations
2806 Neural Computation Self-Organizing Maps Lecture Ari Visa.
Advertisements

Neural Networks Dr. Peter Phillips. The Human Brain (Recap of week 1)
Organizing a spectral image database by using Self-Organizing Maps Research Seminar Oili Kohonen.
Self Organization of a Massive Document Collection
Un Supervised Learning & Self Organizing Maps. Un Supervised Competitive Learning In Hebbian networks, all neurons can fire at the same time Competitive.
Unsupervised learning. Summary from last week We explained what local minima are, and described ways of escaping them. We investigated how the backpropagation.
Self Organization: Competitive Learning
5/16/2015Intelligent Systems and Soft Computing1 Introduction Introduction Hebbian learning Hebbian learning Generalised Hebbian learning algorithm Generalised.
Kohonen Self Organising Maps Michael J. Watts
Artificial neural networks:
Unsupervised Networks Closely related to clustering Do not require target outputs for each input vector in the training data Inputs are connected to a.
Self-Organizing Map (SOM). Unsupervised neural networks, equivalent to clustering. Two layers – input and output – The input layer represents the input.
X0 xn w0 wn o Threshold units SOM.
Self Organizing Maps. This presentation is based on: SOM’s are invented by Teuvo Kohonen. They represent multidimensional.
Decision Support Systems
RBF Neural Networks x x1 Examples inside circles 1 and 2 are of class +, examples outside both circles are of class – What NN does.
UNIVERSITY OF JYVÄSKYLÄ Yevgeniy Ivanchenko Yevgeniy Ivanchenko University of Jyväskylä
Slides are based on Negnevitsky, Pearson Education, Lecture 8 Artificial neural networks: Unsupervised learning n Introduction n Hebbian learning.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
KNN, LVQ, SOM. Instance Based Learning K-Nearest Neighbor Algorithm (LVQ) Learning Vector Quantization (SOM) Self Organizing Maps.
1 Study of Topographic and Equiprobable Mapping with Clustering for Fault Classification Ashish Babbar EE645 Final Project.
Lecture 09 Clustering-based Learning
Radial Basis Function (RBF) Networks
Face Recognition Using Neural Networks Presented By: Hadis Mohseni Leila Taghavi Atefeh Mirsafian.
 C. C. Hung, H. Ijaz, E. Jung, and B.-C. Kuo # School of Computing and Software Engineering Southern Polytechnic State University, Marietta, Georgia USA.
Project reminder Deadline: Monday :00 Prepare 10 minutes long pesentation (in Czech/Slovak), which you’ll present on Wednesday during.
KOHONEN SELF ORGANISING MAP SEMINAR BY M.V.MAHENDRAN., Reg no: III SEM, M.E., Control And Instrumentation Engg.
Presentation on Neural Networks.. Basics Of Neural Networks Neural networks refers to a connectionist model that simulates the biophysical information.
Self Organizing Maps (SOM) Unsupervised Learning.
Self Organized Map (SOM)
CZ5225: Modeling and Simulation in Biology Lecture 5: Clustering Analysis for Microarray Data III Prof. Chen Yu Zong Tel:
Self-organizing Maps Kevin Pang. Goal Research SOMs Research SOMs Create an introductory tutorial on the algorithm Create an introductory tutorial on.
Artificial Neural Networks Dr. Abdul Basit Siddiqui Assistant Professor FURC.
-Artificial Neural Network- Chapter 9 Self Organization Map(SOM) 朝陽科技大學 資訊管理系 李麗華 教授.
Artificial Neural Network Unsupervised Learning
Self organizing maps 1 iCSC2014, Juan López González, University of Oviedo Self organizing maps A visualization technique with data dimension reduction.
Intelligent Database Systems Lab 國立雲林科技大學 National Yunlin University of Science and Technology Adaptive nonlinear manifolds and their applications to pattern.
A two-stage approach for multi- objective decision making with applications to system reliability optimization Zhaojun Li, Haitao Liao, David W. Coit Reliability.
A Scalable Self-organizing Map Algorithm for Textual Classification: A Neural Network Approach to Thesaurus Generation Dmitri G. Roussinov Department of.
Stephen Marsland Ch. 9 Unsupervised Learning Stephen Marsland, Machine Learning: An Algorithmic Perspective. CRC 2009 based on slides from Stephen.
Self-Organizing Maps Corby Ziesman March 21, 2007.
Microarrays.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Machine Learning Neural Networks (3). Understanding Supervised and Unsupervised Learning.
Self Organizing Feature Map CS570 인공지능 이대성 Computer Science KAIST.
Intelligent Database Systems Lab N.Y.U.S.T. I. M. Externally growing self-organizing maps and its application to database visualization and exploration.
381 Self Organization Map Learning without Examples.
Intelligent Database Systems Lab 國立雲林科技大學 National Yunlin University of Science and Technology O( ㏒ 2 M) Self-Organizing Map Algorithm Without Learning.
Semiconductors, BP&A Planning, DREAM PLAN IDEA IMPLEMENTATION.
Intelligent Database Systems Lab 國立雲林科技大學 National Yunlin University of Science and Technology 1 A self-organizing map for adaptive processing of structured.
CHAPTER 14 Competitive Networks Ming-Feng Yeh.
Unsupervised Learning G.Anuradha. Contents Introduction Competitive Learning networks Kohenen self-organizing networks Learning vector quantization Hebbian.
Intelligent Database Systems Lab N.Y.U.S.T. I. M. Community self-Organizing Map and its Application to Data Extraction Presenter: Chun-Ping Wu Authors:
Example Apply hierarchical clustering with d min to below data where c=3. Nearest neighbor clustering d min d max will form elongated clusters!
Artificial Neural Networks for Data Mining. Copyright © 2011 Pearson Education, Inc. Publishing as Prentice Hall 6-2 Learning Objectives Understand the.
Business Intelligence and Decision Support Systems (9 th Ed., Prentice Hall) Chapter 6: Artificial Neural Networks for Data Mining.
Supervised Learning – Network is presented with the input and the desired output. – Uses a set of inputs for which the desired outputs results / classes.
A Self-organizing Semantic Map for Information Retrieval Xia Lin, Dagobert Soergel, Gary Marchionini presented by Yi-Ting.
Self-Organizing Network Model (SOM) Session 11
Data Mining, Neural Network and Genetic Programming
Other Applications of Energy Minimzation
Lecture 22 Clustering (3).
An Artificial Intelligence Based Fisheries Research On The Evaluation Of Gnathiid Parasitism In Goldblotch Grouper of ISKENDERUN BAY ORAL, M. GENÇ, M.A. ELMAS,
Self-Organizing Maps Corby Ziesman March 21, 2007.
Introduction to Cluster Analysis
Feature mapping: Self-organizing Maps
Artificial Neural Networks
Unsupervised Networks Closely related to clustering
Presentation transcript:

CUNY Graduate Center December 15 Erdal Kose

Outlines Define SOMs Application Areas Structure Of SOMs (Basic Algorithm) Learning Algorithm Simulation and Results Conclusion References

SOMs The best know and most popular model of Self- organizing networks is the topology-preserving map proposed by Teuvo Kohonen, known as Kohanen networks They provide a way of representing multidimensional data in much lower dimensional space, such as one or two dimensions.

Applications Image compression Data minding Bibliographic classification Image browsing systems Medical Diagnosis Speech recognition Clustering

Structure of SOM A self organized map consists of components called nodes Associated with each node is a weight vector of the same dimension as the input data vectors and a position in the map space The usual arrangement of nodes is a regular spacing in a hexagonal or rectangular grid

Learning Algorithm Each node's weights are initialized. A vector is chosen at random from the set of training data and presented to the lattice. Every node is examined to calculate which one's weights are most like the input vector. The winning node is commonly known as the Best Matching Unit (BMU). The radius of the neighborhood of the BMU is now calculated. This is a value that starts large, typically set to the 'radius' of the lattice, but diminishes each time-step. Any nodes found within this radius are deemed to be inside the BMU's neighborhood. Each neighboring node's (the nodes found in step 4) weights are adjusted to make them more like the input vector. The closer a node is to the BMU, the more its weights get altered. Repeat step 2 for N iterations.

The learning Algorithm in detail Random initialization means simply that random values are assigned to weight vectors. This is the case if nothing or little is known about the input data at the time of the initialization In one training step, one sample vector is drawn randomly from the input data set, This vector is fed to all units in the network and a similarity measure is calculated between the input data sample and all the weight vectors

Cont. The best matching unit: The Euclidian distance After finding the best-matching unit, units in the SOM are updated

Cont.. The neighborhood function includes the learning rate function which is a decreasing function of time and the function that dictates the form of the neighborhood function.

Adjusting the Weights Every node within the best matching unit’s (BMU) neighborhood (including the BMU) has its weight vector adjusted according to the following equation: W(t+1)=W(t)+α(t)(V(t)-W(t) Where t represents the time-step and α is the learning rate, which decreases with time. Basically, what this equation is saying, is that the new adjusted weight for the node is equal to the old weight (W), plus a fraction of the difference (α) between the old weight and the input vector (V)

Visulization World Poverty Map A SOM has been used to classify statistical data describing various quality-of-life factors such as state of health, nutrition, educational services etc.

Conclusion The Kohonen Feature Map was first introduced by finnish professor Teuvo Kohonen (University of Helsinki) in The "heart" of this type of networks is the feature map a neuron layer where neurons are organizing themselves according to certain input values. They could learn without supervision

References A growing Self-Organizing Algorithm for Dynamic Clustering Ryuji Ohta, Toshimichi Saito Hosei university,Japan (IEEE 2001) A Self-Organization Model of Feature Columns and Face Responsive Neurons in the Temporal Contex Takashie Takahashi, Tako Kurita National Institute of Advanced Industrial Science and Technology (2001 IEEE) sample.html