Lecture 22 Clustering (3).

Slides:



Advertisements
Similar presentations
© Negnevitsky, Pearson Education, Introduction Introduction Hebbian learning Hebbian learning Generalised Hebbian learning algorithm Generalised.
Advertisements

Memristor in Learning Neural Networks
2806 Neural Computation Self-Organizing Maps Lecture Ari Visa.
Neural Networks Dr. Peter Phillips. The Human Brain (Recap of week 1)
Un Supervised Learning & Self Organizing Maps. Un Supervised Competitive Learning In Hebbian networks, all neurons can fire at the same time Competitive.
Neural Networks Chapter 9 Joost N. Kok Universiteit Leiden.
Unsupervised learning. Summary from last week We explained what local minima are, and described ways of escaping them. We investigated how the backpropagation.
Self Organization: Competitive Learning
5/16/2015Intelligent Systems and Soft Computing1 Introduction Introduction Hebbian learning Hebbian learning Generalised Hebbian learning algorithm Generalised.
DATA-MINING Artificial Neural Networks Alexey Minin, Jass 2006.
Kohonen Self Organising Maps Michael J. Watts
Artificial neural networks:
Competitive learning College voor cursus Connectionistische modellen M Meeter.
Unsupervised Networks Closely related to clustering Do not require target outputs for each input vector in the training data Inputs are connected to a.
Competitive Networks. Outline Hamming Network.
Slides are based on Negnevitsky, Pearson Education, Lecture 8 Artificial neural networks: Unsupervised learning n Introduction n Hebbian learning.
KNN, LVQ, SOM. Instance Based Learning K-Nearest Neighbor Algorithm (LVQ) Learning Vector Quantization (SOM) Self Organizing Maps.
Introduction to Neural Networks John Paxton Montana State University Summer 2003.
November 24, 2009Introduction to Cognitive Science Lecture 21: Self-Organizing Maps 1 Self-Organizing Maps (Kohonen Maps) In the BPN, we used supervised.
Neural Networks Lecture 17: Self-Organizing Maps
Lecture 09 Clustering-based Learning
Projection methods in chemistry Autumn 2011 By: Atefe Malek.khatabi M. Daszykowski, B. Walczak, D.L. Massart* Chemometrics and Intelligent Laboratory.
CS623: Introduction to Computing with Neural Nets (lecture-20) Pushpak Bhattacharyya Computer Science and Engineering Department IIT Bombay.
Lecture 12 Self-organizing maps of Kohonen RBF-networks
KOHONEN SELF ORGANISING MAP SEMINAR BY M.V.MAHENDRAN., Reg no: III SEM, M.E., Control And Instrumentation Engg.
Presentation on Neural Networks.. Basics Of Neural Networks Neural networks refers to a connectionist model that simulates the biophysical information.
Self Organized Map (SOM)
CZ5225: Modeling and Simulation in Biology Lecture 5: Clustering Analysis for Microarray Data III Prof. Chen Yu Zong Tel:
Self-organizing map Speech and Image Processing Unit Department of Computer Science University of Joensuu, FINLAND Pasi Fränti Clustering Methods: Part.
Artificial Neural Networks Dr. Abdul Basit Siddiqui Assistant Professor FURC.
Artificial Neural Network Unsupervised Learning
Self organizing maps 1 iCSC2014, Juan López González, University of Oviedo Self organizing maps A visualization technique with data dimension reduction.
A Scalable Self-organizing Map Algorithm for Textual Classification: A Neural Network Approach to Thesaurus Generation Dmitri G. Roussinov Department of.
Machine Learning Neural Networks (3). Understanding Supervised and Unsupervised Learning.
Self Organizing Feature Map CS570 인공지능 이대성 Computer Science KAIST.
Neural Networks - Lecture 81 Unsupervised competitive learning Particularities of unsupervised learning Data clustering Neural networks for clustering.
UNSUPERVISED LEARNING NETWORKS
381 Self Organization Map Learning without Examples.
Intelligent Database Systems Lab 國立雲林科技大學 National Yunlin University of Science and Technology 1 A self-organizing map for adaptive processing of structured.
Self-Organizing Maps (SOM) (§ 5.5)
CHAPTER 14 Competitive Networks Ming-Feng Yeh.
Unsupervised learning: simple competitive learning Biological background: Neurons are wired topographically, nearby neurons connect to nearby neurons.
ViSOM - A Novel Method for Multivariate Data Projection and Structure Visualization Advisor : Dr. Hsu Graduate : Sheng-Hsuan Wang Author : Hujun Yin.
Example Apply hierarchical clustering with d min to below data where c=3. Nearest neighbor clustering d min d max will form elongated clusters!
Intro. ANN & Fuzzy Systems Lecture 20 Clustering (1)
Machine Learning 12. Local Models.
Big data classification using neural network
Chapter 5 Unsupervised learning
Self-Organizing Network Model (SOM) Session 11
Data Mining, Neural Network and Genetic Programming
Unsupervised Learning Networks
Other Applications of Energy Minimzation
Unsupervised Learning and Neural Networks
Counter propagation network (CPN) (§ 5.3)
Unsupervised learning
Lecture 9 MLP (I): Feed-forward Model
Lecture 11. MLP (III): Back-Propagation
Neuro-Computing Lecture 4 Radial Basis Function Network
Competitive Networks.
Computational Intelligence: Methods and Applications
Competitive Networks.
Self-Organizing Maps (SOM) (§ 5.5)
Introduction to Cluster Analysis
Self Organizing Maps A major principle of organization is the topographic map, i.e. groups of adjacent neurons process information from neighboring parts.
Feature mapping: Self-organizing Maps
Artificial Neural Networks
Unsupervised Networks Closely related to clustering
A Neural Net For Terrain Classification
Presentation transcript:

Lecture 22 Clustering (3)

Outline Self Organization Map: Structure Feature Map Algorithms Examples (C) 2001 by Yu Hen Hu

Introduction to SOM Both SOM and LVQ are proposed by T. Kohonen. Biological motivations: Different regions of a brain (cerebral cortex) seem to tune into different tasks. Particular location of the neural response of the "map" often directly corresponds to specific modality and quality of sensory signal. SOM is an unsupervised clustering algorithm which creates spatially organized "internal representation" of various features of input signals and their abstractions. LVQ is a supervised classification algorithm which is obtained by fine-tuning the result obtained using SOM initially. (C) 2001 by Yu Hen Hu

SOM Structure Neurons are spatially organized (indexed) in an 1-D or 2-D area. Inputs connect to every neurons. (C) 2001 by Yu Hen Hu

Neighborhood Structure Based on the topological arrangement, a "neighborhood" can be defined for each neuron. Linear and higher dimensional neighborhood can be defined similarly. (C) 2001 by Yu Hen Hu

Feature Map The neuron output form a low dimension map of high dimensional feature space! Neighboring features in the feature space are to be mapped to neighboring neurons in the feature map. Due to the gradient search nature, initial assignment is important. (C) 2001 by Yu Hen Hu

SOM Training Algorithm Initialization: Choose weight vectors {wm(0); 1  m  M} randomly. Set iteration count t = 0. While Not_Converged Choose the next x and compute d(x, wm(t)); 1  m M. Select m* = mimm d(x, wm(t)) Update node m* and its neighborhood nodes: If Not_converged, then t = t+1 End % while loop (C) 2001 by Yu Hen Hu

Example Initially, the code words are not ordered as shown in tangled blue lines. At the end, the lines are stretched, and the wiring is untangled. (C) 2001 by Yu Hen Hu

SOM Algorithm Analysis Competitive learning – neurons competes to represent the input data. Winner Takes it All! Neighborhood updating: the weights of neurons fall within the winning neighborhood will be updated by pulling themselves toward the data sample. Let x(t) be the data vector at time t, if all the data vectors x(t) have been nearest to Wm(t), then for x(t)’s that m is in the winner neighborhood. The size of the neighborhood is reduced as t increase. Eventually, N(m*,t) = m*. (C) 2001 by Yu Hen Hu

More SOM Algorithm Analysis A more elaborate update formulation: wm(t+1) = wm(t) + (m*,t) (x –wm(t)) where (m*,t) = 0 if m  N(m*,t), and for example, (m*,t) = hoexp(-|m-m*|2/s2(t)) if m  N(m*,t). Distance measure d(x, wm(t)) = || x – wm(t)||. Other definition of distance may also be used. (C) 2001 by Yu Hen Hu

SAMMON Mapping Visualization of high dimensional data structure! The map developed in SOM can serve such a purpose. In general, this is a multi-dimensional scaling problem : Distances (ij) between low-D points {yi} in the map correspond to the dissimilarities (dij) between points {xi} in the original space. Optimization Problem: Given {xi}, find {yi} to minimize Optimize with gradient search for some initial selection of {yi}. Other criteria may also be used. (C) 2001 by Yu Hen Hu