TreeSOM :Cluster analysis in the self- organizing map Neural Networks 19 (2006) 935-949 2006 Special Issue Reporter 張欽隆 D9515012.

Slides:



Advertisements
Similar presentations
Self-Organizing Map (SOM) = Kohonen Map
Advertisements

Memristor in Learning Neural Networks
Neural Networks Dr. Peter Phillips. The Human Brain (Recap of week 1)
Self Organization of a Massive Document Collection
Unsupervised learning. Summary from last week We explained what local minima are, and described ways of escaping them. We investigated how the backpropagation.
Kohonen Self Organising Maps Michael J. Watts
Unsupervised Networks Closely related to clustering Do not require target outputs for each input vector in the training data Inputs are connected to a.
Machine Learning: Connectionist McCulloch-Pitts Neuron Perceptrons Multilayer Networks Support Vector Machines Feedback Networks Hopfield Networks.
X0 xn w0 wn o Threshold units SOM.
Self Organizing Maps. This presentation is based on: SOM’s are invented by Teuvo Kohonen. They represent multidimensional.
1 Chapter 10 Introduction to Machine Learning. 2 Chapter 10 Contents (1) l Training l Rote Learning l Concept Learning l Hypotheses l General to Specific.
UNIVERSITY OF JYVÄSKYLÄ Yevgeniy Ivanchenko Yevgeniy Ivanchenko University of Jyväskylä
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
1 Life-and-Death Problem Solver in Go Author: Byung-Doo Lee Dept of Computer Science, Univ. of Auckland Presented by: Xiaozhen Niu.
A Hybrid Self-Organizing Neural Gas Network James Graham and Janusz Starzyk School of EECS, Ohio University Stocker Center, Athens, OH USA IEEE World.
Lecture 09 Clustering-based Learning
Radial Basis Function (RBF) Networks
Hazırlayan NEURAL NETWORKS Radial Basis Function Networks II PROF. DR. YUSUF OYSAL.
Projection methods in chemistry Autumn 2011 By: Atefe Malek.khatabi M. Daszykowski, B. Walczak, D.L. Massart* Chemometrics and Intelligent Laboratory.
Machine Learning. Learning agent Any other agent.
Self Organizing Maps (SOM) Unsupervised Learning.
Self Organized Map (SOM)
CZ5225: Modeling and Simulation in Biology Lecture 5: Clustering Analysis for Microarray Data III Prof. Chen Yu Zong Tel:
Self-organizing Maps Kevin Pang. Goal Research SOMs Research SOMs Create an introductory tutorial on the algorithm Create an introductory tutorial on.
Artificial Neural Networks Dr. Abdul Basit Siddiqui Assistant Professor FURC.
Self-Organising Networks This is DWC-lecture 8 of Biologically Inspired Computing; about Kohonen’s SOM, what it’s useful for, and some applications.
Community Architectures for Network Information Systems
A Scalable Self-organizing Map Algorithm for Textual Classification: A Neural Network Approach to Thesaurus Generation Dmitri G. Roussinov Department of.
Self-Organizing Maps Corby Ziesman March 21, 2007.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Self Organization of a Massive Document Collection Advisor : Dr. Hsu Graduate : Sheng-Hsuan Wang Author : Teuvo Kohonen et al.
Machine Learning Neural Networks (3). Understanding Supervised and Unsupervised Learning.
Applying Neural Networks Michael J. Watts
Hierarchical Clustering of Gene Expression Data Author : Feng Luo, Kun Tang Latifur Khan Graduate : Chien-Ming Hsiao.
Neural Networks - Lecture 81 Unsupervised competitive learning Particularities of unsupervised learning Data clustering Neural networks for clustering.
Clustering.
Unsupervised Learning
1 Chapter 10 Introduction to Machine Learning. 2 Chapter 10 Contents (1) l Training l Rote Learning l Concept Learning l Hypotheses l General to Specific.
CUNY Graduate Center December 15 Erdal Kose. Outlines Define SOMs Application Areas Structure Of SOMs (Basic Algorithm) Learning Algorithm Simulation.
Computational Biology Clustering Parts taken from Introduction to Data Mining by Tan, Steinbach, Kumar Lecture Slides Week 9.
Intelligent Database Systems Lab 國立雲林科技大學 National Yunlin University of Science and Technology O( ㏒ 2 M) Self-Organizing Map Algorithm Without Learning.
Semiconductors, BP&A Planning, DREAM PLAN IDEA IMPLEMENTATION.
Intelligent Database Systems Lab 國立雲林科技大學 National Yunlin University of Science and Technology 1 Adaptive FIR Neural Model for Centroid Learning in Self-Organizing.
Hybrid Intelligent Systems for Network Security Lane Thames Georgia Institute of Technology Savannah, GA
Lecture 14, CS5671 Clustering Algorithms Density based clustering Self organizing feature maps Grid based clustering Markov clustering.
CHAPTER 14 Competitive Networks Ming-Feng Yeh.
Intelligent Database Systems Lab 國立雲林科技大學 National Yunlin University of Science and Technology 1 Growing Hierarchical Tree SOM: An unsupervised neural.
ViSOM - A Novel Method for Multivariate Data Projection and Structure Visualization Advisor : Dr. Hsu Graduate : Sheng-Hsuan Wang Author : Hujun Yin.
Example Apply hierarchical clustering with d min to below data where c=3. Nearest neighbor clustering d min d max will form elongated clusters!
A Self-organizing Semantic Map for Information Retrieval Xia Lin, Dagobert Soergel, Gary Marchionini presented by Yi-Ting.
Applying Neural Networks
Data Mining, Neural Network and Genetic Programming
Other Applications of Energy Minimzation
Molecular Classification of Cancer
Unsupervised Learning and Neural Networks
Self organizing networks
CSE P573 Applications of Artificial Intelligence Neural Networks
Dr. Unnikrishnan P.C. Professor, EEE
Lecture 22 Clustering (3).
CSE572, CBS598: Data Mining by H. Liu
Self-Organizing Maps Corby Ziesman March 21, 2007.
CSE 573 Introduction to Artificial Intelligence Neural Networks
Self-organizing map numeric vectors and sequence motifs
Introduction to Cluster Analysis
Clustering The process of grouping samples so that the samples are similar within each group.
CSE572: Data Mining by H. Liu
Feature mapping: Self-organizing Maps
Artificial Neural Networks
Unsupervised Networks Closely related to clustering
Presentation transcript:

TreeSOM :Cluster analysis in the self- organizing map Neural Networks 19 (2006) Special Issue Reporter 張欽隆 D

What is SOM Self-Organizing Map(Kohonen, 1987) Mapping high-dimensional data onto a low-dimensional grid Similar data elements are placed close together

SOM A SOM consists of neurons organized on a regular low dimensional grid Each neuron has a weight vector During training, neighboring neurons on the grid get similar weight vector. (come closer) Finally, similar data elements are placed close together.

Problem of SOM Different map initializations and input order of data elements may result in different clustering. For large data sets, analyze different clustering for finding best clustering is a lengthy and laborious task

This paper present… A new SOM visualization An unsupervised method for cluster analysis and confidence testing for finding reliable clustering.

Outline Visualization Cluster discovery Calibration Best clustering SOM as Tree Cluster confidence The most representative SOM

Visualization Tree variants of SOM visualization with umat (u-matrix, kraaijveld, Mao & Jain, 1992) –(a) each cell is shaded according to the average distance form the neuron to its neighbors. –(b) Grid density is double, some cells represent neurons and other are borders. –(c) Use original SOM grid density but draw the borders between the neurons as thick lines Base on representation (c),we can define a cluster as a group of nodes surrounded by an uninterrupted border of a given shade.

Cluster discovery Cluster: a group of nodes with short distances between them and long distances to the other nodes. Algorithm for finding cluster: each node within a cluster must be connected to at least one other node within the same cluster with an edge that is shorter than the distance threshold.

Cluster discovery: clustering map Cluster borders are always displayed in black Cluster area are shaded according to the average distance between the nodes within the cluster. (nodes density)

Calibration Mapping data on a trained SOM is called calibration. The clusters are shaded according to the average distance between the data elements. (element density)

Best clustering Problem: –Data is grouped into few large loose cluster at high thresholds  the clustering may be too coarse, grouping together unrelated items. –Data is grouped into many small tighter clusters at low thresholds  the clustering may be too fine, unnecessarily separating similar items. What is best clustering ? –The clustering with the tightest clusters with the greatest distances between them.( looseness / average distance ) –Algorithm:

SOM as Tree Why Tree ? –Traditional hierarchical clusterings visualization use tree. –Comparison become easier if the SOM itself is represented as a tree.

Cluster confidence Problem: SOM (clustering) may be different with different initialization and input order. How to determine the “true” clustering ? –Only the clusters found in the majority of cases, may be included in the final clustering (Kohonen) Consensus tree was used to tackle this problem.(Margush and McMorris, 1981)

Conensus tree – node equivalence Node equivalence –Two nodes are equivalent if their sets of leaves are identical.

Consensus tree Confidence value Base on confidence values, the consensus tree is built, starting with the nodes with confidence above 50%.

The most representative SOM The most representative SOM is the one which is closest to the consensus tree. How to measure the similarity ? –The more nodes are equivalent between two tree, the more similar they are.