© sebis 1JASS 05 Information Visualization with SOMs Information Visualization with Self-Organizing Maps Software Engineering betrieblicher Informationssysteme.

Slides:



Advertisements
Similar presentations
Artificial Neural Networks
Advertisements

2806 Neural Computation Self-Organizing Maps Lecture Ari Visa.
Neural Networks Dr. Peter Phillips. The Human Brain (Recap of week 1)
Multi-Scale Analysis of Crime and Incident Patterns in Camden Dawn Williams Department of Civil, Environmental & Geomatic.
1 Machine Learning: Lecture 10 Unsupervised Learning (Based on Chapter 9 of Nilsson, N., Introduction to Machine Learning, 1996)
Self Organization of a Massive Document Collection
Un Supervised Learning & Self Organizing Maps. Un Supervised Competitive Learning In Hebbian networks, all neurons can fire at the same time Competitive.
Unsupervised learning. Summary from last week We explained what local minima are, and described ways of escaping them. We investigated how the backpropagation.
Self Organization: Competitive Learning
5/16/2015Intelligent Systems and Soft Computing1 Introduction Introduction Hebbian learning Hebbian learning Generalised Hebbian learning algorithm Generalised.
Kohonen Self Organising Maps Michael J. Watts
Artificial neural networks:
Competitive learning College voor cursus Connectionistische modellen M Meeter.
Unsupervised Networks Closely related to clustering Do not require target outputs for each input vector in the training data Inputs are connected to a.
Self-Organizing Map (SOM). Unsupervised neural networks, equivalent to clustering. Two layers – input and output – The input layer represents the input.
Machine Learning: Connectionist McCulloch-Pitts Neuron Perceptrons Multilayer Networks Support Vector Machines Feedback Networks Hopfield Networks.
X0 xn w0 wn o Threshold units SOM.
Self Organizing Maps. This presentation is based on: SOM’s are invented by Teuvo Kohonen. They represent multidimensional.
Decision Support Systems
Slides are based on Negnevitsky, Pearson Education, Lecture 8 Artificial neural networks: Unsupervised learning n Introduction n Hebbian learning.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
KNN, LVQ, SOM. Instance Based Learning K-Nearest Neighbor Algorithm (LVQ) Learning Vector Quantization (SOM) Self Organizing Maps.
Introduction to Neural Networks John Paxton Montana State University Summer 2003.
November 24, 2009Introduction to Cognitive Science Lecture 21: Self-Organizing Maps 1 Self-Organizing Maps (Kohonen Maps) In the BPN, we used supervised.
Lecture 09 Clustering-based Learning
Radial Basis Function (RBF) Networks
 C. C. Hung, H. Ijaz, E. Jung, and B.-C. Kuo # School of Computing and Software Engineering Southern Polytechnic State University, Marietta, Georgia USA.
Project reminder Deadline: Monday :00 Prepare 10 minutes long pesentation (in Czech/Slovak), which you’ll present on Wednesday during.
KOHONEN SELF ORGANISING MAP SEMINAR BY M.V.MAHENDRAN., Reg no: III SEM, M.E., Control And Instrumentation Engg.
Unsupervised Learning and Clustering k-means clustering Sum-of-Squared Errors Competitive Learning SOM Pre-processing and Post-processing techniques.
Presentation on Neural Networks.. Basics Of Neural Networks Neural networks refers to a connectionist model that simulates the biophysical information.
Self Organizing Maps (SOM) Unsupervised Learning.
Self Organized Map (SOM)
CZ5225: Modeling and Simulation in Biology Lecture 5: Clustering Analysis for Microarray Data III Prof. Chen Yu Zong Tel:
Self-organizing Maps Kevin Pang. Goal Research SOMs Research SOMs Create an introductory tutorial on the algorithm Create an introductory tutorial on.
Intrusion Detection Using Hybrid Neural Networks Vishal Sevani ( )
Artificial Neural Networks Dr. Abdul Basit Siddiqui Assistant Professor FURC.
Artificial Neural Network Unsupervised Learning
Self-Organising Networks This is DWC-lecture 8 of Biologically Inspired Computing; about Kohonen’s SOM, what it’s useful for, and some applications.
Lecture 10 Artificial Neural Networks Dr. Jianjun Hu mleg.cse.sc.edu/edu/csce833 CSCE833 Machine Learning University of South Carolina Department of Computer.
A Scalable Self-organizing Map Algorithm for Textual Classification: A Neural Network Approach to Thesaurus Generation Dmitri G. Roussinov Department of.
NEURAL NETWORKS FOR DATA MINING
Stephen Marsland Ch. 9 Unsupervised Learning Stephen Marsland, Machine Learning: An Algorithmic Perspective. CRC 2009 based on slides from Stephen.
Kohonen Mapping and Text Semantics Xia Lin College of Information Science and Technology Drexel University.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Self-organizing maps (SOMs) and k-means clustering: Part 1 Steven Feldstein The Pennsylvania State University Trieste, Italy, October 21, 2013 Collaborators:
Self Organization of a Massive Document Collection Advisor : Dr. Hsu Graduate : Sheng-Hsuan Wang Author : Teuvo Kohonen et al.
Machine Learning Neural Networks (3). Understanding Supervised and Unsupervised Learning.
Self Organizing Feature Map CS570 인공지능 이대성 Computer Science KAIST.
Unsupervised Learning
Intelligent Database Systems Lab N.Y.U.S.T. I. M. Externally growing self-organizing maps and its application to database visualization and exploration.
CUNY Graduate Center December 15 Erdal Kose. Outlines Define SOMs Application Areas Structure Of SOMs (Basic Algorithm) Learning Algorithm Simulation.
Computational Biology Clustering Parts taken from Introduction to Data Mining by Tan, Steinbach, Kumar Lecture Slides Week 9.
Semiconductors, BP&A Planning, DREAM PLAN IDEA IMPLEMENTATION.
Unsupervised learning: simple competitive learning Biological background: Neurons are wired topographically, nearby neurons connect to nearby neurons.
Example Apply hierarchical clustering with d min to below data where c=3. Nearest neighbor clustering d min d max will form elongated clusters!
Artificial Neural Networks for Data Mining. Copyright © 2011 Pearson Education, Inc. Publishing as Prentice Hall 6-2 Learning Objectives Understand the.
Business Intelligence and Decision Support Systems (9 th Ed., Prentice Hall) Chapter 6: Artificial Neural Networks for Data Mining.
Supervised Learning – Network is presented with the input and the desired output. – Uses a set of inputs for which the desired outputs results / classes.
A Self-organizing Semantic Map for Information Retrieval Xia Lin, Dagobert Soergel, Gary Marchionini presented by Yi-Ting.
Computational Intelligence: Methods and Applications Lecture 9 Self-Organized Mappings Włodzisław Duch Dept. of Informatics, UMK Google: W Duch.
Big data classification using neural network
Self-Organizing Network Model (SOM) Session 11
Data Mining, Neural Network and Genetic Programming
Other Applications of Energy Minimzation
Introduction to Cluster Analysis
Artificial Neural Networks
Unsupervised Networks Closely related to clustering
Presentation transcript:

© sebis 1JASS 05 Information Visualization with SOMs Information Visualization with Self-Organizing Maps Software Engineering betrieblicher Informationssysteme (sebis) Ernst Denert-Stiftungslehrstuhl Lehrstuhl für Informatik 19 Institut für Informatik TU München wwwmatthes.in.tum.de Jing Li Mail: Next-Generation User-Centered Information Management

© sebis 2JASS 05 Information Visualization with SOMs Agenda  Motivation  Self-Organizing Maps Origins Algorithm Example  Scalable Vector Graphics  Information Visualization with Self-Organizing Maps in an Information Portal  Conclusion

© sebis 3JASS 05 Information Visualization with SOMs Motivation: The Problem Statement  The problem is how to find out semantics relationship among lots of information without manual labor How do I know, where to put my new data in, if I know nothing about information‘s topology? When I have a topic, how can I get all the information about it, if I don‘t know the place to search them?

© sebis 4JASS 05 Information Visualization with SOMs Motivation: The Idea Input Pattern 1 Input Pattern 2 Input Pattern 3  Computer know automatically information classification and put them together

© sebis 5JASS 05 Information Visualization with SOMs Motivation: The Idea  Text objects must be automatically produced with semantics relationships Semantics Map Topic1 Topic2 Topic3

© sebis 6JASS 05 Information Visualization with SOMs Agenda  Motivation  Self-Organizing Maps Origins Algorithm Example  Scalable Vector Graphics  Information Visualization with Self-Organizing Maps in an Information Portal  Conclusion

© sebis 7JASS 05 Information Visualization with SOMs Self-Organizing Maps : Origins Self-Organizing Maps  Ideas first introduced by C. von der Malsburg (1973), developed and refined by T. Kohonen (1982)  Neural network algorithm using unsupervised competitive learning  Primarily used for organization and visualization of complex data  Biological basis: ‘brain maps’ Teuvo Kohonen

© sebis 8JASS 05 Information Visualization with SOMs Self-Organizing Maps SOM - Architecture  Lattice of neurons (‘nodes’) accepts and responds to set of input signals  Responses compared; ‘winning’ neuron selected from lattice  Selected neuron activated together with ‘neighbourhood’ neurons  Adaptive process changes weights to more closely resemble inputs 2d array of neurons Set of input signals (connected to all neurons in lattice) Weighted synapses x1x1 x2x2 x3x3 xnxn... w j1 w j2 w j3 w jn j

© sebis 9JASS 05 Information Visualization with SOMs Self-Organizing Maps SOM – Result Example ‘Poverty map’ based on 39 indicators from World Bank statistics (1992) Classifying World Poverty Helsinki University of Technology

© sebis 10JASS 05 Information Visualization with SOMs SOM – Result Example

© sebis 11JASS 05 Information Visualization with SOMs Self-Organizing Maps SOM – Result Example ‘Poverty map’ based on 39 indicators from World Bank statistics (1992) Classifying World Poverty Helsinki University of Technology

© sebis 12JASS 05 Information Visualization with SOMs Self-Organizing Maps SOM – Algorithm Overview 1.Randomly initialise all weights 2.Select input vector x = [x 1, x 2, x 3, …, x n ] 3.Compare x with weights w j for each neuron j to determine winner 4.Update winner so that it becomes more like x, together with the winner’s neighbours 5.Adjust parameters: learning rate & ‘neighbourhood function’ 6.Repeat from (2) until the map has converged (i.e. no noticeable changes in the weights) or pre-defined no. of training cycles have passed

© sebis 13JASS 05 Information Visualization with SOMs Initialisation (i)Randomly initialise the weight vectors w j for all nodes j

© sebis 14JASS 05 Information Visualization with SOMs (ii) Choose an input vector x from the training set In computer texts are shown as a frequency distribution of one word. A Text Example: Self-organizing maps (SOMs) are a data visualization technique invented by Professor Teuvo Kohonen which reduce the dimensions of data through the use of self-organizing neural networks. The problem that data visualization attempts to solve is that humans simply cannot visualize high dimensional data as is so technique are created to help us understand this high dimensional data. Input vector Region Self-organizing 2 maps 1 data4 visualization 2 technique 2 Professor 1 invented 1 Teuvo Kohonen 1 dimensions 1... Zebra0

© sebis 15JASS 05 Information Visualization with SOMs Finding a Winner (iii) Find the best-matching neuron  (x), usually the neuron whose weight vector has smallest Euclidean distance from the input vector x The winning node is that which is in some sense ‘closest’ to the input vector ‘Euclidean distance’ is the straight line distance between the data points, if they were plotted on a (multi-dimensional) graph Euclidean distance between two vectors a and b, a = (a 1,a 2,…,a n ), b = (b 1,b 2,…b n ), is calculated as: Euclidean distance

© sebis 16JASS 05 Information Visualization with SOMs Weight Update SOM Weight Update Equation w j (t +1) = w j (t) +  (t)  (x) (j,t) [x - w j (t)] “The weights of every node are updated at each cycle by adding Current learning rate × Degree of neighbourhood with respect to winner × Difference between current weights and input vector to the current weights” Example of  (t) Example of  (x) (j,t) L. rate No. of cycles –x-axis shows distance from winning node –y-axis shows ‘degree of neighbourhood’ (max. 1)

© sebis 17JASS 05 Information Visualization with SOMs Example: Self-Organizing Maps The animals should be ordered by a neural networks. And the animals will be described with their attributes(size, living space). e.g. Mouse = (0/0) Size:Living space: small=0 medium=1 big=2Land=0 Water=1 Air=2 MouseLionHorseSharkDove Size small big mediumsmall big Living space Land AirWaterLand (2/0)(0/0)(0/2)(2/1)(1/0)

© sebis 18JASS 05 Information Visualization with SOMs Example: Self-Organizing Maps (0/0) Mouse (0/0), Lion (1/0) (0/2) Dove (0/2) (2/2) (2/1) Shark (2/1) (0/0) (2/0) Horse (2/0) (1/1) (0/0) After the fields of map will be initialized with random values, animals will be ordered in the most similar fields. If the mapping is ambiguous, anyone of fields will be seleced.

© sebis 19JASS 05 Information Visualization with SOMs Example: Self-Organizing Maps Auxiliary calculation for the field of left above: Old value in the field:(0/0) Direct ascendancies: Difference Mouse (0/0):(0/0) Difference Lion (1/0):(1/0) Sum of the difference:(1/0) Thereof 50%:(0.5/0) Influence of the allocations of the neighbour fields: Difference Dove (0/2):(0/2) Difference Shark (2/1):(2/1) Sum of the difference:(2/3) Thereof 25%:(0.5/0.75) New value in the field: (0/0) + (0.5/0) + (0.5/0.75)= (1/0.75) (1/0.75) Lion (1/0) (0/0) Lion (1/0) Training

© sebis 20JASS 05 Information Visualization with SOMs Example: Self-Organizing Maps This training will be done in every field. After the network had been trained, animals will be ordered in the similarest field again. (1/0.75) Lion (0.25/1) Dove (1.5/1.5) (1.25/0.5)(1/0.75) (2/0) Horse (1.25/1) Shark (1/1) (0.5/0) Mouse

© sebis 21JASS 05 Information Visualization with SOMs Example: Self-Organizing Maps This training will be very often repeated. In the best case the animals should be at close quarters ordered by similarest attribute. (0.75/0.6875) (0.1875/1.25) Dove (1.125/1.625) (1.375/0.5)(1/0.875) (1.5/0) Hourse (1.625/1) Shark (1/0.75) Lion (0.75/0) Mouse Land animals

© sebis 22JASS 05 Information Visualization with SOMs Example: Self-Organizing Maps [Teuvo Kohonen 2001] Self-Organizing Maps; Springer; A grouping according to similarity has emerged Animal names and their attributes birds peaceful hunters is has likes to

© sebis 23JASS 05 Information Visualization with SOMs Agenda  Motivation  Self-Organizing Maps Origins Algorithm Example  Scalable Vector Graphics  Information Visualization with Self-Organizing Maps in an Information Portal  Conclusion

© sebis 24JASS 05 Information Visualization with SOMs Technologie: Scalable Vector Graphics (SVG) Scalable Vector Graphics (SVG) is an XML markup language for describing two-dimensional vector graphics, both static and animated. It is an open standard created by the World Wide Web Consortium, which is also responsible for standards like HTML and XHTML.

© sebis 25JASS 05 Information Visualization with SOMs Scalable Vector Graphics (SVG) It is desirable to distinguish the algorithm from the visualization as clearly as possible. The anticipated System Structure is shown below. SVG

© sebis 26JASS 05 Information Visualization with SOMs Agenda  Motivation  Self-Organizing Maps Origins Algorithm Example  Scalable Vector Graphics  Information Visualization with Self-Organizing Maps in an Information Portal  Conclusion

© sebis 27JASS 05 Information Visualization with SOMs Software model for Information Visualization of SOM Over-all architecture Services Communication Interaction Presentation StorageRequest, Container Data Base Other Services Persistence

© sebis 28JASS 05 Information Visualization with SOMs Software model for Information Visualization of SOM Sequence diagram of sample document map call

© sebis 29JASS 05 Information Visualization with SOMs Agenda  Motivation  Self-Organizing Maps Origins Algorithm Example  Scalable Vector Graphics  Information Visualization with Self-Organizing Maps in an Information Portal  Conclusion

© sebis 30JASS 05 Information Visualization with SOMs Conclusion  Advantages SOM is Algorithm that projects high-dimensional data onto a two-dimensional map. The projection preserves the topology of the data so that similar data items will be mapped to nearby locations on the map. SOM still have many practical applications in pattern recognition, speech analysis, industrial and medical diagnostics, data mining  Disadvantages Large quantity of good quality representative training data required No generally accepted measure of ‘quality’ of a SOM e.g. Average quantization error (how well the data is classified)

© sebis 31JASS 05 Information Visualization with SOMs Thank you for listening

© sebis 32JASS 05 Information Visualization with SOMs Discussion topics  What is the main purpose of the SOM?  Do you know any example systems with SOM Algorithm?

© sebis 33JASS 05 Information Visualization with SOMs References [Witten and Frank (1999)] Witten, I.H. and Frank, Eibe. Data Mining: Practical Machine Learning Tools and Techniques with Java Implementations. Morgan Kaufmann Publishers, San Francisco, CA, USA [Kohonen (1982)] Teuvo Kohonen. Self-organized formation of topologically correct feature maps. Biol. Cybernetics, volume 43, [Kohonen (1995)] Teuvo Kohonen. Self-Organizing Maps. Springer, Berlin, Germany [Vesanto (1999)] SOM-Based Data Visualization Methods, Intelligent Data Analysis, 3: [Kohonen et al (1996)] T. Kohonen, J. Hynninen, J. Kangas, and J. Laaksonen, "SOM PAK: The Self-Organizing Map program package, " Report A31, Helsinki University of Technology, Laboratory of Computer and Information Science, Jan [Vesanto et al (1999)] J. Vesanto, J. Himberg, E. Alhoniemi, J Parhankangas. Self- Organizing Map in Matlab: the SOM Toolbox. In Proceedings of the Matlab DSP Conference 1999, Espoo, Finland, pp , [Wong and Bergeron (1997)] Pak Chung Wong and R. Daniel Bergeron. 30 Years of Multidimensional Multivariate Visualization. In Gregory M. Nielson, Hans Hagan, and Heinrich Muller, editors, Scientific Visualization - Overviews, Methodologies and Techniques, pages 3-33, Los Alamitos, CA, IEEE Computer Society Press. [Honkela (1997)] T. Honkela, Self-Organizing Maps in Natural Language Processing, PhD Thesis, Helsinki, University of Technology, Espoo, Finland [SVG wiki] [Jost Schatzmann (2003)] Final Year Individual Project Report Using Self-Organizing Maps to Visualize Clusters and Trends in Multidimensional Datasets Imperial college London 19 June 2003