Presentation is loading. Please wait.

Presentation is loading. Please wait.

CUNY Graduate Center December 15 Erdal Kose. Outlines Define SOMs Application Areas Structure Of SOMs (Basic Algorithm) Learning Algorithm Simulation.

Similar presentations


Presentation on theme: "CUNY Graduate Center December 15 Erdal Kose. Outlines Define SOMs Application Areas Structure Of SOMs (Basic Algorithm) Learning Algorithm Simulation."— Presentation transcript:

1 CUNY Graduate Center December 15 Erdal Kose

2 Outlines Define SOMs Application Areas Structure Of SOMs (Basic Algorithm) Learning Algorithm Simulation and Results Conclusion References

3 SOMs The best know and most popular model of Self- organizing networks is the topology-preserving map proposed by Teuvo Kohonen, known as Kohanen networks They provide a way of representing multidimensional data in much lower dimensional space, such as one or two dimensions.

4 Applications Image compression Data minding Bibliographic classification Image browsing systems Medical Diagnosis Speech recognition Clustering

5 Structure of SOM A self organized map consists of components called nodes Associated with each node is a weight vector of the same dimension as the input data vectors and a position in the map space The usual arrangement of nodes is a regular spacing in a hexagonal or rectangular grid

6

7 Learning Algorithm Each node's weights are initialized. A vector is chosen at random from the set of training data and presented to the lattice. Every node is examined to calculate which one's weights are most like the input vector. The winning node is commonly known as the Best Matching Unit (BMU). The radius of the neighborhood of the BMU is now calculated. This is a value that starts large, typically set to the 'radius' of the lattice, but diminishes each time-step. Any nodes found within this radius are deemed to be inside the BMU's neighborhood. Each neighboring node's (the nodes found in step 4) weights are adjusted to make them more like the input vector. The closer a node is to the BMU, the more its weights get altered. Repeat step 2 for N iterations. http://www.ai-junkie.com/ann/som/som2.html

8 The learning Algorithm in detail Random initialization means simply that random values are assigned to weight vectors. This is the case if nothing or little is known about the input data at the time of the initialization In one training step, one sample vector is drawn randomly from the input data set, This vector is fed to all units in the network and a similarity measure is calculated between the input data sample and all the weight vectors

9 Cont. The best matching unit: The Euclidian distance After finding the best-matching unit, units in the SOM are updated

10 Cont.. The neighborhood function includes the learning rate function which is a decreasing function of time and the function that dictates the form of the neighborhood function.

11 Adjusting the Weights Every node within the best matching unit’s (BMU) neighborhood (including the BMU) has its weight vector adjusted according to the following equation: W(t+1)=W(t)+α(t)(V(t)-W(t) Where t represents the time-step and α is the learning rate, which decreases with time. Basically, what this equation is saying, is that the new adjusted weight for the node is equal to the old weight (W), plus a fraction of the difference (α) between the old weight and the input vector (V)

12 Visulization World Poverty Map A SOM has been used to classify statistical data describing various quality-of-life factors such as state of health, nutrition, educational services etc.

13

14

15

16

17 Conclusion The Kohonen Feature Map was first introduced by finnish professor Teuvo Kohonen (University of Helsinki) in 1982. The "heart" of this type of networks is the feature map a neuron layer where neurons are organizing themselves according to certain input values. They could learn without supervision

18 References A growing Self-Organizing Algorithm for Dynamic Clustering Ryuji Ohta, Toshimichi Saito Hosei university,Japan (IEEE 2001) A Self-Organization Model of Feature Columns and Face Responsive Neurons in the Temporal Contex Takashie Takahashi, Tako Kurita National Institute of Advanced Industrial Science and Technology (2001 IEEE) http://www.cis.hut.fi/projects/ica/cocktail/cocktail_en.cgi http://www.cis.hut.fi/~jhollmen/dippa/node9.html http://www.ai-junkie.com/ann/som/som1.html http://www.borgelt.net/doc/somd/somd.html http://www.nnwj.de/sample-applet.html http://fbim.fh-regensburg.de/~saj39122/jfroehl/diplom/e- sample.html


Download ppt "CUNY Graduate Center December 15 Erdal Kose. Outlines Define SOMs Application Areas Structure Of SOMs (Basic Algorithm) Learning Algorithm Simulation."

Similar presentations


Ads by Google