Presentation is loading. Please wait.

Presentation is loading. Please wait.

Intelligent Database Systems Lab 國立雲林科技大學 National Yunlin University of Science and Technology O( ㏒ 2 M) Self-Organizing Map Algorithm Without Learning.

Similar presentations


Presentation on theme: "Intelligent Database Systems Lab 國立雲林科技大學 National Yunlin University of Science and Technology O( ㏒ 2 M) Self-Organizing Map Algorithm Without Learning."— Presentation transcript:

1 Intelligent Database Systems Lab 國立雲林科技大學 National Yunlin University of Science and Technology O( ㏒ 2 M) Self-Organizing Map Algorithm Without Learning of Neighborhood Vectors Hiroki Kusumoto and Yoshiyasu Takefuji, IEEE Transaction on Neural Networks, Vol. 17, No. 6, 2006, pp. 1656-1661. Presenter : Wei-Shen Tai Advisor : Professor Chung-Chian Hsu 2007/3/1

2 N.Y.U.S.T. I. M. Intelligent Database Systems Lab Outline Introduction Algorithm  Initialization  Subdividing method  Binary search Simulation and results Discussion Conclusion Comments

3 N.Y.U.S.T. I. M. Intelligent Database Systems Lab Motivation BMU searching time  One input vector to search a winner vector by exhaustive search is equivalent to M 2.(M*M matrix) Similar input in different clusters  Two similar inputs that belong to the same cluster are mapped on the distant weight vectors. Neighborhood function  Time-consuming parameter tuning

4 N.Y.U.S.T. I. M. Intelligent Database Systems Lab Objective A new SOM algorithm with O(log 2 M)  Composed of the subdividing method and the binary search method.  Reduces the computational costs and eliminates the time-consuming parameter tuning in the neighborhood function.  Similar input vectors will be clustered in the same neuron.

5 N.Y.U.S.T. I. M. Intelligent Database Systems Lab Initialization A feature map  A 2-D layer of M*M nodes (M = 2m + 1, m = 1, 2, 3,...). Four initial nodes  On the coordinates (1, 1), (1,M), (M, 1), and (M, M) have k-dimensional weight vectors W(x, y).  They are trained by the basic SOM with total O(1) computation, respectively.

6 N.Y.U.S.T. I. M. Intelligent Database Systems Lab Subdividing Method Subdividing  Draws center lines between all neighboring, it subdivides an M ’* M ’ feature map into a (2M ’- 1) * (2M ’- 1) feature map. Weight of the new gray nodes  The average of the values of weight vectors of the closest nodes to the new node.

7 N.Y.U.S.T. I. M. Intelligent Database Systems Lab Binary Search and learning Step A. search space  x 1 ≦ x ≦ x 2 and y 1 ≦ y ≦ y 2  Closest vector to X(t) Step B. dividing search space  A winner vector is on a quarter space where the closest vector W(x c, y c ) exits. Learning

8 N.Y.U.S.T. I. M. Intelligent Database Systems Lab Simulations and results Computational cost Codon Frequencies of E. Coli K12 Genes

9 N.Y.U.S.T. I. M. Intelligent Database Systems Lab Discussion Problem in basic SOMs with a large feature map  The proposed algorithm does not search all weight vectors for the winner vectors.  It can avoid two similar input vectors that belong to the same cluster are mapped on the distant weight vectors. Subdividing method  The search space is reduced to a quarter that includes the temporary winner vector. Binary search method  Reduces the computational costs and can work only when it is combined with the subdividing method.  Eliminates the time-consuming parameter tuning in neighborhood function.

10 N.Y.U.S.T. I. M. Intelligent Database Systems Lab Transmissions of a learning effect Learning effect  Each square denotes a weight vector and L denotes the variation of W(M, M) by the training.  L/2 is transmitted to the just adjacent two vectors out of five new vectors and L/4 to the center vector and none to the other two far vectors, accordingly.

11 N.Y.U.S.T. I. M. Intelligent Database Systems Lab Conclusion A new SOM algorithm with computational cost O(log 2 M)  Eliminates the time-consuming parameter tuning in neighborhood function in SOM applications.

12 N.Y.U.S.T. I. M. Intelligent Database Systems Lab Comments Advantage  A novel idea for reducing the computational cost and parameter tuning in neighborhood function in SOM.  Subdividing method is applicable for reducing search space. Drawback  If the initial weighting of each neuron is arbitrary, it maybe causes some subdividing problems such as the spectrum of each neuron are extreme different. Application  SOM related applications.


Download ppt "Intelligent Database Systems Lab 國立雲林科技大學 National Yunlin University of Science and Technology O( ㏒ 2 M) Self-Organizing Map Algorithm Without Learning."

Similar presentations


Ads by Google