Presentation is loading. Please wait.

Presentation is loading. Please wait.

-Artificial Neural Network- Chapter 9 Self Organization Map(SOM) 朝陽科技大學 資訊管理系 李麗華 教授.

Similar presentations


Presentation on theme: "-Artificial Neural Network- Chapter 9 Self Organization Map(SOM) 朝陽科技大學 資訊管理系 李麗華 教授."— Presentation transcript:

1 -Artificial Neural Network- Chapter 9 Self Organization Map(SOM) 朝陽科技大學 資訊管理系 李麗華 教授

2 朝陽科技大學 李麗華 教授 2 Introduction It’s proposed by Kohonen in 1980. SOM is an unsupervised two layered network that can organize a topological map from a random starting point. SOM is also called Kohnen’s self organizating feature map. The resulting map shows the natural relationships among the patterns that are given to the network. The application is good for clustering analysis.

3 朝陽科技大學 李麗華 教授 3 Network Structure – One input layer – One competitive layer which is usually a 2-Dim grid Input layer : f(x) : x Output layer : competitive layer with topological map relationship Weights : randomly assigned X1X1 X2X2 Y k1 Y k2 Y kj j Y 1k Y 11 Y 12 Y jk ( X ij,Y ij ) W 1jk W ijk k

4 朝陽科技大學 李麗華 教授 4 Concept of Neighborhood  Center : the winning node C is the center.  Distance :  R Factor (鄰近係數): RF j : f ( r j,R ) =e (- rj /R) e (-rj/R) → 1 when r j = ∮ e (-rj/R) → ∮ when r j = ∞ e (-rj/R) → 0.368 when r j = R The longer the distance, the smaller the neighborhood area.  R Factor Adjustment : R n =R-rate  R n-1, R-rate<1.0 N is the node is output N to C r j is the distance from N to C R : the radius of the neighborhood r j : distance from N to C

5 朝陽科技大學 李麗華 教授 5 Learning 1.Setup network 2.Randomly assign weights to W 3.Set the coordinate value of the output layer N ( x,y ) 4.Input a training vector X 5.Compute the winning node 6.Update weight △ W with R factor 7.η n =η-rate  η n-1 R n =R-rate  η n-1 8.Repeat from 4 to 7 until converge

6 朝陽科技大學 李麗華 教授 6 Reuse the network 1.Setup the network 2.Read the weight matrix 3.Set the coordriate value of the output layer N ( x,y ) 4.Read input vector 5.Compute the winning node 6.Output the clustering result Y.

7 朝陽科技大學 李麗華 教授 7 Computation process X 1 ~X n ( Input Vector ) N jk ( Output ) 1 j=j*&k=k* if φ others 3. Y jh = 2. Compute winning node 1. Setup network

8 朝陽科技大學 李麗華 教授 8 Computation process (cont.) △ W ijk =η ( X i - W ijk ). RF jk When W ijk = △ W ijk +W ijk r jk = 4. Update Weights 7. Repeat until converge η n = η -rate ‧ η n-1 R n = R -rate ‧ R n-1 6. 5. Repeat 1-4 for all input

9 朝陽科技大學 李麗華 教授 9 Example Let there be one 2-Dim clustering problem. The vector space include 5 different clusters and each has 2 sample sets. X1X2 1-0.9-0.8 2 0.6 30.90.6 40.7-0.4 5-0.20.2 6-0.7-0.6 7-0.90.8 80.70.6 90.8-0.8 100.1-0.2 1 1 1 6 4 9 5 10 8 3 7 2

10 朝陽科技大學 李麗華 教授 10 Solution  Setup an 2X9 network  Randomly assign weights  Let R=2.0 η=1.0  代入第一個 pattern[-0.9, -0.8] RF jk = X1X2 W i00 -0.2-0.8 W i01 0.2-0.4 W i02 0.30.6 W i10 -0.40.6 W i11 -0.30.2 W i12 -0.6-0.2 W i20 0.70.2 W i21 0.8-0.6 W i22 -0.8-0.6 W i10 W i11 W i02 W i20 W i12 W i22 W i00 W i01 W i21 1 1 net 00 =[(-0.9+0.2) 2 +(-0.8+0.8) 2 ]=0.49 net 01 =[(-0.9-0.2) 2 +(-0.8+0.4) 2 ]=1.37 net 02 =[(-0.9+0.3) 2 +(-0.8-0.6) 2 ]=2.32 net 10 =[(-0.9+0.4) 2 +(-0.8-0.6) 2 ]=1.71 net 11 =[(-0.9+0.3) 2 +(-0.8-0.2) 2 ]=1.36 net 12 =[(-0.9+0.6) 2 +(-0.8+0.2) 2 ]=0.45 net 20 =[(-0.9+0.7) 2 +(-0.8-0.2) 2 ]=1.04 net 21 =[(-0.9-0.8) 2 +(-0.8+0.6) 2 ]=2.93 net 22 =[(-0.9+0.8) 2 +(-0.8+0.6) 2 ]=0.05 net 22 【 MIN 】 Winning Node

11 朝陽科技大學 李麗華 教授 11 Solution (cont.)  Update weight j*=2 k*=2 r 00 = RF 00 =0.243 r 01 = RF 01 = r 02 = r 22 = RF 22 = 1  W 00 =η ‧ (X 1 -W 00 ) ‧ RF 00 1.0 × ( -0.9+0.2 ) × ( 0.243 ) = - 0.17 -0.17 W i10 W i11 W i02 W i20 W i12 W i22 W i00 W i01 W i21 1 1


Download ppt "-Artificial Neural Network- Chapter 9 Self Organization Map(SOM) 朝陽科技大學 資訊管理系 李麗華 教授."

Similar presentations


Ads by Google