-Artificial Neural Network- Chapter 9 Self Organization Map(SOM) 朝陽科技大學 資訊管理系 李麗華 教授.

Slides:



Advertisements
Similar presentations
2806 Neural Computation Self-Organizing Maps Lecture Ari Visa.
Advertisements

-Artificial Neural Network- Chapter 2 Basic Model
Un Supervised Learning & Self Organizing Maps. Un Supervised Competitive Learning In Hebbian networks, all neurons can fire at the same time Competitive.
Neural Networks Chapter 9 Joost N. Kok Universiteit Leiden.
Unsupervised learning. Summary from last week We explained what local minima are, and described ways of escaping them. We investigated how the backpropagation.
Self Organization: Competitive Learning
5/16/2015Intelligent Systems and Soft Computing1 Introduction Introduction Hebbian learning Hebbian learning Generalised Hebbian learning algorithm Generalised.
-Artificial Neural Network- Counter Propagation Network
Kohonen Self Organising Maps Michael J. Watts
Unsupervised Networks Closely related to clustering Do not require target outputs for each input vector in the training data Inputs are connected to a.
Machine Learning: Connectionist McCulloch-Pitts Neuron Perceptrons Multilayer Networks Support Vector Machines Feedback Networks Hopfield Networks.
X0 xn w0 wn o Threshold units SOM.
Self Organizing Maps. This presentation is based on: SOM’s are invented by Teuvo Kohonen. They represent multidimensional.
S. Mandayam/ ANN/ECE Dept./Rowan University Artificial Neural Networks ECE /ECE Fall 2008 Shreekanth Mandayam ECE Department Rowan University.
-Artificial Neural Network- Chapter 3 Perceptron 朝陽科技大學 資訊管理系 李麗華 教授.
- Calculus & It’s Application- Chapter 2 Introduction to Limits 朝陽科技大學 資訊管理系 李麗華 教授.
-Artificial Neural Network- Adaptive Resonance Theory(ART) 朝陽科技大學 資訊管理系 李麗華 教授.
-Artificial Neural Network- Chapter 4 朝陽科技大學 資訊管理系 李麗華 教授.
-Artificial Neural Network- Chapter 5 Back Propagation Network
-Artificial Neural Network- Matlab操作介紹 -以類神經網路BPN Model為例
Introduction to Neural Networks John Paxton Montana State University Summer 2003.
Neural Networks Lecture 17: Self-Organizing Maps
Lecture 09 Clustering-based Learning
-Artificial Neural Network- Chapter 3 Perceptron 朝陽科技大學 資訊管理系 李麗華教授.
KOHONEN SELF ORGANISING MAP SEMINAR BY M.V.MAHENDRAN., Reg no: III SEM, M.E., Control And Instrumentation Engg.
Unsupervised Learning and Clustering k-means clustering Sum-of-Squared Errors Competitive Learning SOM Pre-processing and Post-processing techniques.
Presentation on Neural Networks.. Basics Of Neural Networks Neural networks refers to a connectionist model that simulates the biophysical information.
Self Organized Map (SOM)
CZ5225: Modeling and Simulation in Biology Lecture 5: Clustering Analysis for Microarray Data III Prof. Chen Yu Zong Tel:
Artificial Neural Networks Dr. Abdul Basit Siddiqui Assistant Professor FURC.
-Artificial Neural Network- Hopfield Neural Network(HNN) 朝陽科技大學 資訊管理系 李麗華 教授.
Artificial Neural Network Unsupervised Learning
A Scalable Self-organizing Map Algorithm for Textual Classification: A Neural Network Approach to Thesaurus Generation Dmitri G. Roussinov Department of.
Self-Organizing Maps Corby Ziesman March 21, 2007.
1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 21 Oct 28, 2005 Nanjing University of Science & Technology.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Machine Learning Neural Networks (3). Understanding Supervised and Unsupervised Learning.
Self Organizing Feature Map CS570 인공지능 이대성 Computer Science KAIST.
Neural Networks - Lecture 81 Unsupervised competitive learning Particularities of unsupervised learning Data clustering Neural networks for clustering.
IE 585 Competitive Network – Learning Vector Quantization & Counterpropagation.
UNSUPERVISED LEARNING NETWORKS
Intelligent Database Systems Lab 國立雲林科技大學 National Yunlin University of Science and Technology Exploiting Data Topology in Visualization and Clustering.
381 Self Organization Map Learning without Examples.
Privacy-Preserving Self- Organizing Map Shuguo Han and Wee Keong Ng Center for Advanced Information Systems, School of Computer Engineering,Nanyang Technological.
CUNY Graduate Center December 15 Erdal Kose. Outlines Define SOMs Application Areas Structure Of SOMs (Basic Algorithm) Learning Algorithm Simulation.
1 Lecture 6 Neural Network Training. 2 Neural Network Training Network training is basic to establishing the functional relationship between the inputs.
What is Unsupervised Learning? Learning without a teacher. No feedback to indicate the desired outputs. The network must by itself discover the relationship.
Unsupervised Learning Networks 主講人 : 虞台文. Content Introduction Important Unsupervised Learning NNs – Hamming Networks – Kohonen’s Self-Organizing Feature.
Intelligent Database Systems Lab 國立雲林科技大學 National Yunlin University of Science and Technology O( ㏒ 2 M) Self-Organizing Map Algorithm Without Learning.
Semiconductors, BP&A Planning, DREAM PLAN IDEA IMPLEMENTATION.
Intelligent Database Systems Lab 國立雲林科技大學 National Yunlin University of Science and Technology 1 A self-organizing map for adaptive processing of structured.
Example Apply hierarchical clustering with d min to below data where c=3. Nearest neighbor clustering d min d max will form elongated clusters!
南台科技大學 資訊工程系 An effective solution for trademark image retrieval by combining shape description and feature matching 指導教授:李育強 報告者 :楊智雁 日期 : 2010/08/27.
A Self-organizing Semantic Map for Information Retrieval Xia Lin, Dagobert Soergel, Gary Marchionini presented by Yi-Ting.
Chapter 5 Unsupervised learning
Self-Organizing Network Model (SOM) Session 11
Data Mining, Neural Network and Genetic Programming
Unsupervised Learning Networks
Other Applications of Energy Minimzation
Unsupervised Learning and Neural Networks
Prof. Carolina Ruiz Department of Computer Science
Lecture 22 Clustering (3).
Competitive Networks.
network of simple neuron-like computing elements
-Artificial Neural Network- Perceptron (感知器)
Introduction to Cluster Analysis
Feature mapping: Self-organizing Maps
Artificial Neural Networks
Unsupervised Networks Closely related to clustering
Prof. Carolina Ruiz Department of Computer Science
Presentation transcript:

-Artificial Neural Network- Chapter 9 Self Organization Map(SOM) 朝陽科技大學 資訊管理系 李麗華 教授

朝陽科技大學 李麗華 教授 2 Introduction It’s proposed by Kohonen in SOM is an unsupervised two layered network that can organize a topological map from a random starting point. SOM is also called Kohnen’s self organizating feature map. The resulting map shows the natural relationships among the patterns that are given to the network. The application is good for clustering analysis.

朝陽科技大學 李麗華 教授 3 Network Structure – One input layer – One competitive layer which is usually a 2-Dim grid Input layer : f(x) : x Output layer : competitive layer with topological map relationship Weights : randomly assigned X1X1 X2X2 Y k1 Y k2 Y kj j Y 1k Y 11 Y 12 Y jk ( X ij,Y ij ) W 1jk W ijk k

朝陽科技大學 李麗華 教授 4 Concept of Neighborhood  Center : the winning node C is the center.  Distance :  R Factor (鄰近係數): RF j : f ( r j,R ) =e (- rj /R) e (-rj/R) → 1 when r j = ∮ e (-rj/R) → ∮ when r j = ∞ e (-rj/R) → when r j = R The longer the distance, the smaller the neighborhood area.  R Factor Adjustment : R n =R-rate  R n-1, R-rate<1.0 N is the node is output N to C r j is the distance from N to C R : the radius of the neighborhood r j : distance from N to C

朝陽科技大學 李麗華 教授 5 Learning 1.Setup network 2.Randomly assign weights to W 3.Set the coordinate value of the output layer N ( x,y ) 4.Input a training vector X 5.Compute the winning node 6.Update weight △ W with R factor 7.η n =η-rate  η n-1 R n =R-rate  η n-1 8.Repeat from 4 to 7 until converge

朝陽科技大學 李麗華 教授 6 Reuse the network 1.Setup the network 2.Read the weight matrix 3.Set the coordriate value of the output layer N ( x,y ) 4.Read input vector 5.Compute the winning node 6.Output the clustering result Y.

朝陽科技大學 李麗華 教授 7 Computation process X 1 ~X n ( Input Vector ) N jk ( Output ) 1 j=j*&k=k* if φ others 3. Y jh = 2. Compute winning node 1. Setup network

朝陽科技大學 李麗華 教授 8 Computation process (cont.) △ W ijk =η ( X i - W ijk ). RF jk When W ijk = △ W ijk +W ijk r jk = 4. Update Weights 7. Repeat until converge η n = η -rate ‧ η n-1 R n = R -rate ‧ R n Repeat 1-4 for all input

朝陽科技大學 李麗華 教授 9 Example Let there be one 2-Dim clustering problem. The vector space include 5 different clusters and each has 2 sample sets. X1X

朝陽科技大學 李麗華 教授 10 Solution  Setup an 2X9 network  Randomly assign weights  Let R=2.0 η=1.0  代入第一個 pattern[-0.9, -0.8] RF jk = X1X2 W i W i W i W i W i W i W i W i W i W i10 W i11 W i02 W i20 W i12 W i22 W i00 W i01 W i net 00 =[( ) 2 +( ) 2 ]=0.49 net 01 =[( ) 2 +( ) 2 ]=1.37 net 02 =[( ) 2 +( ) 2 ]=2.32 net 10 =[( ) 2 +( ) 2 ]=1.71 net 11 =[( ) 2 +( ) 2 ]=1.36 net 12 =[( ) 2 +( ) 2 ]=0.45 net 20 =[( ) 2 +( ) 2 ]=1.04 net 21 =[( ) 2 +( ) 2 ]=2.93 net 22 =[( ) 2 +( ) 2 ]=0.05 net 22 【 MIN 】 Winning Node

朝陽科技大學 李麗華 教授 11 Solution (cont.)  Update weight j*=2 k*=2 r 00 = RF 00 =0.243 r 01 = RF 01 = r 02 = r 22 = RF 22 = 1  W 00 =η ‧ (X 1 -W 00 ) ‧ RF × ( ) × ( ) = - W i10 W i11 W i02 W i20 W i12 W i22 W i00 W i01 W i21 1 1