Unsupervised Learning Networks 主講人 : 虞台文. Content Introduction Important Unsupervised Learning NNs – Hamming Networks – Kohonen’s Self-Organizing Feature.

Slides:



Advertisements
Similar presentations
Introduction to Artificial Neural Networks
Advertisements

2806 Neural Computation Self-Organizing Maps Lecture Ari Visa.
Un Supervised Learning & Self Organizing Maps. Un Supervised Competitive Learning In Hebbian networks, all neurons can fire at the same time Competitive.
Neural Networks Chapter 9 Joost N. Kok Universiteit Leiden.
Unsupervised learning. Summary from last week We explained what local minima are, and described ways of escaping them. We investigated how the backpropagation.
5/16/2015Intelligent Systems and Soft Computing1 Introduction Introduction Hebbian learning Hebbian learning Generalised Hebbian learning algorithm Generalised.
Kohonen Self Organising Maps Michael J. Watts
Unsupervised Learning with Artificial Neural Networks The ANN is given a set of patterns, P, from space, S, but little/no information about their classification,
Artificial neural networks:
Competitive learning College voor cursus Connectionistische modellen M Meeter.
Self-Organizing Map (SOM). Unsupervised neural networks, equivalent to clustering. Two layers – input and output – The input layer represents the input.
Machine Learning: Connectionist McCulloch-Pitts Neuron Perceptrons Multilayer Networks Support Vector Machines Feedback Networks Hopfield Networks.
X0 xn w0 wn o Threshold units SOM.
ETHEM ALPAYDIN © The MIT Press, Lecture Slides for.
1 Chapter 10 Introduction to Machine Learning. 2 Chapter 10 Contents (1) l Training l Rote Learning l Concept Learning l Hypotheses l General to Specific.
Slides are based on Negnevitsky, Pearson Education, Lecture 8 Artificial neural networks: Unsupervised learning n Introduction n Hebbian learning.
Introduction to Neural Networks John Paxton Montana State University Summer 2003.
Chapter 2: Pattern Recognition
An Illustrative Example
Introduction to Neural Networks John Paxton Montana State University Summer 2003.
KNN, LVQ, SOM. Instance Based Learning K-Nearest Neighbor Algorithm (LVQ) Learning Vector Quantization (SOM) Self Organizing Maps.
Neural Networks based on Competition
Introduction to Neural Networks John Paxton Montana State University Summer 2003.
November 24, 2009Introduction to Cognitive Science Lecture 21: Self-Organizing Maps 1 Self-Organizing Maps (Kohonen Maps) In the BPN, we used supervised.
Un Supervised Learning & Self Organizing Maps Learning From Examples
Neural Networks Lecture 17: Self-Organizing Maps
Lecture 09 Clustering-based Learning
Presentation on Neural Networks.. Basics Of Neural Networks Neural networks refers to a connectionist model that simulates the biophysical information.
Self Organized Map (SOM)
CZ5225: Modeling and Simulation in Biology Lecture 5: Clustering Analysis for Microarray Data III Prof. Chen Yu Zong Tel:
Chapter 4. Neural Networks Based on Competition Competition is important for NN –Competition between neurons has been observed in biological nerve systems.
Artificial Neural Networks Dr. Abdul Basit Siddiqui Assistant Professor FURC.
-Artificial Neural Network- Chapter 9 Self Organization Map(SOM) 朝陽科技大學 資訊管理系 李麗華 教授.
Artificial Neural Network Unsupervised Learning
A Scalable Self-organizing Map Algorithm for Textual Classification: A Neural Network Approach to Thesaurus Generation Dmitri G. Roussinov Department of.
Stephen Marsland Ch. 9 Unsupervised Learning Stephen Marsland, Machine Learning: An Algorithmic Perspective. CRC 2009 based on slides from Stephen.
IE 585 Associative Network. 2 Associative Memory NN Single-layer net in which the weights are determined in such a way that the net can store a set of.
Machine Learning Neural Networks (3). Understanding Supervised and Unsupervised Learning.
Institute for Advanced Studies in Basic Sciences – Zanjan Kohonen Artificial Neural Networks in Analytical Chemistry Mahdi Vasighi.
Self Organizing Feature Map CS570 인공지능 이대성 Computer Science KAIST.
Neural Networks - Lecture 81 Unsupervised competitive learning Particularities of unsupervised learning Data clustering Neural networks for clustering.
UNSUPERVISED LEARNING NETWORKS
1 Chapter 10 Introduction to Machine Learning. 2 Chapter 10 Contents (1) l Training l Rote Learning l Concept Learning l Hypotheses l General to Specific.
Digital Image Processing & Pattern Analysis (CSCE 563) Introduction to Pattern Analysis Prof. Amr Goneid Department of Computer Science & Engineering The.
381 Self Organization Map Learning without Examples.
CUNY Graduate Center December 15 Erdal Kose. Outlines Define SOMs Application Areas Structure Of SOMs (Basic Algorithm) Learning Algorithm Simulation.
1 Lecture 6 Neural Network Training. 2 Neural Network Training Network training is basic to establishing the functional relationship between the inputs.
What is Unsupervised Learning? Learning without a teacher. No feedback to indicate the desired outputs. The network must by itself discover the relationship.
Semiconductors, BP&A Planning, DREAM PLAN IDEA IMPLEMENTATION.
CHAPTER 14 Competitive Networks Ming-Feng Yeh.
Example Apply hierarchical clustering with d min to below data where c=3. Nearest neighbor clustering d min d max will form elongated clusters!
Supervised Learning – Network is presented with the input and the desired output. – Uses a set of inputs for which the desired outputs results / classes.
A Self-organizing Semantic Map for Information Retrieval Xia Lin, Dagobert Soergel, Gary Marchionini presented by Yi-Ting.
Machine Learning 12. Local Models.
Chapter 5 Unsupervised learning
Self-Organizing Network Model (SOM) Session 11
Data Mining, Neural Network and Genetic Programming
Unsupervised Learning Networks
Unsupervised Learning and Neural Networks
Dr. Unnikrishnan P.C. Professor, EEE
Lecture 22 Clustering (3).
Competitive Networks.
Competitive Networks.
Introduction to Cluster Analysis
Feature mapping: Self-organizing Maps
Machine Learning – a Probabilistic Perspective
Artificial Neural Networks
Unsupervised Networks Closely related to clustering
Presentation transcript:

Unsupervised Learning Networks 主講人 : 虞台文

Content Introduction Important Unsupervised Learning NNs – Hamming Networks – Kohonen’s Self-Organizing Feature Maps – Grossberg’s ART Networks – Counterpropagation Networks – Adaptive BAN – Neocognitron Conclusion

Unsupervised Learning Networks Introduction

What is Unsupervised Learning? Learning without a teacher. No feedback to indicate the desired outputs. The network must by itself discover the relationship of interest from the input data. – E.g., patterns, features, regularities, correlations, or categories. Translate the discovered relationship into output.

A Strange World

Supervised Learning IQ Height A B C

Supervised Learning IQ Height A B C Try Classification

The Probabilities of Populations IQ Height A B C

The Centroids of Clusters IQ Height A B C

The Centroids of Clusters IQ Height A B C Try Classification

Unsupervised Learning IQ Height

Unsupervised Learning IQ Height

Clustering Analysis IQ Height Categorize the input patterns into several classes based on the similarity among patterns.

Clustering Analysis IQ Height Categorize the input patterns into several classes based on the similarity among patterns. How many classes we may have?

Clustering Analysis IQ Height Categorize the input patterns into several classes based on the similarity among patterns. 2 clusters

Clustering Analysis IQ Height Categorize the input patterns into several classes based on the similarity among patterns. 3 clusters

Clustering Analysis IQ Height Categorize the input patterns into several classes based on the similarity among patterns. 4 clusters

Unsupervised Learning Networks The Hamming Networks

The Nearest Neighbor Classifier Suppose that we have p prototypes centered at x (1), x (2), …, x (p). Given pattern x, it is assigned to the class label of the i th prototype if Examples of distance measures include the Hamming distance and Euclidean distance.

The Nearest Neighbor Classifier x (1) x (2) x (3) x (4) The Stored Prototypes

The Nearest Neighbor Classifier x (1) x (2) x (3) x (4)  ?Class

The Hamming Networks Stored a set of classes represented by a set of binary prototypes. Given an incomplete binary input, find the class to which it belongs. Use Hamming distance as the distance measurement. Distance vs. Similarity.

The Hamming Net Similarity Measurement MAXNET Winner-Take-All x1x1 x2x2 xnxn

The Hamming Distance y = 1   x =  1   1 1 Hamming Distance = ?

y = 1   x =  1   1 1 The Hamming Distance Hamming Distance = 3

y = 1   The Hamming Distance   1  1 1 Sum=1 x =  1   1 1

The Hamming Distance

The Hamming Net Similarity Measurement MAXNET Winner-Take-All n1n1 n1n1 n n x1x1 x2x2 xm1xm1 xmxm n1n1 n1n1 n n y1y1 y2y2 yn1yn1 ynyn

The Hamming Net Similarity Measurement MAXNET Winner-Take-All n1n1 n1n1 n n x1x1 x2x2 xm1xm1 xmxm n1n1 n1n1 n n y1y1 y2y2 yn1yn1 ynyn W S =? W M =?

The Stored Patterns Similarity Measurement MAXNET Winner-Take-All n1n1 n1n1 n n x1x1 x2x2 xm1xm1 xmxm n1n1 n1n1 n n y1y1 y2y2 yn1yn1 ynyn W S =? W M =?

The Stored Patterns Similarity Measurement k x1x1 x2x2 xmxm... m/2

Weights for Stored Patterns Similarity Measurement n1n1 n1n1 n n x1x1 x2x2 xm1xm1 xmxm W S =?

Weights for Stored Patterns Similarity Measurement n1n1 n1n1 n n x1x1 x2x2 xm1xm1 xmxm W S =? m/2

The MAXNET Similarity Measurement MAXNET Winner-Take-All n1n1 n1n1 n n x1x1 x2x2 xm1xm1 xmxm n1n1 n1n1 n n y1y1 y2y2 yn1yn1 ynyn

Weights of MAXNET MAXNET Winner-Take-All n1n1 n1n1 n n y1y1 y2y2 yn1yn1 ynyn 1 1

Weights of MAXNET MAXNET Winner-Take-All n1n1 n1n1 n n y1y1 y2y2 yn1yn1 ynyn   0<  < 1/n 1 1

Updating Rule MAXNET Winner-Take-All n1n1 n1n1 n n   0<  < 1/n 1 1 s1s1 s2s2 s3s3 snsn

Updating Rule MAXNET Winner-Take-All n1n1 n1n1 n n   0<  < 1/n 1 1 s1s1 s2s2 s3s3 snsn

Analysis  Updating Rule Let If now

Analysis  Updating Rule Let If now

Example

Unsupervised Learning Networks The Self-organizing Feature Map

Feature Mapping Map high-dimensional input signals onto a lower- dimensional (usually 1 or 2D) structure. Similarity relations present in the original data are still present after the mapping. Dimensionality Reduction Topology-Preserving Map

Somatotopic Map Illustration: The “Homunculus” The relationship between body surfaces and the regions of the brain that control them.

Another Depiction of the Homunculus

Phonotopic maps

humppila

Self-Organizing Feature Map Developed by professor Kohonen. One of the most popular neural network models. Unsupervised learning. Competitive learning networks.

The Structure of SOM

Example

Local Excitation, Distal Inhibition

Topological Neighborhood SquareHex

Size Shrinkage

Learning Rule Similarity Matching Updating

Example