Presentation is loading. Please wait.

Presentation is loading. Please wait.

Outcomes  Look at the theory of self-organisation.  Other self-organising networks  Look at examples of neural network applications.

Similar presentations


Presentation on theme: "Outcomes  Look at the theory of self-organisation.  Other self-organising networks  Look at examples of neural network applications."— Presentation transcript:

1

2 Outcomes  Look at the theory of self-organisation.  Other self-organising networks  Look at examples of neural network applications

3 Four requirements for SOM Weights in neuron must represent a class of pattern  one neuron, one class

4 Four requirements for SOM Inputs pattern presented to all neurons and each produces an output.  Output: measure of the match between input pattern and pattern stored by neuron.

5 Four requirements A competitive learning strategy selects neuron with largest response.

6 Four requirements A method of reinforcing the largest response.

7 Architecture  The Kohonen network (named after Teuvo Kohonen from Finland) is a self-organising network  Neurons are usually arranged on a 2- dimensional grid  Inputs are sent to all neurons  There are no connections between neurons

8 Architecture Kohonen network X

9 Theory  For a neuron output (j) is a weighted some:  Where x is the input, w is the weights, net is the output of the neuron

10 Four requirement-Kohonen networks  True  Euclidean distance and weighted sum  Winner takes all  Learning rule of Kohonen learning

11 Output value  The output of each neuron is the weighted sum  There is no threshold or bias  Input values and weights are normalized

12 “Winner takes all”  Initially the weights in each neuron are random  Input values are sent to all the neurons  The outputs of each neuron are compared  The “winner” is the neuron with the largest output value

13 Training  Having found the winner, the weights of the winning neuron are adjusted  Weights of neurons in a surrounding neighbourhood are also adjusted

14 Neighbourhood X Kohonen network neighbourhood

15 Training  As training progresses the neighbourhood gets smaller  Weights are adjusted according to the following formula:

16 Weight adjustment  The learning coefficient (alpha) starts with a value of 1 and gradually reduces to 0  This has the effect of making big changes to the weights initially, but no changes at the end  The weights are adjusted so that they more closely resemble the input patterns

17 Example  A Kohonen network receives the input pattern 0.6 0.6 0.6.  Two neurons in the network have weights 0.5 0.3 0.8 and -0.6 –0.5 0.6.  Which neuron will have its weights adjusted and what will the new values of the weights be if the learning coefficient is 0.4?

18 Answer

19 Summary  The Kohonen network is self-organising  It uses unsupervised training  All the neurons are connected to the input  A winner takes all mechanism determines which neuron gets its weights adjusted  Neurons in a neighbourhood also get adjusted

20 Demonstration  A demonstration of a Kohonen network learning has been taken from the following websites:  http://www.patol.com/java/TSP/index.html http://www.patol.com/java/TSP/index.html  http://www.samhill.co.uk/kohonen/index.htm http://www.samhill.co.uk/kohonen/index.htm

21 Applications of Neural Networks

22 Example Applications  Analysis of data  Classifying in EEG  Pattern recognition in ECG  EMG disease detection.

23 Gueli N et al (2005) The influence of lifestyle on cardiovascular risk factors analysis using a neural network Archives of Gerontology and Geriatrics 40 157–172  To produce a model of risk facts in heart disease.  MLP used  The accuracy was relatively good for chlorestremia and triglyceremdia:  Training phase around 99%  Testing phase around 93%  Not so good for HDL

24

25 Subasi A (in press) Automatic recognition of alertness level from EEG by using neural network and wavelet coefficients Expert Systems with Applications xx (2004) 1–11  Electroencephalography (EEG)  Recordings of electrical activity from the brain.  Classifying operation  Awake  Drowsy  Sleep

26  MLP  15-23-3  Hidden layer – log-tanh function  Output layer – log-sigmoid function  Input is normalise to be within the range 0 to 1.

27  Accuracy  95%+/-3% alert  93%+/-4% drowsy  92+/-5% sleep  Feature were extracted and form the input to the network, from wavelets.

28 Karsten Sternickel (2002) Automatic pattern recognition in ECG time series Computer Methods and Programs in Biomedicine 68 109–115  ECG – electrocardiographs – electrical signals from the heart.  Wavelets again.  Classification of patterns  Patterns were spotted

29

30

31 Abel et al (1996) Neural network analysis of the EMG interference pattern Med. Eng. Phys. Vol. 18, No. 1. pp. 12-l 7  EMG – Electromyography – muscle activity.  Interference patterns are signals produce from various parts of a muscle- hard to see features.  Applied neural network to EMG interference patterns.

32  Classifying  Nerve disease  Muscle disease  Controls  Applied various different ways of presenting the pattern to the ANN.  Good for less serve cases, serve cases can often be see by the clinician.

33 Example Applications  Wave prediction  Controlling a vehicle  Condition monitoring

34 Wave prediction  Raoa S, Mandal S(2005) Hindcasting of storm waves using neural networks Ocean Engineering 32 (2005) 667–684  MLP used to predict storm waves.  2:2:2 network  Good correlation between ANN model and another model

35

36 van de Ven P, Flanagan C, Toal D (in press) Neural network control of underwater vehicles Engineering Applications of Artificial Intelligence  Semiautomous vehicle  Control using ANN  ANN replaces a mathematical model of the system.

37

38

39 Silva et al (2000) THE ADAPTABILITY OF A TOOL WEAR MONITORING SYSTEM UNDER CHANGING CUTTING CONDITIONS Mechanical Systems and Signal Processing (2000) 14(2), 287-298  Modelling tool wear  Combines ANN with other AI (Expert systems)  Self organising Maps (SOM) and ART2 investigated  SOM better for extracting the required information.

40

41 Examples to try yourself  A.1 Number recognition (ONR)  http://www.generation5.org/jdk/demos.asp# neuralNetworks http://www.generation5.org/jdk/demos.asp# neuralNetworks  Details: http://www.generation5.org/content/2004/si mple_ocr.asp http://www.generation5.org/content/2004/si mple_ocr.asp

42  B.1 Kohonen Self Organising Example 1  http://www.generation5.org/jdk/demos.asp# neuralNetworks http://www.generation5.org/jdk/demos.asp# neuralNetworks  B.2 Kohonen 3D travelling salesman problem  http://fbim.fh- regensburg.de/~saj39122/jfroehl/diplom/e- index.html http://fbim.fh- regensburg.de/~saj39122/jfroehl/diplom/e- index.html


Download ppt "Outcomes  Look at the theory of self-organisation.  Other self-organising networks  Look at examples of neural network applications."

Similar presentations


Ads by Google