Download presentation
Presentation is loading. Please wait.
1
Associative Learning in Hierarchical Self Organizing Learning Arrays Janusz A. Starzyk, Zhen Zhu, and Yue Li School of Electrical Engineering and Computer Science Ohio University, Athens, OH 45701, U.S.A.
2
2 Organization Introduction Network structure Associative learning Simulation results Conclusions and future work
3
3 Introduction - SOLAR SOLAR –Self Organizing Learning Array – A concept inspired by the structure of biological neural networks – Regular, two or three-dimensional array of identical processing cells, connected to programmable routing channels – Self-organizing in individual cells and the network structure
4
4 Introduction - SOLAR SOLAR vs. ANN –Deep multi-layer structure with sparse connections –Self organized neuron functions – Dynamic selection of interconnections –Hardware efficiency –Online learning A 15 X 7 SOLAR A 15 X 3 ANN
5
5 Introduction - Associative Learning Hetero-associative (HA) – To associate different types of input signals e. g. a verbal command with an image Auto-associative (AA) – To recall a pattern from a fractional part e. g. an image with missing part The proposed approach: An associative learning network in a hierarchical SOLAR structure - HA and AA www.idi.ntnu.no/~keithd/ classes/advai/lectures/assocnet.ppt
6
6 Introduction Network structure Associative learning Simulation results Conclusions and future work
7
7 Network Structure Two or three dimensional multi-layer regular structure –2 D networks: Input span – rows and network depth – columns –3 D networks, better for image applications “Small world” network connection –More local connections with short Euclidean distance (as in biological neural networks) –Few distant connections
8
8 Network Structure Hierarchical network connection –Each neuron only connects to the preceding layer Neuron connections: –Redundant initial inputs to be refined in learning –2 inputs (I 1 / I 2 ) and 1 output O –Feed-forward and feed-back links
9
9 Introduction Network structure Associative learning Simulation results Conclusions and future work
10
10 Associative learning – feed-forward Semi-logic inputs and internal signals: –scaling from 0 to 1, 0.5 = unknown; –0 = determinate low, 1 = determinate high; –> 0.5 = weak high, < 0.5 = weak low. The I 1 /I 2 relationship is are found with: –P(I 1 is low), P(I 1 is high), P(I 2 is low) & P(I 2 is high) –The joint probabilities, e.g. P(I 2 is low | I 1 is low) –The conditional probabilities, e.g.
11
11 Associative learning – feed-forward Compare the conditional probabilities against a confidence interval:, where N is # samples. If P(I 2 | I 1 ) – CI > threshold, I 2 can be implied from I 1
12
12 Associative learning – feed-forward A neuron is an associative neuron if I 1 can be implied from I 2 and I 2 can be implied from I 1, otherwise it is a transmitting neuron Six possible I 1 /I 2 distributions for associative neurons. A semi-logic function is designed for each one.
13
13 Associative learning – feed-forward In an associative neuron: –Functions are designed for data transformation and feedback calculation. –f 1 to f 4 – for data centered in one dominant quadrant. –f 5 to f 6 – for data mainly in two quadrants –Neuron output is 0.5 with an unknown input.
14
14 Associative learning – feed-forward In a transmitting neuron: –The input with higher entropy (dominant input) is transmitted to O, with the other is ignored. –I 1 is the dominant input if O may be an input to other neurons. O receives feedback O f from connected neurons, which generate feedback to its inputs.
15
15 Associative learning – feedback The network generates feedback to the unknown inputs through association.
16
16 Associative learning – feedback N 1 -> transmitting neurons – O f is passed back to the input. N 2 -> associative neurons with determined inputs –Feedback takes no effect and information passes forward. N 3 -> associative neurons with active feedback and inactive input(s) –O f creates feedbacks I 1f and I 2f through the function f, –These neurons only pass information backwards. N 4 -> actively associating neurons with inactive feedback –If one of their inputs is inactive, it will be overwritten based on its association with the other input and the neuron’s function f.
17
17 Associative learning – feedback Calculation of the feedback (using f 5 ): With an active output feedback, I 1f is determined based on f 5 and weighted using w 5. w 5 measures the quality of learning.
18
18 Introduction Network structure Associative learning Simulation results Conclusions and future work
19
19 Simulation results The TAE database (from University Wisconsin- Madison) –151 instances, 5 features and 3 classes The Iris plants database – 150 instances 4 features and 3 classes The glass identification Database –214 instances, 9 features and 6 classes Image Recovery –Two letters: B and J
20
20 Simulation results - TAE database Not hierarchical; Connections distribution Gaussian; vertically (STD = 30) and horizontally (STD = 5); correct rate = 62.33 % Features coded into binary format with sliding bars and classified using orthogonal codes:
21
21 Simulation results - Iris database Not hierarchical; Connections distribution Gaussian; vertically (STD = 30) and horizontally (STD = 5); correct rate = 73.74 %
22
22 Simulation results - Iris database Hierarchical; vertical connections 80% Gaussian (STD = 2) and 20% uniform; correct rate improved to 75.33 %
23
23 The hierarchical structure appears advantageous. Using equal number of bits for features and class IDs gives better rate. Performance further improved to 86% with mixed feature/classification bits. Simulation results - Iris database
24
24 Simulation results – glass ID database The depth of learning is related to the complexity of the target problem. With more classes, more actively associating neurons and more layers are needed. Average number of actively associating neurons per layer, with 3 / 6 classes
25
25 Simulation results - Image Recovery Training patterns An Average image of training patterns A 2-D image recovery task. 200 patterns are generated by adding random noise to 2 black-white images of letters B and J. The network was trained with 190 patterns and tested using 10 patterns. Mean correct classification rate: 100%
26
26 Simulation results - Image Recovery Testing result and recovered images Testing result and recovered image using input redundancy
27
27 Introduction Network structure Associative learning Simulation results Conclusions and future work
28
28 Conclusion and Future Work SOLAR has a flexible and sparse interconnect structure designed to emulate the organization of a human cortex It handles a wide variety of machine learning tasks including image recognition, classification and data recovery, and is suitable for online learning The associative learning SOLAR is adaptive network with feedback and inhibitory links It discovers the correlation between inputs and establishes associations inside the neurons It can perform auto associative and hetero associative learning It can be modified to perform value driven interaction with environment
Similar presentations
© 2024 SlidePlayer.com Inc.
All rights reserved.