Download presentation
Presentation is loading. Please wait.
Published byRolf Elliott Modified over 9 years ago
1
Neural Networks Architecture Baktash Babadi IPM, SCS Fall 2004
2
The Neuron Model
3
Architectures (1) Feed Forward Networks Feed Forward Networks The neurons are arranged in separate layers The neurons are arranged in separate layers There is no connection between the neurons in the same layer There is no connection between the neurons in the same layer The neurons in one layer receive inputs from the previous layer The neurons in one layer receive inputs from the previous layer The neurons in one layer delivers its output to the next layer The neurons in one layer delivers its output to the next layer The connections are unidirectional The connections are unidirectional (Hierarchical) (Hierarchical)
4
Architectures (2) Recurrent Networks Some connections are present from a layer to the previous layers Some connections are present from a layer to the previous layers
5
Architectures (3) Associative networks There is no hierarchical arrangement There is no hierarchical arrangement The connections can be bidirectional The connections can be bidirectional
6
Why Feed Forward? Why Recurrent/Associative?
7
An Example of Associative Networks: Hopfield Network John Hopfield (1982) Associative Memory via artificial neural networks Associative Memory via artificial neural networks Solution for optimization problems Solution for optimization problems Statistical mechanics Statistical mechanics
8
Neurons in Hopfield Network The neurons are binary units They are either active (1) or passive They are either active (1) or passive Alternatively + or – Alternatively + or – The network contains N neurons The state of the network is described as a vector of 0s and 1s:
9
The architecture of Hopfield Network The network is fully interconnected All the neurons are connected to each other All the neurons are connected to each other The connections are bidirectional and symmetric The connections are bidirectional and symmetric The setting of weights depends on the application The setting of weights depends on the application
10
Updating the Hopfield Network The state of the network changes at each time step. There are four updating modes: Serial – Random: Serial – Random: The state of a randomly chosen single neuron will be updated at each time step Serial-Sequential : Serial-Sequential : The state of a single neuron will be updated at each time step, in a fixed sequence Parallel-Synchronous: Parallel-Synchronous: All the neurons will be updated at each time step synchronously Parallel Asynchronous: Parallel Asynchronous: The neurons that are not in refractoriness will be updated at the same time
11
The updating Rule (1): Here we assume that updating is serial-Random Updating will be continued until a stable state is reached. Each neuron receives a weighted sum of the inputs from other neurons: Each neuron receives a weighted sum of the inputs from other neurons: If the input is positive the state of the neuron will be 1, otherwise 0: If the input is positive the state of the neuron will be 1, otherwise 0:
12
The updating rule (2)
13
Convergence of the Hopfield Network (1) Does the network eventually reach a stable state (convergence)? To evaluate this a ‘energy’ value will be associated to the network: The system will be converged if the energy is minimized
14
Convergence of the Hopfield Network (2) Why energy? Why energy? An analogy with spin-glass models of Ferro- magnetism (Ising model): An analogy with spin-glass models of Ferro- magnetism (Ising model): The system is stable if the energy is minimized The system is stable if the energy is minimized
15
Convergence of the Hopfield Network (3) Why convergence?
16
Convergence of the Hopfield Network (4) The changes of E with updating: In each case the energy will decrease or remains constant thus the system tends to Stabilize.
17
The Energy Function: The energy function is similar to a multidimensional (N) terrain Global Minimum Local Minimum
18
Hopfield network as a model for associative memory Associative memory Associates different features with eacother Associates different features with eacother Karen green George red Paul blue Recall with partial cues Recall with partial cues
19
Neural Network Model of associative memory Neurons are arranged like a grid:
20
Setting the weights Each pattern can be denoted by a vector of -1s or 1s: If the number of patterns is m then: Hebbian Learning: The neurons that fire together, wire together The neurons that fire together, wire together
21
Limitations of Hofield associative memory 1) The evoked pattern is sometimes not necessarily the most similar pattern to the input 2) Some patterns will be recall more than others 3) Spurious states: non-original patterns Capacity: 0.15 N
22
Hopfield network and the brain (1): In the real neuron, synapses are distributed along the dendritic tree and their distance change the synaptic weight In hopfield network there is no dendritic geometry If they are distributed uniformly, the geometry is not important
23
In the brain the Dale principle holds and the connections are not symmetric The hopfield network with assymetric weights and dale principle, work properly Hopfield network and the brain (2):
24
The brain is insensitive to noise and local lesions Hopfield network can tolerate noise in the input and partial loss of synapses Hopfield network and the brain (3):
25
In brain the neurons are not binary devices, they generate continuous values of firing rates Hopfield network with sigmoid transfer function is even more powerful than the binary version Hopfield network and the brain (4):
26
In the brain most of the neurons are silent or firing at low rates but in hopfield network many of the neurons are active In sparse hopfield network the capacity is even more Hopfield network and the brain (5):
27
In hopfield network updating is serial which is far from biological reality In parallel updating hopfield network the associative memories can be recalled as well Hopfield network and the brain (6):
28
When the number of learned patterns in hopfield network will be overloaded, the performance of the network will fall abruptly for all the stored patterns But in real brain an overload of memories affect only some memories and the rest of them will be intact Catastrophic inference Hopfield network and the brain (7):
29
In hopfield network the usefull information appears only when the system is in the stable state The Brain do not fall in stable states and remains dynamic Hopfield network and the brain (8):
30
The connectivity in the brain is much less than hopfield network The diluted hopfield network works well Hopfield network and the brain (9):
Similar presentations
© 2024 SlidePlayer.com Inc.
All rights reserved.