3 Feedback Networks and Associative Memories Introduction大同大學資工所智慧型多媒體研究室
4 Feedforward/Feedback NNs Feedforward NNsThe connections between units do not form cycles.Usually produce a response to an input quickly.Most feedforward NNs can be trained using a wide variety of efficient algorithms.Feedback or recurrent NNsThere are cycles in the connections.In some feedback NNs, each time an input is presented, the NN must iterate for a potentially long time before it produces a response.Usually more difficult to train than feedforward NNs.
5 Supervised-Learning NNs Feedforward NNsPerceptronAdaline, MadalineBackpropagation (BP)ArtmapLearning Vector Quantization (LVQ)Probabilistic Neural Network (PNN)General Regression Neural Network (GRNN)Feedback or recurrent NNsBrain-State-in-a-Box (BSB)Fuzzy Congitive Map (FCM)Boltzmann Machine (BM)Backpropagation through time (BPTT)
7 The Hopfield NNsIn 1982, Hopfield, a Caltech physicist, mathematically tied together many of the ideas from previous research.A fully connected, symmetrically weighted network where each node functions both as input and output node.Used forAssociated memoriesCombinatorial optimization
8 Associative MemoriesAn associative memory is a content-addressable structure that maps a set of input patterns to a set of output patterns.Two types of associative memory: autoassociative and heteroassociative.Auto-associationretrieves a previously stored pattern that most closely resembles the current pattern.Hetero-associationthe retrieved pattern is, in general, different from the input pattern not only in content but possibly also in type and format.
9 Associative Memories A A Auto-association Hetero-association memory NiagaraWaterfall
10 Optimization Problems Associate costs with energy functions in Hopfield NetworksNeed to be in quadratic formHopfield Network finds local, satisfactory soluions, doesn’t choose solutions from a set.Local optimums, not global.
11 Feedback Networks and Associative Memories Discrete Hopfield NNs大同大學資工所智慧型多媒體研究室
12 The Discrete Hopfield NNs w1nw2nw3nw13w23wn3w12w32wn2w21w31wn1123n. . .I1I2I3Inv1v2v3vn
48 Problems of Hopfield Memory Complement MemoriesSpurious stable statesCapacity
49 Capacity of Hopfield Memory The number of storable patterns w.r.t. the size of the network.Study methods:ExperimentsProbabilityInformation theoryRadius of attraction ()
50 Capacity of Hopfield Memory The number of storable patterns w.r.t. the size of the network.Hopfield (1982) demonstrated that the maximum number of patterns that can be stored in the Hopfield model of n nodes before the error in the retrieved pattern becomes severe is around 0.15n. The memory capacity of the Hopfield model can be increased as shown by Andrecut (1972).
Your consent to our cookies if you continue to use this website.