3Feedback Networks and Associative Memories Introduction大同大學資工所智慧型多媒體研究室
4Feedforward/Feedback NNs Feedforward NNsThe connections between units do not form cycles.Usually produce a response to an input quickly.Most feedforward NNs can be trained using a wide variety of efficient algorithms.Feedback or recurrent NNsThere are cycles in the connections.In some feedback NNs, each time an input is presented, the NN must iterate for a potentially long time before it produces a response.Usually more difficult to train than feedforward NNs.
5Supervised-Learning NNs Feedforward NNsPerceptronAdaline, MadalineBackpropagation (BP)ArtmapLearning Vector Quantization (LVQ)Probabilistic Neural Network (PNN)General Regression Neural Network (GRNN)Feedback or recurrent NNsBrain-State-in-a-Box (BSB)Fuzzy Congitive Map (FCM)Boltzmann Machine (BM)Backpropagation through time (BPTT)
7The Hopfield NNsIn 1982, Hopfield, a Caltech physicist, mathematically tied together many of the ideas from previous research.A fully connected, symmetrically weighted network where each node functions both as input and output node.Used forAssociated memoriesCombinatorial optimization
8Associative MemoriesAn associative memory is a content-addressable structure that maps a set of input patterns to a set of output patterns.Two types of associative memory: autoassociative and heteroassociative.Auto-associationretrieves a previously stored pattern that most closely resembles the current pattern.Hetero-associationthe retrieved pattern is, in general, different from the input pattern not only in content but possibly also in type and format.
9Associative Memories A A Auto-association Hetero-association memory NiagaraWaterfall
10Optimization Problems Associate costs with energy functions in Hopfield NetworksNeed to be in quadratic formHopfield Network finds local, satisfactory soluions, doesn’t choose solutions from a set.Local optimums, not global.
11Feedback Networks and Associative Memories Discrete Hopfield NNs大同大學資工所智慧型多媒體研究室
48Problems of Hopfield Memory Complement MemoriesSpurious stable statesCapacity
49Capacity of Hopfield Memory The number of storable patterns w.r.t. the size of the network.Study methods:ExperimentsProbabilityInformation theoryRadius of attraction ()
50Capacity of Hopfield Memory The number of storable patterns w.r.t. the size of the network.Hopfield (1982) demonstrated that the maximum number of patterns that can be stored in the Hopfield model of n nodes before the error in the retrieved pattern becomes severe is around 0.15n. The memory capacity of the Hopfield model can be increased as shown by Andrecut (1972).
51Radius of attraction (0 1/2) False memories
52Feedback Networks and Associative Memories Hopfield MemoryBidirection Memory大同大學資工所智慧型多媒體研究室