This chapter introduces a variety of associative
neural memories and characterizes their capacity and their error
correction capability. In particular, attention is given to recurrent
associative nets with dynamic recollection of stored information.
The most simple associative memory is the linear
associative memory (LAM) with correlation-recording of real-valued
memory patterns. Perfect storage in the LAM requires associations
whose key patterns (input patterns) are orthonormal. Furthermore,
one only needs to have linearly independent key patterns if the
projection recording technique is used. This results in an optimal
linear associative memory (OLAM) which has noise suppression capabilities.
If the stored associations are binary patterns and if a clipping
nonlinearity is used at the output of the LAM, then the orthonormal
requirement on the key patterns may be relaxed to a pseudo-orthogonal
requirement. In this case, the associative memory is nonlinear.
Methods for improving the performance of LAMs, such
as multiple training and adding specialized associations to the
training set, are also discussed. The remainder of the chapter
deals with DAMs (mainly single-layer autoassociative DAM's) which
have recurrent architectures.
The stability, capacity, and associative retrieval
properties of DAMs are characterized. Among the DAM models discussed
are the continuous-time continuous-state model (the analog Hopfield
net), the discrete-time continuous-state model, and the discrete-time
discrete-state model (Hopfield's discrete net). The stability
of these DAMs is shown by defining appropriate Liapunov (energy)
functions. A serious shortcoming with the correlation-recorded
versions of these DAMs is their inefficient memory storage capacity,
especially when error correction is required. Another disadvantage
of these DAMs is the presence of too many spurious attractors
(or false memories) whose number grow exponentially in the size
(number of units) of the DAM.
Improved capacity and error correction can be achieved
in DAMs which employ projection recording. Several projection
DAMs are discussed which differ in their state update dynamics
and/or the nature of their state: continuous versus discrete.
It is found that these DAMs are capable of storing a number of
memories which can approach the number of units in the DAM. These
DAMs also have good error correction capabilities. Here, the
presence of self-coupling (diagonal-weights) is generally found
to have a negative effect on DAM performance; substantial improvements
in capacity and error correction capability are achieved when
self-coupling is eliminated.
In addition to the above DAMs, the following models
are discussed: Brain-state-in-a-box (BSB) model, non-monotonic
activations model, hysteretic activations model, exponential capacity
model, sequence generator model, and heteroassociative model.
Some of these models still employ simple correlation recording
for memory storage, yet the retrieval dynamics employed results
in substantial improvement in DAM performance; this is indeed
the case when non-monotonic or hysteretic activations are used.
A generalization of the basic correlation DAM into a model with
higher nonlinearities allows for storage of an exponential (in
memory size) number of associations with "good" error
correction. It is also shown how temporal associations (sequences)
and heteroassociations can be handled by simple variation of the
recording recipe and intuitive architectural extension, respectively.
The chapter concludes by showing how a single-layer
continuous-time continuous-state DAM can be viewed as a gradient
net and applied to search for solutions to combinatorial optimization
Back to the Table of Contents
Back to Main Menu