Preface

Associative Neural Memories: Theory and Implementation
by Mohamad H. Hassoun

(Oxford Press, 1993)


This edited volume brings together significant works on associative neural memory theory (architecture, learning, analysis, and design) and hardware implementation (VLSI and optoelectronic) by leading international researchers. The purpose of this book is to integrate recent fundamental and significant research results on associative neural memories into a single volume, and present the material in a clear and organized format which makes it accessible to researchers and students.

Associative neural memories are a class of artificial neural networks (connectionist nets) which have gained substantial attention relative to other neural net paradigms. Associative memories have been the subject of research since the early seventies. Recent interest in these memories has been spurred by the seminal work of Hopfield in the early eighties, who has shown how a simple discrete nonlinear dynamical system can exhibit associative recall of stored binary patterns through collective computing. Since then, a number of important contributions have appeared in conference proceedings and technical journals addressing various issues of associative neural memories, including multiple-layer architectures, recording/storage algorithms, capacity, retrieval dynamics, fault-tolerance, and hardware implementation. Currently, associative neural memories are among the most extensively studied and understood neural paradigms. They have been studied as possible models of biological associative phenomena, as models of cognition and categorical perception, as high-dimensional nonlinear dynamical systems, as collective computing nets, as error-correcting nets, and as fault- tolerant content addressable computer memories.

This book is organized into an introductory chapter and four parts: Biological and Psychological Connections, Artificial Associative Neural Memory Models, Analysis of Memory Dynamics and Capacity, and Implementation. The group of chapters in the first part deals with associative neural models which have close connections to biological and/or psychological aspects of memory. This group consists of three chapters by D. Alkon et al., P. Kanerva, and J. Anderson. The second part of this book consists of three chapters by Y.-F. Wang et al., A. Dembo, and B. Baird and F. Eeckman. These chapters present more complex extensions of the simple associative memory models covered in the introductory chapter, and study their recall capabilities.The analysis of artificial associative neural memory dynamics, capacity, and error-correction capabilities are addressed in part three, which comprises the seven chapters by S.-I. Amari and H.-F. Yanai, R. Paturi, F. Waugh et al., S. Hui et al., G. Pancha and S. Venkatesh, S. Yoshizawa et al., and P.-C. Chung and T. Krile. The last part of the book deals with hardware implementation of associative neural memories (some of these memories and/or their associated recording algorithms constitute variations and/or extensions to those discussed in earlier chapters). Here, three chapters by A. G. Andreou and K. A. Boahen, M. Verleysen et al., and T.-D. Chiueh and R. Goodman address electronic VLSI implementations. Two additional chapters, one by F. T. S. Yu and the other by K. Kyuma et al., present optoelectronic implementations.

Chapter 1 is an introduction to basic dynamic associative memory (DAM) architectures and their associated learning/recording algorithms. DAMs are treated as collective nonlinear dynamical systems in which information retrieval is accomplished through an evolution of the system's state in a high dimensional (binary) state space. This chapter reviews some basic supervised learning/recording algorithms and derives the necessary conditions for perfect storage and retrieval of memories in a simple DAM. The characteristics of high-performance DAMs are outlined and such general issues as stability, capacity, and retrieval dynamics are discussed. Also, references to other chapters of the book are made so as to point out architecture extensions, additional learning algorithms, formal analysis, and other associative neural memory issues which are briefly described in this introductory chapter.

Chapter 2 demonstrates the important contributions that neurobiology can make to the design of artificial neural networks in general and associative learning nets in particular. It describes some results of biochemical and biophysical experiments that elucidate the properties of the Hermissenda visual-vestibular network, followed by descriptions of two computer models, each representing a different level of aggregation of the essential features of learning and memory. A biologically-based computer model called Dystal is presented. The model demonstrates efficient associative learning and is applied to problems in face recognition and optical character recognition applications.

Chapter 3 describes and analyzes a class of associative memories known as "sparse distributed memory" and relates it to associative memory models of the cerebellum, digital random access memory, and other sparse memory models reported in the literature. The chapter presents a unified formulation of a broad class of two-layer feedforward associative memory architectures, and advances the concept of "pattern computing" as a new computing model, as contrasted to numeric computing and symbolic computing.

Chapter 4 focuses on the brain-state-in-a-box (BSB) associative memory as a low-order approximation to a broad range of human cognitive operations. The chapter presents the theory of the BSB model and informally describes some of its mathematical properties. Simulations are presented that show how BSB can model psychological response time.

Chapters 5-7 present more complex models of artificial associative neural memories as compared to those reviewed in chapter 1. Chapter 5 addresses bidirectional associative memory (BAM), originally proposed independently by B. Kosko and Okajima et al. in 1987, and proposes several alternative recording schemes for improved recall. In chapter 6, a class of high-density associative memory models with such desirable properties as high capacity, controllable basins of attraction, and fast convergence speed is proposed and analyzed. In chapter 7, the "projection algorithm"-based network and its extensions are proposed for the guaranteed associative memory storage of analog patterns, continuous sequences, and chaotic attractors in the same network, with no spurious attractors. In this chapter, the authors concentrate on mathematical analysis and engineering-oriented applications of the projection algorithm-based memory. The chapter also attempts to relate the emergent dynamical behavior of interconnected modules of the proposed network to those of cortical computations by taking the view that oscillatory and possibly chaotic network modules form the actual cortical substrate of diverse sensory, motor, and cognitive operations.

The next group of seven chapters, chapters 8-14, deals with the analysis of various aspects of associative memory, such as capacity, convergence dynamics, shaping basin of attractions, effects of non-monotonic activation functions on retrieval dynamics, and fault tolerance. Chapter 8 formulates and presents a unified approach to the analysis of various architectures of associative memories based on a statistical neurodynamical method which allows for the analysis of associative recall dynamics and storage capacity. The method is applied to cross-correlation, cascaded, cyclic, autocorrelation, and associative sequence generator associative memories. Chapter 9 presents a detailed, rigorous mathematical analysis of convergence in the synchronous and asynchronous updated Hopfield associative memory. Theorems are presented, along with their proofs, on the amount of error-correction and the rate of convergence as a function of the number of fundamental memories. The stability and dynamics of analog parallel updated associative memories are studied in chapter 10. The operation of these dynamic analog nets as associative memories is explained in terms of phase diagrams, relating memory loading to neuron activation function slopes, for correlation and generalized inverse recording. Chapter 11 examines the stability of the generalized BSB associative memory. It characterizes the stability and location of all fixed points of the BSB model for different weight matrices. In chapter 12, a family of algorithms is discussed for DAM recording in terms of memory capacity and algorithm complexity. The chapter also emphasizes recording schemes for controlling the basins of attraction of selected memories and/or controling the basins of attraction of selected memory vector/pattern components. Chapter 13 adopts a piecewise linear non- monotonic neuron activation function in a dynamic autocorrelation associative memory and investigates the existence and stability of equilibrium states. This chapter also gives theoretical estimates on the capacity of such memories. Chapter 14 analyzes the effects of the faults characteristic of optical and electronic implementation technologies on implemented associative memory retrieval characteristics.

Chapters 15-19 cover hardware implementations of associative neural memories. In chapter 15, some basic building blocks for VLSI implementation of neural circuitry are described. The basic principles of analog VLSI architectures are discussed in connection with precision limitations encountered with such technology. The chapter describes how to overcome some of these limitations by appropriate designs of artificial neurons and adaptation of basic associative learning algorithms. Chapter 16 describes a hybrid analog/digital CMOS chip implementation of a high- capacity exponential correlation associative memory (ECAM), along with simulations and experimental validation. The chapter also analyzes the storage capacity and error-correction characteristics of ECAMs, and the effect of hardware-limited exponentiation dynamic range on capacity. In chapter 17, a scalable, efficient, and fault-tolerant chip design of a novel bidirectional associative memory architecture, based on subthreshold current mode MOS transistors, is described and validated through simulations. The design techniques employed in this chapter allow for compact implementation which can potentially lead to associative memory chips with densities approaching that of static random access memory (RAM). An optical implementation of a dynamic single-layer associative memory based on a liquid crystal television (LQTV) spatial light modulator (SLM) is described in chapter 18. Robust associative retrieval is demonstrated in this optical memory for several recording algorithms. This chapter also describes how a high-dimensional set of memories can be handled by employing a space-time sharing architecture. Finally, chapter 19 considers the implementation of neural network architectures employing 3-D optical neurochips with on-chip analog memory capabilities based on integrated LED and variable sensitivity photodetector (VSPD) arrays. Experimental results are reported for a two-layer winner-takes-all-based associative memory for stamp classification and a two-layer perceptron net employing back error propagation learning.

In putting this book together, an effort was made to include those researchers who are in many cases the originators of significant ideas on associative neural memories. However, as in any book like this, it would be impossible to include all significant work on this topic. Therefore, my strategy was to invite contributions byleading researchers who are able to relate their ideas to others in the literature, and who, in some cases, present a unifying framework for the study of associative neural memories.

I would like to take this opportunity to thank those who have contributed in various ways to the completion of this book. This project would not have been successful without the enthusiasm and professional cooperation of the contributing authors and their high-quality chapter contributions. I would like to thank the National Science Foundation for support of my work on associative neural memories through a Presidential Young Investigator Award (Grant ECE-9057896). In particular, my thanks go to Dr. Paul Werbos of NSF for his support of my research ideas since 1988. Also, thanks go to Ford Motor Company, Sun Microsystems, Unisys Corporation, and Zenith Data Systems for their valuable support, which has contributed directly or indirectly to the success of this project. Special thanks to Donald C. Jackson of Oxford University Press for his interest and help in publishing this book. I also take this opportunity to thank my wife Amal for her understanding and support, and thank my daughter Lamees for her patience.

Mohamad H. Hassoun
Detroit, June 1992



Click here to return to the CNNL homepage