Tim@CIS.UPENN.EDU (Tim Finin) (11/14/86)
CIS Colloquium
University of Pennsylvania
3pm Tuesday November 18
216 Moore School
THE CAPACITY OF NEURAL NETWORKS
Santosh S. Venkatesh
University of Pennsylvania
Analogies with biological models of brain functioning have led to fruitful
mathematical models of neural networks for information processing. Models of
learning and associative recall based on such networks illustrate how
powerful distributed computational properties become evident as collective
consequence of the interaction of a large number of simple processing
elements (the neurons). A particularly simple model of neural network
comprised of densely interconnected McCulloch-Pitts neurons is utilized in
this presentation to illustrate the capabilities of such structures. It is
demonstrated that while these simple constructs form a complete base for
Boolean functions, the most cost-efficient utilization of these networks
lies in their subversion to a class of problems of high algorithmic
complexity. Specializing to the particular case of associative memory,
efficient algorithms are demonstrated for the storage of memories as stable
entities, or gestalts, and their retrieval from any significant subpart.
Formal estimates of the essential capacities of these schemes are shown. The
ultimate capability of such structures, independent of algorithmic
approaches, is characterized in a rigourous result. Extensions to more
powerful computational neural network structures are indicated.