[mod.ai] Seminar - BoltzCONS: Recursive Objects in a Neural Network

Masaru.Tomita@A.CS.CMU.EDU (11/12/86)

Time: 3:30pm
Place: WeH 5409
Date: 11/18, Tuesday
 
	BoltzCONS:  Representing and Transforming Recursive
		  Objects in a Neural Network
 
		  David S. Touretzky, CMU CSD
 
BoltzCONS is a neural network in which stacks and trees are implemented as
distributed activity patterns.  The name reflects the system's mixed
representational levels: it is a Boltzmann Machine in which Lisp cons cell-like
structures appear as an emergent property of a massively parallel distributed
representation.  The architecture employs three ideas from connectionist symbol
processing -- coarse coded distributed memories, pullout networks, and variable
binding spaces, that first appeared together in Touretzky and Hinton's neural
network production system interpreter.  The distributed memory is used to store
triples of symbols that encode cons cells, the building blocks of linked lists.
Stacks and trees can then be represented as list structures, and they can be
manipulated via associative retrieval.  BoltzCONS' ability to recognize shallow
energy minima as failed retrievals makes it possible to traverse binary trees
of unbounded depth nondestructively without using a control stack.  Its two
most significant features as a connectionist model are its ability to represent
structured objects, and its generative capacity, which allows it to create new
symbol structures on the fly.
 
A toy application for BoltzCONS is the transformation of parse trees from
active to passive voice.  An attached neural network production system contains
a set of rules for performing the transformation by issuing control signals to
BoltzCONS and exchanging symbols with it.  Working together, the two networks
are able to cooperatively transform ``John kissed Mary'' into ``Mary was kissed
by John.''
-------