[comp.doc.techreports] tr-input/CSLI

leff@smu.UUCP (Laurence Leff) (06/07/89)

Below is a complete up-to-date list of CSLI Reports, 
which may be obtained by writing to 

	Publications Dept. TR
	CSLI / Ventura Hall 
	Stanford University 
	Stanford, CA 94305.  

All orders for the reports must be prepaid, or, if you
wish to order the reports by email, you can charge them 
to your VISA or MasterCard (include your name, card number,
and expiration date).  Please add $1.50 for shipping and
handling.  Minimum order is $5.00.  Our email address is

	Pubs@csli.stanford.edu.

CSLI was founded in 1983 by researchers from 
Stanford University, SRI International, and 
Xerox PARC to further research  and development 
of integrated theories of language, information, 
and computation.


ABSTRACTS OF CSLI REPORTS

2. The Situation in Logic--I
Jon Barwise $2.00

This paper argues for a broader conception of what logic is all about
than prevails among logicians.  In particular, it claims that ordinary
usage of the words `logic,' `inference,' `information,' and `meaning'
defines a natural subject matter that is broader than logic as
presently studied.  More specifically, I argue that logic should seek
to understand meaning and inference within a general theory of
information, one that takes us outside the realm of sentences and
relations between sentences of any language, natural or formal.  I
also want to suggest that the theory of situations and situation types
developed with John Perry provides a tool with which one can begin to
study some of the neglected aspects of logic.

3. Coordination and How to Distinguish Categories
Ivan A. Sag, Gerald Gazdar, Thomas Wasow, and Steven Weisler $3.50

This paper presents a comprehensive grammar of coordination in English
which provides a principled account of why exact identity of the
syntactic category of the conjuncts is not required in examples like:

	(1)  Pat is either stupid or a liar.

	(2)  Pat has become a banker and very conservative.

Our basic proposal is summarized in (3):

	(3)  If a phrase structure rule introduces a category $\alpha,$
then any conjunct of $\alpha$ is a superset of $\alpha.$

This principle, taken together with the assumption that verbs like
BE are introduced by rules like (4), which introduces the
archicategory XP,

	(4)  VP --> V  XP

predicts the grammaticality of (1)--(2) and the deviance of:

	(5)  Chris sang beautifully and a carol.

The interaction of our theory of syntactic features and feature
instantiation principles allows us to deduce Ross's "Coordinate
Structure Constraint" and "Across-the-Board Convention"
[in full generality] as a theorem within the framework of Generalized
Phrase Structure Grammar.

4. Belief and Incompleteness
Kurt Konolige $4.50

The most successful current Artificial Intelligence (AI)
models of knowledge and belief originated with Hintikka, and have a
formal basis in the possible-worlds semantics of Kripke.  However,
such models assume consequential closure: an agent knows all the
consequences of her knowledge (Hintikka calls this "logical
omniscience").  Obviously this is not a realistic assumption; actual
agents have computational resource limitations on the derivations they
can perform, given existing knowledge.  In this paper we characterize
several types of consequential incompleteness that are important in
representing real-world domains.  We introduce a model of knowledge
and belief that, in contrast to the possible-worlds model, is
explicitly computational in nature.  This model is called the
Deduction Model of Belief, because it uses a set of (possibly
incomplete) deduction rules to model the computational process of
belief derivation.  We show how the various types of consequential
incompleteness can be modeled by an appropriate choice of the
deduction rules.

5. Equality, Types, Modules and Generics for Logic Programming
Joseph A. Goguen and Jose Meseguer $2.50

The original vision of Logic Programming called for using predicate
logic as a programming language.  However, PROLOG has many
features with no corresponding feature in first-order predicate logic,
and it also fails to realize some features of predicate logic.  From
the logical point of view, the system suggested in this paper,
hereafter called EQLOG, is based upon first-order Horn clause
logic with equality.  From the implementation point of view, it
combines the technology of PROLOG (its efficient implementation
with unification and backtracing) with functional programming (in an
efficient first-order conditional rewrite rule form) to yield more
than just their sum: logical variables can appear in equations, and
can be solved over user defined abstract data types; in fact,
combining unification with rewriting yields the technique called
"narrowing," which provides this extra power.  In addition, 
EQLOG provides generic (i.e., parameterized) modules, in a fully
rigorous way; and EQLOG also has a subsort facility that greatly
increases its expressive power.  Combining many-sorted logic with
modules permits a convenient treatment of data abstraction, inspired
by our experience with the rewrite rule based language OBJ. In
fact, both pure PROLOG and OBJ are "sublanguages" of
EQLOG; that is, both Horn clause programming and modular first-order
functional programming are provided.  In EQLOG, functions and
predicates are sharply distinguished, and functional notation,
including compositon, is available for functions.

6. Lessons from Bolzano
Johan van Benthem $1.50

Bernard Bolzano's contributions to logic, largely unnoticed in the
19th century, have been receiving ever more attention from modern
logicians.  As a result, it has already become something of a
commonplace to credit Bolzano with the discovery of the notion of
logical consequence in the semantic sense.  Now, this particular
attribution, whether justified or not, would at best establish a
historical link between modern logical concerns and Bolzano's work.
The purpose of the present note, however, is to bring out three
important aspects of that work that are still of contemporary
systematic interest.  No detailed textual study of Bolzano is needed
to substantiate our suggestions.  We shall refer to well-documented
`public' aspects of the `Wissenschaftslehre' (translated as `Theory of
Science,' R. George (translator), Berkeley: University of California
Press, 1972), pointing out their more general logical significance.

7. Self-propagating Search: A Unified Theory of Memory
Pentti Kanerva $9.00

Human memory has been compared to a film library that is indexed by
the contents of the film strips stored in it.  How might one construct
a computer memory that would allow the computer (a robot) to recognize
patterns and to recall sequences the way humans do?  The model
presented is a generalization of the conventional random-access memory
of a computer.  However, it differs from it in that (1) the address
space is very large (e.g., 1,000-bit addresses), (2) only a small
number of physical locations are needed to realize the memory (the
memory is sparse), (3) a pattern is stored by adding it into a `set'
of locations, and (4) a pattern is retrieved by `pooling' the contents
of a set of locations (the memory is distributed).  Patterns (e.g., of
1,000 bits) are stored in the memory (the memory locations are 1,000
bits wide) and they are also used to address the memory.  From such a
memory it is possible to retrieve previously stored patterns by
approximate retrieval cues---thus, the memory is sensitive to
similarities.  By storing a sequence of patterns as a linked list, it
is possible to index into any part of any "film strip" and to follow
the strip from that point on (recalling a sequence).

8. Reflection and Semantics in LISP
Brian Cantwell Smith $2.50

A general architecture is presented, which is called "procedural
reflection" and designed to support self-directed reasoning in a
serial programming language.  The architecture, illustrated in a
revamped dialect of LISP called 3-LISP, involves three
steps: (i) reconstructing the semantics of a language so as to deal
with both declarative and procedural aspects of meaning; (ii)
embedding a theory of the language within a language; and (iii)
defining an infinite tower of procedural self-models in terms of this
embedded theory, very much like a tower of metacircular interpreters,
except connected to each other in a simple but crucial way.  In a
procedurally reflective architecture, any aspect of a process's state
that can be described in terms of the theory can be rendered explicit,
in structures accessible for program examination and manipulation.
Procedural reflection enables a user to define complex programming
constructs by writing, within the programming language, direct
analogues of those metalinguistic semantical expressions that would
normally be used to describe them.  It is argued that the concept of
procedural reflection should be added to any language designer's tool
kit.

9. The Implementation of Procedurally Reflective Languages
Jim des Rivieres and Brian Cantwell Smith $3.00

In a procedurally reflective programming language, all programs are
executed not through the agency of a primitive and inaccessible
interpreter, but rather by the explicit running of a program that
represents that interpreter.  In the corresponding virtual machine,
therefore, there are an infinite number of levels at which programs
are processed, all simultaneously active.  It is therefore a
substantial question to show whether, and why, a reflective language
is computationally tractable.  We answer this question by showing how
to produce an efficient implementation of a procedurally reflective
language, based on the notion of a `level-shifting' processor.
A series of general techniques, which should be applicable to
reflective variants of any standard applicative or imperative
programming languages, are illustrated in a complete implementation
for a particular reflective LISP dialect called 3-LISP.

10. Parameterized Programming
Joseph A. Goguen $3.50

Parameterized programming is a powerful technique for the reliable
`reuse' of software.  In this technique, modules are
parameterized over very general interfaces that describe what
properties of an environment are required for the module to work
correctly.  Reusability is enhanced by the flexibility of the
parameterization mechanism proposed here.  Reliability is further
enhanced by permitting interface requirements to include more than
purely syntactic information.  This paper introduces three new ideas
that seem especially useful in supporting parameterized programming:
(1) `theories', which declare global properties of program
modules and interfaces; (2) `views', which connect theories with
program modules in an elegant way; and (3) `module expressions',
a kind of general structured program transformation which produces new
modules by modifying and combining existing modules.  Although these
ideas are illustrated with some simple examples in the OBJ
programming language, they should also be taken as proposals for an
ADA library system for adding modules to PROLOG, and as
considerations for future language design efforts.  OBJ is an
ultra-high level programming language, based upon rewrite rules, that
incorporates these three ideas, as well as many others from modern
programming methodology.

11. Morphological Constraints on Scandinavian Tone Accent
Meg Withgott and Per-Kristian Halvorsen $2.50

The renewed study of linguistic representations and their reflection
in the physical manifestations of language has found its most fruitful
and controlled testbed in the analysis of the interplay of morphology,
phonology, and phonetics. The alignment and coordination of various
disparate aspects of the phonological and phonetic subsystems have
become more accessible to detailed analysis by the emergence of two
foundational concepts.  One is the idea of a skeletal base which
serves to sequentially order and align autonomous units of sound along
the temporal axis. The second is the idea of level-ordered
word-formation, which again systematically relates phonological and
morphological processes. These ideas constitute the basis of
autosegmental theory and lexical phonology/morphology, respectively.
Norwegian tone accent provides a particularly fecund example of the
leverage which the combination of such approaches can yield.

12. Partiality and Nonmonotonicity in Classical Logic
Johan van Benthem $2.00

Recent developments in semantics have broken with what are generally
perceived to be two major presuppositions in classical logic:
`complete' information, and `cumulative' inference.  In this report,
we want to show that the matter is more complex. Lack of completeness
and failure of cumulation do occur in classical modal logic and, in
the last analysis, even in ordinary classical logic itself. Although
the locus of these phenomena becomes less definite in this way, the
classical analogy may also have some positive heuristic virtues.

The following discussion has been restricted to propositional
languages, for reasons of expedience rather than principle. 
Three results obtained appear to be new: a semantic tableau analysis of 
"strong consequence," a modal reduction of "data logic,"
and an axiomatization of nonmonotonic classical logic.

13. Shifting Situations and Shaken Attitudes
Jon Barwise and John Perry $4.50

In this paper, Barwise and Perry reply to a series of commentaries on
their book `Situations and Attitudes.' The paper, written in the
form of an interview with Barwise and Perry, explains the main ideas
of the book, discusses some misunderstandings of the project and the
book on the part of some of the commentators, as well as some changes
that need to be made in the theory to handle real problems uncovered
by some of the commentators.  The commentaries and interview have
appeared in a special issue of `Linguistics and Philosophy'
8:105--61 (1985). 

14C. Aspectual Classes in Situation Semantics
Robin Cooper $4.00

In this report we will explore some of the ways in which the tools
of situation semantics can be applied to the analysis of aspectual
classes, in particular in terms of what has come to be known as the
Vendler classification of verbs. A convenient summary of the vast
literature on this topic and very useful discussion of the issues
involved is to be found in David Dowty's `Word Meaning and
Montague Grammar: The Semantics of Verbs and Times in Generative
Semantics and in Montague's PTQ,' Chapter 2.  This paper is not meant
to provide a general treatment of aspect but will concentrate on the
analysis of certain phenomena which we hope will point to the
usefulness of situation semantics for a general treatment of natural
language tense and aspect.

15. Completeness of Many-sorted Equational Logic
Joseph A. Goguen and Jose Meseguer $2.50

Assuming that many-sorted equational logic "goes just as for the
one-sorted case" has led to incorrect statements of results in
many-sorted universal algebra; in fact, the one-sorted rules are not
sound for many-sorted deduction.  This paper gives sound and complete
rules, and characterizes when the one-sorted rules can still be used
safely; it also characterizes the related question of when many-sorted
algebras can be represented as one-sorted algebras.  The paper
contains a detailed introduction to Hall's theory of clones (later
developed into "algebraic theories" by Lawvere and Benabou); this
allows a full algebraization of many-sorted equational deduction that
is not possible with the usual fully invariant congruences on the free
algebra on countably many generators.

17. Moving the Semantic Fulcrum
Terry Winograd $1.50

The emergence of situation semantics has been a notable
cross-disciplinary event.  The theory was developed by a philosopher
and a mathematician in the tradition of the analytic philosophy of
language.  Many of its most enthusiastic advocates have been natural
language researchers in artificial intelligence.  This commentary
explores the reasons for this enthusiasm, the limitations of the
theory as seen from an AI perspective, and the significance it is
likely to have for computational theory and practice.

18. On the Mathematical Properties of Linguistic Theories
C. Raymond Perrault $3.00

Methaphorical findings regarding the decidability, generative
capacity, and recognition complexity of several syntactic theories are
surveyed.  These include context-free, transformational,
lexical-functional, generalized phrase structure, tree adjunct,
and stratificational grammars. The paper concludes with a
discussion of the implications of these results with respect to
linguistic theory.

19. A Simple and Efficient Implementation of Higher-order Functions
in LISP
Michael P. Georgeff and Stephen F. Bodnar $4.50

A relatively simple method for handling higher-order functions
`(funargs)' in LISP is described. It is also shown how this scheme
allows extensions of the LISP language to include partial application
of functions.  The basis of this approach is to defer evaluation of
function-valued expressions until sufficient arguments have been
accumulated to reduce the expression to a nonfunctional value. This
results in stacklike environment structures rather than treelike
structures produced by standard evaluation schemes.  Consequently, the
evaluator can be implemented on a standard runtime stack without
requiring the complex storage management schemes usually employed for
handling higher-order functions.  A full version of LISP has been
implemented by modifying the FRANZ LISP interpreter to incorporate the
new scheme. These modifications prove to be both simple and efficient.

20. On the Axiomatization of "if-then-else"
Irene Guessarian and Jose Meseguer $3.00

The equationally complete proof system for "if-then-else" of Bloom and
Tindell (Varieties of IF...THEN..., `SIAM J. Comput' 12:677--707
(1983)) is extended to a complete proof system for continuous algebras
and program schemes (infinite trees) by the methods of algebraic
semantics.  Additional operations in the algebras and additional
equations which do not clash with those for if--then--else can also be
accommodated into the proof system.

21. The Situation in Logic--II: Conditionals and Conditional
Information
Jon Barwise $3.00

This paper discusses the semantics of conditionals from the
perspective of information and situation theory.  The paper is built
around a number of traditional puzzles involving conditionals and
their logic.  First it is argued that we need a unified account of
conditionals, since all the problematic issues in the logic of natural
language conditionals are problematic in mathematics for just the same
sort of reasons. Secondly, it is proposed that conditional statements
be interpreted as describing conditional constraints, that is, certain
relations between types of situations.  The constraint is obtained in
a straightforward manner from the meaning of the conditional sentence,
but the conditions under which the constraint is asserted to hold are
fixed by context.  The paper closes by showing the claims this
proposal makes with regards to the traditional puzzles.  

22. Principles of OBJ2
Kokichi Futatsugi, Joseph A. Goguen, Jean-Pierre Jouannaud, and Jose
Meseguer $2.00

OBJ2 is a functional programming language with an underlying
formal semantics that is based upon equational logic, and an
operational semantics that is based upon rewrite rules.  Four classes
of design principles for OBJ2 are discussed: (1) modularization
and parameterization; (2) subsorts; (3) implementation techniques; and
(4) interaction and flexibility.  We also trace OBJ history,
current status, and future plans, and give a fairly complete OBJ
bibliography.

23. Querying Logical Databases
Moshe Vardi $1.50

We study here the complexity of evaluating queries in logical
databases. We focus on Reiter's model of closed-world databases with
unknown values. We show that in this setting query evaluation is
harder than query evaluation for physical databases.  For example,
while first-order queries over physical databases can be evaluated in
logarithmic space, evaluation of first-order queries in the studied
model is co-NP-complete.  We describe an approximation algorithm
for query evaluation that enables one to implement a logical database
on top of a standard database management system.

24. Computationally Relevant Properties of Natural Languages and
Their Grammars
Gerald Gazdar and Geoffrey K. Pullum $3.50

This paper, which is intended for computer scientists rather
than linguists, surveys what is currently known about natural language
morphology and syntax from the perspective of formal language theory.
First, the position of natural language word-sets and sentence-sets on
the formal language hierarchy is discussed.  Secondly, the
contemporary use by linguists of a range of formal grammars (from
finite state transducers to indexed grammars) in both word-syntax
(i.e., morphology) and sentence-syntax is sketched.  Finally, recent
developments such as feature-theory, the use of extension and
unification, default mechanisms, and metagrammatical techniques, are
outlined.

25. An Internal Semantics for Modal Logic: Preliminary Report
Ronald Fagin and Moshe Vardi $2.00

In Kripke semantics for modal logic, "possible worlds" and the
possibility relation are both primitive notions. This has both
technical and conceptual shortcomings. From a technical point of view,
the mathematics associated with Kripke semantics is often quite
complicated. From a conceptual point of view, it is not clear how to
use Kripke structures to model knowledge and belief, where one wants a
clearer understanding of the notions that are primitive in Kripke
semantics. We introduce "modal structures" as models for logic. We
use the idea of possible worlds, but by directly describing the
"internal semantics" of each possible world. It is much easier to
study the standard logical questions, such as completeness,
decidability, and compactness, using modal structures. Furthermore,
modal structures offer a much more intuitive approach to modeling
knowledge and belief.

26. The Situation in Logic--III: Situations, Sets and the Axiom of
Foundation
Jon Barwise $2.50

In this paper the rudiments of a theory of structured situations,
construed as comprehensible parts of reality, are outlined.  Some
relations between situations and sets are discussed.  It is argued
that situations are not necessarily wellfounded under the
constituent-of relation.  It is then suggested that this gives an
alternative conception of set (which we dub "hyperset") under which
hypersets are not necessarily wellfounded under the membership
relation $\in$.  Connections with the axiom AFA of
antifoundation from Aczel are briefly discussed.

27. Semantic Automata
Johan van Benthem $2.50

An attractive, but never very central idea in modern semantics has
been to regard linguistic expressions as denoting certain `procedures'
performed within models of the language. This paper applies that idea
to determiners, or more particularly, quantifier expressions. Such an
expression denotes a generalized quantifier, in our case a functor
$Q_EXY,$ assigning to each (finite) universe E a binary relation
among its subsets, and satisfying certain basic constraints. Viewed
procedurally, the quantifier has to decide which truth value to give
when presented with an enumeration of the individuals in E marked
for their (non)membership in X and $Y,$ i.e., it corresponds to a
`language' of admissible sequences in an alphabet coding the distinct
types of possible $X,Y$-behavior. This brings us to the familiar
perspective of mathematical linguistics and automata theory. It turns
out, surprisingly, that the Chomsky hierarchy makes eminent semantic
sense, both in its coarse and its fine structure. For instance, the
regular/context-free border-line corresponds to that between
first-order and higher-order definability. More precisely, two main
results are (1) that the first-order definable quantifiers are
precisely those which can be recognized by permutation-invariant
acyclic finite-state machines, and (2) that the quantifiers recognized
by (nondeterministic) push-down automata are precisely those definable
in additive arithmetic. Furthermore, within these broad classes,
machine fine structure is correlated with a significant semantic
hierarchy. It is also suggested how the present perspective on
quantifiers can be extended to other linguistic categories, opening
the way for a new approach to procedural semantics.

28. Restrictive and Nonrestrictive Modification
Peter Sells $3.00

It is commonly assumed that the interpretation of nonrestrictive
relative clauses relies on some sort of coreference between the
head and the `wh'-phrase in the relative clause.  I will show
that this assumption is not correct, and give an account of the
interpretation of nonrestrictive relative clauses that incorporates
some of the insights of the Discourse Representation Theory of Hans
Kamp (A Theory of Truth and Semantic Representation.  In J.
Groenendijk, T. Janssen, and M. Stokhof (eds.), {\it Truth,
Interpretation and Information. Dordrecht: Foris, 1--41).  I also
discuss the similarities in interpretation that we find with regular
pronominal anaphora, including the phenomena that I call `modal
subordination' and `temporal subordination.'  I believe that this
shows that the right level to express the relation between the head
and nonrestrictive adjunct is neither by indexing in the syntax nor by
actual reference in a world or model, but at an intermediate level of
discourse structure of the sort proposed by Kamp.  This provides
further support for the idea that there is a level of linguistic
representation larger than the sentence at which such relations of
discourse structure are characterized.

30. Institutions: Abstract Model Theory for Computer Science
Joseph A. Goguen and R. M. Burstall $4.50

There has been a population explosion among the logical systems
being used in computer science.  Examples include first-order logic
(with and without equality), equational logic, Horn clause logic,
second-order logic, higher-order logic, infinitary logic, dynamic
logic, process logic, temporal logic, and modal logic; moreover, there
is a tendency for each theorem prover to have its own idiosyncratic
logical system.  Yet it is usual to give many of the same results and
applications for each logical system; of course, this is natural in so
far as there are basic results in computer science that are
independent of the logical system in which they happen to be
expressed.  But we should not have to do the same things over and over
again; instead, we should generalize, and do the essential things once
and for all!  Also, we should ask what the relationships are among all
these different logical systems.  This paper shows how some parts of
computer science can be done in any suitable logical system, by
introducing the notion of an `institution' as a precise generalization
of the informal notion of a "logical system."  A first main result
shows that if an institution is such that interface declarations
expressed in it can be glued together, then `theories' (which are just
collections of sentences) in that institution can also be glued
together.  Some smaller results explore relationships between
collections of models and collections of sentences, starting from a
basic Galois connection.  A second main result gives conditions under
which a theorem prover for one institution can be validly used on
theories from another; this uses the notion of an `institution
morphism.' A third main result considers when theory structuring is
preserved by institution morphisms.  A fourth main result shows how to
extend institutions so that their theories may include, in addition to
the original sentences, various kinds of constraints upon
interpretations; such constraints are useful for defining abstract
data types, and include so-called "data" and "hierarchy"
constraints.  Further results show how to define institutions that mix
sentences from one institution with constraints from another, and even
mix sentences and (various kinds of) constraints from several
different institutions.  It is noted that general results about
institutions apply to such "multiplex" institutions, including the
result mentioned above about gluing together theories.  Finally, this
paper discusses the application of these results to specification and
logic programming languages, showing that large parts of these
subjects, and in particular many aspects of so-called
programming-in-the-large, are in fact independent of the institution
used.

31. A Formal Theory of Knowledge and Action
Robert C. Moore $5.50

Most work on planning and problem solving within the field of
artificial intelligence assumes that the agent has complete knowledge
of all relevant aspects of the problem domain and problem situation.
In the real world, however, planning and acting must frequently be
performed without complete knowledge.  This imposes two additional
burdens on an intelligent agent trying to act effectively.  First,
when the agent entertains a plan for achieving some goal, he must
consider not only whether the physical prerequisites of the plan have
been satisfied, but also whether he has all the information necessary
to carry out the plan.  Second, he must be able to reason about what
he can do to obtain necessary information that he lacks.  In this
paper, we present a theory of action in which these problems are taken
into account, showing how to formalize both the knowledge
prerequisites of action and effects of action on knowledge.

32. Finite State Morphology: A Review of Koskenniemi (1983)
Gerald Gazdar $1.50

A review of Kimmo Koskenniemi's paper Two-level Morphology: A
General Computational Model for Word-form Recognition and Production.

33. The Role of Logic in Artificial Intelligence
Robert C. Moore $2.00

Formal logic has played an important part in artificial intelligence
(AI) research for almost thirty years, but its role has always
been controversial.  This paper surveys three possible applications of
logic in AI: (1) as an analytical tool, (2) as a knowledge
representation formalism and method of reasoning, and (3) as a
programming language.  The paper examines each of these in turn,
exploring both the problems and the prospects for the successful
application of logic.

34. Applicability of Indexed Grammars to Natural Languages
Gerald Gazdar $2.00

Indexed grammars have been alluded to in a number of recent works on
syntax and semantics.  This paper provides a tutorial introduction to
them which is oriented to the concerns of linguists working in those
areas.  An intuitive stack-oriented notation is introduced, along with
a more liberal range of rule-types, and example grammars are listed.
The implications of phenomena including Scandinavian unbounded
dependencies, Dutch verb phrases, and the English comparative
construction are then discussed.  The paper concludes with some
consideration of phenomena that lie outside the power of indexed
grammars, and of ways in which indexed grammars might be constrained
in a language-theoretically interesting manner.  An appendix contains
notational equivalence proofs and other technicalities.

35. Commonsense Summer: Final Report
Jerry R. Hobbs, Tom Blenko, Bill Croft, Greg Hager, Henry A. Kautz,
Paul Kube, and Yoav Shoham $12.00

Commonsense Summer, a workshop held at SRI International during
the summer of 1984, was an attempt to encode significant portions of
commonsense knowledge in first-order predicate calculus in a number of
domains.  This report describes some of the work done during the
summer.  The first chapter describes the aims and methodology of the
project.  The second chapter contains Henry Kautz's axiomatization of
spatial knowledge linking up the way we talk about space in natural
language and the kind of Cartesian geometry a robot would need to find
its way around a building.  The third chapter reports on Greg Hager's
treatment of materials and their properties, focusing on the question
of how the shape of various materials is affected by different
processes.  The converse question is the focus of the fourth chapter,
by Yoav Shoham---how are different processes affected by the shapes of
the objects they involve?  The fifth chapter describes Paul Kube's
characterization of various modes of belief and belief formation that
are reflected in natural language.  The sixth chapter describes Tom
Blenko's formalization of the speech acts of offering and suggesting.
The final chapter is of a more linguistic flavor, describing Bill
Croft's treatment of English determiners as relations between textual
entities, i.e., the descriptions provided by the rest of the noun
phrase, and entities in the world.  Each of these efforts were only
reconnaissance missions into the problem areas, but they constitute a
start toward the kind of large-scale knowledge base of commonsense
knowledge that artificial intelligence is in need of.  

36. Limits of Correctness in Computers
Brian Cantwell Smith $2.50

Program verification is a technique in computer science that is used,
in its own terms, to "prove programs correct."  From its name,
someone might easily conclude that a program that had been proven
correct would never make any mistakes, or that it would always follow
its designer's intentions.  In fact, however, what are called proofs of
correctness are really proofs of the relative consistency between two
formal specifications: of the program, and of the model in terms of
which the program is formulated.  Part of assessing the correctness of
a computer system, however, involves assessing the appropriateness of
this model.  Whereas standard semantical techniques are relevant to
the program-model relationship, we do not currently have any theories
of the further relationship between the model and the world in which
the program is embedded.

In this paper I sketch the role of models in computer systems,
comment on various properties of the model-world relationship, and
suggest that the term `correctness' should be
changed to `consistency.'  In addition, I argue that, since models
cannot in general capture all the infinite richness of real-world
domains, complete correctness is inherently unattainable, for people
or for computers.

37. On the Coherence and Structure of Discourse
Jerry R. Hobbs $3.00

Discourse has structure that can be described in terms of coherence
relations between successive segments of text.  In this paper a theory
of coherence relations is embedded within the larger context of a
knowledge-based theory of discourse interpretation.  First, this
larger framework is described.  Then an account of the coherence
relations is given, in which their intimate connection with the
knowledge of the speaker and the hearer is explored.  Next, it is
shown how larger-scale structures in discourse are composed out of the
coherence relations.  This helps elucidate such elusive notions as
"topic" and "genre," and also allows us to examine some of the
ways in which ordinary discourse is often incoherent.  Finally, a
method for analyzing discourse is suggested, which allows the
structure of discourse and the knowledge base that underlies it to
illuminate each other.

38. The Coherence of Incoherent Discourse
Jerry R. Hobbs and Michael H. Agar $2.50

Some of the more ill-behaved vagaries of free-flowing conversation may
seem to call into question the possibility of formal treatments of
coherence in conversation.  However, in this paper we show that the
notions of planning and local coherence from artificial intelligence
work in discourse interpretation make such treatments possible.  Four
fragments of an ethnographic life history interview are examined; they
illustrate a negotiation of topic, an associative slide, discontinuous
structure, and the emergence of a new conversational goal.  In each
case we show that the notions of planning and local coherence make
possible an intricate analysis of how local incoherencies can disguise
a larger, global coherence or of how global coherence can arise from
the piecing together of locally coherent segments.  Finally, we give
an overview of the production of conversation based on these notions
that accommodates these vagaries.

39. The Structures of Discourse Structure
Barbara Grosz and Candace L. Sidner $4.50

This paper presents the basic elements of a computational theory of
discourse structure that simplifies and expands upon previous work.
It is concerned with answers to two rather simple questions: What is
discourse?  What is discourse structure? As we develop it, the theory
of discourse will be seen to be intimately connected with two
nonlinguistic notions, namely intention and attention.  Attention is
an essential factor in explicating the processing of utterances in
discourse.  Intentions play a primary role in explaining discourse
structure, defining discourse coherence, and providing a coherent
conceptualization of the term "discourse" itself.

40. A Complete, Type-free "Second-order" Logic and its Philosophical
Foundations
Christopher Menzel $4.50

Since Frege, the (arguably) dominant conception of properties,
relations, and propositions (PRPs) has been typed.  Recent
years, however, have seen a growing dissatisfaction with logics based
upon the typed conception due primarily to their inability to
represent many intuitively valid patterns of inference.  In this
paper, a logic and a corresponding algebraic semantics are developed
on the basis of a type-free conception of PRPs.  Part I of the
paper consists of intuitive motivation and exposition of the system,
followed by a discussion of its philosophical implications for
Russell's paradox.  In Part II, the language, syntax, semantics,
and logic of the system are introduced formally.  The logic is then
proved to be sound and complete relative to its semantics, and
consistent.

41. Possible-World Semantics for Autoepistemic Logic
Robert C. Moore $2.00

In a previous paper (Moore, 1983), we presented a nonmonotonic logic for
modeling the beliefs of ideally rational agents who reflect on their own
beliefs, which we called "autoepistemic logic."  We defined a simple and
intuitive semantics for autoepistemic logic and proved the logic
sound and complete with respect to that semantics.  However, the
nonconstructive character of both the logic and its semantics made it
difficult to prove the existence of sets of beliefs satisfying all the
constraints of autoepistemic logic.  This note presents an alternative,
possible-world semantics for autoepistemic logic that enables us to
construct finite models for autoepistemic theories, as well as to
demonstrate the existence of sound and complete autoepistemic theories
based on given sets of premises.

42. Deduction with Many-sorted Rewrite
Jose Meseguer and Joseph A. Goguen $1.50

Abstract data types can be axiomatized equationally; and if they are
computable (as they should be), they can be finitely axiomatized by
confluent and terminating rewrite rules.  This note shows (for
arbitrary many-sorted signatures), that if the equations form a
confluent set of rewrite rules, then the ordinary one-sorted way of
proving equality by rewriting is sound and complete in the many-sorted
case; we then extend this result to rewrite rules that are confluent
modulo a set E of equations, and sketch the further extension to
conditional rewrite rules.  These results are basic to the semantics
of OBJ2, since they support its efficient implementation by
one-sorted term reduction, while still allowing its simple and
powerful parameterization mechanism a la CLEAR, which would be
compromised if sorts were required to be nonempty.

43. On Some Formal Properties of Metarules
Hans Uszkoreit and Stanley Peters $1.50

This paper defines and studies metarules which derive context-free
phrase structure (CF) rules from other CF rules, as in
Generalized Phrase Structure Grammars.  It proves that every
recursively enumerable language can be generated by a set of CF
rules obtainable as the closure of a finite set of basic CF
rules under finitely many metarules, even when each metarule has only
one variable (thus settling a conjecture about GPSGs in the
negative).  Further results concern the case where the set of derived
CF rules contains no `phantom' (or `useless') symbols.  In this
case, recursively undecidable languages are still generated, but only
languages whose strings are linearly dense are generable; the class of
languages generated is incomparable with the indexed languages.

44. Language, Mind, and Information
John Perry $2.00

There are three points I wish to make.

First, the pick-up of information and the flow and transformation
of information (and misinformation) across physical, psychological,
linguistic, computational, and other sorts of information-carrying
events and processes form a unified subject matter which intersects a
number of disciplines, including linguistics, psychology, computer
science, artificial intelligence, philosophy, and logic.  We call this
field "Information, Computation, and Cognition."  I shall sometimes
shorten this to "information and intelligence."

Second, mathematical theories of informational content, which are
a crucial unifying tool for research in information and intelligence,
are neither trivial nor impossible; such theories are emerging from a
renaissance in logic that is already well underway.

Third, research into information and intelligence is underway in a
number of interdisciplinary settings around the nation, and these
activities are a valuable national resource.  They have been aided by
National Science Foundation and other agencies and in major ways by
two private foundations, the Sloan and System Development Foundations;
but funding that is more generally available and available on a
continuing basis is needed.

46. Constraints on Order
Hans Uszkoreit $3.00

Partially free word order as it occurs in German and probably to some
extent in all natural languages arises through the interaction of
potentially conflicting ordering principles.  A modified linear
precedence (LP) component of Generalized Phrase Structure
Grammar (GPSG) is proposed that accommodates partially free word
order.  In the revised framework, LP rules are sets of LP
clauses.  In a case in which these clauses make conflicting ordering
predictions, more than one order is grammatical.  LP clauses may
refer to different types of categorial information such as category
features, morphological case, thematic roles, discourse role, and
phonological information.  The modified framework is applied to
examples from German.  It is demonstrated how the new LP
component constrains the linear ordering of major nominal categories.

47. Linear Precedence in Discontinuous Constituents: Complex Fronting
in German
Hans Uszkoreit $2.50

Syntactic processes that have been identified as sources of
discontinuous constituents exhibit radically different properties.
They seem to fall into several classes: leftward "extractions,"
rightward "movements," "scrambling" phenomena, and parenthetical
insertions.  Current linguistic theories differ as to the formal tools
they employ both for describing the participating syntactic phenomena
and for encoding the resulting representations.

In this paper, the general problem of determining the linear order in
the discontinuous parts of a constituent is discussed.  The focus lies
on frameworks that use their feature mechanisms for connecting the
noncontiguous elements.  It is then shown that the current framework
of Generalized Phrase Structure Grammar (GPSG) is not suited for
describing the interaction of leftward extractions, scrambling, and
constraints on linear order. The relevant data come from German
fronting.  Previous analyses (Johnson, 1983; Nerbonne, 1984;
Uszkoreit, 1982, 1984) have neglected certain types of
fronting or failed to integrate their accounts of fronting properly
with an analysis of linear precedence.  The critical constructions
involve the fronting of main verbs together with some complements and
adjuncts as in the following example:

Unbemerkt seine Brieftasche stehlen kannst du ihm nur in der Oper.
Unnoticed his  wallet     steal  can   you him only at the Opera.

`Only at the opera can you steal his wallet unnoticed.'

If a widely accepted condition that restricts fronting to a single
constituent is not sacrificed, a binary branching clause structure is
required.  The branching structure, however, prevents LP rules
from specifying the linear order of the arguments and adjuncts of the
verb.

The proposed modification of the framework redefines the
relationship between syntax and lexicon.  A subcategorization or
valency feature is associated with an uninstantiated entry for a
lexical head.  Its value is a set of possible types of obligatory and
optional complements including constituents that have traditionally
been analyzed as adjuncts.  When a lexical entry is instantiated,
complex LP rules impose a linear order on a selected subset of
complements.  The order depends on syntactic, thematic, pragmatic, and
phonological information.  In the syntax, the head picks up its
complements one by one, just as in categorial grammars, thus creating
the branching structure suggested in previous proposals. 

The modified framework admits grammars that preserve the feature
analysis for unbounded dependencies, that abolish the metarule
analysis for the scrambling of adjuncts and complements, and that
treat certain stylistic reorderings such as
heavy-NP-shift as simple results of LP rule applications.

48. A Compilation of Papers on Unification-based Grammar Formalisms,
Parts I and II
Stuart M. Shieber, Fernando C. N. Pereira, Lauri Karttunen, and
Martin Kay $4.00

This report is a compilation of papers by members of the PATR
group at SRI International and collaborators reporting ongoing
research on both practical and theoretical issues concerning
unification-based grammar formalisms, that is, formalisms based on
unification of directed-graph structures.  The papers presented in
this compilation provide an overview of the design of PATR-II,
the current formalism being simultaneously designed, implemented, and
used by the SRI group; a discussion of the use of disjunction
and negation in unification-based feature systems; a mathematical
semantics for unification-based formalisms; techniques for efficient
implementation of unification over graph structures; and a method for
extension of parsing techniques for unification-based formalisms.

49. An Algorithm for Generating Quantifier Scopings
Jerry R. Hobbs and Stuart M. Shieber $2.50

The syntactic structure of a sentence often manifests quite clearly
the predicate-argument structure and relations of grammatical
subordination.  But scope dependencies are not so transparent.  As a
result, many systems for representing the semantics of sentences have
ignored scoping or generated scopings with mechanisms that have often
been inexplicit as to the range of scopings they choose among or
profligate in the scopings they allow.

In this paper, we present an algorithm, along with proofs of some of
its important properties, that generates scoped semantic forms from
unscoped expressions encoding predicate-argument structure.  The
algorithm is not profligate as are those based on permutation of
quantifiers, and it can provide a solid foundation for computational
solutions where completeness is sacrificed for efficiency and
heuristic efficacy.

50. Verbs of Change, Causation, and Time
Dorit Abusch $2.00

D. Dowty in `Word Meaning and Montague Grammar' (Dordrecht: Reidel,
1979) claimed that the difference between classes of aspectual verbs
may be captured by a class of stative predicates and sentential
operators.  The appearance of sentential operators like BECOME and
CAUSE, in the logical structure of verbs should predict their
aspectual properties.  He presents a lexical decompositional analysis
of word meaning in generative semantics.  His decompositional analysis
is presented as a fragment of "natural logic," for which an explicit
model theoretic interpretation is given.

The first claim, noted in part by Dowty, is that while achievements are
analyzed with BECOME combined with a stative predicate, and
accomplishments in terms of CAUSE, the morphological
categories of inchoatives and causatives are not of uniform aspectual type.
There are some inchoatives which meet tests for process verbs.  The
semantic analysis of inchoative and causative process verbs and their
interaction with time adverbials is provided.

The second claim is that verb classification itself breaks down
for the above causatives and perhaps for the inchoatives, since
according to the tests they are neither pure processes nor pure
accomplishments/achievements.  We believe that this result gives a
novel perspective on Dowty's theory.  Accomplishments are not
identical with causative, and the role of the verb classification is
further weakened, in that the theory makes predictions
about verbs which fall into the cracks of the classification.

51. Noun-Phrase Interpretation
Mats Rooth $2.00

File change semantics, discourse representaton theory, and situation
semantics (as delineated by Heim, Kamp, and Barwise and Perry) propose
nonquantificational analyses of indefinite noun phrases such as `a
man'.  To compare these analyses to the standard quantificational one,
a minimal syntactic fragment is given an extensional Montague-style
meaning assignment, and one embodying a version of the analysis of
indefinites and anaphora of Heim, Kamp, and Barwise and Perry.
Because it duplicates Heim's and Kamp's solutions to problems of
discourse anaphora and so-called donkey anaphora, the second meaning
assignment has superior empirical coverage.  However, it is shown that
the two are similar outside the domain of anaphora, in that the
denotations for sentences not involving anaphora provided by the
second meaning assignment can be mapped homomorphically to the
denotations provided by the first.  This gives us confidence that
certain results, such as Barwise and Cooper's characterization of the
noun phrases permissible in `there'-insertion sentences or
Montague's analysis of intensional transitive verbs, can be
preserved in a theory embodying the improved analysis of anaphora.

52. Noun Phrases, Generalized Quantifiers and Anaphora
Jon Barwise $2.50

In this paper, ideas from situation semantics are used to
improve the generalized quantifier model of NP interpretation,
especially as regards anaphora.  The main ideas are to replace (i)
sets of sets with sets of sets of parametric sets, (ii) total variable
assignments with partial variable assignments, and (iii) the three
place satisfaction ${fr M \models \varphi [f]$ with a four place
"dynamic interpretation" relation.

53. Circumstantial Attitudes and Benevolent Cognition
John Perry $1.50

Our cognitions have two aspects, their causal powers and their
contents, which must be coordinated if psychology is to make any sense
at all.  I call this the principle of efficient and benevolent
cognition.  "Efficient" means that the same psychological laws hold
for everyone; "benevolent" means that when a certain desire and
belief cause an action, then that action should promote the
satisfaction of the desire, given the truth of the belief.  It is not
easy to see, however, how this can be, given the circumstantial nature
of reference.  If the reference of a thought or idea depends not
merely on the cognitive state of the agent, but also on the
circumstances outside the agent, then we seem to have a dilemma.  If
the circumstantially determined content somehow controls action, we
have to abandon efficiency; the connection between cognitive states
and actions will not be matters of psychological law, the same for
everyone, but vary with individual circumstances.  If only the
cognitive states determine the action, benevolence becomes a bit of a
mystery, for how can we be sure that the caused action will be
appropriate to the contents of the desire and belief, as determined by
the combination of cognitive state and circumstances?  In this essay,
I claim that upon fairly careful reflection, the problem disappears.
Indeed, given the fact that action is circumstantial, the
circumstantiality of the content of belief and desire is required to
make the principle work out.

54. A Study in the Foundations of Programming Methodology:
Specifications, Institutions, Charters and Parchments
Joseph A. Goguen and R. M. Burstall $2.50

The theory of institutions formalizes the intuitive notion of a
"logical system."  Institutions were introduced (1) to support as much
computer science as possible `independently' of the underlying logical
system, (2) to facilitate the transfer of results (and artifacts such
as theorem provers) from one logical system to another, and (3) to
permit combining a number of different logical systems.  In
particular, programming-in-the-large (in the style of the CLEAR
specification language) is available for any specification or
"logical" programming language based upon a suitable institution.
Available features include generic modules, module hierarchies, "data
constraints" (for data abstraction), and "multiplex" institutions (for
combining multiple logical systems).  The basic components of an
`institution' are: a category of `signatures' (which generally provide
symbols for constructing sentences); a set (or category) of
$\Sigma$-`sentences' for each signature $\Sigma$; a category (or set)
of $\Sigma$-`models' for each $\Sigma$; and a $\Sigma$-`satisfaction'
relation, between $\Sigma$-sentences and $\Sigma$-models, for each
$\Sigma$.  The intuition of the basic axiom for institutions is that
`truth (i.e., satisfaction) is invariant under change of notation'.
This paper enriches institutions with sentence morphisms to model
proofs, and uses this to explicate the notion of a logical programming
language.

To ease constructing institutions, and to clarify various notions,
this paper introduces two further concepts.  A `charter' consists of
an adjunction, a "base" functor, and a "ground" object; we show that
"chartering" is a convenient way to "found" institutions.
`Parchments' provide a notion of sentential syntax, and a simple way
to "write" charters and thus get institutions.  Parchments capture the
insight that `the syntax of logic is an initial algebra'.  Everything
is illustrated with the many-sorted equational institution.
Parchments also explicate the sense of finitude that is appropriate
for specifications.  Finally, we introduce `generalized institutions',
which generalize both institutions and Mayoh's "galleries," and we
introduce corresponding generalized charters and parchments.

55. Quantifiers in Formal and Natural Languages
Dag Westerstahl $7.50

This paper surveys recent developments in the theory of generalized
quantifiers as interpretations of natural language determiners. It
focuses on the semantic constraints obeyed by such quantifiers, and on
their logical properties. The paper has four main sections. Section 1
provides background: a selective history of quantifiers from Aristotle
via Frege to modern generalized quantifiers, and an introduction to
generalized quantifiers in mathematical logic and their expressive
power. Section 2 presents basic ideas of the present approach to
natural language quantification, as initiated by Montague and
developed further by Barwise and Cooper, and Keenan and Stavi. In
particular, the conservativity universal is introduced, and the
interpretation of several types of quantifier expressions, related to,
e.g., numericals, comparatives, possessives, definites, partitives,
and Boolean combinations, is discussed. Section 3 formulates a number
of quantifier constraints, dealing with universe-restriction, monadic
quantifiers, closure under isomorphism and under Boolean operations,
non-triviality, monotonicity, partiality, finiteness, and evaluates
their logical effects. Section 4 surveys the theory of quantifiers
satisfying such constraints as developed by van Benthem and others.
The results concern relational properties of quantifiers, definability
and classification, logical constants, and inferential behavior. The
paper ends with a further outlook and two appendices, one on branching
quantification and one on quantifiers as variables.

56. Intentionality, Information, and Matter
Ivan Blair $3.00

This article is concerned with the sources of intentionality, to which
I adopt a realist attitude.  I consider both the intentionality of
language and that of mind, and the relationship between them.  After
an examination of Hartry Field's proposal to explicate the
intentionality of mind in terms of a language of thought, and of Fred
Dretske's proposal based on an already intentional conception of
information grounded in natural law, I discuss Howard Pattee's thought
on the "symbol-matter problem," i.e., how to understand the relation
between symbolic and dynamic processes.  I describe Pattee's two
attempts to articulate a criterion for distinguishing between systems
which involve symbolic operations and merely dynamical systems,
success in which would provide us with both a better understanding of
symbolic systems and an enlarged vision of their potential
applications.

57. Graphs and Grammars
William Marsh $2.00

This paper looks at the role graphs other than trees can play in
several kinds of grammars: ordinary context-free grammars, Pereira's
extraposition grammars, the phrase-linking grammars of Peters and
Ritchie, and what we call mother-and-daughter grammars.  For each kind
of grammar we will specify the following: the set of graphs used by
that type of grammar, the way these graphs are to be made into trees,
the class of grammars of the type under consideration, and the way
these grammars accept and reject graphs.  After definitions and
examples, we present some theorems and conjectures concerning weak
generative capacity.  We also consider uses of these graphs and
grammars in linguistic theory.

58. Computer Aids for Comparative Dictionaries
Mark Johnson $2.00

Much of the collating and indexing work associated with constructing a
dictionary can be automated. This paper describes how we use a
computer to "undo" regular sound changes to produce a set of
reconstructed forms which are then used to index entries in a
comparative dictionary, resulting in a substantial timesaving over
manual compilation of the dictionary.  Moreover, after completion of
the dictionary a data base of lexical entries is available to aid
further research.

59. The Relevance of Computational Linguistics
Lauri Karttunen $2.50

Computational Linguistics (CL) has moved in a few years from
obscurity to a center of activity.  Although it is often thought of as
being an applied field, the significance of CL for linguistics
as a whole is due to the theoretical work that is currently being
done, especially in the area of finite state transducers and
unification-based grammar formalisms.  As a way of a concrete example,
the paper includes a categorically based unification grammar for a
fragment of Finnish.  It was presented at the North-American
Conference on Finnish Linguistics and Literature at the University of
Wisconsin, Madison, April 1986.

60. Grammatical Hierarchy and Linear Precedence
Ivan A. Sag $3.50

Considerable research has been directed toward the problem of
explaining word order variation across languages and the nature of
word order generalizations within individual languages. One
particularly influential approach is by Gazdar and Pullum (1981) who
propose to replace familiar phrase structure rules by two kinds of
rules that must both be satisfied by well-formed linguistic
structures: Immediate Dominance (ID) rules, specifying the category
contents of particular constituent structures and Linear Precedence
(LP) rules, which state general constraints on the possible orderings
within constituent structures. Their proposal, elaborated in Gazdar et
al. (1985) is couched in the framework of Generalized Phrase-Structure
Grammar (GPSG). 

In this paper, I examine a set of problems in English that challenge
Gazdar and Pullum's theory. I present a solution to these problems
within the framework of Head-driven Phrase-Structure Grammar (HPSG),
as expounded in Sag and Pollard (1986) and Pollard and Sag (1987). The
fundamental idea is that the hierarchical theory of grammatical
relations developed by Dowty (1982a,b) and incorporated into HPSG (but
not HPSG) provides information that must be accessed in the statement
of LP rules. Thus LP rules may function so as to require elements of
type A to precede elements of type B just in case the former are less
oblique than the latter, i.e., just in case the former are higher on
the hierarchy of grammatical relations. Indeed one such LP rule of
English is shown to solve three fundamental problems for Gazdar and
Pullum's nonhierarchical LP theory.

The paper also includes an overview of HPSG theory, as well as a
sketch of a new theory of adjuncts, extraposed phrases, and the
verb-particle construction.

61. D-PATR: A Development Environment for Unification-based Grammars
Lauri Karttunen $4.00

D-PATR is a development environment for
unification-based grammars on Xerox 1100 series work
stations.  It is built on the PATR formalism developed at SRI
International.  This formalism is suitable for encoding a wide variety
of grammars.  At one end of the range are simple phrase structure
grammars with no feature augmentations.  D-PATR can also be used
to write grammars that are based on a number of current linguistic
theories, such as lexical-functional grammar (Bresnan and Kaplan),
head-driven phrase structure grammar (Pollard and Sag), and functional
unification grammar (Kay).  At the other end of the range covered by
D-PATR are unification-based categorial grammars (Klein,
Steedman, Uszkoreit, Wittenburg) in which all the syntactic
information is incorporated in the lexicon and the remaining few
combinatorial rules that build phrases are function application and
composition.  Definite-clause grammars (Pereira and Warren) can also
be encoded in the PATR formalism.

D-PATR consists of four basic parts:

These packages are written in simple INTERLISP-D for
transportability to other dialects of LISP. They do not depend
on the features of any particular machine. The only part of 
D-PATR that is specific to Xerox 1100 series work stations is its
USER INTERFACE. This last set of routines takes full
advantage of the graphic capabilities of D-machines.  It provides good
facilities for writing and editing grammars as well as many debugging
tools for the grammar writer.

D-PATR is not a commercial product. It is made available to
users outside CSLI who might wish to develop unification-based
grammars.  D-PATR is currently being used for grammar
development and teaching purposes at CSLI and a number of
foreign and American universities.

62. A Sheaf-theoretic Model of Concurrency
Luis F. Monteiro and Fernando C. N. Pereira $3.00

We develop here a structural theory of concurrency in which the
locality of interaction between subsystems is described with the
mathematical tools of the theory of sheaves.  This theory allows us to
model precisely processes that interact through common behavior at
shared locations.  In contrast to behavioral models, ours keeps track
of the individual contributions of subsystems to overall system
behavior, allowing a finer-grained analysis of subsystem interactions.

>From event signatures that specify relations of independence and
exclusivity between events, we construct spaces of locations where
activity may occur.  Behaviors are then modeled as elements of sheaves
of monoids over those spaces and processes as certain sets of
behaviors.  The construction of the model, and in particular its
avoidance of interleaving, gives it very convenient mathematical
properties---sheaves of behavior monoids are to event signatures what
free monoids are to alphabets.  The theory also allows us to identify
on purely structural grounds event signatures with a potential for
deadlock.  

We conclude with a discussion of the solution of process equations in
our model, with an example from CSP.

63. Discourse, Anaphora, and Parsing
Mark Johnson and Ewan Klein $2.00

Discourse Representation Theory, as formulated by Hans Kamp and
others, provides a model of inter- and intra-sentential anaphoric
dependencies in natural language. In this paper, we present a
reformulation of the model which, unlike Kamp's, is specified
declaratively.  Moreover, it uses the same rule formalism for building
both syntactic and semantic structures.  The model has been
implemented in an extension of PROLOG, and runs on a VAX
11/750 computer.

64. Tarski on Truth and Logical Consequence
John Etchemendy $3.50

Tarski's writings on truth and logical consequence are among the most
influential works in both logic and philosophy of the twentieth
century.  But they are continually misconstrued in a variety of ways.
For example, Tarski's work on truth gave rise to the important field
of formal semantics (both as pursued by "Davidsonians" and by those in
the model-theoretic tradition).  However, Tarski's own project, that of
providing an eliminable definition of truth, actually conflicts with
the aims of formal semantics.  In this paper I try to straighten out
various of these misunderstandings, and to explain the genuine
significance and influence of Tarski's work on truth and logical
consequence.

65. The LFG Treatment of Discontinuity and the Double Infinitive
Construction in Dutch
Mark Johnson 2.50

This paper presents an analysis of the Double Infinitive Construction
(DIC) in Dutch within the LFG framework.  The analysis
presented is a direct extension of the analysis of the Dutch cross
serial dependencies presented in Bresnan, Kaplan, Peters and Zaenen
(1982); in effect, cross serial dependencies are analyzed here as
instances of the DIC.

However, under the DIC analysis proposed here, certain
grammatical structures result in structures that violate the off-line
parsability requirement (Bresnan and Kaplan, 1982; Pereira
and Warren, 1983), which ensures that LFG
is decidable.  I show that this problem can be avoided by a
modification of the LFG framework, incorporating the device of
functional uncertainty, which has been proposed on independent grounds
by Kaplan and Zaenen (forthcoming) and Saiki (1985).

66. Categorial Unification Grammars
Hans Uszkoreit $2.50

A type of grammar formalism is proposed that merges strategies from
two successful approaches in formal linguistics.  Categorial
unification grammars (CUGs) embody the essential properties of both
unification and categorial grammar formalisms.  Their efficient and
uniform way of encoding linguistic knowledge in well-understood and
widely used representations makes them attractive for computational
applications and for linguistic research.

In this paper, the basic concepts of CUGs and simple examples of
their application will be presented.  It will be argued that the
strategies and potentials of CUGs justify their further exploration
in the wider context of research on unification grammars.  Approaches
to selected linguistic phenomena are discussed.

67. Generalized Quantifiers and Plurals
Godehard Link $2.00

In earlier work, the author developed a logic of plurals in a model
theoretic framework, giving the domain of his models the structure of
an atomic join semi-lattice; a plural NP denoted an individual,
the join of a number of atomic elements in the lattice, and not a set.
Here, this logic of plurals is lifted into the generalized quantifier
framework for NP interpretation, and a number of extensions of
the resulting theory are proposed.  It is claimed that quantifiers may
range over plural individuals, or i-sums, as well as over atomic
individuals in the domain, and a number of supporting examples are
discussed.  Also, it is argued that in NPs of the form `numeral
CN', the numeral is not a determiner, but an adjective.
Finally, the treatment of the English floated quantifiers `each'
and `all' as adverbial operators, as proposed by Dowty and Brodie
(1984) is adapted to the present theory by the introduction of an
adverbial distributivity operator, and the relation of English
reciprocals, of NPs containing `different', and of the
German particle `je' to adverbial distributivity is explored.

68. Radical Lexicalism
Lauri Karttunen $2.50

Abstract not available

70. Understanding Computers and Cognition: Four Reviews and a
Response
Mark Stefik, Editor $3.50

Every now and then a book about computers and AI sweeps
through the community and divides opinion.  Some people praise it
while others deeply criticize it.  The new book by Winograd and Flores
is such a book.

In this report there are four reviews of `Understanding
Computers and Cognition.' The reviewers bring different backgrounds
and concerns to their assessments of the book.
At the end of this report Winograd and Flores respond.

71. The Correspondence Continuum
Brian Cantwell Smith $4.00

It is argued that current semantical techniques for analysing
knowledge representation systems (clear use/mention distinctions,
strict metalanguage hierarchies, distinct "syntactic" and
"semantic" accounts, even model-theory itself) are too rigid to
account for the complexities of representational practice, and
ill-suited to explain intricate relations among representation,
specification, implementation, communication, and computation.  By
way of alternative, the paper advocates the prior development of a
general theory of correspondence, able to support an indefinite
continuum of circumstantially dependent representation relations,
ranging from fine-grained syntactic distinctions at the level of
language and implementation, through functional data types, abstract
models, and indirect classification, all the way to the represented
situation in the real world.  The overall structure and some general
properties of such a correspondence theory are described, and its
consequences for semantic analysis surveyed.

72. The Role of Propositional Objects of Belief in Action
David J. Israel $2.50

Abstract not available

73. From Worlds to Situations
John Perry $2.00

In his recent book, `Inquiry,' Robert Stalnaker has developed a
conception of "possible worlds" as "ways the world might be."  In
this paper it is argued that it is reasonable and useful for one who
has such a conception of possible worlds to extend it in various ways
until it becomes a version of situation theory.  Such "ways" seem
best conceived or modeled as functions from issues to answers, where
an issue is determined by an `n'-ary relation and `n' appropriate
objects.  But then partial functions from issues to answers seem a
harmless but useful addition to the theory.  The theory can then
avail itself the of the situation theoretical treatment of the
problem of necessary equivalence.  Stalnaker's claim that the
identity of necessarily equivalent propositions is a inevitable
consequence of an informational/pragmatic conception of
intentionality is also criticized.  The uses of parts of the world,
as well as partial ways the world might be, is also discussed.

74. Two Replies
Jon Barwise $3.00

Surely one measure of a theory is the range of people who find it
significant enough to take exception to all or part of it.  By that
measure, Situation Semantics must be considered wildly successful.
What follows are two replies to criticisms coming from opposite
directions, from Max Cresswell and Jerry Fodor.

The first reply, "Situations and Small Worlds," was solicited by
Arnim von Stechow for inclusion in a projected `Handbook of
Semantics'.  In it I offer a comparison of situation semantics with
possible worlds semantics, and try to refute some claims Cresswell
has made about situation semantics.

The second reply, "Unburdening the Language of
Thought," was solicited by Samuel Guttenplan
for the journal `Mind and Language'.  In this paper I
introduce two principles, the Situatedness of Content Principle,
and the Situatedness of Casual Role Principle, and try to show why
they (i) are plausible, (ii) go together, and
(iii) are at odds with much of what Fodor believes about the
Language of Thought.

75. Semantics of Clocks
Brian Cantwell Smith $2.50

Clocks participate in their subject matter.  Temporal by nature, they
also represent time.  And yet, like other representational systems,
clocks have been hard to build, and can be wrong.  For these and
other reasons clocks are a good foil with which to explore issues in
AI and cognitive science about computation, mind, and the relation
between semantics and mechanism.

An analysis is presented of clock face content, the function of
clockworks, and various notions of chronological correctness.

77. The Parts of Perception
Alexander Pentland $4.00

To support our reasoning abilities perception must recover
environmental regularities---e.g., rigidity,
"objectness," axes of symmetry---for later use by
cognition.  Unfortunately, the representations that are currently
available were originally developed for other purposes (e.g.,
physics, engineering) and have so far proven unsuitable for the
task of perception.  In answer to this problem we present a
representation that has proven competent to accurately describe an
extensive variety of natural forms (e.g., people, mountains,
clouds, trees), as well as man-made forms, in a succinct and
natural manner.  The approach taken in this representational
system is to describe scene structure at a scale that is similar
to our naive perceptual notion of "a part," by use of
descriptions that reflect a possible formative history of the
object, e.g., how the object might have been constructed from
lumps of clay.  One absolute constraint on any theory of shape
representation is that it must be possible to recover accurate
descriptions from image data.  We therefore present several
examples of recovering such a "part" description from natural
imagery, and show that this recovery process is overconstrained
and therefore potentially quite reliable.  Finally, we show that
by using this shape representation we can improve man-machine
communication in several contexts; this provides further evidence
of the "naturalness" of the representation.

78. Topic, Pronoun, and Agreement in Chichewa
Joan Bresnan and Sam Mchombo $5.00

Typologists have maintained that grammatical agreement systems evolve
historically from the morphological incorporation of pronouns into
verbs or nominal heads, and it has also been claimed that there is no
clear dividing line between grammatical agreement, such as
subject-verb agreement, and incorporated pronominal anaphora to a
topic.  Current theories of formal grammatical structure provide
little insight into the nature of grammatical and anaphoric
agreement, why they are so closely related, and what significant
differences there are between them.  As we will show in this study,
there is substantial synchronic evidence of the close relation
between grammatical and anaphoric agreement even within the
grammatical structures of a single language.  But, as we will also
show, it is possible to predict clear syntactic differences between a
grammatical agreement marker and a morphologically incorporated
anaphoric pronoun.  What is required is a theory of grammatical
functions that integrates the properties of argument functions, such
as subject and object, and discourse functions such as topic and
focus.  This study is a step toward developing such a theory within
the overall framework of the lexical-functional theory of grammar.

79. HPSG: An Informal Synopsis
Ivan Sag and Carl Pollard $2.50

Head-driven phrase structure grammar (HPSG) is an information-based
theory of natural language syntax and semantics which has its roots in
a number of different research programs within linguistics and
neighboring disciplines such as philosophy and computer science. Thus
it has drawn upon and attempted to synthesize insights and
perspectives from several families of contemporary syntactic theories
such as categorial grammar (Dowty, 1982a, 1982b; Bach, 1983; Steedman,
1985), lexical-functional grammar (LFG) (Bresnan, ed., 1982),
generalized phrase structure grammar (GPSG) (Gazdar et al., 1985), and
government-binding theory (GB) (Chomsky, 1981); but many of the key
ideas arise from from semantic theories such as situation semantics
(Barwise and Perry, 1983) and discourse representation theory (Kamp,
1981; Heim, 1982), and from computational work in such areas as
knowledge representation (Roberts and Goldstein, 1977; Ait-Kaci,
1984), data type theory (Moshier and Rounds, 1986; Kasper and Rounds,
1986; Rounds and Kasper, 1986), and unification-based formalisms (Kay,
1978, 1985; Shieber, 1984; Shieber et al., 1984).  HPSG incorporates a
number of design properties that bear on its relation to theories of
(human or computer) language processing. The grammars sketched herein,
and developed in greater detail in Pollard and Sag (forthcoming), are
purely monotonic, declarative, reversible, and stated in terms of
partial information structures represented by a precisely-defined
mathematical formalism.

80. The Situated Processing of Situated Languages
Susan Stucky $1.50

We take as a plausible starting point the hypothesis that the
successful processing of a natural-language expression results in the
agent's (whether human or machine) being in a state that has the same
interpretation, i.e., is about the same state-of-affairs, as the input
natural language expression.  Such states I call successful states, or
s-states, for short.  When the consequences of the situatedness (i.e.,
of the dependence on context for interpretation) of both language and
computation are properly assessed, we see that s-states are best
viewed as intentional states in their own right, with all the
complexity that that entails: indexicality, intensionality, and
perhaps even ambiguity.  From the premise that both computation and
language are situated, it follows that inference, if it is to be
compatible in a natural way, will be situated too.  All of this lead
to a commitment, I will be arguing, not simply to the processing of
situated language, but to situated natural-language processing.
Viewing computation, language, and inference through this perspective,
I will maintain, suggests a conception of natural-language processing
that is both more powerful and more realistic, than that underlying
much of current practice.  In the end it may be more complicated in
the requirements it places on our theories of meaning, but it opens
the door for more simplicity, too: by taking the situatedness to
heart, we will find that less needs to be represented explicitly, both
in our accounts of language and in our processing of it.

81. Muir: A Tool for Language Design
Terry Winograd $2.50

Muir is a `language design environment,' intended for use in
creating and experimenting with languages such as programming
languages, specification languages, grammar formalisms, and logical
notations.  It provides facilities for a language designer to create
a language specification, which controls the behavior of generic
language manipulating tools typically found in a language-specific
environment, such as structure editors, interactive interfaces,
storage management and attribute analysis.  It is oriented towards
use with evolving languages, providing for mixed structures
(combining different languages or different versions), semi-automated
updating of structures from one language version to another, and
incremental language specification.  A new hierarchical grammar
formalism serves as the framework for language specification, with
multiple presentation formalisms and a unified interactive
environment based on an extended notion of edit operations.  A
prototype version is operating and has been tested on a small number
of languages.

82. Final Algebras, Cosemicomputable Algebras, and Degrees of
Unsolvability
Lawrence S. Moss, Jose Meseguer, and Joseph A. Goguen $3.00

This paper studies some computability notions for abstract data
types, and in particular compares cosemicomputable many-sorted
algebras with a notion of finality to model minimal-state
realizations of abstract (software) machines.  Given a finite
many-sorted signature $\Sigma$ and a set V of visible sorts, for
every $\Sigma$-algebra A with co-r.e. behavior and nontrivial,
computable V-behavior, there is a finite signature extension
$\Sigma'$ of $\Sigma$ (without new sorts) and a finite set E of
$\Sigma'$-equations such that A is isomorphic to a reduct of the
final $(\Sigma', E)$-algebra relative to V.  This uses a theorem
due to Bergstra and Tucker (1983).  If A is computable, then A is
also isomorphic to the reduct of the initial $(\Sigma', E)$-algebra.
We also prove some results on congruences of finitely generated free
algebras.  We show that for every finite signature $\Sigma$, there are
either countably many $\Sigma$-congruences on the free $\Sigma$-algebra
or else there is a continuum of such congruences.  There are several
necessary and sufficient conditions which separate these two cases.
We introduce the notion of the Turing degree of a minimal algebra.
Using the results above, we prove that there is a fixed one-sorted
signature such that for every r.e. degree d, there is a finite set E
of $\Sigma$-equations such the initial $(\Sigma, E)$-algebra has degree
d. There is a two-sorted signature $\Sigma_0$ and a single visible
sort such that for every r.e. degree d there is a finite set E of
$\Sigma$-equations such that the initial $(\Sigma,E,V)$-algebra is
computable and the final $(\Sigma,E,V)$-algebra is cosemicomputable and
has degree d.

83. The Synthesis of Digital Machines with Provable Epistemic
Properties
Stanley J. Rosenschein and Leslie Pack Kaelbling $3.50

Researchers using epistemic logic as a formal framework for studying
knowledge properties of artificial-intelligence (AI) systems often
interpret the knowledge formula $K(x,\varphi)$ to mean that machine x
encodes $\varphi$ in its state as a syntactic formula or can derive it
inferentially.  If $K(x,\varphi)$ is defined instead in terms of the
correlation between the state of the machine and that of its
environment, the formal properties of modal system S5 can be satisfied
without having to store representations of formulas as data
structures.  In this paper, we apply the correlational definition of
knowledge to machines with composite structure and describe the
semantics of knowledge representations in terms of correlation-based
denotation functions.  In particular, we describe how epistemic
properties of synchronous digital machines can be analyzed, starting
at the level of gates and delays, by modeling the machine's components
as agents in a multiagent system and reasoning about the flow of
information among them.  We also introduce Rex, a language for
computing machine descriptions recursively, and explain how it can be
used to construct machines with provable informational properties.

84. Formal Theories of Knowledge in AI and Robotics
Stanley J. Rosenschein $1.50

Although the concept of `knowledge' plays a central role in
artificial intelligence, the theoretical foundations of knowledge
representation currently rest on a very limited conception of what it
means for a machine to know a proposition.  In the current view, the
machine is regarded as knowing a fact if its state either explicitly
encodes the fact as a sentence of an interpreted formal language or if
such a sentence can be derived from other encoded sentences according
to the rules of an appropriate logical system.  We contrast this
conception, the interpreted-symbolic-structure approach, with another,
the situated-automata approach, which seeks to analyze knowledge in
terms of relations between the state of a machine and the state of its
environment over time using logic as a metalanguage in which the
analysis is carried out.

85. An Architecture for Intelligent Reactive Systems
Leslie Pack Kaelbling $2.00

Any intelligent system that operates in a moderately complex or
unpredictable environment must be `reactive'---that is, it must
respond dynamically to changes in its environment.  A robot that
blindly follows a program or plan without verifying that its
operations are having their intended effects is not reactive.  For
simple tasks in carefully engineered domains, non-reactive behavior is
acceptable; for more intelligent agents in unconstrained domains, it
is not.

This paper presents the outline of an architecture for intelligent
reactive systems.  Much of the discussion will relate to the problem
of designing an autonomous mobile robot, but the ideas are independent
of the particular system.  The architecture is motivated by the
desires for modularity, awareness, and robustness.

86. Order-Sorted Unification.
Jose Meseguer, Joseph A. Goguen, and Gert Smolka $2.50

Order-sorted logic is the logic of multiple inheritance and
overloading polymorphism. It provides a rich type theory that permits
easy and natural expression for many problems in knowledge
representation, natural language processing, theorem proving, etc.
Order-sorted logic is also the basis of the logical languages OBJ3,
and EQLOG.  In spite of its considerable expressiveness, all the usual
results of equational and first-order logic generalize to order-sorted
logic.  The present work develops a general theory of order-sorted
E-unification and characterizes the cases when there is a minimal
family of unifiers, a finite family of unifiers, and a unique most
general unifier.  The latter case has a simple syntactic
characterization and also a quasi-linear unification algorithm a la
Martelli-Montanari that is in fact more efficient than ordinary
unification thanks to the type-checking.

87. Modular Algebraic Specification of Some Basic Geometrical
Constructions
Joseph A. Goguen $2.50

This paper presents a modular algebraic specification of some basic
constructions in plane geometry, including the line through two
points, the intersection of two lines, the circle through three
points, and the tangent to a circle through a point.  These
constructions are specified over any field having square roots of
non-negative elements.  This approach to geometry requires that we
also specify some basic algebra, including rings, fields, and
determinants.  Several aspects of current algebraic specification
technology are illustrated, including modularity, parameterization,
hierarchical organization, information hiding, and exceptions.  The
latter is supported by a theory of subsorts, called order-sorted
algebra.  This paper also introduces novel approaches to block
structured specification, and to degenerate and multiple
representations; the latter issues seem to be of some interest in
computational geometry, and again use order-sorted algebra.  A
powerful approach to coercions is also introduced.

88. Persistence, Intention and Commitment
Phil Cohen and Hector Levesque $3.50

This paper explores principles governing the rational balance among an
agent's beliefs, goals, actions, and intentions.  Such principles
provide specifications for artificial agents, and approximate a theory
of human action (as philosophers use the term).  By making explicit
the conditions under which an agent can drop his goals, i.e., by
specifying how the agent is `committed' to his goals, the formalism
captures a number of important properties of intention.  Specifically,
the formalism provides analyses for Bratman's three characteristic
functional roles played by intentions (Bratman 1983a, 1986), and shows
how agents can avoid intending all the foreseen side-effects of what
they actually intend.  Finally, the analysis shows how intentions can
be adopted relative to a background of relevant beliefs and other
intentions or goals.  By relativizing one agent's intentions in terms
of beliefs about another agent's intentions (or beliefs), we derive a
preliminary account of interpersonal commitments.

89. Rational Interaction as the Basis for Communication
Phil Cohen and Hector Levesque $3.50

Abstract not available

90. An Application of Default Logic to Speech Act Theory
C. Raymond Perrault $2.50

One of the central issues to be addressed in basing a theory of speech
acts on independently motivated accounts of propositional attitudes
(belief, knowledge, intentions, ...) and action is the specification
of the effects of communicative acts.  The very fact that speech acts
are largely conventional means that specifying, for example, the
effects of the utterance of a declarative sentence, or the performance
of an assertion, requires taking into consideration many possible
exceptions to the conventional use of the utterances (e.g., the
speaker may be lying, the hearer may not believe him, etc.).  Previous
approaches to the problem have paid insufficient attention to the
dependence of the participants' mental state before the utterance on
their mental state following it. We present a limited solution to the
revision of beliefs within Reiter's non-monotonic Default Logic and
show how to formulate the consequences of many uses of declarative
sentences.  Default rules are used to embody simple theories of belief
adoption, of action observation, and of the relation between the form
of a sentences and the attitudes it is used to convey.

91. Models and Equality for Logical Programming
Joseph A. Goguen and Jose Meseguer $3.00

We argue that some standard tools from model theory provide a better
semantic foundation than the more syntactic and operational approaches
usually used in logic programming.  In particular, we show how initial
models capture the intended semantics of both functional and logic
programming, as well as their combination, with existential queries
having logical variables (for both functions and relations) in the
presence of arbitrary user-defined abstract data types, and with the
full power of constraint languages, having any desired built-in
(computable) relations and functions, including disequality (the
negation of the equality relation) as well as the usual ordering
relations on the usual built-in types, such as numbers and strings.
These results are based on a new completeness theorem for order-sorted
Horn clause logic with equality, plus the use of standard
interpretations for fixed sorts, functions and relations.  Finally, we
define "logical programming," based on the concept of institution, and
show how it yields a general framework for discussions of this kind.
For example, this viewpoint suggests that the natural way to combine
functional and logic programming is simply to combine their logics,
getting Horn clause logic with equality.

92. Order-Sorted Algebra Solves the Constructor-Selector, Multiple
Representation and Coercion Problems
Joseph A. Goguen and Jose Meseguer $2.00

Structured data are generally composed from constituent parts by
`constructors' and decomposed by `selectors'.  We prove that the usual
many-sorted algebra approach to abstract data types `cannot' capture
this simple intuition in a satisfactory way.  We also show that
order-sorted algebra `does' solve this problem, and many others
concerning ill-defined and erroneous expressions, in a simple and
natural way.  In particular, we show how order-sorted algebra supports
an elegant solution to the problems of multiple representations and
coercions.  The essence of order-sorted algebra is that sorts have
`subsorts', whose semantic interpretation is the `subset' relation on
the carriers of algebras.

93. Extensions and Foundations for Object-Oriented Programming
Joseph A. Goguen and Jose Meseguer $3.50

This paper presents some novel design ideas, multiparadigm extensions,
and logical (declarative) semantics for object-oriented programming
(OOP).  The simple rigorous semantic foundations that we give for OOP
seem to be first available, and have the practical advantages of
supporting clean language design and many features that are new to
OOP, including: a functional level (providing abstract data types for
attribute values) that is distinct from the object level (providing
objects, classes, and methods); subsorts (from order-sorted algebra),
which greatly increase the expressiveness of abstract data types and
also provide a simple semantics for multiple inheritance;
parameterization at both the functional and object levels; strong but
flexible typing, with overloaded mixfix operations; and a "wide
spectrum" integration of coding, rapid prototyping, and specification.
These features together constitute FOOPS, which thus combines OOP with
functional programming.  We also unify FOOPS with relational (i.e.,
"logic") programming to get FOOPlog, which adds logical variables and
backtracking and thus combines all three major emerging programming
paradigms.  Regarding semantics, we present: an abstract operational
semantics based on "reflection," in the sense of using an abstract
data type for programs; a corresponding logical basis in a
"reflective" logic; a corresponding more efficient operational
semantics; another slightly less general) mathematical semantics based
on an "abstract machine" generalization of abstract data types to
include hidden sorts for states; and a definition of "logical
programming" that explicates "declarative programming" in a sense that
includes functional, relational and object-oriented programming.  Our
approach also clarifies the relationships among the various
programming paradigms, and with databases.  In particular,
relationships among some characteristic styles of code we use are
explored, including parameterization, module hierarchies (importing
and exporting), and multiple inheritance.

94. L3 Reference Manual
William Poser $2.50

This document describes L3, a flexible, highly interactive multiple
time series display, analysis, and editing system designed for use in
phonetics research. L3 provides the ability to display one or more
time-locked time series simultaneously, to interrogate the display
about the original data points or any of a number of functions of the
data, and to edit the displayed data for use in speech resynthesis or
interactive modeling.

L3 was designed with two principal goals in mind: (1) to facilitate
the study of large amounts of data; and (2) to be as generally usable
as possible. It accomplishes the first goal by automating many common
measurements and by providing an elaborate system for automatic
logging of measurements. It accomplishes the second goal by providing
for an unusual degree of customization.  Each time-series is displayed
in a separate window. Windows may be created or removed interactively.
Each window's location and height may be set by the user, as may be
nearly all of its graphical parameters. Other user-specifiable
window-specific properties include the logging of the results of
interrogations of the display and the functions used to compute the
values returned by interrogation of the display. Each window is
independently associated with a data file and track within the data
file, so minimal constraints are imposed on the format of the data
displayed. The configuration of the program may be changed
interactively by typing commands or it may be initialized by reading a
command file.

95. Change, Process and Events
Carol E. Cleland 

We commonly think of change as something which is inherently dynamic:
the birth of a child, the shattering of a window, the flying of a
bird, the exploding of the space shuttle Challenger.  That is to say,
we think of change as involving some kind of physical medium for the
alteration of conditions associated with the change.

In this light, it is surprising how few of our modes of representing
change provide for any notion of process or activity.  In
contemporary analytic philosophy, for instance, change is almost
invariably represented in terms of a mere difference in the
properties or relations exemplified by an object at different times.
Similarly, change is often represented in theoretical computer
science in terms of time-ordered sequences of discrete (Turing)
machine "configurations."

In the first part of this paper I argue for the (re)introduction into
both philosophy and computation theory of a dynamic notion of change.
Accordingly, in the second part, I develop an account of change which
draws a fundamental distinction between an actual process of
`changing' and the differences (in property or machine configuration)
over time associated with it: on my account, it is solely in virtue of
the `inherently' dynamic nature of an actual process of changing that
there are any differences in time in the first place.  In part three,
I use the account of change developed in part two to solve a number of
traditional philosophical puzzles about events.  Finally, in part
four, I speculate about the implications of such an account of change
for computer science, suggesting that the notion that computational
procedures can be fully understood independently of their actual
embodiment in a machine is fundamentally mistaken.

96. One, None, a Hundred Thousand Specification Languages
Joseph A. Goguen $2.00

Many different languages have been proposed for specification,
verification, and design in computer science; moreover, these
languages are based upon many different logical systems.  In an
attempt to comprehend this diversity, the theory of `institutions'
formalizes the intuitive notion of a "logical system." A number of
general linguistic features have been defined "institutionally" and
are thus available for any language based upon a suitable institution.
These features include generic modules, module hierarchies, "data
constraints" (for data abstraction), and multiplex institutions (for
combining multiple logical systems).  In addition, institution
morphisms support the transfer of results (as well as associated
artifacts, such as theorem provers) from one language to another.
More generally, institutions are intended to support as much computer
science as possible independently of the underlying logical system.

97. Constituent Coordination in HPSG
Derek Proudian and David Goddeau $1.50

The analysis of coordinate constructions has long been a topic of
interest to theoretical and computational linguists alike. The ability
to handle coordination is essential in any practical natural language
system because of the ubiquity of the construction. The problem is
made more interesting by the fact that the range of coordination
phenomena is very wide yet surprisingly subtle. Many dissimilar types
of constituents, and non-constituents, may be coordinated, yet not all
types may be. This raises the interesting theoretical problem of
accounting for exactly the range of permissible constructions. For the
computational linguist there is the additional challenge of
discovering an efficient algorithm for parsing the constructions, and
of fitting the analysis into a research vehicle. 

This paper talks about constituent coordination in an HPSG
framework, in particular in the natural language system under
development at Hewlett-Packard Laboratories. The analysis described in
the paper was developed in September of 1985 and has been in active
use at HP Labs since that time. The analysis concerns itself only with
constituent coordination, which although not the whole story on
coordination by any means, nonetheless represents a significant and
useful fragment of the coordination puzzle.

98. A Language/Action Perspective on the Design of Cooperative Work
Terry Winograd $2.50

In creating computer-based systems, we work within a perspective that
shapes the design questions that will be asked and the kinds of
solutions that are sought. This paper introduces a perspective based
on language as action, and explores its consequences for system
design. We describe a communication tool called The
Coordinator, which was designed from a language/action
perspective, and we suggest how further aspects of coordinated work
might be addressed in a similar style. The language/action perspective
is illustrated with an example based on studies of nursing work in a
hospital ward and is contrasted to other currently prominent
perspectives.

99. Implicature and Definite Reference
Jerry R. Hobbs $1.50

An account is given of the appropriateness conditions for definite
reference in terms of the operations of inference and implicature. It
is shown how a number of problematic cases noticed by Hawkins can be
explained in this framework. In addition, the use of unresolvable
definite noun phrases as a literary device and definite noun phrases
with nonrestrictive material can be explained within the same framework.

100. Thinking Machines: Can there be? Are we?
Terry Winograd $2.50

Artificial intelligence researchers predict that "thinking machines"
will take over our mental work, just as their mechanical predecessors
were intended to eliminate physical drudgery. Critics have argued with
equal fervor that "thinking machine" is a contradiction in terms.
Computers, with their foundations of cold logic, can never be creative
or insightful or possess real judgement. Although my own understanding
developed through active participation in artificial intelligence
research, I have now come to recognize a larger grain of truth in the
criticisms than in the enthusiastic predictions. The source of the
difficulties will not be found in the details of silicon micro-circuits
or Boolean logic, but in a basic philosophy of `patchwork
rationalism' that has guided the research. In this paper I review the
guiding principles of artificial intelligence and argue that as now
conceived it is limited to a very particular kind of intelligence: one
that can usefully be likened to bureaucracy. In conclusion I will
briefly introduce an orientation I call `hermeneutic
constructivism' and illustrate how it can lead to an alternative path
of design.

101. Situation Semantics and Semantic Interpretation in
Constraint-based Grammars
Per-Kristian Halvorsen $1.50

This paper analyses the problem of compositional semantic
interpretation in constraint-based approaches to linguistic analysis
(LFG, FUG, PATR).  We show how semantic interpretations can be arrived
at by means of constraints which give a declarative specification of
the relationship between the form of an utterance and its semantic
content.  Traditionally, and specifically in Montague Grammar,
semantic interpretations have been derived by means of semantic rules
specifying semantic operations on semantic objects in a semantic
algebra.  We examine previous proposals for semantic interpretation
strategies for unification grammars, and we find that the
misinterpretation of the semantic constraints as specifying operations
in the semantic algebra has prevented the emergence of simple and
powerful methods for semantic interpretation in constraint-based
environments.  We introduce a notation for semantic constraints as an
extension to the rule language of lexical-functional grammar.  The
notation is illustrated by examples of lexical items and annotated
phrase-structure rules from a fragment of English analyzed by our
parser.

102. Category Structures
Gerald Gazdar, Geoffrey K. Pullum, Robert Carpenter, Ewan Klein,
Thomas E. Hukari, Robert D. Levine $3.00

This paper outlines a simple and general notion of syntactic category
on a metatheoretical level, independent of the notations and
substantive claims of any particular grammatical framework.  We define
a class of formal objects called "category structures" where each such
object provides a constructive definition for a space of syntactic
categories.  A unification operation and subsumption and identity
relations are defined for arbitrary syntactic categories.  In
addition, a formal language for the statement of constraints on
categories is provided. By combining a category structure with a set
of constraints, we show that one can define the category systems of
several well-known grammatical frameworks: phrase structure grammar,
tagmemics, augmented phrase structure grammar, relational grammar,
transformational grammar, generalized phrase structure grammar,
systemic grammar, categorial grammar, and indexed grammar. The problem
of checking a category for conformity to constraints is shown to be
solvable in linear time.  This work provides in effect a unitary class
of data structures for the representation of syntactic categories in a
range of diverse grammatical frameworks.  Using such data structures
should make it possible for various pseudo-issues in natural language
processing research to be avoided.  We conclude by examining the
questions posed by set-valued features and sharing of values between
distinct feature specifications, both of which fall outside the scope
of the formal system developed in this paper.

103. Cognitive Theories of Emotion
Ronald Alan Nash $2.50

A distinguished philosophical tradition holds that emotions are
necessary to rational action, particularly moral action. Cognitive
scientists have mostly ignored the emotions, suggesting a rejection of
this tradition. This paper presents an outline of a theory of emotion,
intended as a first step towards deciding whether intelligent machines
will need emotional states.

I begin with two widely held assumptions: that a theory of emotion
should be a cognitive theory, and that it should account for the
`passivity' of emotion. Two sorts of cognitive theories are considered:
Pure Theories and Hybrid Theories. Hybrid Theories analyze the
passivity of emotion in terms of noncognitive (or nonintentional)
states; the most popular version holds that being emotionally upset or
disturbed is a matter of undergoing certain peripheral physiological
changes and bodily sensations, where these are effects of evaluative
beliefs. Pure Theories, in contrast, analyze emotion solely in terms
of intentional states like evaluations and desires. It is often
assumed that only a Hybrid Theory can account for the passivity of
emotion. I argue that this assumption is doubly mistaken. First, the
Hybrid Theory itself lacks explanatory power: the bodily disturbances
it identifies have important effects on behavior but not on
intentional action. Yet emotional action, I argue, frequently has
distinctive features not shared by dispassionate action. Second, the
version of the Pure Theory I propose has a better account of
passivity: one in terms of parameters I call attentional focus and
overvaluation. This account explains why emotional action is sometimes
unreasonable, ill-advised or impetuous; and it casts considerable
(though not conclusive) doubt on the philosophical tradition that
promotes emotion.

104. Toward an Architecture for Resource-bounded Agents
Martha E. Pollack, David J. Israel, and Michael E. Bratman $2.00

Autonomous, rational agents must both perform means-end reasoning and
weigh alternative courses of action. Hence, it would be
desirable to combine existing AI techniques for automating the former
with a computational instantiation of decision-theoretic techniques
for the latter. However, such a synthesis will fail to be useful
unless the problem of `resource boundedness' is addressed: agents
cannot do arbitrarily large computations in constant time.  We
describe an architecture, under development, for producing rational
behavior in resource-bounded agents. It is based upon an account of
the functional role of an agent's plans in constraining the amount of
further reasoning she must do. An agent designed with this
architecture will sometimes perform `locally irrational' behavior,
in service of `global rationality'.

105. On the Relation Between Default and Autoepistemic Logic
Kurt Konolige $3.00

Default logic is a formal means of reasoning about defaults: what
`normally' is the case, in the absence of contradicting information.
Autoepistemic logic, on the other hand, is meant to describe the
consequences of reasoning about ignorance: what must be true if a
certain fact is `not' known. Although the motivation and formal
character of these two systems are different, a closer analysis shows
that they share a common trait, which is the indexical nature of
certain elements in the theory. In this paper we compare the
expressive power of the two systems. First, we give an effective
translation of default logic into autoepistemic logic; default
theories can thus be embedded into autoepistemic logic. We also
present a more surprising result: the reverse translation is also
possible, so that every set of sentences in autoepistemic logic can be
effectively rewritten as a default theory. The formal equivalence of
these two differing systems is thus established. This analysis gives
an interpretive semantics to default logic, and yields insight into
the nature of defaults in autoepistemic reasoning.

106. Three Responses to Situation Theory
Terry Winograd $2.50

During the past few years, Jon Barwise, John Perry, and their
colleagues at CSLI have been developing a comprehensive theory of
language and meaning, under the general labels of "situation theory"
and "situation semantics." Over these same years, Fernando Flores
and I have been developing theories of language, computation and
cognition that address many of the same issues from a very different
perspective. This report collects three of my contributions to the
ongoing dialog between these approaches, dealing with questions
concerning the relation among language, cognition and reality.

107. Subjects and Complements in HPSG
Robert Borsley $2.50

At the heart of Head-driven Phrase structure Grammar (HPSG) is
the idea that grammars can be simplified quite radically if heads
carry explicit information about the categories with which they
combine.  This information is encoded in a feature SUBCAT, which
takes as its value a list of categories. This list indicates both what
complements an item takes and what kind of subject it requires. A
number of considerations suggest that these two kinds of information
should be separated. More precisely, they suggest that SUBCAT
should be restricted to complements and that subjects should be
identified by a separate SUBJ feature.

108. Tools for Morphological Analysis
Mary Dalrymple, Ronald M. Kaplan, Lauri Karttunen, Kimmo Koskenniemi,
Sami Shaio, Michael Wescoat $10.00

This report consists for two self-contained but related parts. The
first article, "A Compiler for Two-level Phonological Rules" by
Kartunnen, Koskenniemi, and Kaplan describes a system, the TWOL
compiler, that converts a set of phonological or orthographic rules to
finite-state transducers; it also discusses some general issues
concerning the rule formalism and describes the compilation algorithm.
The second part of the report, "A Morphological Analyzer using
Two-Level Rules," by Shaio, Dalrymple, Kartunnen and Wescoat is a
user's manual to a system called DKIMMO. The purpose of 
DKIMMO is to aid the user in developing a computationally implemented
morphological description of a language. It takes as input a set of
transducers produced by the TWOL compiler, a set of lexical
entries with encoded morphotactic principles. DKIMMO allows the
user to test the description by generating surface forms from lexical
input and by producing analyses for a given surface form. It contains
facilities for tracing the analysis and for editing and maintaining
morphological grammars.

109. Cognitive Significance and the New Theories of Reference
John Perry $2.00

Howard Wettstein argues in his "Has Semantics Rested on a Mistake,"
(`Journal of Philosophy', April 1986) that semantical theories
which take demonstratives and names to contribute individuals to the
propositions the statements containing them express, cannot resolve
Frege's problems about cognitive significance and identity. Wettstein
himself is a "new theorist of reference," and advocates this type of
treatment of names and demonstratives. He concludes that it was a
mistake for semantics to try to resolve Frege's problems. 

I argue that semanticists should worry about cognitive significance,
and that semantical theories of the sort in question can resolve
Frege's problems. I argue that as soon as one accepts that reference
depends on circumstances of utterance and not just the meaning of the
words used, one should accept the consequence that the cognitive
significance of an utterance---what one believes when one believes the
utterance to be true---cannot be identified with the proposition
expressed by the utterance. Once one accepts this, it is possible to
resolve Frege's problems, without abandoning the new theory of
reference.

111. Fourth Year Report of the Situated Language Research Program
CSLI Fourth Year Annual Report free

112. Bare Plurals, Naked Relatives, and Their Kin
Dietmar Zaefferer $2.50

Free relative clauses, as they are referred to in this paper, are
members of a family of related, but distinct constructions, which are
often referred to by the name wh-constructions, because they are
marked in English by the presence of wh-words like `who' or `where'.
The term can be carried over to any language with interrogative words
and can be conceived of as denoting the family of construction types
that have the interrogative words and their homonyms as common
denominator. Therefore, if there are no homonyms, the wh-constructions
are just the constituent interrogatives. Most languages, however, do
have homonyms and so, e.g., both in English and in German, the
headless or free relative constructions are also in the family of
wh-constructions. This paper analyses the relationship of six members
of this family: the weak indefinite, naked (or headless) relatives,
pseudo-clefts, exclamatories, interrogatives, and wh-antecedents of
no-matter-conditionals.

113. Events and "Logical Form"
Stephen Neale $2.00

By combining Davidson's analysis of the logical form of action
sentences with some of the ideas from GB theory, James
Higginbotham has proposed the first serious alternative to Jon
Barwise's `situation semantics' treatment of naked-infinitive
perceptual reports.  The present paper argues that Higginbotham's
theory makes no desirable empirical predictions over and above those
made by Barwise's original proposal, and that it does not succeed in
simultaneously fulfilling its syntactic and semantic obligations.

114. Backward Anaphora and Discourse Structure: Some Considerations
Peter Sells $2.50

This paper examines a class of cases that are counterexamples to most
syntactic accounts of backwards anaphora, such as Principle C of
Chomsky's Government-Binding theory. It is proposed that certain
aspects of discourse information (referred to as SOURCE, 
SELF, and PIVOT) and the overall informational structure of the
utterance affect acceptability in these cases. Some consequences of
the proposed analysis are considered, and some questions remain open.

115. Towards a Linking Theory of Relation Changing Rules in LFG
Lori Levin $4.00

This paper describes a portion of a new theory of relation changing
rules in Lexical Functional Grammar. This theory consists almost
entirely of linking rules which associate grammatical functions with
thematic roles. An innovative aspect of this system is that it also
allows thematic roles to be linked to partially specified grammatical
functions, the full specification of which is determined by
wellformedness conditions of LFG. The use of partially specified
grammatical functions provides a lexical representation of
unaccusative verbs. In this paper, the theory is applied to
passivization in English and Dutch and the presentational-`there'
construction in English.

116. Fuzzy Logic
L. A. Zadeh $2.50

Fuzzy Logic differs from traditional logical systems in four basic
respects: it allows the use of 

(1) fuzzy predicates exemplified by `small', `soon', `quite expensive', etc.;
(2) fuzzy truth values exemplified by `quite true', `not very true', etc.;
(3) fuzzy quantifiers exemplified by `few', `several', `most',
`usually', etc.; and 
(4) fuzzy predicate modifiers exemplified by `very', `much more',
`somewhat', `quite', etc.  

In fuzzy logic, fuzzy quantifiers are treated as fuzzy numbers which
may be manipulated through the use of fuzzy arithmetic. Furthermore,
the concept of cardinality of a fuzzy set provides a basis for
interpreting a proposiiton of the form `QA's are B's', in which `A'
and `B' are fuzzy predicates and `Q' is a fuzzy quantifier, as an
imprecise characterization of the proportion of elements which satisfy
`B' among those that satisfy `A', with the understanding that
satisfying is a matter of degree.

The concept of a fuzzy quantifier makes it possible to generalize
syllogistic reasoning to premises of the form `QA's are B's'. This,
in turn, provides a foundation for reasoning with  dispositions, that
is, with propositions which are preponderantly, but not necessarily
always, true.

In fuzzy logic, a proposition plays the role of an elastic constraint
on a variable. In this perspective, inference in fuzzy logic may be
viewed as a process of propagation of elastic constraints. In general,
this process reduces to the solution of a nonlinear program.

In sum, fuzzy logic differs from traditional logical systems in that
its major aim is to provide a model for modes of reasoning which are
approximate rather than exact.

117. Dispositional Logic and Commonsense Reasoning
L. A. Zadeh $2.00

Dispositional logic, or DL for short, is a branch of fuzzy logic which
is concerned with inference from dispositions, that is propositions
which are preponderantly, but not necessarily always, true. Simple
examples of dispositions are `birds can fly', `snow is white',
and `Swedes are blond'. The importance of the concept of a
disposition derives from the fact that much of commonsense knowledge
may be viewd as a collection of dispositions.

Dispositional logic provides an alternative approach to the theories
of default reasoning, nonmonotonic reasoning, circumscription, and
other widely-used approaches to commonsense reasoning. It is simple
conceptually and is computational rather than purely symbolic in
nature.

In DL, the premises are assumed to be of the form `usually (X is
A)' or `usually (Y is B if X is A)', where `A' and `B' are
fuzzy predicates which play the role of elastic constraints on the
variables `X' and `Y'. Inference from such premises reduces, in
general, to the solution of a nonlinear program. In many cases, an
inference rule in DL has the form of a fuzzy syllogism.

The importance of dispositional logic transcends its role as a basis
for formalization of commonsense reasoning. Viewed in a broader
perspective, it underlies the remarkable human ability to make
rational decisions in an environment of uncertainty and imprecision.

118. Intention and Personal Policies
Michael Bratman $2.00

I am about to go running; I hear on the radio that the pollen count is
high; I recall my general policy of not running when the pollen count
is high; and so I decide not to run. Gregory asks me for help with his
homework; I recall my general policy of helping him with his homework
when he asks; so I decide to help him. Such general policies shape my
practical reasoning and action. I call these `personal policies'
to indicate that I am confining my attention to the general policies
of individual agents, rather than those of complex organizations like
corporations or governments. Personal policies pervade our lives. A
plausible model of the intelligent activity of beings like us should
make room for and shed light upon the complex roles of such policies
in our lives. 

In previous publications I have argued for a `planning theory' of
intention (see `Intention, Plans, and Practical Reason',
Cambridge: Harvard University Press, 1987). According to that theory,
prior intentions are typically elements in larger but partial plans
for future action.  And these partial plans shape further planning and
action in characteristic ways, thereby helping us to coordinate our
activities over time and with each other, and thereby helping us to
extend the influence of present rational reflection to future action.
Once our model of agency takes such prior intentions seriously, we
acquire a natural way of thinking about personal policies.

In a typical case a future-directed intention concerns some particular
occasion, an occasion that is more or less precisely specified by that
intention. I may intend, for example, to go to Boston `tomorrow'
or `some time next month', or to finish this abstract `today'.
But we may also have intentions that are `general with respect to
their occasions of execution'. We can have an intention to act in a
certain way `whenever' a certain type of occasion is present, or
an intention to act in a certain way on a `regular' basis. For
example, I can intend to help Gregory `whenever' he asks, not to
run `whenever' the pollen count is too high, etc. Thus our
planning theory of intention can be extended to include and thus
characterize personal policies by seeing such policies as intentions
that are general with respect to their occasions of execution. This
paper provides the details of this strategy.

119. Propositional Attitudes and Russellian Propositions
Robert C. Moore $2.50

An adequate theory of propositions needs to conform to two sets of
intuitions that pull in quite different directions. One set of
intuitions concerning entailments (or, more specifically, the `lack'
thereof) among reports of propositional attitudes such as belief,
knowledge, or desire points toward a very fine-grained notion of
proposition. To be the objects of attitudes, propositions must
seemingly be individuated almost as narrowly as sentences of a natural
language. On the other hand, other intuitions seem to require that
proposiitons not be specifically linguistic entities---rather that
they be proper "semantic" objects, whatever that really amounts to.
Over the last few years, a number of approaches have been proposed in
the attempt to reconcile these two types of intuitions. Perhaps the
simplest approach with any hope of success is the recent revival of
the "Russellian" view of propositions, based on the semantic ideas
expressed in `Principles of Mathematics'. This paper explores the
Russellan view of propositions and its adequacy as a basis for the
semanitics of propositional attitude reports. We review some of the
familiar problems of attitude reports and suggest that a number of
other approaches to their solution fall short of the mark. We then
sketch how these problems can be handled by the Russellian aproach,
pointing out that it in fact offers a more complete treatment of the
problems than is sometimes realized, and we present a formal treatment
of a logic based on the Russellian conception of a proposition.
Finally we discuss a number of remaining isues, including the need to
distinguish propositinal functions fromproperties and the problem of
proper names in attitude reports.

120. Unification and Agreement
Michael Barlow $2.50

In `Unification and Agreement', I argue that many present-day
theories  of agreement, which I call feature-copying
accounts,  are inadequate because they are based on two
false premises: (1)  the source of agreement is fully specified
with respect to agreement features, and (2)  the agreement
relation involves a transfer (in some sense) of agreement features
from the source to the target. Adopting these premises means that some
special provisions must be made to deal with both the over- and
under-specification of agreement targets. 

For example, in Classical Arabic the predicate adjective can be
described as over-specified with respect to pronominal sources.  In
Classical Arabic, the first person pronoun only distinguishes singular
and plural (`ana' and `nah'nu); predicates, however,
exhibit both dual and plural number.  Furthermore, although the first
person plural pronoun does not distinguish morphologically between
masculine and feminine gender, predicate adjectives invariably exhibit
these gender distinctions.  Therefore, in order to maintain a
feature-copying theory of agreement it would be necessary to posit
four homophonous forms for `nah'nu.

Accounts of agreement in unification-based grammars rest on two
premises which avoid such problems: (1) the source of agreement may be
partially specified with respect to agreement features, and (2) the
agreement relation ensures that the agreement features of the source
are compatible with the agreement features of the targets.  Thus,
rather than introducing homophonous forms simply to ensure that the
appropriate features are available to be copied to an agreement
target, an alternative and less problematic assumption is that
information about a nominal is distributed throughout a sentence in
the form of agreement markers.  This assumption fits naturally with an
account of agreement based on merging of information.

121. Extended Categorial Grammar
Suson Yoo and Kiyong Lee

This work aims at constructing a simple and yet descriptively adequate
grammar that suits a semantic program based on situation theory.  For
this we propose an extended version of categorial grammar called ecg
consisting of feature-based categories and only two types of
operations: category cancellations based on `subsumption' and category
concatenation, LP statements. ecg thus may be considered as one of
hpsg's version, but, instead of the notion `head', ecg heavily relies
on quotient, funtor, categories which are treated as providing in the
lexicon the fullest possible pieces of information about local trees.
ecg also uses indices to represent interactions between syntax and
semantics, which then makes it possible to adopt an equational solving
approach to the unification of information contents. In this paper,
however, we show that with only the operations of cancellation ecg
successfully accounts for such syntactic phenomena as agreement, case,
and unbounded dependency in English, leaving its semantics to another
work (Yoo and Lee, Situation Semantics for Extended Categorial
Grammar)

For this we propose an extended version of categorial grammar called
ecg consisting of feature-based categories and operations of category
cancellation.  We demonstrate that this version can successfully
account for such phenomena as agreement, case, and unbounded
dependency in Enlish.  We also show that the use of indices in ecg
makes it possible to adopt an equational approach to the
representation and unification of information contents and thus to
obtain the content of a complex expression by solving equations that
represent the contents of its constituent expressions.  We, however,
understand our proposed grammar ecg as an amalgamation, a successful
amalgamation of current linguistic models such as gpsg, hpsg, and some
version of categorial grammar.

122. The Situation in Logic IV: On the Model Theroy of Common
Knowledge
Jon Barwise $2.00

This paper presents a model-theoretic investigation of the
relationships between three different views of common knowledge: the
iterate approach, the fixed-point approach, and the shared-environment
approach.  A novel feature of the approach, from the model-theoretic
point of view, is the use of a metatheory which ensures the existence
of non-wellfounded models, e.g., models that can contain themselves as
objects. This is central to modeling the notions we are attempting to
understand, and suggests new directions in higher-order model theory.

123. Unaccusative Verbs in Dutch and the Syntax-Semantics Interface
Annie Zaenen $3.00

It has come to be assumed that in Dutch both impersonal passives and
the selection of the auxiliary zijn (to be) indicate that a verb is
unaccusative. This paper establishes that the two tests do not pick
out the same subset of verbs. It also gives a semantic
characterization of both subsets; verbs conjugated with zijn are telic
whereas those conjugated with hebben (to have) are atelic; verbs that
allow an impersonal passive are both atelic and volitional. It is also
shown that auxiliary selection should be considered part determined by
the Aktionsart of the verb whereas impersonal passives are part of the
ence as a whole. The discussion is limited to clearly intransitive
verbs.

124. What is Unification? A Categorical View of Substitution, Equation
and Solution
Joseph A. Goguen $3.50

>From a general perspective, a `substitution' is a transformation from
one space to another, an `equation' is a pair of such substitutions,
and a `solution' to an equation is a substitution that yields the same
value when composed with (i.e. when substituted into) the
substitutions that constitute the given equation. In some special
cases, solutions are called `unifiers'. Other examples include Scott
domain equations, unification grammars, type inference, and
differential equations. The intuition that the composition of
substitutions should be associative when defined, and should have
identities, motivates a general concept of `substitution system' based
on category theory. Notions of morphism, congruence, and quotient are
given for substitution systems, each with the expected properties, and
some general cardinality bounds are proved for most general solution
sets (which are minimal sets of solutions with the property that any
other solution is a substitution instance of one in the set). The
notions of equation and solution are also generalized to systems of
equations, i.e. to constraint solving. This paper is sel-contained as
regards category theory, and indeed, could be used as an introductory
tutorial on that subject.

125. Types and Tokens in Linguistics
Sylvain Bromberger $3.00

This paper takes as its point of departure three widely---and
rightly---accepted truisms: (1) that linguistic theorizing rests on
information about types, that is, word types, phrase types, sentence
types, and the like; (2) that linguistics is an empirical science and
that the information about types on which it rests is empirical
information, that is, information obtained by attending with ones
senses to something---normally tokens (utterances); (3) that the facts
that linguistics seeks to uncover---for instance, facts about the
lexicon or the range of sentences in a given
\vadjust{\newpagelanguage---follow, in part at least, from facts about
people's mental makeup. It (the paper) seeks to reconcile (1) with (2)
and with (3). Few, if any, practitioners feel the need for such a
reconciliation, but wide-eyed philosophers like me who think (rightly)
that types are abstract entities, that is, nonspatial, nontemporal,
unobservable, causally impotent entities, have trouble seeing how
information about such entities can be obtained by attending to
spatial, temporal, observable entities, though they cannot deny that
it can; and they have even greater trouble seeing how features of
minds (some misguided souls would say brains) can have repercussions
in the realm of abstract entities.

The reconciliation proposed in the paper is based on two conjectures.
First, that tokens of a type (for instance all the utterances of
`cat,' or all the utterances of `Mary sank a ship') form what I call a
"quasi-natural kind," a grouping like that formed by all the samples
of a chemical substance (for instance, all the samples of mercury).
Second, that tokens of different types form what I call "categories,"
a grouping like that formed by the samples of different chemical
substances (mercury, water, gold, sulfuric acid, etc.). The
possibility of inferring facts about types from facts about tokens
follows from these two conjectures like day follows night. And so does
the conclusion that linguistics is grounded on mental realities.  The
conjectures, if true, also reveal that the subject matter of
linguistics, like the subject matter of any natural science, is
defined by configurations of questions as well as by the makeup of the
world.

The paper is an essay in the philosophy of science as it applies
to linguistics.

126. Determination, Uniformity, and Relevance: Normative Criteria for
Generalization and Reasoning by Analogy
Todd Davies $4.50

127. Modal Subordination and Pronominal Anaphora in Discourse
Craige Roberts $4.50

Modal subordination involves the apparent extension of the scope of
modal operators intersententially across segments of a discourse.
Besides appearing to contradict otherwise well-supported
generalizations about the scope of such operators, this phenomonen
presents problems both for the analysis of the logical entailments of
individual sentences in such contexts, and for theories of anaphora in
discourse.  An account of modal subordination is proposed which
involves extending Discourse Representation Theory to include modal
operators.