[sci.lang] English grammar

goldberg@su-russell.ARPA (Jeffrey Goldberg) (05/30/87)

In article <2112@husc6.UUCP> hughes@endor.UUCP (Brian Hughes) writes:
>In article <1116@houdi.UUCP> marty1@houdi.UUCP (M.BRILLIANT) writes:
>	(summarized)
>>In article <13263@watmath.UUCP>, erhoogerbeet@watmath.UUCP writes:
>>> ...
>>> Is there a Backus-Naur Form for the English language itself or is this too
>>> complicated? ... Basically, what I am asking is it possible to do syntactic
>>> checking as if "compiling" a sentence with rules set down in some BNF?

>	Natural language is not context free (though some people disagree
>on this). BNF formalisms cannot deal with context sensitive languages

I don't think that there is any serious disagreement here.  The
work done by Culy on Bambara reduplication and Shieber on Swiss
cross serial dependencies has convinced the last hold outs for CFness
(Geoff Pullum, Gerald Gazdar, and their students: me, Culy, etc).

>>About 30 years ago when I was at MIT doing graduate study in EE, my
>>wife was talking with a guy named Chomsky who wanted to do machine
>>translation.  The effort resulted in new approaches to English grammar,
>>but not in machine translation.

>	While in a strict sense this is true, Chomsky's transformational
>grammer seems to be almost universaly accepted as the basis upon which to
>build models that deal with the syntax of natural languages. This is true
>for computerized models as well as pure abstract models.

These is hardly true at all.  It is true that "generative grammar" is
nearly universally accepted, and this comes from Chomsky.  While
the most popular current generative theory is transformational
(Government and Binding theory), the role of transformations has
been reduced radically, and much more emphasis is placed on
interacting well formedness conditions on different levels of
representations.

Substantial minority theories, Generalized Phrase Structure
Grammar, and Lexical Functional Grammar, do not employ
transformations.

A summary of these three theories can be found in "Lectures
on Contemporary Syntactic Theories:  An Introduction to
government-Binding Theory, Generalized Phrase Structure Grammar,
and Lexical-Functional Grammar"  by Peter Sell.  Published by the
Center for the Study of Language and Information, and distributed
by Chicago University Press.

I have seen implementations based on LFG, GPSG (and an offshoot
of that) as well as some other not transformational models.  I
have only once seen a GB based parser.  It was very clever, but
it only parsed four sentences.

None of these theories were constructed with computer processing in
mind, but it does turn out that it is often easier to build a
parser based on nontransformation representations.  None of the
authors of these theories would claim that their theory was a
better linguistic theory because of this property.

>>> As I understand it so far, natural language processing would have at least
>>> two levels (syntactic, semantic) and that syntactic checking level would
>>> be the basis of the other.

I have seen parsers that build up semantic representations along
with the syntax in which there is no sense that the syntax is
prior.

Again, I am directing follow-ups to my follow-up to sci.lang.

-- 
Jeff Goldberg 
ARPA   goldberg@russell.stanford.edu
UUCP   ...!ucbvax!russell.stanford.edu!goldberg

tfra@ur-tut.UUCP (Tom Frauenhofer) (06/01/87)

[Et tu, line-eater?]

Actually, there is a (one-paragraph) discussion comparing BNF versus Transition
Networks in the latest issue of AI Magazine (Volume 8, Number 1).  It is part
of the article "YANLI: A Powerful Natural Language Front-End Tool" by
John C. Glasgow II.  It even includes an example of a BNF and a Transition
Network representation of the same grammar fragment.