[net.nlang] American Languages

asa@rayssd.UUCP (11/09/83)

Newsgroups: net.nlang
Subject: Re: American languages
References: <1013@uvacs.UUCP>

The reason Hopi was cited by Whorf in his book was because it lacked
separate verb forms for distinguishing between time and growth. The
Hopi verbs only express whether or not the thing referred to is visible
above the ground, has reached a certaoin height, or some other aspect
of growth, as I remember it. This is a very interesting contrast with
English. What if you had to say "It will become visible tomorrow", instead
of "It will happen tomorrow"? Facts such as these, and the observed 
corollaries in behavior, as in the case of the "empty" gasoline drums, led
to the Sapir-Whorf "Relativity Hypothesis" for languages in general. I must
caution that I may have the Hopi language confused with Navaho, which
Whorf also cited.

jeffy@bnl.UUCP (Jeff Mattson) (11/11/83)

How can we discuss American languages without mentioning the third most
used language in America; namely, American Sign Language (ASL).  Despite
anything you may have heard, it's more than random gestures and mime. 
It's a legitimate language.  Some interesting facts about it:

* It has no verb "to be."  

* Verbs don't express tense.  Instead, adverbs for time are usually
  placed in the front of the sentence.  

* It's a great deal like Latin or German in that the order of the subject
  and object in a sentence doesn't matter.  In place of nasty declensions
  or endings, in ASL there are certain facial expressions.  To show 
  some word is the object, for example, you raise your eyebrows like
  you just saw something interesting.

* There's no gender at all.  For pronouns, you point.  If the person is 
  nearby, you point toward him/her/it.  Otherwise, you establish a point
  in space for the person, and everytime you refer to himl, you point in that
  direction.

Welll, the computer is acting awfully tempermental, so I'll be going.

-Jeff Mattson

asa@rayssd.UUCP (11/17/83)

Subject: American Languages
Newsgroups: nlang
   Replay to rene's question: what is the case of the empty drums?

  Sorry I didn't get back to you before this, but I have not had access

  The case of the missing drums describes a sequence of events which
contributed to the generation of the relativity hypothesis for natural
languages. I will summarize it for you, and those who spent time using
Whorf's book as a night-time sedative can skip it.


  First, let's set the context. Whorf was an industrial engineer, and
pioneered the field of safety engineering as a result of his work in risk
assessment for insurance companies.

   Now the story: One of his inspection trips took him to a company which
used gasoline as a fuel. They purchased the gasoline in 55-gallon drums,
and stored it in outdoor open bins in which coal had previously been
stored. One bin was used for the unused drums, one bin was used for the
drums from which the fuel had been poured. Being safety conscious, the
company had posted three signs: Danger, Full, and Empty. The signs were
located as follows: the signs Danger and Full were over the bin with the
unused fuel. The sign Empty was over the bin whrere the barrels with no
gasoline liquid in them were stored. 

   So what? Well, there were a group of workemen who regularly gathered
here on their break, to smoke and have a cup of coffee. Out of respect
for the semantics of the signs, they did their smoking in the bin
labelled Emopty. This bizarre behavior, given the extreme danger of
igniting gasoline fumes, started Whorf thinking about how "menings"
attached to words in the language can impact behavior. In this case, the
word "full" carries such strong implications of the presence of a solid
or liquid that the speaker can only relate to it in those terms. This
association is so strong that one cannot conceive of a natural extension
to cover the presence of invisible, colorless, maybe odorless substances
of the type 'gaseous'. Trying to do this would violate the whole concept,
since nothing would ever be 'empty' in the sense in which that word is
intended. Whorf noted that this was a case in which a brand-new single
word would be useful, provided that it conveyed the emotion of danger
with its use. He could think of no simple way to do this, unless he could
create from roots familiar to users of the language, and even then he
would have trouble with the emotional neutrality of the invented word,
until social useage had created a strtong associaltion with the concept
of danger. 

   This incident, in combination with the limitations of expression he
found in other languages led him to articulate, in association with
Sapir, the Sapir-Whorf Relativity Hypothesis for language. This states
that the process of formalizing a language results in a restriction on
what can be thought, and beyond that, on the behavior patterns observed
because of the conceptual constraints imposed by agreed 'definitions'.
All those who believe that formalization of language in some mathematical
sense thus may be proposing that we give up more than we gain.

   Now I'd like to spend a minute on a subject which puzzles me very much,
and with which you might help me. Modern Grammarians have skipped the
first step in the description of a sentence, as it was described in the
old days. This is "A sentence is composed of a subect and a predicate".
The conept of "subject" seems to have gotten lost. To me that is sad,
because it helps so much in explaining how sentences get to be the way
they are, and how they could be easily analyzed, from a semantics point
of view. In this view, a subject is a unique cognate (my word for
cognitive entity), which may be described (or denoted) by a word phrase
or clause, in the same way a noun can be defined. The form used depends,
in part, on how abstract the cognate is, and how much information must be
added to a base-word (such as "man") to assure that the hearer (reader)
selects the correct member of a set of possible subjects of that type.
Thus, in the sentence "The man wearing the green hat won the jackpot",
the semantic functions of 'the' and 'wearing the green hat' is quite
clear. They point one to the specific person being discussed. Similarly,
in the sentence "Wearing a gren hat makes him look ridiculous", the
function of "wearing a green hat" quite clear. It points to the attribute
which is the subject of diacussion. If one were not preoccuppied with
linear progression through a sequence of words in the fashion of a
sequential machine, it seems as though the process of parsing sentences
could be simplified enormously. All one needs is an allowance for a
temporary store which holds enough data to allow basic divisions to be
determined, with successive passes to simplify the significant units.
The fact that significant units exist whose functions are clear, such as
the pointer function, might be used to allow some simplification 'on the
fly', in a way similar to the way humans seem to operate, in the sense
that once the cognate pointed to is unambiguously identified, a simple
marker (representing the focus of attention) could be substituted for it
before the remainder of the input is processed. This would still be a
step-wise, group-processing approach, however, and might still require a
multi-pass operation. Even so, ti seems to allow for analysis techniques
which are conceptually much cleaner, and which capture more of our
intuitions about language in a more natural way. For that reason, I
cannot understand why no one except Winograd has used an approach
resembling it. Anyone out there have any insights?
Subject: American Languages
Newsgroups: net.nlang

morgan@uicsl.UUCP (12/01/83)

#R:rayssd:-27400:uicsl:8600032:000:1448
uicsl!morgan    Nov 30 18:06:00 1983

Your definition and discussion conflate things that need to be
kept separate, including at least these:

	grammatical subject	(the traditional notion)
	discourse topic
	referring expression

You begin by talking about subjects, then give a definition that
would apply to any noun phrase that has reference (i.e.
referring expression), then you mention 'focus of attention',
which might be an appropriate way to look at discourse topics.
The three are quite distinct, at least in English.  What
would count as a grammatical subject need not be the discourse
topic, as when one says in a discussion of John,

	Everybody likes John

where grammarians new and old would agree that 'everybody' is
the subject, though John (NB not 'John') is the focus of
attention.  In fact the grammatical subject need not even
refer to anything; many languages have "dummy" subjects, as
in

	It's raining
	It's a long way to Chicago
	It's illegal to sleep in subways

where, most grammarians would agree, 'it' is the subject,
but certainly not the topic, since as a matter of fact it
doen't refer to anything, but is merely an empty morpheme
to satisfy the English requirement that every declarative
clause have a grammatical subject.

In fact, it appears that we need all three of these notions
to make sense of the syntax of natural languages.  But
giving satisfactory explicit definitions of each is
extremely difficult, and worth pursuing, if you're interested.