[net.ai] Linguistic Structuring of Concepts

PENTLAND@SRI-AI.ARPA (04/05/84)

           [Forwarded from the CSLI bboard by Laws@SRI-AI.]

Issues In Language, Perception and Cognition
WHO: Len Talmy, Cognitive Science Program and German Dept., UC Berkeley
WHEN: Monday April 9, 12:00 noon
WHERE: Room 100, Psychology

                How Language Structures its Concepts

Languages have two kinds of elements: open-class, comprising  the  roots
of  nouns,  verbs,  and adjectives, and closed-class, comprising all in-
flections, particle words, grammatical categories, and the like.  Exami-
nation  of a range of languages reveals that closed-class elements refer
exclusively to certain concepts, and seemingly never to concepts outside
those  (e.g., inflection on nouns may indicate number, but never color).
My idea is that all closed-class elements taken  together  consistute  a
very  special  group:  they  code  for a fundamental set of notions that
serve to structure the conceptual material expressed by language.   More
particularly,   their  references constitute a basic notional framework,
or scaffolding, around which is organized the more contentful conceptual
material  represented by open-class (i.e., lexical) elements.  The ques-
tions to be addressed are: a) Which exactly are the notions specified by
closed-class  elements, and which notions are excluded?  b) What proper-
ties are shared by the included notions and  absent  from  the  excluded
ones?   c) What functions are served by this design feature of language,
i.e., the existence in the first place of  a  division  into  open-  and
closed-class  subsystems,  and  then the particular character that these
have?  d) How does this structuring system specific to language  compare
with  those  in other cognitive subsystems, e.g. in visual perception or
memory?  With question (d), this linguistic investigation opens out into
the  issue  of  structuring within cognitive contents in general, across
cognitive domains.