[mod.ai] For posting on mod.ai 3rd of 4

harnad@mind.UUCP (Stevan Harnad) (10/26/86)

In mod.ai, Message-ID: <8610190504.AA08083@ucbvax.Berkeley.EDU>,
17 Oct 6 17:29:00 GMT, KRULWICH@C.CS.CMU.EDU (Bruce Krulwich) writes:

>	i disagree...that symbols, and in general any entity that a computer
>	will process, can only be dealt with in terms of syntax. for example,
>	when i add two integers, the bits that the integers are encoded in are
>	interpreted semantically to combine to form an integer. the same
>	could be said about a symbol that i pass to a routine in an
>	object-oriented system such as CLU, where what is done with
>	the symbol depends on it's type (which i claim is it's semantics)

Syntax is ordinarily defined as formal rules for manipulating physical
symbol tokens in virtue of their (arbitrary) SHAPES. The syntactic goings-on
are semantically interpretable, that is, the symbols are also
manipulable in virtue of their MEANINGS, not just their shapes.
Meaning is a complex and ill-understood phenomenon, but it includes
(1) the relation of the symbols to the real objects they "stand for" and
(2) a subjective sense of understanding that relation (i.e., what
Searle has for English and lacks for Chinese, despite correctly
manipulating its symbols). So far the only ones who seem to
do (1) and (2) are ourselves. Redefining semantics as manipulating symbols
in virtue of their "type" doesn't seem to solve the problem...

>	i think that the reason that computers are so far behind the
>	human brain in semantic interpretation and in general "thinking"
>	is that the brain contains a hell of a lot more information
>	than most computer systems, and also the brain makes associations
>	much faster, so an object (ie, a thought) is associated with
>	its semantics almost instantly.

I'd say you're pinning a lot of hopes on "more" and "faster." The
problem just might be somewhat deeper than that...

Stevan Harnad
princeton!mind!harnad
harnad%mind@princeton.csnet
(609)-921-7771