[comp.ai] What IS a symbol?

dan-hankins@cup.portal.com (Daniel B Hankins) (04/22/89)

In article <11313@bcsaic.UUCP> ray@bcsaic.UUCP (Ray Allis) writes:

>>>A digital computer is the archetypical physical symbol system; it
>>>manipulates symbols according to specified relationships among them,
>>>with absolute disregard for whatever they symbolize. 
>> 
>>     This is essentially the same error.  If they don't symbolize
>>anything, then they aren't symbols.  No representation, no symbol.
>
>They DO symbolize something; *human experience*, but the computer never
>gets to "see" that!  Most AI and "simulation of cognition" people insist
>that symbols are enough!

     They don't symbolize human experience to the computer.  They symbolize
human experience to other humans.  To the computer they are merely more
signals to move around and combine.  But I begin to see where you are
coming from, and will adress it shortly.


>The nature of a symbol is that there is *no necessary* relationship
>between it and whatever it symbolizes.  The symbol "red" doesn't *cause*
>redness, or always occur in conjunction with redness, or in fact have any
>other relationship at all to the subjective experience of redness, except
>that which we as symbol-users *assigned*.

     It begins to seem to me that you are approaching AI from the viewpoint
of one that only sees the LISP and Prolog programs.  My viewpoint leans
(considerably) towards the biological connectionist camp.  A connectionist
system is different from the others in that the values of various
quantities (I suppose some might call them symbols) are _not_ assigned by
the programmer.  Rather, they are built on the basis of continuing input
and the state of other nodes in the network.  They 'represent' external
reality, in the sense of the word 'represent' that you are using.

     This is important.  Traditional AI systems deal in English (or
whatever language) words as their base symbol system.  The system is taught
to manipulate these symbols on the basis of what the programmer has in mind
as the meaning of the symbols.  Therefore, in some important sense, any
meaning in the computer program was put there by the programmer.

     However, Neural Networks are qualitatively different.  They use
physical quantities (like synapse conductivities and neuron activation
levels) as their base symbol system.  These 'symbols' then really do end up
representing (in the sense you defined) external reality, because any
'meaning' they acquire is a result of external experience rather than
implicit assignment of meaning by a programmer.


Dan Hankins

mgresham@artsnet.UUCP (Mark Gresham) (04/22/89)

In article <11313@bcsaic.UUCP> ray@bcsaic.UUCP (Ray Allis) writes:
>
>In article <16876@cup.portal.com> dan-hankins@cup.portal.com
>(Daniel B Hankins) writes:

...and back and forth.

My $.02 worth:

A 'symbol' is 'chunked' information with loose ends.

That's all I have to say about it.  Use your intuition and imagination.

Cheers,

--Mark

=======================================
Mark Gresham  ARTSNET  Atlanta, GA, USA
E-mail:      ...gatech!artsnet!mgresham
or:         artsnet!mgresham@gatech.edu
=======================================