[comp.ai] Perception and Logic

kp@uts.amdahl.com (Ken Presting) (03/23/90)

In article <12532@venera.UUCP> smoliar@vaxa.isi.edu.UUCP (Stephen Smoliar) writes:
>In article <a9Mo024=94yf01@amdahl.uts.amdahl.com> kp@amdahl.uts.amdahl.com (Ken
>Presting) writes:
>>
>>The most important aspect of logic for present purposes is the immense
>>distance between the *practice* of logic and the *process* of perception.
>>
>So we DO have some agreement!  The whole reason I wanted to get into this
>discussion in the first place was because of my interest in perception AS
>A PROCESS. ...

I do entirely agree that perception is a process, and that concept
formation is a process, and (perhaps most significant) that selection
models are the best approach to understanding concept formation.
Furthermore, I am attracted to selection models of perception itself,
more than to perceptrons and their offspring.

>it seems that you have now admitted that the practice of logic is not
>appropriate for this concern.

That "immense distance" exists, in my view, between the practice of logic
and *every* process.  The only way to brigde the gap is by implementation
(from the bottom up) and by interpretation (from the top down).  It is
essential to the success of AI that the gap be bridged.  For example,
perceptual observations are important in almost every argument, and if
AI is to build machines that get on the world the way literate people
do, the machines must have the ability to make arguemnts based on
observations, and criticize arguments based on observations which disagree
with its own.

> . . .                         In a similar vein, I would argue that
>solving "the problem of understanding literature and poetry" is also
>not appropriate, since I am prepared to argue that what we "understand"
>about any artistic experience depends on what we PERCEIVE.

I agree with the premise, but not with the conclusion.  In my view, to
understand even the most rigorous algebraic derivation requires a mental
process which is more like interpreting a poem than observing an event.
Understanding literal-minded speech is significantly simpler, but still
involves much more complex activities than perception.

(Sooner or later, AI must face up to the experience of art, since almost
every human being is affected to some degree by some art.  I believe that
we *could* start with music, and from an understanding of the experience
of music, understand how to make an AI.  But that would impose a very
high start-up cost)


It may help to follow my position if I own up to a heretical belief.
While I am a strong supporter of Strong AI, I do not accept the
computational theory of mind, and I do not accept the Language of Thought
hypothesis.  These theories seem to me to deny the "immense gap", which
is nothing more than the distinction between normative and descriptive
properties.  Or, equivalently, the distinction between abstractions and
reality.

(of course, I am no "connectionist" or "eliminativist" either)

Ken Presting