[net.philosophy] Black Box

tmoody@sjuvax.UUCP (T. Moody) (11/04/85)

[]
I want to re-emphasize some points about what the "Turing machine" theory
of the mind does and does not imply.  The idea of the Turing Test was
to replace the question "Can machines think?" with a more verifiable
-- and, hence, more scientific -- one: "Can machines pass the Turing
Test?"  If a machine can pass the Turing Test, according to this, then
we know all that we need to know to justify the ascription of mental
states to it.  We do *not* need to know the details of *how* the
machine does it.  A machine, of course, is a system whose behavior can
be described and predicted by a finite algorithm.

This is often described as a "black box" approach to the mind.

The Turing Test is supposed to provide a sufficient condition for the
ascription of mental states, but it does not purport to specify the
*necessary* conditions.  As far as I know, neither Turing nor anybody
else has claimed that *only* a Turing machine could conceivably pass
the Turing Test.

Some people dispute the adequacy of the Turing Test as a logically
sufficient condition for mental states.  John Searle, of course, is
one such person.  Another is Ned Block.  The point of these disputes
is that mental terms carry certain commitments, and the Turing Test
does not capture or do justice to these commitments.  Block's argument
is quite different from Searle's.  Here is a summary:

A valid Turing Test has a time limit.  Let's say, a half hour.  There
are only finitely many responses that a system could produce in a half
hour.  Make a list of them.  There are only finitely many questions
that an interrogator could present in a half hour.  Make a list of
them.  Write a program of "if-thens" that links questions to
appropriate answers.  Make the program as complicated as you like, as
long as the system, in the end, only plays back "canned" sentences.
It is, in effect, an enormous sentence-playing jukebox.  This system
would pass the Turing Test.  In the opinion of Block, which I share,
the ascription of mental predicates to it would be unjustified.

One of the commitments of genuine "understanding" (a mental predicate)
is the ability to analyze sentences and *generate* new ones.  I would
hold this to be a *necessary* condition of genuine intelligence.  The
jukebox does not possess this property of generativity, but it can
pass the Turing Test.  It follows that the Turing Test is not a
logically sufficient condition for the ascription of at least one
mental predicate.

The point to discuss, it seems to me, is whether generativity is
indeed a necessary condition.


Todd Moody                 |  {allegra|astrovax|bpa|burdvax}!sjuvax!tmoody
Philosophy Department      |
St. Joseph's U.            |         "I couldn't fail to
Philadelphia, PA   19131   |          disagree with you less."