[net.ai] The Turing Test

ags@pucc-i (Seaman) (07/05/84)

>  Is it coincidence that the computer declines
>  to write a sonnet and accepts the other challenges?  A real human, trying
>  to prove that he is not a computer program, would probably welcome the
>  opportunity to offer a poem.

Yes, I believe it is a coincidence.  Another conversation from the Turing
article demonstrates that he did not mean to exclude the possibility of
a sonnet-writing machine:

	Interrogator:  In the first line of your sonnet which reads
	'Shall I compare thee to a summer's day,' would not 'a spring
	day' do as well or better?

	Witness:  It wouldn't scan.

	Interrogator:  How about 'a winter's day.' That would scan all
	right.

	Witness:  Yes, but nobody wants to be compared to a winter's
	day.

	Interrogator:  Yet Christmas is a winter's day, and I do not
	think Mr. Pickwick would mind the comparison.

	Witness:  I don't think you're serious.  By a winter's day one
	means a typical winter's day, rather than a special one like
	Christmas.

And so on [Turing continues].  What would Professor Jefferson say if
the sonnet-writing machine was able to answer like this in the viva voce?

-------------------------------------------------------------------------

>  My attack was not against the details of the conversation (for that 
>  matter, the third problem is ambiguous), but the premise of the Test.  

Yes, the third problem was ambiguous.  I thought it was also rather
clever:

	Q:  I have K at my K1, and no other pieces.  You have only K
	    at K6 and R at R1.  It is your move.  What do you play?
	A:  (After a pause of 15 seconds) R-R8 mate.

A machine might be expected to ask whether the rook is at QR1 or KR1,
not realizing that it is irrelevant.  The answer "R-R8 mate" is
correct in either case.  Was this a trap laid by the questioner?

You say you object to the premise of the test.  The reason for that
becomes apparent in your next comment:

>  You may remember that Turing called it a "Game" rather than a "Test."  This
>  sort of situation arises _only_ as a game; if you really want to know
>  whether somebody is a person or a computer, you just look at him/it.

Where does Turing say or imply that being able to tell a person from a 
computer is of any importance?  The question is merely, "Can a machine 
think?"  Unless you believe that "having a human form" is a prerequisite for 
thinking, physical appearance means nothing.  Is your objection of the form,

  1.  The Turing "imitation game" is not an adequate test of a machine's
      ability to think?  [If not, why not?]

  2.  It is of no importance to decide whether machines can think, and
      therefore the Turing "imitation game" has no value?  [If this is
      your position, then I think we have nothing more to discuss.]

>  I should think that ELIZA has laid to rest the myth that a program's
>  "humanity" has anything to do with its intelligence.  ELIZA's intel-
>  ligence was low, but she was a very human source of comfort to many
>  people who talked with her.

I don't think the imitation game is (or was intended to be) a test of
"humanity."  Since ELIZA cannot come close to performing well in the 
imitation game, she has no relevance to the validity of the test.
Yes, I am aware that ELIZA has fooled people, but this happened under 
circumstances that are very different from the imitation game.
-- 

Dave Seaman			"My hovercraft is full of eels."
..!pur-ee!pucc-i:ags

mikevp@proper.UUCP (07/10/84)

It seems to me that a good place to try a Turing test would
be on just such a network as this eone.  Could one or more
of the people who write these messages be AI programs?
Probably not, but it would certainly be a good place to 
ry out such a pam if someone has one that looks convincing.