[comp.ai] PARRY, THE DOCTOR, and the Turing Test

rapp@sunGk.tellabs.com (Chuck Rapp) (07/13/90)

In article <5841@tellab5.tellabs.com>, nolan@tssi.UUCP (Michael Nolan) writes:
> I remember a famous and funny article in DATAMATION, a few
> lifetimes ago, around 1970, in which PARRY was hooked up with
> THE DOCTOR, an 'eliza' variant.  This pair came fairly close
> to passing the 'turing test', which I suspect has fallen somewhat
> by the wayside among current A.I. practitioners.

Firstly, the article referred to is:

	Cerf, Vinton. "PARRY meets THE DOCTOR".  DATAMATION, July, 1973,
		pp. 62-64.

A sample of this dialogue is:

	PARRY:  I don't understand your motives.
	DOCTOR: Why don't you understand my motives?
	PARRY:  Shouldn't I be?
	DOCTOR: Why do you say that just now?
	PARRY:  That's just the way it is.
	DOCTOR: Suppose that's just the way it were not?
	PARRY:  So?
	DOCTOR: I am not sure I understand you fully.
	PARRY:  You harp on the same questions too much.
	DOCTOR: What does that similarity suggest to you?
	PARRY:  I don't get you.
	DOCTOR: Do you wish you did get I?
	PARRY:  It's about time you showed some interest in my
		feelings.
	DOCTOR: Oh?  I showed some interest in your feelings?

This transcript is entertaining to read but it fails to accurately
simulate human dialogue.

Secondly, I don't believe THE DOCTOR or ELIZA were ever put before a
formal Turing Test and if they were, the interregator would probably
become frustrated with these programs quickly.  There is a anecdote
that ELIZA was left running on a terminal one weekend by a system
administrator.  It happened that a not-too-technical VP brought in a
potential customer to view the company's software product.  The VP
came upon the terminal which had ELIZA and, thinking it was
connected to the system administrator, started to "talk".  The
conversation went something like this:

	VP:    Please start up the demo.
	ELIZA: Why do you want me to start up the demo?
	VP:    Because I want to show it to a customer.
	ELIZA: Is it important for you to show it to a customer?
	VP:    Yes!
	ELIZA: Are you saying yes just to agree?
	VP:    I am trying to make a sale.
	ELIZA: Tell me more about this need.
	VP:    <expletive deleted>

The system administrator was in a fog when the VP chewed him out 
the next Monday.

PARRY went through rigorous testing based on the Turing Test.
Dr. Colby describes this testing in his article:

	Colby, Kenneth Mark. "Modeling a paranoid mind".  The Behavioral
		and Brain Sciences, No. 4, 1981, pp. 515-560.

Ten psychiatrists took part in the judging.  Each conducted a
diagnostic psychiatric interview with a real patient and with
PARRY.  Both interviews were conducted via a terminal and lasted anywhere
from 30 to 60 minutes.  The interviews were separated by a few days.
Judges decided whether they were talking with a human or PARRY after
each exchange and the degree of confidence in the decision.
Five of the psychiatrists judged both "patients" to be paranoid (Dr. Colby
doesn't say what the other five diagnosed.)  As for deciding which patient
was PARRY, 5 out of 10 made the correct assessment, a random
outcome.

Dr. Colby then goes on to make an honest assessment of the significance
of these results and whether a Turing Test is a valid methodology for
theory validation.  These points are discussed further in the "Open
Peer Commentary" which follows the article proper.  For those
interested in reading some interesting thoughts about PARRY and
Turing Tests, I suggest you read this article.  Of particular
interest are the discussions of whether AI is a "cargo cult" science.

In a larger sense, neither ELIZA nor PARRY could pass the Turing Test
because of their limited domain.  ELIZA just keeps reforming what the
human says into a question and PARRY is focused on telling its story
about horse-racing, gambling, and gangsters.  Any prolonged
interview would soon come to the end of their limited knowledge.
Consider the following sample of a dialogue with PARRY:

	PERSON: Would you rather be outside the hospital?
	PARRY:  Yes I am.  This place is not for me.
	PERSON: Where would you live?
	PARRY:  I live in San Jose.
	PERSON: How would you support yourself?  When you are out,
		will you be able to pay rent and buy food?
	PARRY:  It's about eight o'clock.
	PERSON: What does that have to do with my question?
	PARRY:  It's not helping me to be here.
	PERSON: What would help you?
	PARRY:  I don't really need help like they think.
	PERSON: What do they think?
	PARRY:  Do you know anything about bookies?

What makes PARRY interesting is the thorough testing and analysis
the model went through.  The use of a Turing-like test for
validation is notable in itself.
--

| Chuck Rapp       | Internet: rapp@balr.com   | If you're Bell Labs employees,     |
| BALR Corporation | UUCP: uunet!balr.com!rapp | then where are your badges?        |
| Oakbrook, IL     |                           | Badges? Badges?! We have no badges.|
|                  |                           | We don't need no stinking badges!  |

smoliar@vaxa.isi.edu (Stephen Smoliar) (07/14/90)

In article <3004@tellab5.tellabs.com> rapp@sunGk.tellabs.com (Chuck Rapp)
writes:
>
>Secondly, I don't believe THE DOCTOR or ELIZA were ever put before a
>formal Turing Test and if they were, the interregator would probably
>become frustrated with these programs quickly.  There is a anecdote
>that ELIZA was left running on a terminal one weekend by a system
>administrator.  It happened that a not-too-technical VP brought in a
>potential customer to view the company's software product.  The VP
>came upon the terminal which had ELIZA and, thinking it was
>connected to the system administrator, started to "talk".  The
>conversation went something like this:
>
>	VP:    Please start up the demo.
>	ELIZA: Why do you want me to start up the demo?
>	VP:    Because I want to show it to a customer.
>	ELIZA: Is it important for you to show it to a customer?
>	VP:    Yes!
>	ELIZA: Are you saying yes just to agree?
>	VP:    I am trying to make a sale.
>	ELIZA: Tell me more about this need.
>	VP:    <expletive deleted>
>
>The system administrator was in a fog when the VP chewed him out 
>the next Monday.
>
This is not quite what happened, but it is close enough that the circumstances
should be left altered to protect the parties involved!  Nevertheless, one must
still recognize all those individual who chose to behave as if ELIZA really
WERE an intelligence (or at least understanding) individual.  This is one of
the circumstances which prompted Weizenbaum to go over the brink with his book.
He could not understand that a secretary might want to be left alone while
talking to ELIZA!  (Of course, she would be just as likely to want to be left
alone while talking to her cat . . . as would any of us.)  I happened to show
ELIZA to a member of the Merce Cunningham Dance Company;  and her assessment
was that it was "just like talking to Merce."  The point is that you can do
all sorts of things when you exchange sentences, and some of those things which
involve "intelligence" need not also involve any rigorous semantic analysis of
those sentences.  Perhaps the best evidence for this was the "withdrawal" which
set in among many programmers when the ELIZA code went away at one point.

=========================================================================

USPS:	Stephen Smoliar
	USC Information Sciences Institute
	4676 Admiralty Way  Suite 1001
	Marina del Rey, California  90292-6695

Internet:  smoliar@vaxa.isi.edu

"It's only words . . . unless they're true."--David Mamet