[net.misc] The fly and the Turing test

lew (12/04/82)

Michael Wagner's retort to my comment that, "If you put a fly in a box
it will try to get out." contains the same reasoning that reactionary
critics use to dismiss the Turing test as a criterion for intelligence.
And all after he stated that he wasn't sure what it had to do with the
Turing test.

I don't assert that a fly has intelligence. My point is that it is an
independent agent which has a consistent and meaningful set of goals,
and a set of resources and abilities, including some very sophisticated
information processing, to carry them out. I would say that it is
proto-intelligent. I think comparing it to a gas molecule is much
more ridiculous than comparing a fly to a human.

Flies don't bump into flowers. They have eyes and wings and split-second
coordination. They might fly around it, or they might land on it.
It is certainly meaningful to talk about how a fly decides which to do.

Well, I'm starting to heavily overlap my "birds vs. AI" submission, but
let me make one more remark. Rick (watmath!pcmcgeer) commented that
Berkeley Unix on a VAX 11/780 gives a much better indication of
intelligence than most human conversation. I place the fly well above
any computer in sophistication of information processing. Before you
scoff, go look up "Atlas of the Fly Brain".

Lew Mammel, Jr. ihuxr!lew

wagner (12/06/82)

Lew Mammel correctly pointed out that the analogy between the
fly and a gas molecule was flawed.  I don't seem to have made
my point properly (as an aside, what is it about this form of
communication that interferes with getting ones point across
accurately?  I don't generally have this much trouble in
person.  We better find out what it is before we offer networks
(worldnet) to the masses!).  I agree that the fly has much
more image-processing, flight control, goal processing, etc,
than a VAX or a gas molecule.  I shouldn't have said that it
bumps into flowers.  I know it actually is aware of the
flower and lands properly on it.  My point was that the goal
processing is extremely limited (compared to us, not a vax),
and the fly had only a small set of relatively easy goals
(generally instinctive rather than intellectually arrived at).
It can "execute" a landing procedure in response to seeing an
object up close; I assume it can also deduce whether this is
a "nice" flower (his criteria, not mine), and decide to stay or
go on.  In particular, and this is what I was trying to get at,
I don't think that the fly has enough data gathering and
storing ability to recognize the fact that it is inside a
*closed* container.  It is just going about its business as
normal, looking for flowers (sounds more like a bee, actually!)
and probably not even particularly aware that it is suffering
from a particularly low hit ratio inside the box.  It is not
"trying" to get out.  The fact that it spends a lot of time
on the surface of the box is an artifact of executing a random
walk in a small volume relative to the average size of the 
walks.  And that is where the gas molecule came in.  It also
executes random walks until it bumps into the walls, at which
point it goes elsewhere (actually, the behaviour of a molecule
in a vacuum is not even a good model of the sort of random 
walks I mean, but this REALLY isn't supposed to be a 
discussion of gas behaviour, so lets leave the gasses alone).
	Now, I still don't know how to characterize what the
fly has, and decide if it is non/pseudo/fully intelligent.
I doubt that it could even apply to take the Turing test
(archie and mehitabel (sp?) notwithstanding).  It is an
impressive and comprehensive design for flying (of course),
landing, and other fly business.  I guess the biologists
would have to pass judgement on how much of it was reflex
and instinct, and how much of it was higher level function.
I expect very little of the later.
	This is getting long. Enough for now.

Sorry if I had, by oversight, insulted the intelligence
of any fly who happened to be reading this discussion

Michael Wagner, UTCS (decvax!utzoo!utcsrgv!utcsstat!wagner)

cjp (12/07/82)

I second (third?) the suggestion to start a net.ai.  Does anyone want
to volunteer to receive a vote count? (I'm not doing so, just trying to
get things moving.)

I'd be interested to hear more people's opinion on what "intelligence"
is.  There are many, many ways of looking at it: goal seeking, good
memory, common sense, wisdom, puzzle solving, learning, teaching,
invention, bumping against the sides of a box (for a fly)?!?, and god
knows what else.

We've got to realize what we are talking about before we can hope to
have meaningful discussions concerning AI.

Let's hear it, what is *your* definition of intelligence?

	Charles J. Poirier (decvax!mcnc!cjp)

mark@sri-unix (12/07/82)

Funny you should mention bees.  Go look up bees and ants in your
encyclopedia sometime.  The sophistocated communication among them
is truly amazing.  Bees find food, then return to the hive and do a
dance in the shape of a circle with a wiggly line up the middle.
The wiggly line points in the direction of the food, and the frequency
of wiggles says how far it is.  I think ants have something along
these lines too.  They aren't exactly huge brained intelligent
creatures, but they obviously have a good deal of processing power.

jcz (12/13/82)

References: mcnc.1404



Another vote for net.ai!

About a year and a half ago I suggested this but the
time was not right.

The idead for the Turing test came from Turing's Imitation
Game.   Has anyone on the net actually played the game?

Would you care to via electric mail, posting results to
net.ai?

--jcz
(John Carl Zeigler '( North Carolina State.University))