[mod.ai] how we decide whether it has a mind

colonel@buffalo.CSNET ("Col. G. L. Sicherman") (10/30/86)

In article <8610271726.AA12550@ucbvax.Berkeley.EDU>, harnad@mind.UUCP writes:

> >       One (rationally) believes other people are conscious BOTH because
> >       of their performance and because their internal stuff is a lot like
> >       one's own.
> 
>                                         ...  I am not denying that
> there exist some objective data that correlate with having a mind
> (consciousness) over and above performance data. In particular,
> there's (1) the way we look and (2) the fact that we have brains. What
> I am denying is that this is relevant to our intuitions about who has a
> mind and why. I claim that our intuitive sense of who has a mind is
> COMPLETELY based on performance, and our reason can do no better. ...

There's a complication here:  Our intutions about things in our environment
change with the environment.  The first time you use a telephone, you hear
an electronic reproduction of somebody's voice; you KNOW that you're talking
to a machine, not to the other person.  Soon this knowledge evaporates, and
you come to think, "I talked with Smith today on the phone." You may even
have seen his face before you!

It's the same with thinking.  When only living things could perceive and
adapt accordingly, people did not think of artifacts as having minds.
This wasn't stubborn of them, just honest intuition.  When ELIZA came
along, it became useful for her users to think of her as having a mind.
Just like thinking you talked with Smith ...

I'd like to see less treatment of "X has a mind" as a formal proposition,
and more discussion of how we use our intuition about it.  After all,
is having a mind the most important thing about somebody to you?  Is
it even important at all?