[comp.ai] Language-related capabilities

kww@amethyst.ma.arizona.edu (K Watkins) (06/01/88)

In article <238@proxftl.UUCP> tomh@proxftl.UUCP (Tom Holroyd) writes:
>Name one thing that isn't expressible with language! :-)

>A dog might "know" something and not be able to describe it, but this is
>a shortcoming of the dog.  Humans have languages, natural and artificial,
>that let us manipulate and transmit knowledge.
>
>Does somebody out there want to discuss the difference between the dog's
>way of knowing (no language) and the human's way of knowing (using language)?

A dog's way of knowing leaves no room that I can see for distinguishing 
between the model of reality that the dog contemplates and the reality 
itself.  A human's way of knowing--once the human is a competent user of
language--definitely allows this distinction, thus enabling lies, fiction,
deliberate invention, and a host of other useful and hampering results of 
recognized possible disjunction between the model and the reality.

One aspect of this, probably one of the most important, is that it makes it
easy to recognize that in any given situation there is much unknown but 
possibly relevant data...and to cope with that recognition without freaking
out.

It is also possible to use language to _refer_ to things which language cannot
adequately describe, since language users are aware of reality beyond the
linguistic model.  Some would say (pursue this in talk.philosophy, if at all) 
language cannot adequately describe _anything_; but in more ordinary terms, it
is fairly common to hold the opinion that certain emotional states cannot be
adequately described in language...whence the common nonlinguistic 
"expression" of those states, as through a right hook or a tender kiss.

Question:  Is the difficulty of accurate linguistic expression of emotion at
all related to the idea that emotional beings and computers/computer programs
are mutually exclusive categories?

If so, why does the possibility of sensory input to computers make so much
more sense to the AI community than the possibility of emotional output?  Or
does that community see little value in such output?  In any case, I don't see
much evidence that anyone is trying to make it more possible.  Why not?

ok@quintus.UUCP (Richard A. O'Keefe) (06/01/88)

In article <700@amethyst.ma.arizona.edu>, kww@amethyst.ma.arizona.edu (K Watkins) writes:
> If so, why does the possibility of sensory input to computers make so much
> more sense to the AI community than the possibility of emotional output?  Or
> does that community see little value in such output?  In any case, I don't see
> much evidence that anyone is trying to make it more possible.  Why not?

Aaron Sloman had a paper "You don't need a soft skin to have a warm heart".
I don't know whether that has been followed up.

bwk@mitre-bedford.ARPA (Barry W. Kort) (06/02/88)

In his rejoinder to Tom Holroyd's posting, K. Watkins writes:

>Question:  Is the difficulty of accurate linguistic expression of emotion at
>all related to the idea that emotional beings and computers/computer programs
>are mutually exclusive categories?
>
>If so, why does the possibility of sensory input to computers make so much
>more sense to the AI community than the possibility of emotional output?  Or
>does that community see little value in such output?  In any case, I don't see
>much evidence that anyone is trying to make it more possible.  Why not?

These are interesting questions, and I hope we can mine some gold along
this vein.

I don't think that it is an accident that emotional states are difficult
to capture in conventional language.  My emotions run high when I find
myself in a situation where words fail me.  If I can name my emotional
state, I can avoid the necessity of acting it out nonverbally.  Trouble
is, I don't know the names of all possible emotional states, least of
all the ones I have not visited before.

Nevertheless, I think it is useful for computer programs to express
emotions.  A diagnostic message is a form of emotional expression.
The computer is saying, "Something's wrong.  I'm stuck and I don't
know what to do."  And sure enough, the computer doesn't do what
you had in mind.  (By the way, my favorite diagnostic message is
the one that says, "Your program bombed and I'm not telling you
why.  It's your problem, not mine.")

So, as I see it, there is a possibility of emotional output.  It is
the behavior exhibited under abnormal circumstances.  It is what the
computer does when it doesn't know what to do or how to do what you asked.

--Barry Kort