janw@inmet.UUCP (09/10/86)
[colonel@sunybcs.UUCP ] >> >Besides, ideas have nothing to do with it; even an amoeba can love. >> I confess I can't really empathize with an amoeba; but >> it seems no harder to ascribe ideas to it than love. >This sounds like an echo of Jennings's: "If Amoeba were the size of a >dog, instead of being microscopic, no one would deny to its actions the >name of intelligence." >But ideas are abstractions, and I doubt that amoebas need to abstract. >Intelligence isn't necessarily abstract or symbolic. Ideas are abstract in different degree. The rudiments of intelligence existing in an amoeba may not deserve the name of ideas; but do its rudiments of emotion deserve names such as love? >> Can ideas and emotions be fully separated? >> If AI ever succeeds in passing the Turing test, won't it >> necessarily include AE as well ? >One of the classical objections to the Turing test is that it limits >communication to a tty. I foresee difficulties in expressing emotion >at 1200 baud ... Ever read the late net.flame? Or, for that matter, the extant groups? >Besides, computer hugs just don't feel right. What would we use AE >_for?_ They could hug each other, and we could watch and laugh... Peo- ple try to make computers compose music, poetry and graphic art - all three can be emotional. I am not, BTW, predicting success - I am just saying IF a program passed the Turing test, it would have to pass it emotionally, as well as rationally. That is, its answers to emotion-provoking remarks would have to imitate, suc- cessfully, human emotion - or else it would be found out. You might say it doesn't *really* feel the emotion - but the same objection is made about its thinking. Your implied objection is that emotion is not expressed in words alone - but neither is intelligence. Remember Charles, " our noble king, Whose word no man relies on. He never said a foolish thing, And never did a wise one" ? The Imitation Game is crooked, but it's the only game in town... Jan Wasilewsky