hayes.pa@XEROX.COM (11/03/87)
Gilbert Cockton makes a serious mistake, in lumping AI models together with all other `mechanical' or `scientific' models of mind on the wrong side of C P Snows cultural fence: >In short, mechanical concepts of mind and the values of a civilised >society are at odds with each other. It is for this reason that modes >of representation such as the novel, poetry, sculpture and fine art >will continue to dominate the most comprehensive accounts of the >human condition. The most exciting thing about computational models of the mind is exactly that they, alone among the models of the mind we have, ARE consistent with humanist values while being firmly in contact with results of the hardest of sciences. Cockton is right to be depressed by many of the scientific views of man that have appeared recently. We have fallen from the privileged bearers of divine knowledge to the lowly status of naked apes, driven by primitive urges; or even to mere vehicles used by selfish genes to reproduce themselves. Superficial analogies between brains and machines make people into blind bundles of mechanical links between inputs and outputs, suitable inhabitants for Skinners New Walden, of whose minds - if they have any - we are not permitted to speak. Physicists often assume that people, like everything else, are physical machines governed by physical laws, and therefore whose behavior must be describable in physical terms: more, that this is a scientific truth, beyond rational dispute. None of these pictures of human nature has any place for thought, for language, culture, mutual awareness and human relationships. Many scientists have given up and decided that the most uniquely human attributes have no place in the world given us by biology, physics and engineering. But the computational approach to modelling mind gives a central place to symbolic structures, to languages and representations. While firmly rooted in the hard sciences, this model of the mind naturally encompasses views of perception and thought which assume that they involve metaphors, analogies,inferences and images. It deals right at its center with questions of communication and miscommunication. I can certainly imagine my mind ( and Gilberts ) working this way: I consist of symbols, interacting with one another in a rich dynamic web of inference, perceptual encoding and linguistic inputs ( and other interactions, such as with emotional states ). This is a view of man which does NOT reduce us to a meaningless machine, one which places us naturally in a society of peers with whom we communicate. Evolutionary biology can account for the formation of early human societies in very general terms, but it has no explanation for human culture and art. But computer modellers are not surprised by the Lascaux cave paintings, or the univeral use of music, ritual and language. People are organic machines; but if we also say that they are machines which work by storing and using symbolic structures, then we expect them to create representations and attribute meaning to objects in their world. I feel strongly about this because I believe that we have here, at last, a way - in principle - to bridge the gap between science and humanity. Of course, we havnt done it yet, and to call a simple program `intelligent' doesnt help to keep things clear, but Cocktons pessimism should not be alllowed to cloud our vision. Pat Hayes