cutler (01/22/83)
Concerning intelligent machines...how can you expect a conscious entity with roughly the same intelligence as yourself (assuming it understands human knowledge and interactions and learns from experience and all this implies) to do all your dirty work for you? Is this a reasonable assumption? What might cause it to obey you without fail? Will you turn it off? If it's as smart as a human, wouldn't this be murder? And if you do turn it off, what will you have gained? Anything with the level of intelligence that we ascribe to humanity will have a free will and do what it wants. Unless of course you deprive it of the ability to learn and create in which case you just have a very sophisticated but STATIC program. Ben Cutler decvax!yale-comix!cutler
leichter (01/22/83)
How will you make an intelligent machine do the dirty work for you? Human societies had slavery until very recently; getting HUMANS to do your dirty work for you is a (depressingly easily) solved problem. The basic trick is to make the slaves believe that the situation they are living in is right - and make sure they have no hope of any other life - like just across the border. Since you control their environments, especially their upbringing, you can arrange to do the former; the latter is a matter of what kind of society your neighbors have. Historically, slave revolts are a fairly rare, and even less often successful, occurence. What about intelligent computers? Here, you don't even have to worry about indirect methods of indoctrination; you can control the data base the systems start with, what kind of likes and dislikes they have, and so on. Any program that was really like a human would have a (PERSONAL) concept of pain. It would be easy to include some simple command that triggers intense pain. Further, it's unlikely that anywhere in the world would a society of free computers exist; there would be no "underground railroad" to run to. While I agree that there are real MORAL questions to deal with here, I think the PRACTICAL issues would be pretty easily solvable. Of course, you could argue that a real ability to revolt is a necessary part of a "really intelligent" program. In the abstract, you would be right; it \\\if what you want is an accurate model of HUMAN intelligence, that would probably be a necessary part. However, we have no trouble recognizing as human the "faithful manservant" who really believes that "his place" is to serve. Talking intelligently to such a person is not particularly hard. -- Jerry decvax!yale-comix!leichter
cutler (01/22/83)
Computer Slavery? Perhaps if you just want the machine to do simple stuff like get rid of old files you don't need any more. But certainly NOT if you want it to do intellectual, creative work. History has given examples of this, but it also seems to indicate that creativity does not flourish under the whip. Ben Cutler decvax!yale-comix!cutler
wagar (01/23/83)
Why does instilling computers with motivations conducive to performing human drudgery have to be considered slavery? Why should one assume pain and discipline are the proper motivating factors? This strikes me not only as sick, but as incredibly arrogant. As human beings we didn't ask to be born, didn't ask for the right to spend our whole lives working our tails off, but we're here all the same, and we love it. How's that for slavery? -Steve Wagar decvax!yale-comix!wagar