taylor@hplabsc.UUCP (08/01/86)
This article is from tektronix!videovax.TEK.COM!stever (Steven E. Rice) and was received on Fri Aug 1 11:08:56 1986 I don't normally read _mod.comp-soc_, but happened to be leafing through a week or so ago. One of the discussions that caught my eye was whether technology was or was not morally neutral. The writers seemed to be in agreement that it was not. However, technology is just one more tool in the hands of an individual or individuals. No technology is self-creating or self-directing -- all of this must come from the hands and minds of human beings. If a technology is used for good purposes, or if it is used for evil purposes, the responsibility is the users'. An example of this is a recent tragedy in Seattle, Washington. A man forced his way into the home of a family of four, tied up the father, mother, and two sons, chloroformed them, beat them with an iron (the kind you iron clothes with), and stabbed them. The mother died at the scene, and the father and two sons died of their wounds over the next several weeks. Consider the technologies involved for a moment -- wrapping and tying (the rope), anesthesia (the chloroform), clothes maintenance (the iron), and food preparation (the knife). In each case, the normal usage of the particular technology is for positive ends. But in each case, the murderer mis-used a product of that technology for an evil end. This is not an isolated example -- it applies across the board. Even the science-fiction "killing machine" (robotic battle tanks, and so on) are not signs of evil technology. A machine which has no good purpose, having been built for evil ends, is a product of the minds and the hands of men. And the responsibility for moral judgements lies upon them. Steve Rice
taylor@hplabsc.UUCP (08/02/86)
Steve Rice writes; >However, technology is just one more tool in the hands of an individual >or individuals. No technology is self-creating or self-directing -- all >of this must come from the hands and minds of human beings. If a >technology is used for good purposes, or if it is used for evil purposes, >the responsibility is the users'. Agreed, but, just as we cannot consider photographs independent of the subjects (more in a 'sec) we cannot consider technology independent of it's use. Considering anything as an 'instantiation of the perfect <x>' (originally from some pretty famous philosophical types) is a poor solution to the problem of how do we justify to OURSELVES the creation of objects that are to be used in `evil' ways? To say "well, sure I'm working in a Germ Warfare lab, but what *I'm* working on is more akin to the study of disease in man, so it's a good thing, not a bad one" is a poor rationalization. The comment above about photographs versus their subject isn't as far from reality as you're no doubt thinking - a photograph can be judged for it's aesthetics (e.g. surface appearance, size, etc) but for anything that approaches the FUNCTION of a photograph, the subject matter that was photographed must be taken into account. In the same way, a chair may be considered as a fine example of woodworking, but cannot be really evaluated as good or bad unless it's function is taken into account. Consider a chair that cannot support more than 7 pounds - it may be a fine chair from a MATERIALS point of view, but from a FUNCTIONAL point of view it's a flop. I believe that this is the attitude we need to take with technology - that we can consider creating new viruses for germ warfare as `interesting experiments and ways to get new and better equipment', but that simply isn't enough relative to our society. As scientists and engineers, we have a responsibility to consider ALL the aspects of what we create, and one of those is FUNCTIONALITY. We cannot divorce an atomic bomb from the destruction it wreaks. We cannot consider offensive weapon creation as something amoral since the act of creation is indpendent of any moral or ethical considerations. I mean, c'mon, who's fooling whom? The FUNCTION and it's place in our ENVIRONMENT is a critical factor. >An example of this is a recent tragedy in Seattle, Washington. The tragedy is indeed tragic. There are, as you say, all too many examples of this sort of behaviour. On the other hand, this is a poor example of what we're talking about because this is somewhat of an overtrivialized example - if, for example, they had been killed with a small thermonuclear device (!) or, better yet, a hand grenade lobbed into their living room, then certainly we couldn't argue that the technology was neutral. It's a matter of the FUNCTION of the technology as I said above. An iron isn't designed to kill people but it can be used for that. A grenade IS designed to kill people. That is one *HELL* of a difference! Again, though, the fact that they are different shows that we place a moral 'value' on technology - I'm sure there are very very few people reading this who are thinking `they're both interesting technologies'. No. You're thinking `yeah! Grenades ARE bad' (nice how I can tell what you're thinking, isn't it? *chuckle*) I think that this is why parents get concerned when their children play with guns and such - those are essentially destructive rather than constructive toys...(but that's a step into a different subject altogether) >This is not an isolated example -- it applies across the board. Even >the science-fiction "killing machine" (robotic battle tanks, and so on) >are not signs of evil technology. A machine which has no good purpose, >having been built for evil ends, is a product of the minds and the hands >of men. And the responsibility for moral judgements lies upon them. Agreed. But the object itself has an inherited moral and societal value by virtue of this. Again, to circle back to my original comment, technology cannot be considered independent of the environment it exists in. -- Dave
taylor@hplabsc.UUCP (Dave Taylor) (08/07/86)
This article is from rti-sel!dg_rtp!throopw%mcnc.csnet@csnet-relay.ARPA and was received on Wed Aug 6 16:50:03 1986 > hplabs!taylor (Dave Taylor) I'm not sure I understand Dave's position. I can't seem to make reasonable sense of it. For example, the bit about photographs: > but, just as we cannot consider photographs independent of the > subjects (more in a 'sec) we cannot consider technology independent of > it's use. [...] the subject matter that was photographed must be taken > into account. Are you saying that a photograph of something evil is, in itself, evil? This makes no sense. After all, your posting <518@hplabsc.UUCP> discussed evil. Was your posting therefore evil? If not, what separates this case from that of the photograph? In each case, a representation of evil is present. ( If you are saying that one cannot even think about a photograph independantly of its subject matter, I disagree. I take your point to be that a photograph cannot be evauated for its virtue without considering its subject matter. ) Let's look at it in general terms. You say that "one cannot evaluate technology without looking at its use". I agree wholeheartedly. How then can one possibly conclude that technology is either good or evil? Surely either conclusion implies an evaluation of technology. And surely we have just agreed that this is impossible without considering more than simply the technology. How can one possibly avoid the conclusion then that technology cannot be either good or evil? Let's take a concrete example. You say > [An] object itself has an inherited moral and societal value > by virtue of [its designer's intent]. Ok. A Mad Scientist designs and builds a nuclear weapon and intends to use it to obliterate, oh, say, New York City. Millions of dead urbanites. Untold suffering. Really eeee-vil. Is the bomb itself, or nuclear technology itself, eeee-vil? "Yes", you say! "EEEE-VIL" you say. "Hogwash", I say. Further hypothesize that a joint FBI/NASA team stops the MS, and captures the bomb. And then uses it to divert an asteroid that would (if left alone) have obliterated Chicago. Is the bomb now good? The same bomb that was eeee-vil before? Is nuclear technology now good? You mean they WERE eeee-vil, and NOW they are good? If this is really your position, I can't make heads nor tails of it. It makes no sense at all to me. On the other hand, if you mean that the bomb in the context of the MS's intent is evil, and in the context of the joint FBI/NASA asteroid-bashing project is good, then I disagree on other grounds. Consider, would the MS's intent be evil without the bomb? Would the FBI/NASA team's intent be good without the bomb? I think so, in both cases. Thus surely the intent is the evil thing, and the bomb has little or nothing to do with the morality of the situation. -- "Is that how you get your kicks... by planning the deaths of innocent people?" "No... by *causing* the deaths of innocent people!" --- Superman and Lex Luthor (question: Is Luthor eeee-vil because he so plans, or because he so causes?)
taylor@hplabsc.UUCP (Dave Taylor) (08/07/86)
This article is from ihnp4!whuts!6243tes (Terry Sterkel) and was received on Wed Aug 6 17:50:34 1986 > However, technology is just one more tool in the hands of an individual > or individuals. No technology is self-creating or self-directing -- > > ...a description of a low-tech murder > > the science-fiction "killing machine" (robotic battle tanks, and so on) > are not signs of evil technology. A machine which has no good purpose, > having been built for evil ends, is a product of the minds and the hands > of men. And the responsibility for moral judgements lies upon them. > > Steve Rice Either I am missing the point or others are. The "medium is the mAssage argument" I have been pressing is not one of ethics, morality, or even evil. My contention of the last few weeks is that; 1. Technology, especially that which forces changes in basic functions, such as word processing software results in subtle changes in the way we are, and 2. we can objectively, and subjectively, measure this impact even on recognized masters of a trade such as the authors Clark, and Anthony, and 3. the increase in hackerism in "programming" vs. specification driven, system engineered software, and 4. I have expressed deep reservations on the rest of us (mere journeymen, novices, and *grade shool* children) if the masters can be so readily driven to machine mediocrity. Thank you for your patience in this argument. Terry Sterkel, AT&T Bell Labs, Whippany, NJ [opinions are obviously only my own; not necessarily those of my employer or those of my associates.]
taylor@hplabsc.UUCP (Dave Taylor) (08/09/86)
This article is from caip!uw-beaver!uw-vlsi!tony (Tony Marriott) and was received on Fri Aug 8 17:49:44 1986 > ...But the object itself has an inherited moral and societal value >by virtue of this. Again, to circle back to my original comment, >technology cannot be considered independent of the environment it exists >in. Leonardo da Vinci designed war machines of optimum technology; yet these were not produced in any functional capacity for centuries. One should not confuse the ethics of thought with the moral questions they involve. It seems to be the most common assumption in current discussions that "technology cannot be considered independent of the environment it exists in." We are human kind, a creature capable of molding our environment to our needs. Why don't we begin? It often appears that technology is being used in lieu of more basic forms of communication. Simplicity of form or function does not readily lend itself to simplicity of psychology or moral principal. Even within this, a hostile environment by any historical comparison (immediate and distant, locally and global), there is in your neighbor a judge to assure the success of your enterprise. I hope you know where I guest in your burden. Good eating!
taylor@hplabsc.UUCP (Dave Taylor) (08/12/86)
This article is from rti-sel!dg_rtp!throopw%mcnc.csnet@csnet-relay.ARPA and was received on Mon Aug 11 20:52:27 1986 > ihnp4!whuts!6243tes (Terry Sterkel) > I have expressed deep reservations on the rest of us (mere journeymen, > novices, and *grade shool* children) if the masters can be so readily > driven to machine mediocrity. I agree in general but disagree in particular. I agree that sloppy thinking is bad, and I agree that electronic composition and dissemination may spread more of it around. But I don't think it is unambiguously true that WP is bad for prose, nor that automated software development is bad for software development, nor that machine assistance in general leads to mediocrity. The examples posed to support this contention aren't convincing to me because first, much more that just degree-of-machine-assistance varied between the "good" instances and the "bad", and moreover I don't think it is definitive that the "good" instances are really good and the "bad" are really bad. In fact, I'll drag out the old, tired point one more time. Each and every time the technology and technique of thought and dissemination of ideas has advanced, the new technology or technique has been greeted with howls of "It'll lead to sloppy/lazy/scatterbraned/insert-your- perjorative-phrase-here thinking!" After having seen historical evidence of this in the case of the printing press, logarithms, slide rules, calculators, and no doubt zillions of others, I am somewhat skeptical of this latest denouncement. -- ... the fact that it is possible to push a pea up a mountain with your nose does not mean that this is a sensible way of getting it there. --- Christopher Strachey -- Wayne Throop <the-known-world>!mcnc!rti-sel!dg_rtp!throopw