[mod.ai] Computer Ethics

bruces@dg_rama.UUCP (05/27/86)

          [Forwarded from the Risks Digest by Laws@SRI-AI.]


The following is a copy of a review I wrote for a recent newsletter of the
Boston chapter of Computer Professionals for Social Responsibility (CPSR).
Readers of RISKS may be interested, as well.

METAPHILOSOPHY is a British journal published three times yearly which is
dedicated to considerations about particular schools, fields, and methods of
philosophy.  The October 1985 issue, Computers & Ethics (Volume No. 16, Issue
No. 4), is recommended reading [...].

This issue's articles attempt to define and delimit the scope of Computer
Ethics, and examine several emerging and current concerns within the field.

One current concern is responsibility for computer-based errors.  In his
article on the subject, John W. Snapper asks:  "...whether it is advisable to
...write the law so that a machine is held legally liable for harm." The author
invokes Aristotle's "Nichomachean Ethics" (!) in an analysis of how computers
make decisions, and what is meant by "decision" in this context.

On the same subject, William Bechtel goes one step further, considering the
possibility that computers could one day bear not only legal, but moral
responsibility for decision-making:  "When we have computer systems that ...can
be embedded in an environment and adapt their responses to that environment,
then it would seem that we have captured all those features of human beings
that we take into account when we hold them responsible."

Deborah G. Johnson discusses another concern:  ownership of computer programs.
In "Should Computer Programs Be Owned?," Ms. Johnson criticizes utilitarian
arguments for ownership, as well as arguments based upon Locke's labor theory
of property. The proper limits to extant legal protections, including
copyrights, patents, and trade secrecy laws, are called into question.

Other emerging concerns include the need to educate the public on the dangers
and abuses of computers, and the role of computers in education.  To this end,
Philip A. Pecorino and Walter Maner present a proposal for a college level
course in Computer Ethics, and Marvin J. Croy addresses the ethics of
computer-assisted instruction.

Dan Lloyd, in his provocative but highly speculative article, "Frankenstein's
Children," envisions a world where cognitive simulation AI succeeds in
producing machine consciousness, resulting in a possible ethical clash of the
rights of artificial minds with human values.

The introductory article, James H. Moor's "What is Computer Ethics," is an
ambitious attempt to define Computer Ethics, and to explain its importance.
According to Moor, the development and proliferation of computers can rightly
be termed "revolutionary":  "The revolutionary feature of computers is their
logical malleability.  Logical malleability assures the enormous application of
computer technology." Moor goes on to assert that the Computer Revolution, like
the Industrial Revolution, will transform "many of our human activities and
social institutions," and will "leave us with policy and conceptual vacuums
about how to use computer technology."

An important danger inherent in computers is what Moor calls "the invisibility
factor." In his own words:  "One may be quite knowledgeable about the inputs
and outputs of a computer and only dimly aware of the internal processing."
These hidden internal operations can be intentionally employed for unethical
purposes; what Moor calls "Invisible abuse,"  or can contain "Invisible
programming values":  value judgments of the programmer that reside, insidious
and unseen, in the program.

Finally, in the appendix, "Artificial Intelligence, Biology, and Intentional
States," editor Terrell Ward Bynum argues against the concept that "intentional
states" (i.e. belief, desire, expectation) are causally dependent upon
biochemistry, and thus cannot exist within a machine.

If you're at all like me, you probably find reading philosophy can be "tough
going," and METAPHILOSOPHY is no exception.  References to unfamiliar works,
and the use of unfamiliar terms occasionally necessitated my reading
passages several times before extracting any meaning from them.  The topics,
however, are quite relevant and their treatment is, for the most part,
lively and interesting.  With its well-written introductory article, diverse
survey of current concerns, and fairly extensive bibliography, this issue of
METAPHILOSOPHY is an excellent first source for those new to the field of
Computer Ethics.

[METAPHILOSOPHY, c/o Expediters of the Printed Word Ltd., 515 Madison Avenue,
Suite 1217, New York, NY  10022]

Bruce A. Sesnovich         mcnc!rti-sel!dg_rtp!sesnovich
Data General Corp.         rti-sel!dg_rtp!sesnovich%mcnc@csnet-relay.arpa
Westboro, MA               "Problems worthy of attack
                            prove their worth by hitting back"

colonel@buffalo.CSNET ("Col. G. L. Sicherman") (06/16/86)

I have a few comments on _Metaphilosophy,_ as summarized by Bruce Sesnovich:

> The introductory article, James H. Moor's "What is Computer Ethics," is
> an ambitious attempt to define Computer Ethics, and to explain its
> importance.  According to Moor, the development and proliferation of
> computers can rightly be termed "revolutionary":  "The revolutionary
> feature of computers is their logical malleability.  Logical
> malleability assures the enormous application of computer technology."
> Moor goes on to assert that the Computer Revolution, like the
> Industrial Revolution, will transform "many of our human activities and
> social institutions," and will "leave us with policy and conceptual
> vacuums about how to use computer technology."

"Logical malleability" sounds vague to me.  If it's just an abstract
phrase for programmability, then I think Moor neglects the real signi-
ficance of computers: that (unlike machines) they accept differing input,
and produce differing output.

I agree fully that computers will cause revolutions.  But this talk of
"conceptual vacuums" is born of unavoidable myopia.  None of our present-
day prognosticators have shown any serious understanding of the future,
except a few science-fiction writers whom nobody takes seriously.  I
suggest that posterity will regard _us_ as the "vacuum" generation,
of an age "when nobody knew how to use computer technology."

> An important danger inherent in computers is what Moor calls "the
> invisibility factor." In his own words:  "One may be quite
> knowledgeable about the inputs and outputs of a computer and only dimly
> aware of the internal processing." These hidden internal operations can
> be intentionally employed for unethical purposes; what Moor calls
> "Invisible abuse,"  or can contain "Invisible programming values":
> value judgments of the programmer that reside, insidious and unseen, in
> the program.

Here Moor appears to be about 30 years behind McLuhan.  Try this: "One may
be quite knowledgeable about reading and writing and only dimly aware of
the details of book production and distribution." Or this: "One may be
quite knowledgeable about watching TV and only dimly aware of the physics
of broadcasting." Isn't it rather naive to think that the hidden values
of the computer medium lie in if-tests and do-loops?

To quote one of McLuhan's defocussed analogies: "You must talk to the
medium, not to the programmer.  To talk to the programmer is like
complaining to the hot-dog vendor about how badly your team is playing."

Col. G. L. Sicherman
UU: ...{rocksvax|decvax}!sunybcs!colonel
CS: colonel@buffalo-cs
BI: csdsicher@sunyabva

jc@cdx39.UUCP.UUCP (07/23/86)

> To quote one of McLuhan's defocussed analogies: "You must talk to the
> medium, not to the programmer.  To talk to the programmer is like
> complaining to the hot-dog vendor about how badly your team is playing."

Whether he was talking about the broadcast or the computer industry, he
got the analogy wrong.

If the subject is broadcasting, the sports analogy to a "programmer"
is the guy that makes the play schedules.  True, that person is not
responsible for program content, much less quality.  But still, the
analogous position is not the hot-dog vendor.

If the subject is computers, the sports equivalent to a programmer 
is the guy that designs the plays, i.e., the coach.  He is indeed
responsible for how badly the team/computer plays.  True, there may
be others that share the responsibility (like the players and equipment
vendor and the cpu and the I/O devices).  But still, in computing, 
a programmer bears at least partial responsibility for the computer's 
(mis)behaviour.