[comp.ai] AI and the Arms Race

blenko@burdvax.UUCP (Tom Blenko) (11/21/86)

In article <8611181719.AA00510@watdcsu.uucp> "B. Lindsay Patten" <shen5%watdcsu.waterloo.edu@RELAY.CS.NET> writes:
[... stuff ...]

|The real point Dr. Weizenbaum was trying to make (in my
|opinion) was that we should weigh the good and bad applications of
|our work and decide which outweighs the other.

If Weizenbaum or anyone else thinks he or she can succeeded in weighing
possible good and bad applications, I think he is mistaken. Wildly
mistaken.

Why does Weizenbaum think technologists are, even within the bounds of
conventional wisdom, competent to make such judgements in the first
place?  Everywhere I turn there is a technologist telling me why SDI
cannot succeed -- which tells me that technologists fail to comprehend
consequences of their work from any perspective except their own.  Is
it not possible that the principal consequences of SDI will be
something other than an operational defense system?

Why doesn't Weizenbaum do some research and talk about it?  Why is
Waterloo inviting him to talk on anything other than his research
results? No reply necessary, but doesn't the fact that technically-
oriented audiences are willing to spend their time listening to this
sort of amateur preaching itself suggest what their limitations are
with regard to difficult ethical questions?

	Tom

anderson@uwmacc.UUCP (11/22/86)

In article <2862@burdvax.UUCP>, blenko@burdvax.UUCP (Tom Blenko) writes:
| Why doesn't Weizenbaum do some research and talk about it?  Why is
| Waterloo inviting him to talk on anything other than his research
| results? No reply necessary, but doesn't the fact that technically-
| oriented audiences are willing to spend their time listening to this
| sort of amateur preaching itself suggest what their limitations are
| with regard to difficult ethical questions?
Even as a preacher, Weizenbaum is hardly an amateur! Do be fair. On
your last point, I would claim the evidence shows just the opposite
of what you claim, namely that technically-oriented audiences are
willing to spend their time listening to intelligent opinions shows
that they are more qualified than some people think to consider
difficult ethical questions. Of course I am an amateur, too -- of
life (remember what the word means!).
-- 
==ARPA:====================anderson@unix.macc.wisc.edu===Jess Anderson======
| UUCP: {harvard,seismo,topaz,                           1210 W. Dayton    | 
|    akgua,allegra,ihnp4,usbvax}!uwvax!uwmacc!anderson   Madison, WI 53706 |
==BITNET:============================anderson@wiscmacc===608/263-6988=======

willc@tekchips.UUCP (Will Clinger) (11/24/86)

In article <2862@burdvax.UUCP> blenko@burdvax.UUCP (Tom Blenko) writes:
>If Weizenbaum or anyone else thinks he or she can succeeded in weighing
>possible good and bad applications, I think he is mistaken. Wildly
>mistaken.
>
>Why does Weizenbaum think technologists are, even within the bounds of
>conventional wisdom, competent to make such judgements in the first
>place?

Is this supposed to mean that professors of moral philosophy are the only
people who should make moral judgments?  Or is it supposed to mean that
we should trust the theologians to choose for us?  Or that we should leave
all such matters to the politicians?

Representative democracy imposes upon citizens a responsibility for
judging moral choices made by the leaders they elect.  It seems to me
that anyone presumed to be capable of judging others' moral choices
should be presumed capable of making their own.

It also seems to me that responsibility for judging the likely outcome
of one's actions is not a thing that humans can evade, and I applaud
Weizenbaum for pointing out that scientists and engineers bear this
responsibility as much as anyone else.

By saying this I neither applaud nor deplore the particular moral choices
that Weizenbaum advocates.

William Clinger

eugene@nike.uucp (Eugene Miya N.) (11/26/86)

>Will Clinger writes:
>In article <2862@burdvax.UUCP> blenko@burdvax.UUCP (Tom Blenko) writes:
>>If Weizenbaum or anyone else thinks he or she can succeeded in weighing
>>possible good and bad applications, I think he is mistaken.
>>
>>Why does Weizenbaum think technologists are, even within the bounds of
>>conventional wisdom, competent to make such judgements in the first
>>place?
>
>Is this supposed to mean that professors of moral philosophy are the only
>people who should make moral judgments?  Or is it supposed to mean that
>we should trust the theologians to choose for us?  Or that we should leave
>all such matters to the politicians?
>
>Representative democracy imposes upon citizens a responsibility for
>judging moral choices made by the leaders they elect.  It seems to me
>that anyone presumed to be capable of judging others' moral choices
>should be presumed capable of making their own.
>
>It also seems to me that responsibility for judging the likely outcome
>of one's actions is not a thing that humans can evade, and I applaud
>Weizenbaum for pointing out that scientists and engineers bear this
>responsibility as much as anyone else.
>
>William Clinger

The problem here began in 1939.  It's science's relationship to
the rest of democracy and society.  Before that time science was
a minor player.  This is when the physics community (on the part of
Leo Szilard and Eugene Wigner) when to Albert Einstein and said:
look at these developments in nuclear energy and look where Nazi Germany
is going.  He turn as a public figure (like Carl Sagan in a way)
went to Roosevelt.  Science has never been the same. [Note we also
make more money for science from government than ever: note
the discussion on funding math where Halmos was quoted.]

What Tom did not point out is where or not scientists and engineers
have "more" responsibility.  Some people say since they are in the know,
they have MORE responsibility, others say, no this is a democracy
they have EQUAL responsibility, but judgments MUST be made by it's
citizens.  In the "natural world," many things are not democratic
(is gravity autocratic?)... well these are not the right words but
the illustrate the point that man's ideas are sometimes feeible.

While Weizenbaum may or may not weigh moral values, he is in a
unique position to understand some of the technical issues, and he
should properly steer the understanding of those weighing moral
decisions (as opposed to letting them stray): in other words, yes,
to a degree, he DOES weigh them and yes he DOES color his moral values
into the argument. [The moral equivalent to making moral judgments.]

An earlier posting pointed out the molecular biologists restricting
specific types of work at the Asolimar meeting years ago.  In the
journal Science, it was noted that much of the community felt it shot
its foot off, looking back, and that current research is being held back.
I would hope that the AI community would learn from the biologists'
experience and either not restrict research (perhaps too ideal)
or not end up gagging themselves.  Tricky issue, why doesn't someone
write an AI program to decide what to do?  Good luck.

From the Rock of Ages Home for Retired Hackers:

--eugene miya
  NASA Ames Research Center
  eugene@ames-aurora.ARPA
  "You trust the `reply' command with all those different mailers out there?"
  {hplabs,hao,nike,ihnp4,decwrl,allegra,tektronix,menlo70}!ames!aurora!eugene

blenko@burdvax.UUCP (Tom Blenko) (12/04/86)

In article <863@tekchips.UUCP> willc@tekchips.UUCP (Will Clinger) writes:
|In article <2862@burdvax.UUCP> blenko@burdvax.UUCP (Tom Blenko) writes:
|>If Weizenbaum or anyone else thinks he or she can succeeded in weighing
|>possible good and bad applications, I think he is mistaken. Wildly
|>mistaken.
|>
|>Why does Weizenbaum think technologists are, even within the bounds of
|>conventional wisdom, competent to make such judgements in the first
|>place?
|
|Is this supposed to mean that professors of moral philosophy are the only
|people who should make moral judgments?  Or is it supposed to mean that
|we should trust the theologians to choose for us?  Or that we should leave
|all such matters to the politicians?

Not at all. You and I apparently agree that everyone does, willingly or
not, decide what they will do (not everyone would agree with even
that). I claim that they are simply unable to decide on the basis of
knowing what the good and bad consequences of introducing a technology
will be. And I am claiming that technologists, by and large, are less
competent than they might be by virtue of their ignorance of the
criteria professors of moral philosophy, theologians, nuclear plant
designers, and politicians bring to bear on such decisions.

I propose that most technologists decide, explicitly or implicitly,
that they will ride with the status quo, believing that

	1) there are processes by which errant behavior on the part of
	   political or military leaders is corrected;
	2) they may subsequently have the option of taking a
	   different role in deciding how the technology will be used;
	3) the status quo is what they are most knowledgeable about,
	   and other options are difficult to evaluate;
	4) there is always a finite likelihood that a decision may,
	   in retrospect, prove wrong, even though it was the best
	   choice available to them as decision-maker.

Such a decision is not that some set of consequences is, on balance,
good or bad, but that there is a process by which one may hope to
minimize catastrophic consequences of an imperfect, forced-choice
decision-making process.

|Representative democracy imposes upon citizens a responsibility for
|judging moral choices made by the leaders they elect.  It seems to me
|that anyone presumed to be capable of judging others' moral choices
|should be presumed capable of making their own.
|
|It also seems to me that responsibility for judging the likely outcome
|of one's actions is not a thing that humans can evade, and I applaud
|Weizenbaum for pointing out that scientists and engineers bear this
|responsibility as much as anyone else.

I think the exhortations attributed to Weizenbaum are shallow and
simplistic. If one persuades oneself that one is doing what Weizenbaum
proposes, one simply defers the more difficult task of modifying one's
decision-making as further information/experience becomes available
(e.g., by revising a belief set such as that above).

	Tom

csrdi@its63b.ed.ac.uk (ECTU68 R Innis CS) (12/06/86)

In article <2888@burdvax.UUCP> blenko@burdvax.UUCP (Tom Blenko) writes:
>
>....I am claiming that technologists, by and large, are less
>competent than they might be by virtue of their ignorance of the
>criteria professors of moral philosophy, theologians, nuclear plant
>designers, and politicians bring to bear on such decisions.
>
I think this reflects a fundamental flaw in society - 'technologists'aren't 
trained to consider such problems, and most take no interest in them anyway.
By the way, your wording suggests that 'nuclear plant designers' aren't
'technologists'....whatever they may be. 

By a parallel argument, why are professors of moral philosophy, theologians, 
politicians etc fit to make decisions regarding the moral issues of 
technologies which they may know nothing about? Surely they are as ignorant of
the criteria 'technologists' may use to judge the ethical implications of
something?
>
>I propose that most technologists decide, explicitly or implicitly,
>that they will ride with the status quo
>
Most *people* go with the status quo - it makes life easier. Many don't even
think to question it, more's the pity. What was it Shaw said about progress
and dissatisfaction?

(from <863@tekchips.UUCP> in <2888@burdvax.UUCP> - sorry, can't remember who
the original sender was)
>
>|It also seems to me that responsibility for judging the likely outcome
>|of one's actions is not a thing that humans can evade, and I applaud
>|Weizenbaum for pointing out that scientists and engineers bear this
>|responsibility as much as anyone else.
>
I agree. Unfortunately, the vast majority do *not* think through the
likely outcomes of their actions. 
>
>	....If one persuades oneself that one is doing what Weizenbaum
>proposes, one simply defers the more difficult task of modifying one's
>decision-making as further information/experience becomes available
>(e.g., by revising a belief set such as that above).
>
Sorry, I don't see the connection. Can you explain further?

	--Rick