[comp.ai] Indictment

jwi@lzfme.att.com (Jim Winer @ AT&T, Middletown, NJ) (06/20/89)

Jim Winer writes:

> > Apparently, you think that a system that learns unnecessary (and
> > possibly incorrect or un-useful) things is intelligent? What about
> > systems (like people on the radical left, right or center) who
> > "never stop learning" *incorrect* things? A better definition of an
> > intelligent system might be one that can cope with unanticipated
> > (or even random) situations.

Gordon@ucl-cs.UUCP replies:

> I agree with the last statement. The politics of left and right has
> appeared exactly as above, be it Thatcher & Kinnock or Stallman & Apple,
> Should this posting be to talk.politics.theory?
> 
> Maybe. The left and the right are both intelligent people. One says
> freedom/wealth/choice is growing, the other says that it is diminishing.
> They both believe (much of) what they say is the truth. And is it the
> truth from where they sit, with statistics prove it.

Jim Winer continues:

My point is that human beings, as a group, are not intelligent.
There does not seem to be any clear correlation between their
behavior and the presumed objective of happiness. (Wealth and power
are not happiness, but only states which are presumed by some to be
necessary or sufficient conditions to happiness. People who make
these presumptions are invariably proved wrong in that they don't
accheive a state of contentment.) There *may* be individual
exceptions. (Based on *my* life, and the life of everyone I have
ever met, that seems unlikely even though there are some rare
moments of rationality.)

We are attempting then, to create rational machines that model
irrational human behavior -- machines that pursue false goals for
dubious (or devious) reasons. Okay, so it's the best we can do. But
let's not lie to ourselves and call it intelligence. Let's call a
spade a spade and a pseudo-intelligent bomb-controller a weapon.

Jim Winer ..!lzfme!jwi 

May you live in intersting times.
        Pax Probiscus!
        Sturgeon's Law (Revised again): 98.89%
        of everything is peanut butter.
        Rarely able to send an email reply sucessfully.
        The opinions expressed here are not necessarily  
Those persons who advocate censorship offend my religion.

cam@edai.ed.ac.uk (Chris Malcolm cam@uk.ac.ed.edai 031 667 1011 x2550) (06/24/89)

In article <1421@lzfme.att.com> jwi@lzfme.att.com (Jim Winer @ AT&T, Middletown, NJ) writes:

>My point is that human beings, as a group, are not intelligent.
>There does not seem to be any clear correlation between their
>behavior and the presumed objective of happiness.

I can think of many possible explanations of the observed lack of
correlation: the objective is not happiness; the objective is happiness
but it's usually so hard to find that most people, although highly
intelligent, make mistakes; there is a correlation, but the correlation
is hard to see. All of these seem to me to be much more plausible than
the presumption that most people are not intelligent.

>Based on *my* life, and the life of everyone I have
>ever met, ... there are some rare
>moments of rationality.

In general you cannot judge whether or not behaviour is rational without
knowing the goals. The annals of psychotherapy are full of anecdotes
about apprarently irrational behaviour proving to be rational once the
hidden agenda was revealed. If you take into account the built-in goals
of the biological machines we inhabit (to beg a few swift questions)
such as the procreation of the race, it becomes rather hard to assert
that human behaviour is, in this larger perspective, anything other than
rational, with our commonplace use of the term "irrational" meaning no
more than quite simply "I can't think of a reason" which is a fairly
trivial observation, i.e., a understandable failure of the imagination
in a situation requiring considerable and perhaps superhuman knowledge
to understand.

After all, it may well be the case that in order to be able to
understand ourselves we would have to be so complicated that we
couldn't. We could still be rational though; it would just be impossible
to know it.

>We are attempting then, to create rational machines that model
>irrational human behavior

Seems a long way to go about it. Why not use irrational machines if it
makes it simpler?

> -- machines that pursue false goals for
>dubious (or devious) reasons.

Sound like either you're a misanthropist or you've been very unfortunate
in your choice of friends.

>Okay, so it's the best we can do. But
>let's not lie to ourselves and call it intelligence. Let's call a
>spade a spade and a pseudo-intelligent bomb-controller a weapon.

Ah, now I see! You've been associating with military people! No wonder
you think people are irrational, unintelligent, dubious, etc.!
-- 
Chris Malcolm    cam@uk.ac.ed.edai   031 667 1011 x2550
Department of Artificial Intelligence, Edinburgh University
5 Forrest Hill, Edinburgh, EH1 2QL, UK		

d88-cwe@nada.kth.se (Christian Wettergren) (07/07/89)

In article <1421@lzfme.att.com> jwi@lzfme.att.com (Jim Winer @ AT&T, Middletown, NJ) writes:
>My point is that human beings, as a group, are not intelligent.
>There does not seem to be any clear correlation between their
>behavior and the presumed objective of happiness.

I think that human society, as a group, is indeed intelligent, but at another
level. A neuron won't feel the intelligence of a living creature. But you 
can't deny that an animal, or a human, in some sense is intelligent.

The problem with the intelligence of a whole population is more complex than
it may seem. It is a lot of factor's one have to look at. 

For example, doesn't this medium in some sense raise the IQ :-) of our society
because it raises the level of interconnections. The resources spread out geo-
graphically over the globe is used much more efficient because of this.

Another factor is that you have to be able to understand and interpret the 
information that is given to you. And to be able to that, you must have 
education. The importance of education must also be understood, and isn't that
also an information that you must understand. (Of course you can read, you
just did!)

----

I've read a very interesting book recently, about 'The Evolution of Co-
-operation' by R. Axelrod. Axelrod has studied the concept of 'strategies'. 
He has battled a lot of strategies against each other in a computer simulation.

The strategies has access to the history of the interaction with it's 
'opponent' and has to answer either 'Cooperate' or `Defect' on each round. This
is an Iterated Prisoner's dilemma. If both answer 'Cooperate', they both 
get 3 scores. If one says 'Cooperate' and one says `Defect', the one who 
didn't cooperate gets 5 p and the other gets the Sucker's payoff, 0 p. If both
mistrust each other, they get 1 p. (I'm not too sure about the exact scores, 
but I'll think it's correct this way.)

The result was that the strategy Tit-for-Tat won. It does whatever the opponent
did. The conclusion, after some analysis, was that you should be provocable,
forgiving, nice - don't start a battle, and robust.

I'll know that I'm not doing this great book right trying to making a summary
in just three paragraphs, so, go and read it.

----

Excuse me, I just wanted to mention the book. What I was going to say was that
I think that the Theory of Cooperation gives us a clue to how an organism 
evolves. (I think that a society is an organism too, in some sense...)

The society needs as many interconnections as possible to become 'intelligent'.
There is a self-organizing function in nature as soon as the parts is self-
-aware and strives for 'profit'. Therefore I think it is a mistake to say that
a group of people isn't intelligent. Just think of ALL the layers this article
has passed on it's way.

I'm writing this at home. Each letter goes from the microchip in the keyboard,
up through the operating system, out into the modem & the screen, through all
telephone wires, switchboards, into my minicomputer...

Each stage is modulated by a lot of factors. Do you have electricity, raw-
-material, industry, money !, ...., ...., .... 

I think you've got the idea. (You read it!)

By the way, I do agree with you to call a pseduo-intelligent bomb-controller a
weapon. For us 'neurons', it is essential to live at our level :-). That means
we should absolutely engage ourselves in our world, and not loose ourselves up
in the blue with threatpictures etc, etc.

Thank for listening to my chattering.

/Christian Wettergren, d88-cwe@nada.kth.se

P.S. How did the conventions in this medium evolve (don't flame, don't correct
spelling etc) ?