[net.politics] Deliberate WWIII

glassner@cwruecmp.UUCP (Andrew Glassner) (11/21/83)

There's been a lot of talk recently about nuclear weapons; their
necessity, their effectiveness, and so on.  This talk hasn't been
confined to the net;  in the most recent Scientific American the
lead article is an analysis of a "first strike" situation.

The authors look at the factors that go into deciding the destruction
generated by a nuclear strike aimed at another nuclear missile facility.
For example they consider questions like "how close to a silo of type
A would a missle of type B have to detonate to cause the missile in
the silo to be unusable?  What are the probabilities that a warhead on
a missle of type B can be detonated within this range?".  

This is just the flavor of the article; there are many other issues
addressed as well.  The general thrust is to estimate the maximum
amount of damage that the US could suffer from a foreign first strike,
and then estimate the retaliatory firepower that would still remain
(I use "damage" advisedly; strictly a measure of weapons rendered
unusable).

The authors come up with some reassuring numbers that basically lead
one to think that a first strike would not make sense.  The reason it
would not make sense is that there is no way a first strike can remove
enough retaliatory firepower such that the aggressor can avoid virtually
the same amount of damage in return.  So if they come after us first,
they're still gonna get it back bad -- relax.

I don't like this argument at all.

It reads to me like a lot of the MAD arguments and other such "deterrence"
arguments.  The whole issue of "deterrence" seems completely wrong to me.

If you want to deter someone you are offering a threat.  For a threat to
be effective the threatened party has to fear the consequences of your
making good on your threat.  If they don't care then your threat is worthless
and the other party is not deterred in any way.

Sure, if you and I are sitting around talking you'll probably agree that
you don't want to die, and neither do I, so it's a pretty sure thing that
we won't attack each other.  And you can easily generalize to saying,
well the Soviets (for example) don't want their country destroyed, which
will definately happen in a nuclear war, so they have a vested interest
as well in avoiding a nuclear conflict.

Well, sure, but this is a RATIONAL argument.  It is an INTELLECTUAL argument.
The idea is based on 1) you do not wish to die, and 2) you are afraid of
the consequences.

1) You do not wish to die. ::: How true do you think this is?  How many
   people have given their lives for a cause they believed to be just?
   How many people fly airplanes into battleships, or trucks full of
   explosivles into brick walls?  Certainly a few.  For them, this argument
   holds no water, and you have to rely completely on...

2) You are afraid of the consequences. ::: Sure, most people are.  Have you
   seen Dr. Strangelove?  Let's say someone's going to die soon.  What the
   heck, why not wipe out those bad guys on the way out?  I agree that it's
   rather twisted logic that leads one to get the other guys at one's own
   expense, but how many people follow straght logic on issues as power-
   packed as politics?

It seems to be pretty well agreed that machines are very much in control
of things right now (although not completely in control, we are assured),
and they will only be more and more responsible as response times decrease
and the friend/foe problem must be solved faster and faster.

Computer software is a funny thing.  People who write code for banks are
very tightly supervised, but "secret" stuff happens all the time.  If you're
writing a complex piece of computer code you might be able to slip in a
few hooks that you don't tell anyone about.  Who's got the power now?

Or how about someone in control of a substation with some computer smarts?
He spends enough time with a multiply-redundant, safeguarded system and
puts in a few changes.  Certainly SOMEONE has to have access to the code.
If that someone takes it in his head to do some personal mucking about with
what goes on, again who has the power?

In summary I claim the following scenario is not impossible:
	I am in charge of a number of nuclear warheads.
	One day I decide that those damn Russians have got to go.
        I hate my wife, my mistress has dumped me. 
		(or something equally negative and destructive...)
        To hell with them all, they deserve whatever they might get.
        I understand something of how the computers work.
        I work late and gradually come to know a lot.
		(remember, I run this place!)
	I set it up a la Dr. Strangelove so that we're cut off and
                the computers report something going on.  If I'm really
                slick I might even be able to fake confirmations from
                Washington.
        Enough people and computers (especially!) are duped and the
                missles go up.  From my point of view I hope that the
                hair-trigger we're all holding goes off and it's all over.

Anyone out there think there's a lack of martyrs?
Or people who DON'T consider the consequences of their actions?

	Before the flames come on, please note I am not arguing for
        or against nuclear weaponry per se; I am merely noting a 
        scenario based the current situation as I perceive it.
	
	Responses are of course welcome by net or mail.  

		-Andrew


 - Andrew               uucp:    decvax!cwruecmp!glassner
  (just a glint in a    csnet:   glassner.case@rand-relay
   phosphor's eye...)