[mod.politics.arms-d] Arms-Discussion Digest V7 #72

ARMS-D-Request@XX.LCS.MIT.EDU (Moderator) (11/27/86)

Arms-Discussion Digest             Wednesday, November 26, 1986 7:14PM
Volume 7, Issue 72

Today's Topics:

                       Re: Defending Kwajalein
                    Re: acetylene anti-tank weapon
                            Research Mode
                 Why defense can look like an attack
                  attacks on land-based hard targets
                New book, *The Automated Battlefield*
                New book, *The Automated Battlefield*
               Twas The Night Before Christmas (humor?)

----------------------------------------------------------------------

Date: Tue, 25 Nov 86 21:14:03 pst
From: Eugene Miya N. <eugene@ames-pioneer.arpa>
Subject: Re: Defending Kwajalein

Hurray Henry!  You beat me to posting the conditions of a lower level
SDI test.  But you forgot to mention one important thing.  The tendency
to modify the outcome of such tests, experiments and war games
to "Successful" conclusions.  There have been the notorious naval
wargames about big carrier defense where sub attacks sucessfully
sink big carriers and escape, but the torpedo kill was disavowed.
This is akin to security Tiger teams breaking into computer systems
using means which the defenders later cry foul.  A non-biased (?)
third party must be able to determine whether or not a sucessful
defense has taken place (such a specification must be written before
hand, placed in a sealed envelope, etc.).

Remember: Star Wars is a trademark of Lucasfilm, Ltd.

--eugene miya
  NASA Ames Research Center

------------------------------

Subject: Re: acetylene anti-tank weapon
Date: Tue, 25 Nov 86 18:28:50 PST
From: Jef Poskanzer <unisoft!charming!jef@ucbvax.Berkeley.EDU>

Very interesting.  Did the article say anything about what effect
acetylene would have on gas turbine engines?
---
Jef

------------------------------

From: hplabs!pyramid!utzoo!henry@ucbvax.Berkeley.EDU
Date: Tue, 25 Nov 86 22:46:09 pst
Subject: Research Mode

Hank Walker writes:

> I haven't heard any response yet to my proposal to start the Tau Ceti
> Initiative (TCI), a $30B research project to send a manned mission to Tau
> Ceti.  Sure we don't have hardly any idea of how to do it.  But it is
> obviously theoretically possible.

Hank, you should be more careful about choosing your examples.  Given $30B
funding and patience, a Tau Ceti mission might well be possible.  If you
look at some of the work Robert Forward has done in recent years, it becomes
clear that we are probably much closer to antimatter rockets than most people
think.  The USAF has study contracts out already on antimatter for in-space
propulsion; even at many billions of dollars a gram, it is cost-competitive
with lifting H2/O2 from Earth.  Interstellar propulsion would need much more
antimatter, and much lower production costs... but it's no longer a joke.

				Henry Spencer @ U of Toronto Zoology
				{allegra,ihnp4,decvax,pyramid}!utzoo!henry

------------------------------

From: hplabs!pyramid!utzoo!henry@ucbvax.Berkeley.EDU
Date: Tue, 25 Nov 86 22:46:47 pst
Subject: Why defense can look like an attack

dm@bfly-vax.bbn.com writes:

>     1) Space based systems are vulnerable to pre-emptive ASAT attack, such
>     ASATs can be too cheap to overwhelm. [thus boost-phase systems must be
>     pop-up systems, which do look a lot like an attack]

Spend a fraction of an SDI deployment budget on better in-space transport,
and it becomes possible to get materials relatively cheaply from the Moon
or the asteroids.  This makes it feasible to armor space-based systems,
making effective ASAT attacks much more costly.

Another way around this is active mid-course discrimination, which could
greatly reduce the pressure for boost-phase interception.

There are probably other solutions as well.

> I don't think this is an engineered-to-fail system, since this
> scenario is roughly the one originally proposed by Lowell Wood's
> O-group, and to which they keep returning ...

"Never overlook the possibility of sheer stupidity as an explanation."
Ill-conceived and unwise BMD systems seem to be a specialty of Wood's group.

				Henry Spencer @ U of Toronto Zoology
				{allegra,ihnp4,decvax,pyramid}!utzoo!henry

------------------------------

Date: 26 Nov 1986 08:43-EST
From: Hank.Walker@gauss.ECE.CMU.EDU
Subject: attacks on land-based hard targets

Everything I have read indicates that any significant attacks on hard
land-based targets will result in tens of millions of deaths due to
fallout.  I doubt that the US public is going to settle for anything so
minor as a few East European countries in return.  Maybe Soviet
disarmament, maybe a few million dead Soviets, but not East Germany.
I'm assuming the dust won't cause significant changes to the weather or
agriculture.

------------------------------

Date: Tuesday, 25 November 1986  17:54-EST
From: Gary Chapman <chapman at russell.stanford.edu>
Re:   New book, *The Automated Battlefield*

I have finally picked up a copy of Frank Barnaby's new book, *The
Automated Battlefield* (The Free Press, $18.95 hardcover).  I am
having a hard time believing what I find in it.

Here is the back cover blurb about the book:

          Here is a fascinating glimpse of future war--fought
          *without* human confrontation.  Respected physicist Frank
          Barnaby argues that advances in surveillance and target
          acquisition systems, the development of increasingly
          intelligent missiles and highly destructive conventional
          warheads, and the automation of command, control and
          communications operations foreshadow the advent of war
          fought entirely by machines.

          In The Automated Battlefield, Barnaby reveals an
          astonishing new perspective on the modern battleground
          where advancing forces are not men but mechanized drones,
          monitored on video screens by commanders who base their
          strategies on reconnaissance from satellites, unmanned
          aircraft and underground sensors.

Barnaby, for those of you who don't know him, is the former director
of the Stockholm International Peace Research Institute, or SIPRI, one
of the most respected "thinktanks" on security policy in the world.
He has taught at the Free University of Amsterdam, the University of
Minnesota and other places.  He is currently the director of a new
organization called Just Defence, based in Oxford, England, which
lobbies for a more robust conventional defense of Western Europe as a
means to get out from under the nuclear umbrella.

Barnaby is straightforward: "The Pentagon's goal is to give machine
intelligence the job of waging war without human intervention."  "As
we have stressed in this book, there is no technological reason why
the Pentagon should not achieve its goal of the automated
battlefield."  "Automated weapons are needed to replace soldiers."

          ET (emerging technology) weapons will react very rapidly.
          Surveillance systems will detect enemy forces on the move,
          at great distances, and guide missiles to attack them so
          quickly that there will simply be no time for humans to
          intervene.  Human judgment will be irrelevant.  Commanders
          of ET forces will mere onlookers from a great distance,
          watching their computerized attacks in comfort on
          television.

Barnaby does quote some of my objections to "autonomous" weapons:

          As Gary Chapman points out, one problem is that military
          computer programmers will, to say the least, find it
          extremely difficult to distinguish combatants from
          non-combatants, a distinction that is "at the heart of all
          rules of the conduct of war."  To cross the threshold of
          "allowing machines to kill humans with nonchalance and
          without regret" is to move into an age of "new barbarism."

However, Barnaby then goes on to say that it is only a matter of the
*range* of the weapons, and the tactics under which they will be
employed, that will separate us from this "new barbarism."  By
stressing defense instead of offense, and by making the weapons
short-range instead of long-range, argues Barnaby, we can have a
secure defense of Western Europe leading to the elimination of the
current reliance on nuclear weapons.

I find that I agree with much of what Barnaby has to say about a
strategic realignment of military forces that abandons the deep stike,
"follow on forces attack" strategy of Airland Battle doctrine.  But
his optimism regarding the roles of artificial intelligence and robot
weapons is very disturbing.  Moreover, there is no discussion at all
of the prospects for the Mutually Balanced Force Reductions talks
(MBFR), which have been going on for the last 15 years in an attempt
to get some kind of arms control on conventional weapons in the
European theater.

Barnaby only rhetorically asks what it would take to produce a victory
in a war of robots against robots.  I find it incredible that we've
reached a stage where such a question is on the public agenda.  A
Deputy Director of DARPA recently asked me what I could have against
war involving only robots.  I was, for a while, speechless.  I can't
understand the world view that would produce a question like that.  It
appears there's a need for another book to answer that question, an
answer that seems to me obvious.  Perhaps the truly necessary books
are those that state the obvious.

------------------------------

Date: Wed, 26 Nov 1986  17:14 EST
From: LIN@XX.LCS.MIT.EDU
Subject: New book, *The Automated Battlefield*

A summary (written by Barnaby) of the book is in the October 1986
issue of Technology Review.

    Barnaby says that it is only a matter of the *range* of the weapons,
    By stressing defense instead of
    offense, and by making the weapons short-range instead of long-range,
    argues Barnaby, we can have a secure defense of Western Europe...

Short-range???  At least at long range you can plausibly identify a
tank as an enemy tank.  At short range, you can't even do that!  So we
have a conflict: at short range, you have to worry about killing your
own forces.  At long range, you have to worry about civilians.  

------------------------------

Date: Wednesday, 26 November 1986  12:07-EST
From: Hauptman.PA at Xerox.COM
To:   ARMS-D
Re:   Twas The Night Before Christmas

I thought the readers of ARMS-D might find this interesting. 

Steve

From: Barnett.WBST
Subject: 'Twas the night before Christmas


      Twas the Night before Christmas -- The Very Last One
	-----------------------------------------------------
  	                (Anon. 1986)

     'Twas the night before Christmas -- the very last one --
     When the blazing of lasers destroyed all our fun.

     Just as Santa had lifted off, driving his sleigh,
     A satellite spotted him making his way.

     The Star Wars Defense System -- Reagan's desire
     Was ready for action, and started to fire!

     The laser beams criss-crossed and lit up the sky
     Like a fireworks show on the Fourth of July.

     I'd just finished wrapping the last of the toys
     When out of my chimney there came a great noise.

     I looked to the fireplace, hoping to see
     St. Nick bringing presents for missus and me.

     But what I saw next was disturbing and shocking:
     A flaming red jacket setting fire to my stocking!

     Charred reindeer remains and a melted sleigh-bell;
     Outside burning toys like confetti they fell.

     So now you know, children, why Christmas is gone:
     The Star Wars computer had got something wrong.

     Only programmed for battle, it hadn't a heart;
     'Twas hardly a chance it would work from the start.

     It couldn't be tested, and no one could tell,
     If the crazy contraption would work very well.

     So after a trillion or two had been spent
     The system thought Santa a Red missle sent.

     So kids dry your tears now, and get off to bed,
     There won't be a Christmas -- since Santa is dead.

------------------------------

End of Arms-Discussion Digest
*****************************