[mod.risks] RISKS-3.14

RISKS@SRI-CSL.ARPA (RISKS FORUM, Peter G. Neumann, Coordinator) (06/28/86)

RISKS-LIST: RISKS-FORUM Digest,  Friday, 27 June 1986  Volume 3 : Issue 14

           FORUM ON RISKS TO THE PUBLIC IN COMPUTER SYSTEMS 
   ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Contents:
  A Personal View on SDI (Harlan Mills)
  The Risky Gap Between Two Design Cultures (Jack Goldberg)
  Privacy legislation (RISKS-3.10) (Jerome H. Saltzer)
  Risks in burning wood (Mike McLaughlin)
  Mailer explosion (Sean Malloy)

The RISKS Forum is moderated.  Contributions should be relevant, sound, in good
taste, objective, coherent, concise, nonrepetitious.  Diversity is welcome. 
(Contributions to RISKS@SRI-CSL.ARPA, Requests to RISKS-Request@SRI-CSL.ARPA.)
  (Back issues Vol i Issue j available in SRI-CSL:<RISKS>RISKS-i.j.
  Summary Contents in MAXj for each i; Vol 1: RISKS-1.46; Vol 2: RISKS-2.57.)

----------------------------------------------------------------------

Date: Fri 27 Jun 86 13:35:07-PDT
From: Peter G. Neumann <Neumann@SRI-CSL.ARPA>
Really-From: Harlan Mills (IBM Federal Systems, no net address)
Subject: A Personal View on SDI from Harlan Mills
To: RISKS@SRI-CSL.ARPA

   [The following note has been circulated privately by Harlan Mills,
    noted practitioner of structured programming and other software
    engineering techniques, and is included here with his permission.  PGN]

	Two of my friends, whose intelligence and integrity I respect and
admire greatly, namely David Parnas and James Horning, have stated their
belief that the SDI concept is impractical.  At the same time other groups
of scientists and engineers, from dozens to hundreds to thousands are
declaring their opposition to SDI on various grounds from infeasibility to
conscience.  Yet, we do not seem to find comparable groups of scientists
and engineers on the pro side of SDI in public forums.  Is it because there
is no pro side?  Or is there some other reason?  I think there is another
reason.

	First, there are many scientists and engineers actively working on
SDI research.  Does that mean they are for SDI or are simply hypocrites?
I think for most of them that neither is the case.  There is another
reason possible.  I believe it is the case with me.

	I personally do not know enough to be for or against SDI.
But I do know enough to want our country to be strong in technology.
As a citizen, I depend on our system of government, and particularly
our Congress, to decide about SDI.

	I regard SDI as a political question that will be ultimately
settled in our political system by the 525 members of our Congress.
I trust them to make the wisest disposition possible of this question.
It seems too complex a qustion to settle on a simple up or down vote.
It will take time, experience, and reflection to progressively deal with it.
Much of that experience and reflection will be political and diplomatic;
some of it will be military and technical in nature.  I believe the intent
of most scientists and engineers working on SDI is to explore the technical
side intelligently enough to provide the widest range of options possible
for the political and diplomatic side.

	In order to pursue the SDI question, the administration,
particularly the military, must organize a substantial and serious effort
that itself involves a narrower form of political effort.  It must
advocate a position and lobby Congress for the opportunity to pursue SDI
military and technical research in a responsible way.  But I do, indeed,
believe that members of Congress, with the facts, the checks and balances
of our political system, and constitutional guarantees (e.g., a free
press) will resolve the question of SDI intelligently in due course and
process.

	So I regard the positions of my friends Parnas and Horning, and of
many other scientists and engineers, as thoughtful and courageous acts of
technical or political conviction.  In particular, Parnas and Horning are
expert witnesses in computer science and software engineering.  People in
the administration and members of Congress should and do listen to them.
In matters of theory in computer science or software engineering, I have
never had an occasion to differ or disagree with either of them.  But I do
not always agree with their extrapolations into engineering expectations
in large systems such as required by SDI.

	In the first place, I believe it is somewhat misleading to convert
the problem of SDI feasibility into the question of software perfection.
The problem is deeper than software.  The recent shuttle tragedy reminds
us that any man-made system can fail for many reasons beside software.
So the problem is even worse than simply software.  The best man can do in
any physical system is to reduce the probability of failure to low levels,
not to zero.  If the hardware fails more often then the software, it is
wiser to improve the hardware even though the software is not perfect.

	In the second place, I believe that engineering expectations and
achievements in large systems depend as much on the checks and balances
of good management processes as on engineering theory.  We never get away
from the fallibility of people, but we can reduce the fallibility of
organizations of people below the fallibilities of their individuals.
And with sound engineering theory, there is no real limit to that reduction
in fallibility of organizations.  For me, they key is the combination of
sound engineering theory and good management process -- both are necessary
and neither is sufficient.

	So my extrapolations into what is possible for SDI software are
more open ended than those of Parnas or Horning.  But, as Parnas and
Horning both suggest, we surely will not get there doing business as usual
in the DoD software acquisition process.  Thus, as with the Congress, I
expect DoD to rise to the occasion as the needs arise.  After all, it's
our DoD, as well as our Congress.

	In another era, in the late 40's I was involved in a losing cause
on the issue of "One World or None."  As a student, I was convinced by the
arguments of my elders that atomic theory should be declassified and that
the U.S. should lead the way with an open science policy throughout the
world.  The science world was split then -- Niehls Bohr on one side,
Edward Teller on the other (and Robert Oppenheimer, I think, caught in
the middle).  But, of course, the cold war and Korea settled things
irreversibly.  In spite of the excesses of a few individuals, I believe
our Congress and administration came through that period as well
as possible in steering a science policy course.  I was personally
disappointed in a dream of open science and abundant peace, but I do not
see how it could have been pulled off if our government could not see how.

	That is how I look at SDI.  I would like to help my country be
strong in science and engineering.  The adminstration and the military
are agents of the country in that endeavor.  But, I depend on the Congress
to make the final, collective, decisions, in how to best reflect that
strength for peace in political, diplomatic, and military matters.

	However, as events unfold and we all learn more, both about SDI
needs and engineering theory, if I come to the same belief as Parnas and
Horning, you can be sure that I will join them, and try to bring my
opinions to the administration and Congress, too.  I want to be on the
right side, whether it loses or not!
                                             Harlan Mills

------------------------------

Date: Wed 25 Jun 86 12:01:12-PDT
From: Jack Goldberg <JGOLDBERG@SRI-CSL.ARPA>
Subject: The Risky Gap Between Two Design Cultures
To: Neumann@SRI-CSL.ARPA

Over the centuries of experience in dealing with hazards, mechanical and
civil engineers developed a culture of safe design, with principles and
practices appropriate to the various kinds of products.  This culture was
expressed in the design of mechanisms that implemented various safety
functions, such as barriers to undesired motion, redundancy in the event of
local failures, self-adjustment to losses of tolerance, and so on.  For each
kind of product, particular mechanisms were developed to accomplish these
functions, e.g., pawls, detents, rails, ratchets, fuses.

The advent of computers and inexpensive sensors and motors made possible
tremendous economies in manufacture by eliminating all those particular
mechanisms and their often costly assembly (consider the dramatic comparison
in complexity of mechanism between the original teletype machine and a
modern typewriter/printer).  Mechanical design of the new systems has been
dramatically simplified, and the complex functions, including safety
functions, has been relegated to a control program.  In a sense, the design
is created on a blank slate.

Who creates that design?  Generally someone who is a professional
programmer, often a novice, who has inherited the culture of that
profession.  There are many aspects to that culture, but it rarely includes
the lore and practices of safe design (and the exposure to the machinery of
legal liability) that is the inheritance of mechanical and civil engineers.
It is often based on a partitioning of responsibility between the
hypothetical (and often anonymous) "customer" and the programmer-supplier, a
partitioning that hides the ultimate users from the designer.  Also, too
often, the programmer's education in matters of the physical world has been
compromised by the demands of training for his profession.

Often, the practitioners in the new culture see themselves as generalists,
able to solve any new problem, and they move frequently from one application
area to another.  Consequently, they seldom have the time to study and
understand the things that users or designers in a particular field know or
assume to be obvious, and so they must imagine and re-invent them.
Tragically, those imperfectly mastered things sometime seriously affect
safety.

In short, the culture of safety that traditional engineers have expressed in
particular mechanisms has been tossed out along with those mechanisms, and
is being re-discovered, painfully, by a new generation of designers that has
no connection with the traditional culture.  In this light, risks arising in
contemporary computer-based system design may be seen as a consequence of a
gap between two design cultures.  The gap is both generational and
professional; there are many safety engineers in industry, but they and
programmers speak different languages.

In a different context, awareness of the loss of knowledge by experts in
various practices, due to their lack of replacement in the work force, has
stimulated some computerists to try to capture that knowledge.  How well
they are doing that is another matter, but it may be that some conscious
gap-bridging between the cultures would save the world some amount of
misfortune and misery.

------------------------------

Date: Fri, 27 Jun 86 15:16:13 EDT
To: RISKS FORUM    (Peter G. Neumann, Coordinator) <RISKS@SRI-CSL.ARPA>
Subject: Privacy legislation (RISKS-3.10)
From: Jerome H. Saltzer <Saltzer@ATHENA.MIT.EDU>

The reported privacy legislation proposal for radio-based telephone
conversations is quite analogous to some of the proposals that circulated
for several years around the cable and satellite TV industry.  In that case
as well as this, technology bluffing is dominating the conversation.  The
overall scenario is that economic interests are claiming that technology
can't supply privacy economically, so draconian laws are the best way to
proceed.  Responsible engineers should object to this line of reasoning
whenever they notice it being misused.

Since in-the-clear radio communications are trivially, even accidentally
interceptable, the public interest requires that the first avenue to explore
in protecting them be narrowly technological (scrambling) rather than
broadly targeted legal approaches that can have surprising side effects on
the bill of rights.  But commercial interests that don't want to think about
extra costs or delay in getting to market use technological intimidation to
produce public positions that scrambling is too expensive.

The cable and satellite broadcast communities have come to realize that laws
don't help as much as they hoped and they have to scramble anyway.  It would
be nice if we could somehow get that fact across to the legislators who are
being bamboozled by the cellular telephone business.

The worst part about passing a law to cover for temporarily missing
technology is that when the technology to solve the problem does arrive, the
laws don't magically disappear; they stick around, forgotten, to cause
trouble and surprises later when an enterprising District Attorney discovers
they have undreamed-of possibilities.

A related comment on banning listening said. . .

  > Not true.  States routinely ban the use of radar detectors, and that
  > is nothing more than "listening to a frequency."

States often legislate things that wouldn't pass constitutional muster; this
is an example that at least some legal specialists identify as unlikely to
stand up.  The word around here is the real challenge to radar detector bans
is awaiting the first time that the state of Connecticut tickets F. Lee
Bailey.
					Jerry Saltzer

------------------------------

Date: Fri, 27 Jun 86 11:21:19 edt
From: mikemcl@nrl-csr (Mike McLaughlin)
To: Risks@sri-csl.ARPA
Subject: Risks in burning wood

Risks has carried a lot lately regarding the risks associated with nuclear
energy.  Some discussion has compared nuclear with coal and hydro.  The 
emphasis has been on "disasters," such as Chernobyl or dams breaking. 

May I respectfully submit that not all disasters are sudden. 

Wood smoke is a pollutant.  It may smell nice (except for poplar and a
few others), but if you burn enough of it, nasty things happen.  

Coal smoke is a pollutant.  It never smells nice, and it makes for acid rain
and other nasty things.  These nasty things are slow, but some of us 
recognize the long term effects of generating power through coal as an
ecological disaster.  

Most natural hydrocarbon combustion byproducts (excuse me, "smoke") also
contain carcinogens.  They are as effective at producing cancer as alpha,
beta, gamma, and all those other funny names.  Just different cancers.
I see no value in having any cancer, different or not.

In an attempt to tie this to computers somehow, so that PGN will not toss 
this in his bit bucket:  

Will some reader please gather a creel of Crays and compare the long-term
hazards to the populace, Sialis sialis and Cornus florida of nuclear pol-
lutants (sudden or slow) vs. hydrocarbon pollutants (sudden or slow) while
holding Terra's total energy demand as a constant?

Thank you.

------------------------------

Date: Thu, 26 Jun 86 06:50:03 pdt
From: malloy@nprdc.arpa (Sean Malloy)
To: RISKS-Request@SRI-CSL.ARPA
Subject: Mailer explosion

     I'm sorry about the explosion of the mailer demons here. At NPRDC, we
have a network consisting of two VAXen, eight or nine Sun workstations, and
a couple of PCs and ATs, all EtherNeted together.  The mail program was
recently brought up on the Suns, and it was suggested that people wishing to
receive their mail on the Suns rather than on PACIFIC (the VAX our code has
primary accounts on) should put .forward files in their home directories on
PACIFIC, which would cause mail sent to <username>@pacific to be forwarded
to a system specified in the .forward file.

     So I made a .forward file, and expected my mail to be forwarded from
malloy@pacific to malloy@hull. But I hadn't expected that a network mail
alias simplification would blow my mail all over creation. To simplify
maintaining the mail alias file on the Suns, the file /usr/lib/aliases on
PACIFIC gets copied to the Suns whenever it is changed. This means that the
Suns think my mail address is malloy@pacific.

     As a result, any mail coming in between Friday (6/20) morning when I
set up the .forward file, to Monday morning when I deleted it because it
wasn't working right (one of my coworkers mentioned losing mail to me) was
received by pacific, where the mailer-demon read the .forward file, and sent
it on to malloy@hull. Hull received the mail, checked the /usr/lib/aliases
file, and sent it back to malloy@pacific.  Twenty-nine loops later, the
mailer-demon explodes, and my mail gets thrown back at whoever sent it.

	Sean Malloy 	(malloy@pacific)

------------------------------

End of RISKS-FORUM Digest
************************
-------