[mod.risks] RISKS DIGEST 4.26

RISKS@CSL.SRI.COM.UUCP (12/11/86)

RISKS-LIST: RISKS-FORUM Digest, Wednesday, 10 December 1986 Volume 4 : Issue 26

           FORUM ON RISKS TO THE PUBLIC IN COMPUTER SYSTEMS 
   ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Contents:
  Computer Error Endangers Hardware (Nancy I. Garman)
  "One of the Worst Days Ever for Muni Metro, BART" (PGN)
  Korean Air Lines Flight 007 (Steve Jong)
  Plug Compatible Modules; Criminal Encryption (David Fetrow)
  More on skyscraper control (Mike Ekberg)
  Satellite interference (James D. Carlson)
  (Il)legal Encryption (Richard Outerbridge)
  Software article in _Computer Design_ (Walt Thode)
  Heavy metal and light algorithms (PGN)
  Suit against Lotus dropped (Bill Sommerfeld)

The RISKS Forum is moderated.  Contributions should be relevant, sound, in good
taste, objective, coherent, concise, nonrepetitious.  Diversity is welcome. 
(Contributions to RISKS@CSL.SRI.COM, Requests to RISKS-Request@CSL.SRI.COM)
  (Back issues Vol i Issue j available in CSL.SRI.COM:<RISKS>RISKS-i.j.  MAXj:
  Summary Contents Vol 1: RISKS-1.46; Vol 2: RISKS-2.57; Vol 3: RISKS-3.92.)

----------------------------------------------------------------------

To: RISKS@csl.sri.com
Subject: Computer Error Endangers Hardware  
Date: Wed, 10 Dec 86 08:44:43 PST
From: Nancy I. Garman <ngarman@venera.isi.edu>

I work for a group that also manages an offsite computer center.  There has
been so much difficulty with the contractor who is supposed to keep the
floor clean that our hardware folks were worried about disk drive
contamination from the dirty floor.

I spoke with the Director of Sales for the cleaning company.  He blamed their
computer for the dirty computer room floor that was risking damage to our disk
drives.  Apparently, their computer had us erroneously scheduled for fewer
cleanings than our contract called for.  

Of course, it is likely to be a data entry error.  Still, it makes me wonder -
what does their computer have against our computers?!

- Nancy Garman
  NGarman@VENERA.ISI.EDU

------------------------------

Date: Wed 10 Dec 86 17:38:57-PST
From: Peter G. Neumann <Neumann@CSL.SRI.COM>
Subject: "One of the Worst Days Ever for Muni Metro, BART"
To: RISKS@CSL.SRI.COM

(From an article by Harry W. Demoro and Carl Nolte in the San Francisco
Chronicle, 10 December 1986, p. 2.)

"... doors, signals, switches, brakes, and even a speedometer broke."

The worst mess began at 6 a.m. when an electrical short circuit caused
a "ghost train" to appear on the signaling equipment that guides Muni
Metro streetcars in and out of the Embarcadero station.  [This prevented
the switches from working automatically.]  Muni troubleshooters did not
eliminate the "ghost train" until 8:14 a.m...

By that time BART was a mess...  [for the second morning in a row]  At 5:10
a.m., a train broke down at the Richmond Yard.  Then at 7:10 a.m., the
switches that route trains through the MacArthur station in Oakland stuck,
creating a bottleneck because two lines converge there.  [Workers used hand
cranks.] The problem was fixed at 9.07 a.m.  Also at 9:07, switches stuck at
the Daly City station (18 miles away).

At 7:14 a.m. a door stuck open at MacArthur.  At 7:25 a.m. a train was taken
out of service because brakes locked for no apparent reason.  At 7:33 a.m. a
train stalled when the door stuck.  At 8:04 a.m. another train broke down in
the repair yard.  At 8:38 a.m., a train refused to budge because of a stuck
door.  At 9 a.m the speedometer stuck on a train, which had to be sidetracked.
At 10:28 a.m. another train was stalled by a stuck door.  The problem
"finally cleared itself up at noon," said a spokesman.  [Bad Car-ma resolved?]

Things have been fairly smooth for BART and Metro Muni for some time.  I
don't recall BART having such a disastrous day since 6 years ago.  (See
Software Engineering Notes 6 1, January 1981)  The "ghost train" problem 
had plagued Muni Metro in its early days, but I had not heard about it
recently.  (See Software Engineering Notes 8 3, July 1983)

Although the computer systems were not implicated, this "bad day" serves
to remind us that when we plan for things not going well, we need to plan
for things going REALLY BADLY.  

------------------------------

Date: Tuesday,  9 Dec 1986 11:07:53-PST
From: jong%derep.DEC@decwrl.DEC.COM  (Steve Jong/NaC Pubs)
To: risks@csl.sri.com, risks%derep.DEC@decwrl.DEC.COM
Subject: Korean Air Lines Flight 007

In his book "The Target is Destroyed" (1986), Pulitzer Prize-winning writer
Seymour M. Hirsh strives to explain why Korean Air Lines Flight 007 flew
serenely over the Soviet Union to its doom on September 1, 1983.  Since none
of the crew survived and the flight recorders were never recovered, any
explanation is highly conjectural, but he presents the arguments of a
veteran pilot, one who has flown that route many times.  After exhaustive
studies, including his knowledge of how flight crews work with their
equipment, the pilot concluded that a combination of human errors caused the
navigational snafu.  One of the errors was postulated to be a well-known
blind faith in the plane's inertial navigation system (INS).

This triply-redundant, highly accurate system flies the plane automatically
once coordinates are entered.  The crew enters starting, ending, and
"waystation" coordinates into each of the three components.  If there is an
entry error, or if the plane seems off course, an alarm sounds.

The full scenario is too complex to cover here, but the gist of it
is that a crew member fat-fingered the "you are here" coordinates.

How is it a RISK?  Consider the anecdotal evidence of other flights:

o	Crews place complete faith in INS.  They don't have to fly the
	plane, and sometimes have been known to nap in the cockpit.

o	Crews trust INS more than their radar.  The pilot who
	developed this scenario said if the KAL crew looked at
	their radar and saw the Kamchatka Penninsula where there
	should have been open ocean, they probably shut off the
	radar, because the INS was functioning normally.

o	The INS is so sensitive that if the plane strays down the
	wrong taxiway, it sounds off.  Crews will shut off the alarm.

o	Entry errors are common on long flights, because crews must
	enter three sets of ten coordinates (over a hundred numbers).

o	Though it is strictly against airline policy to do so, at
	the touch of a button the crew can "autoload" coordinates
	from one INS to another.

If you accept the scenario, 269 people died at least partly because of blind
faith in computers and a tedious interface that was too simply circumvented.

    [Reminder:  There are quite a few books on this subject.  Each tries to
     justify its own theory, but all seem to come to somewhat different
     conclusions.  PGN]

------------------------------

Date: Mon, 8 Dec 86 01:45:09 PST
From: fetrow@entropy.ms.washington.edu (David Fetrow)
To: risks@CSL.SRI.COM
Subject: Plug Compatible Modules; Criminal Encryption

 Item 1: Plug Compatible Modules

 Concerning the Nurse who accidentally electrocuted a little girl by plugging
in AC power to heart monitor electrodes at Seattles' Childrens' Orthopedic.

 AP gave the impression that the heart-monitor plug was like a wall-plug. This
was not the case: The heart monitor plug consisted of three simple metal probes
(like those on an ohm meter). They were accidentally plugged into the slots
of the female end of an AC extension cord; which resembled the unit the probes
should have been attached to. The solution doesn't change: make unique plugs
for everthing around an ICU patient. (Source: KING TV on camera interview with
hospital administrator).

 Item 2: Criminal Encryption

 I remember reading in the Seattle Times a couple years ago about a computer
expert who encrypted his kid-porn information on a disk. The police had a
warrant for his files but couldn't crack the encryption. They turned to
hacker who tried the "decrypt" command without a key. It worked; the
evidence was admissible.  [No documentation for this one, though.]

------------------------------

Date: Fri, 5 Dec 86 17:11:15 pst
From: weitek!mae@decwrl.DEC.COM (Mike Ekberg)
To: RISKS@csl.sri.com
Subject: More on skyscraper control

One of the buildings in Boston is indeed balanced by computer. I think it
is the John Hancock Building.

At any rate, the building was designed by I.M. Pei, a rather famous architect.
It was one of the first buildings ever built that is a parallelpiped with
non-90 degree angles. The skin of the building is almost solid glass. Soon
after the building was finished, glass sheets began falling off the building
onto the plaza below. (I don't know if anybody was squashed) This only occured
when the wind blew.

An aeronautical engineer at nearby MIT found out why the glass fell. He 
modelled the building as an vertical aircraft wing fixed on the bottom end.
When the wind blows, the wing(building) generates lift on one side. The
upper part of the building twists and window dimensions are altered causing
the glass to fall.

The solution was to install in the upper floor a large weight controlled by
computer. When the computer detects the building being twisted, it counters
the torque by moving this weight. 

In addition, all the glass in the building was replaced with a type more 
resistant to the effects of the building being twisted. The new glass has the
property that its optical characteristics are significantly modified 
when the panes are twisted. In periods of high wind, spotters near the 
building can monitor its status by using binolculars looking at sections
of glass.

mike   {turtlevax,cae780,pyramid}!weitek!mae

PS Most of this was related to me by a structural engineer living in Boston.

------------------------------

Date: Mon,  8 Dec 86 20:06:02 est
From: jc37@andrew.cmu.edu (James D. Carlson)
To: risks@csl.sri.com
Subject: Satellite interference

  From Lauren Weinstein:

  > While misaimings and such are fairly common, they rarely result in total
  > capture of a satellite transponder, since it takes considerable power to
  > completely override the main signal.  In the case of the described 
  > problem with CNN Headline News, the explanation is almost certainly very 
  > simple and has nothing whatever to do with interfering signals.

Unfortunately, uplink signals are usually fairly weak, about one watt,
since they are very narrow beam.  The uplink is also frequency
modulated, which means that another signal only 1dB stronger aimed in
the same direction will take over the satellite's receiver.

------------------------------

Date: Mon, 8 Dec 86 20:41:18 est
From: Richard Outerbridge <outer%csri.toronto.edu@RELAY.CS.NET>
To: risks@CSL.SRI.COM
Subject: (Il)legal Encryption

In >The Codebreakers< David Kahn tells of several cases involving crooks,
codes and evidence, but none with 5th amendment implications.  A related
issue is high-order homophonic and "subliminal channel" coding, which are
capable of conveying two (or more) legitimate messages depending on the
key employed: using Key A out pops Grandma's secret recipe for marzipan;
use Key B and out pops the chemistry of the latest designer drug.  Even
were I legally compelled to divulge my keys, if the analyst can't find
'Key B' how can he prove that I haven't complied by revealing 'Key A'?

------------------------------

Date: 10 December 1986 0824-PST (Wednesday)
From: thode@nprdc.arpa
To: RISKS@CSL.SRI.COM
Subject: Software article in _Computer Design_

There is an article in a recent issue of _Computer Design_ magazine (the
November 15 issue) titled "Approaches to Software Testing Embroiled in
Debate."  Its author is William E. Suydam.  It covers a lot of the same
ground as some of the contributions to this list.  Quotations from David
Parnas, Nancy Leveson, and others are included.  It seems, from my inexpert
perspective, to be a decent summary of the problems in software reliability.

--Walt Thode (thode@NPRDC)

------------------------------

Date: Wed 10 Dec 86 17:31:32-PST
From: Peter G. Neumann <Neumann@CSL.SRI.COM>
Subject: Heavy metal and light algorithms
To: RISKS@CSL.SRI.COM

Dave Parnas called my attention to an oversimplification in which I indulged
when I noted in RISKS-4.20 that "an algorithm is an algorithm is an algorithm"
(This was in connection with Steve Gutfreund's note on encoding algorithms in
"smart metals".)

Indeed, Dave is right in suggesting that "the metal algorithm would be, to a
very useful approximation, a continuous function or at least piecewise
continuous with very few points of discontinuity.  As such it could be much
more easily analyzed and studied than its counterpart as a digital computer
program."

This raises interesting questions about the relative precision, accuracy,
and soundness of "metal algorithms" and comparable analog devices in
general.  The situation is somewhat akin to higher-level programming
languages.  Perhaps one is less likely to make low-level design and program
errors in the directly-implemented analog case, but it is of course still
possible to choose the wrong model.

------------------------------

To: risks@csl.sri.com
Subject: Suit against Lotus dropped
Date: Wed, 10 Dec 86 13:15:08 EST
From: Bill Sommerfeld <wesommer@ATHENA.MIT.EDU>

The following article may be of interest to Risks readers; it is from page
81 (the first page of the business section) of the Boston Globe of 10 Dec 86.

	Lawsuit charging errors in Lotus software dropped
               By Ronald Rosenberg, Globe Staff

  A case seen as a test for settling responsibility when computer software
  fails was dropped yesterday, a victory for an industry that had stood by
  nervously as the issue made its way to court.

  James A. Cummings, Inc., a Florida construction firm, yesterday ended
  its suit against Lotus Development Corp. of Cambridge [Massachusetts],
  a lawsuit that the industry feared could open the door to a host of
  liability claims against software developers.

  In its suit, Cummings had charged that errors in the Lotus software caused
  to underbid on, and lost a contract.  The company had sought $254,000 in
  damages from Lotus, a leading maker of personal computer software.

  If the case had gone to trial, it would have been the first to question
  whether a supplier of software tools, such as Lotus, is liable for wrong
  information produced by users of its programs.

  Neither Cummings nor its attorney, John R. Squitero of Miami, will receive
  anything from Lotus.  Squitero, who talked openly about the case last
  summer, refused to comment yesterday.

  Lotus said that under the termination agreement, Squitero and James A.
  Cummings, president of the Fort Lauderdale contracting company, agreed not
  to discuss [the] case.

  ``Lotus is pleased that this attack upon the integrity of Symphony, one
  [of] our leading products, has ended with the complete vindication of both
  Symphony and Lotus,'' said Jim P. Manzi, Lotus chairman and president.

  Squitero had expected to fly to Boston yesterday to take depositions from
  Lotus employees. Late Monday evening, Squitero decided to throw in the towel.

  ``I think they (Cummings and Squitero) hoped that there would be a financial
  settlement by now, and we persuaded them that we would never settle -- not a
  penny,'' said Hank Gutman, an attorney with O'Sullivan, Graev and Karabell
  of New York, which represented Lotus.

  Peter Marx, general counsel to the Information Industry Assn., a trade
  organization for 500 software and computer companies including Lotus,
  applauded the dismissal:

  ``Our fear was that as long as the case was hanging out, it might have
  encouraged creative lawyers to file suits that have no merit.'' 

------------------------------

End of RISKS-FORUM Digest
************************
-------