[comp.risks] RISKS DIGEST 6.58

RISKS@KL.SRI.COM (RISKS FORUM, Peter G. Neumann -- Coordinator) (04/12/88)

RISKS-LIST: RISKS-FORUM Digest   Monday 11 April 1988   Volume 6 : Issue 58

        FORUM ON RISKS TO THE PUBLIC IN COMPUTERS AND RELATED SYSTEMS 
   ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Contents:
  Computers are a drain on police cruisers (Mark Brader)
  What happened to personal responsibility? (George Michaelson)
  Re: Intolerant Fault-Tolerance (Tom Lane)
  Another Security Clearance Story (Ronald J Wanttaja)
  A new VMS security hole? (Jonathan Corbet)
  Re: Notifying users of security problems (John O. Rutemiller, William Smith)
  April Fool's Warning (Piet Beertema)
  Viruses (Fred Cohen)
  Virus Distribution (Peter G. Rose)
  Re: The "(c) Brain" virus is not a new virus.  (Rob Elkins)
  There is a VT220 with block mode available from DEC.  (David E A Wilson)
  Enfranchising the disenfranchised: our responsibility? (Tom Betz)
  Discrimination and careless arguments (David Thomasson)

The RISKS Forum is moderated.  Contributions should be relevant, sound, in good
taste, objective, coherent, concise, nonrepetitious.  Diversity is welcome. 
Contributions to RISKS@CSL.SRI.COM, Requests to RISKS-Request@CSL.SRI.COM.
  For Vol i issue j, FTP SRI.COM, CD STRIPE:<RISKS>, GET RISKS-i.j.
  Volume summaries in (i, max j) = (1,46),(2,57),(3,92),(4,97),(5,85).

----------------------------------------------------------------------

Date: Sun, 10 Apr 88 16:26:24 EDT
From: msb@sq.com (Mark Brader)
Subject: Computers are a drain on police cruisers

Abridged from an article by Jack Lakey in the Toronto Star, April 9, 1988.

Booster cables are fast becoming the Metro[politan Toronto] police officer's
best friend.

Deputy Chief Bill McCormack says the force is having problems with batteries
dying in some police cruisers equipped with both computers and radios.
However, McCormack denied that the dead batteries are compromising the ability
of the force ... to respond quickly to emergencies.  "We always have cars
available on an emergency basis...  But I quite agree that it's a problem and
some of the equipment we have in place is the cause of it."

The computer and radio draw power from the battery when the cruiser isn't
running. ...  At the scene of a murder ... last Sunday, a Star reporter watched
as two cruisers -- both Plymouth Caravelles -- needed boosts from other police
vehicles to get started.  An officer driving the second vehicle said many
cruisers equipped with radios and computers were having similar problems. ...
the force uses heavy-duty batteries in the cars.  Two suspects in the murder
were arrested ... another Star reporter watched police boosting a third cruiser
that wouldn't start.

Police have noticed the problem is most common to a particular year and model,
but McCormack didn't want to identify the manufacturer.  "We're going to be
consulting with the manufacturer on finding a way to upgrade the power to the
battery ... But we are finding that it happens with the older cars or spares."
Allan Gibb, fleet administrator for the ... Metro police, said the power drain
was an "operational problem" common to any vehicle heavily equipped with
electronics.  However, police departments in Winnipeg and Calgary, where most
cruisers are equipped with computers, said they haven't had any problems
maintaining battery power.

Abridged and posted by Mark Brader

------------------------------

Date: 09 Apr 88 12:20:35 +1000 (Sat)
From: munnari!ditmela.oz.au!george@uunet.UU.NET
Subject: What happened to personal responsibility?

I've just finished re-reading L.T.C Rolts classic 'Red For Danger'
which is an updated version of the book, with new text by G Kichenside.
-alas Rolt died in 1974 and this edition was published by PAN in 1986.

This book should be required reading for all COMP.RISK-ers.
Technology *always* brings problems of "risks", and how we dealt
with those faced yesterday lends weight to how we could/should 
approach similar problems today. 

Reading the book I was struck by how often Rolt said (obvious BNF paraphrase):

     "...at the board of enquiry the {<driver>|<guard>|<signalman>|<other>}
      was rightly found innocent of the charge of manslaughter due to
      {<extenuating circumstances>|<genuine act-of-god>} ..."

Rolt also highlights the problems the Inspectorate faced persuading
Railway Companies to adopt basic working practice, or upgrade equipment 
to be in line with current practice. Then as now there was an ongoing
design cycle, existing systems were being kludged to work under increasing
loads and beyond design parameters.

	(1) even though "hardware" failed in many cases, a bolt sheared,
	    a retaining wall collapsed, a signal was not visible SOMEONE
	    stands up and says "yes, it was my responsibility". They could
	    be "let off" but it wasn't possible to say "the machine was
	    to blame".

	(2) *when* hardware failed, "rules" were changed to reflect
	    experience, designs re-worked, fail-safe factors increased.
	    the problem was getting $-obsessed people to become accountable 
	    for the consequences of $-based decisions.

	(3) "software" (I think the rules governing single-track usage
	    and point-settings and even how to behave when a train stops
	    in an emergency can be described as programs can't they?) was
	    very often to blame, but usually the problem lay with human
	    "interpretation" of the rules, or conflict in their application.

Railways are among the most public users of computer-based systems
we have. To sit in a 30mph steam train was not only a joy, you placed
your life in the hands of engineers who were ultimately accountable. To
sit in a 125mph bullet train or a high-speed local subway is no longer
quite so joyful. You *still* place you life in the hands of the company,
but is it the Engineers, software or otherwise that carry the can?

	-George Michaelson

ACSnet:	G.Michaelson@ditmela.oz.au	
Postal:	CSIRO, 55 Barry St, Carlton, Vic 3053
Phone:	(03) 347 8644 				Fax:	(03) 347 8987

------------------------------

Date: Sat, 09 Apr 88 19:26:19 EDT
From: Tom.Lane@ZOG.CS.CMU.EDU
Subject: Re: Intolerant Fault-Tolerance

Here's another "amen" to Tim Mann's and Jerome Saltzer's comments about
reporting faults masked by fault-tolerant systems (RISKS 6.53, 6.54).
I have another example to add to the list.

  For the past several weeks, considerable net bandwidth in Usenet newsgroup
comp.sys.hp has been devoted to discussion of a posted set of benchmark
numbers, which allegedly demonstrated that a certain new HP machine was not 3x
faster than its predecessor (as claimed by HP), but actually more like 15x
slower.  Other people were unable to duplicate the original poster's results.
It eventually emerged that the machine he tested had a bad floating-point
processor.  The operating system detected this at bootup and *silently*
installed software emulation traps for all the floating-point instructions...

                 				tom lane
BITNET: tgl%zog.cs.cmu.edu@cmuccvma
UUCP: <your favorite internet/arpanet gateway>!zog.cs.cmu.edu!tgl

------------------------------

Date: Thu, 7 Apr 88 23:10:30 pst
From: ames!uw-beaver!ssc-vax!wanttaja@ucbvax.Berkeley.EDU (Ronald J Wanttaja)
Subject: Another Security Clearance Story 

It's one thing to be truthful on a background questionare for a security
clearance.  It's entirely *another* thing when nothing in the chain...
human or computer... actually LOOKS at the data input!

This I'm afraid, is one of those "friend of a friend" stories you hear in
the high-security world.  But it certainly seems possible.

A person was filling out the background investigation form for a clearance,
when he came upon the question,

"Have you or any member of your family ever attempted to 
overthrow of the government of the United States?"

The guy thought about it for a while.  Then answered 'yes.'

Months went by.  His questionaire routinely traveled up the chain.
Finally, someone noticed his response to that question.  Down came the
FBI, hauling our hero off to one of those high security interview rooms:

"Why did you answer yes to that question?"
"Because it's true."
The Feds leaned closer and invited him to explain.

"It's true.  My Great-great grandpappy fought for the South during the War
Between the States"  (Civil War to you Yankees)

They changed the question to read "Have you or any member of your IMMEDIATE
family..."

                        	  Ron Wanttaja

------------------------------

Date: 9 Apr 88 17:25:40 MST (Sat)
From: gaia!jon@husc6.harvard.edu (Jonathan Corbet)
Subject: A new VMS security hole?

It was interesting to read Andy Goldstein's remarks on DEC's new policy on
security patches.  Such a policy was certainly needed after the delays
associated with the SECURESHR problem last fall.

What made it more interesting, though, was the arrival, via Federal Express, of
another one of those urgent-install-it-right-now patches from DEC yesterday
morning.  Yes, it is another security patch, but this time, nobody seems to
have heard about the hole yet.  It looks to me like DEC is living up to its
word on this one.  Good news.

But, now that the cat is out of the bag, does anybody out there know what the
situation is?  This patch contains about 1200 blocks of stuff -- lots of fixes!

Jonathan Corbet
National Center for Atmospheric Research, Field Observing Facility

------------------------------

Date:  Fri, 8 Apr 88 10:43 EDT
From: "John O. Rutemiller" <Rutemiller@DOCKMASTER.ARPA>
Subject:  Re: Notifying users of security problems

I believe the procedure outlined by Andy Golstein in RISKS 6.54 is a sound way
to manage the problem, with acceptable compromises.  Eric Postpischil's work
arounds in RISKS 6.56 fail to take a couple of points into consideration.

> Almost every site has a big red button (or equivalent) that will make
> any computer system secure, and a very few sites might have information
> sensitive enough to warrant the button's use.

This big red botton may do wonders in keeping intruders out of the system, but
it also prevents users who NEED to do work from accessing the system.  Also,
your reasoning seems to indicate that the system should stay in this state
until the security flaw is fixed.  How long will that be?  Can your site stand
to be down for one month or maybe longer?

> For another few sites, the VMS software might be only a portion of
> their computing resources, a portion they can do without or with
> limited use for a period.

Emphasize the word FEW.

> Many sites can restrict accounts, temporarily removing general
> accounts.

Removing general accounts (the existence of which has its own problems)
does not prevent attacks from those with valid accounts.

> And other sites may be satisfied with establishing some sort of
> auditing procedures so they can tell who is being naughty and stop it
> if not prevent it beforehand.

If you simply announce that a flaw exists without a fix, the most
auditing will tell you is that you have just been had. :-)

John Rutemiller

------------------------------

Date: Fri, 8 Apr 88 15:24:23 cdt
From: wsmith@m.cs.uiuc.edu (William Smith)
Subject: Re: notifying users of security bugs 

  >From: goldstein%star.DEC@decwrl.dec.com (Andy Goldstein)
  >Subject: Re: Notifying users of security problems

  >So we get to the critical distinction between security bugs and
  >others: Because invocation of a security bug requires a deliberate,
  >unusual action, a security bug is only harmful to an installation when
  >malicious users gain knowledge of the bug. 

This is patently false.  If my Unix kernel has a security bug that let anyone
delete a file owned by root, and I *accidentally* (not maliciously) type rm *
while I am accidentally in /, I will have invoked the security bug and force
the sysadmin to reload the system.  Or, if I misspell a command and execute a
different command that causes the system to crash, the security bug is still
harmful, but I am am not malicious.

You need to protect the system from inadvertant misuse by normal users as much
as you need to protect it from malicious users.  Each system adminstrator
should have the right to decide which set of users is more prevalant at his or
her site and act accordingly.  Some sites require their administrators to be
paranoid as you are suggesting.  Other sites can fire or remove the accounts of
malicious users and do not need to be paranoid.  A simplistic model of the risk
of system security bugs says that (bug + malicious-user) => danger.  A more
accurate analysis would also say that (bug + hapless-user) => danger.  How
likely a user might stumble over the bug is also a factor.  To fix an obscure
bug may not be worth the risk of breaking the operating system when the fix is
installed.

Bill Smith     pur-ee!uiucdcs!wsmith    wsmith@a.cs.uiuc.edu

------------------------------

Date: 10 Apr 88 18:47:13 EDT (Sun)
Subject: Viruses
From: fc@ucqais.uc.edu (Fred Cohen)

For details on theory of computer viruses, call Fred Cohen (513)475-6575

We can detect all viruses, but cannot decide whether or not a program is
infected. That is, if we detect all files as suspects of containing viruses,
we catch all viruses. Whether or not a program contains a virus is undecidable
(i.e., we cannot write a program that determines whether or not another program
contains a virus correctly and in finite time in all cases). I suspect that
the Israeli defense is useless against most of the viruses we have done
experiments on - I wish I was on the attacker's side of that bet!!! - FC

------------------------------

Date: Mon, 11 Apr 88 16:28:13 +0200
From: mcvax!cwi.nl!piet@uunet.UU.NET (Piet Beertema)
Subject: April Fool's Warning (Re: RISKS-6.55)   [The last word was the first!]

	>Subject: April Fool's warning from Usenet
	>From: spaf@cs.purdue.EDU (Gene Spafford)
	>      ==================================
	>Date: 1 Apr 88 00:00:00 GMT
	       ^^^^^^^^^^^^^^^^^^^^^

          [Piet points out that the key line that I inadvertently deleted
          -- and already noted so doing -- was the path:]

... which contained ...!kremvax!perdue!spaf 
(kremvax was one of the sites warned for!).

    [Piet of course is famous as the perpetrator of the Chernenko hoax four
    years ago.  That was the Ur-hoax and deserves many kudos.  RISKS has
    received quite a few queries from neophytes who were not around on 1 April
    1984.  They may find the message "from" mcvax!moskvax!kremvax!Chernenko and
    the delightfully annotated ensuing responses in their entirety -- including
    all of the header stuff! -- in ACM SIGSOFT Software Engineering Notes vol 9
    no 4, July 1984, pp. 6-8.  Or ask Piet if he still has it on line.  PGN]

------------------------------

Date:    Mon, 11 Apr 88 19:27 EDT
From: EAE114%URIMVS.BITNET@MITVMA.MIT.EDU
Subject: Virus Distribution

Will Martin's fears about a possible Virus in a 'computer Interview' seem a
little overblown to me.  In the first place, putting your address on your virus
sounds like a good idea to get yourself in serious trouble. (How hard/easy is
it to trace someone through a mailing address?  Does the Postal service have
ANY verification?)  Anyway, if your really concerned about a diskette, just
park the head on your hard-disk, or pull its cable, or whatever, and run the
diskette in isolation.  Then just be sure to power-down before you do anything
else.

I've heard rumors that the Macintosh OFF switch only pretends to power down, so
maybe this won't work.  Is this true?  If so, why does apple do that?

Peter G. Rose

------------------------------

Date: 11 Apr 88 14:42:47 GMT
From: Rob Elkins <relkins@vax1.acs.udel.edu>
Subject: Re: The "(c) Brain" virus is not a new virus.  (RISKS-6.55)

  >It is a basically harmless virus which first emerged ...

That may not be exactly true.  From reading RISKS extensivly, it seems to me
that the command.com virus may not be harmless.  It may have "evolved" since
its discovery into something more harmful, and I remember reading that it had
sort of date trap set for Friday the 13th.  It is still in your best interest
to copy the data on any infected disks onto fresh disks and reformat the 
infected disks.

Rob Elkins

BITNET: FFO04688@UDACSVM   UUCP: ...!sun!vax1.acs.udel.edu

------------------------------

Date: 11 Apr 88 05:44:39 GMT
From: munnari!uowcsa.oz.au!david%uowcsa.cs.uow.oz.OZ@uunet.UU.NET (David E A Wilson)
Subject: There is a VT220 with block mode available from DEC. (Re: RISKS-6.52)
Organization: Uni of Wollongong, NSW, Australia

	Jerry Leichter is not quite correct in saying that NO VT220 made
by DEC has BLOCK mode. In Australia, DEC modified the standard VT220 to
create a VT220-Z (VT220 + VT131/2 block mode) as a special for the
New South Wales Department of Health. They then also made it available
to anyone else who wanted to buy it. Whether or not this has the security
hazard described in RISKS 6.51 I cannot tell as I no longer work for the
NSW Dept of Health.

David E.A. Wilson		ACSnet:	david@uowcsa.oz
Dept. of Computing Science	UUCP:	...!munnari!uowcsa.oz!david
Uni. of Wollongong		ARPA:	david%uowcsa.oz@uunet.UU.NET

------------------------------

Date: 2 Apr 88 10:47:02 GMT
From: cmcl2!phri!dasys1!tbetz@rutgers.edu (Tom Betz)
Subject: Enfranchising the disenfranchised: our responsibility?

Kim Greer writes:

  > ... if someone does not like the state they are in they should do
  > something to change it.  

I would go so far as to say that there is nobody who can use a VCR who can
not use a computer for >something<.

  > ... If someone is "disenfranchised" from using computers because they 
  > can't read, let them learn how to read.

Unfortunately, much easier said than done.  Computers can, however, be a
useful tool for aiding the teaching of reading/writing.
     
  >... But people are generally able to do anything they really want to...

     I know, through a skills training program for welfare mothers living in 
motels because they have no other home, of one woman who has managed to buy 
her kids an Adam, a C=64, and a TI-99A, all on a welfare budget.  This 
woman, though a part of the most disenfranchised classes in America today, 
has obtained (leaving aside for the moment the question of whether or not 
she uses them for this purpose) the tools to join into this peculiar 
Republic we here are a part of, using a very powerful lobbying tool to guide 
our elected officials.  Recent proof of the power of this medium is the 
defeat of the FCC's connect charge for computer systems.  

A question I would find most interesting to discuss here would be the 
question of this Republic within the Republic.  How are the lives of those 
who are too ill-educated to use these tools effectively going to be affected 
by the increased power of those of us who >do< use them?

Do we have a responsibility to do whatever we can to spread the power around
to these people? How can we do this?  How can our computers help us help them?

     Serious questions....

Tom Betz                        {allegra,philabs,cmcl2}!phri\
Big Electric Cat Public Unix           {bellcore,cmcl2}!cucard!dasys1!tbetz
New York, NY, USA                               {sun}!hoptoad/         

------------------------------

Date:         Mon, 11 Apr 88 15:09:18 EDT
From:         David Thomasson <ST401405%BROWNVM.BITNET@MITVMA.MIT.EDU>
Subject:      Discrimination and careless arguments

In an earlier note I pointed out what I take to be weak points in some recent
RISKS items about discrimination. Literary critic John Lavagnino replied that
my complaints are "off the mark." As irony would have it, Lavagnino's reply
further substantiates my precautions about shoddy arguments.

  >In all the instances I recall, and particularly Les Earnest's,
  >nobody was talking about the question of what the ideal Motor
  >Vehicle Bureau should ask you on their application.

Nor was I. Explaining why he refused to reveal his race on a license
application, Earnest argued as follows (I paraphrase):  (1) Race has nothing
to do with driving a car. Therefore, (2) asking for an applicant's race isn't
justifiable. My point was not about ideal motor vehicle bureaus; it was about
logic: (2) doesn't follow from (1). The suppressed premise is: (1A) If X has
nothing to do with driving a car, then X cannot justifiably be put on a
license application. *If* once accepts that premise, then most of the
information on drivers licenses is unjustified:  name, address, color of eyes,
color of hair, etc. And this, of course, is patent silliness.

  >Les Earnest's was a *story* that told of becoming uneasy about certain 
  >classifications in the light of substantial evidence of their misuse.

No, Earnest did not give any evidence at all that racial information on
drivers license applications had been misued. He simply lumped this anecdote
in with others that did suggest such misuse.

  >Thomasson would have us ignore how information is used in society, and once
  >you do that then of course a discrimination's bad effects will often 
  >disappear from view; you can pretend that a Southern state, in the early 
  >60s, would ask about your race merely because it's useful for 
  >identification. Just because they can cite good reasons doesn't mean their 
  >real reasons aren't bad.

Rather than ignore such information, I would suggest that writers *present*
it. Here, Lavagnino confuses two separate actions:  gathering information, and
misusing information. Asking for race on a driver's license is, I suggest,
justified because it is useful in identifying the licensee. If the state then
uses that information for other (discriminatory) purposes, *that* action is
not justified and should be stopped. But one must not confuse the reasons that
justify including such information with its subsequent misuse. The state could
just as easily misuse information about one's address or age. It is this
*misuse* of information, and not the gathering of it, that is wrong. Careful
argument requires that such distinctions be made, especially on the overheated
hot topic of discrimination.

    [We are drafting in RISKS relevance, but this reply is still useful.  PGN]

------------------------------

End of RISKS-FORUM Digest
************************
-------