[comp.protocols.tcp-ip] Implications of recent virus

sean@cadre.dsl.PITTSBURGH.EDU (Sean McLinden) (11/06/88)

Now that the crime of the century has been solved and all of the
bows have been taken it is, perhaps, time to reflect a little more
on the implications of what has happened.

First of all, to the nature of the problem. It has been suggested that
this was little more than a prank let loose without sufficient restraint.
I have not seen the latest in press releases but there seems to be
a hint of "I didn't want anything like this to happen!" Perhaps not.
In fact, if the thing had not run wild and had not bogged down a number
of systems it might have gone undetected for a long time and might
have done much worse damage than our estimates suggest was done. I can
accept that the author did not anticipate the virulence of his creation
but not that it was out of some benevolent concern for the users of
other systems. Rather it was because it allowed him to be caught.

In fact, with function names such as "des", "checkother", and
"cracksome", I am less likely to believe that the intent of this
program was one of simple impishness.

Let's look, for a moment, at the effects of this system (whether
intended or otherwise). First, it satisfied a public desire for news
and, one might argue, served as a reassurance to the many technophobes
out there that our systems are as vulnerable as error prone as they,
all along, have been arguing. If you don't think that this might have
social consequences you need only look at things like community bans
on genetic research have resulted from social policy implemented as
a result of public distrust. When I was interviewed by a local news
agency the questions asked were on the order of "Does this mean that
someone could fix a Presidential Election?" (sure, Daley did it in
Chicago but he didn't used computers!), and "What implications does
this have for the nation's defense?" (In spite of reassurances from
here and CMU, the local media still insisted on the headline "Defense
Computers invaded by virus.")

Second, there is an economic conseqence. Since we were unable to
determine the extent of the programs activities we were forced to
commit programmers time to installing kernel fixes, rebuilding systems,
checking user data files, and checking for other damage. That was
the direct cost. The indirect cost comes from the delay in other
tasks that was incurred by the diversion of people's time to solving
this one. If you multiply by the effort that is going on at a number
of other sites I suspect that in salary time, alone, you are looking
at costs into the hundreds of thousands of dollars.

Perhaps, most importantly, there is the academic costs. I would argue
that that the popularity of Unix, today, is due in great part to the
development of the Berkeley Software Distribution which was made available
in source form to thousands of research and academic organizations starting
in the '70s. In a sense, it is a community designed system and although
Berkeley deserves the lion's share of the credit, it was the contribution
of hundreds of users with access to source codes that allowed the system
to evolve in the way that it did.

There is a cost to providing an academic environment and there are
responsibilities that are imposed by it. One advantage of academic is
access to information which would not be tolerated in an industrial
domain. This access requires our users to observe some code of behavior
in order to guarantee that everyone will have the same access to the
same information. The person who rips out the pages of an article from
a library journal is abusing this privilege of free access to information
and depriving others of the same. By convention, we agree not to do
that, and so we protect that system that has benefited us so that others
derive the same benefit.

A great part of the Internet was funded by DARPA because some forward
thinking individuals recognized the tremendous technological and academic
benefits that would be derived from this open network. This has resulted,
I believe, in significant economic benefits to American industry and
continues to support our leadership role in software development. It is
an an infrastructure that supports a gigantic technological community
and there are very few, if any, computer interests in this country that
were influenced by DARPA' experiment.

Within a week or two, members of the organizations responsible for this
network are going to be meeting to discuss the implications of the recent
virus(es), and mechanisms with which they can be dealt. One possible outcome
would be increased restrictions on access to the network (the Defense
Research Network is already moving along these lines). It would not
be unreasonable to consider whether a venture such as this should be
supported, at all. To restrict access to a network such as this, or
to remove the network, altogether, would be the economic equivalent
to tearing up the Interstate highway system. The effect on academic
and technological advancement would be quite serious.

The bottom line being that to suggest that program such as the
"virus" (which is really more of a Trojan Horse), was little more
than a harmless prank is to overlook what the long term effects of
both the technology, and the PUBLICATION of that technology will
have on continued academic freedom and technological growth.

But what of the nature of the act? Is there something to be said of
that? First, there is the personal tragedy, here. There is public
humiliation for the (supposed) perpetrator's father who is, himself,
a computer security expert (his employer's must be questioning whether
the son had access to specialized information though most of us realize
that the holes that were exploited were well known). There is the
jeopardy of the academic career for the programmer. But there is more
than that.

There seems to be a real lack of consideration for what are the ethical
considerations of this action. Consider, for a moment, that you are
walking down the street and the person in front of you drops a 10 dollar
bill. You have three options: 1) You can pick it up and hand it to them;
2) You can pick it up and keep it; 3) You can leave it and continue walking.
It should be obvious that these choices are not morally equivalent. To
have known about the holes in the system which allowed the virus in
(and even to have known how to exploit these), is NOT the same as actually
doing it (any more than leaving the bill on the sidewalk is the same
as pocketing it). Somewhere along the line, we fail ourselves and our
students if we don't impress upon them the need to regard the network
as a society with rights, responsibilities, and a code of professional
ethics which must be observed in order to preserve that society. There
are probably a few hundred people who could have written the code to
do what this virus did; most of those people didn't do it. Most, if
not all, of us have had the opportunity to pocket a candybar from
the local convenience store, but most of us don't. We don't, not
because we will be punished or because there are laws against it,
but because we have a social consciousness which tells us that
such an action would, in the end, would substantially degrade the
society in which we live.

What happened in this situation reflects not only a moderately
high level of programming sophistication but also a disturbingly
low level of ethical maturity.

If we tolerate those who view the network as a playground where
anyhting goes, we are going to be faced with serious consequences. But
the answer is not to change the character of the network (by increasing
restrictions and decreasing freedom of access), but to promote a sense
of character among the members of the community who work and experiment
in this network. This puts the burden on us to remember that there
is a need for us to encourage, teach, and provide examples of the
kind of behaviors that we need to preserve in order to preserve the
network.

Sean McLinden
Decision Systems Laboratory
University of Pittsburgh

peter@ficc.uu.net (Peter da Silva) (11/06/88)

One side effect that I don't like is that UNIX is taking the blame for
a combination of (1) a security hole in an application (sendmail), and
(2) deliberate loosening of security to trusted sites (rhosts, etc...).

Non-academic UNIX in general is a lot less open to techniques like this.
-- 
Peter da Silva  `-_-'  Ferranti International Controls Corporation
"Have you hugged  U  your wolf today?"     uunet.uu.net!ficc!peter
Disclaimer: My typos are my own damn business.   peter@ficc.uu.net

chend@beasley.CS.ORST.EDU (Donald Chen - Microbiology) (11/07/88)

In article <1698@cadre.dsl.PITTSBURGH.EDU> sean@cadre.dsl.PITTSBURGH.EDU (Sean McLinden) writes:
>Now that the crime of the century has been solved and all of the
>bows have been taken it is, perhaps, time to reflect a little more
>on the implications of what has happened.
>
	text deleted
>
>Let's look, for a moment, at the effects of this system (whether
>intended or otherwise). First, it satisfied a public desire for news
>and, one might argue, served as a reassurance to the many technophobes
>out there that our systems are as vulnerable as error prone as they,
>all along, have been arguing. If you don't think that this might have
>social consequences you need only look at things like community bans
>on genetic research have resulted from social policy implemented as
>a result of public distrust. When I was interviewed by a local news

Are you suggesting that the "public" does not have an interest and
responsibility to ask for suitable safeguards from what "they"
consider to be either dangerous or incompletely thought out?
Although people like Jeremy Rifkin have been nuisances to the
practical application of bio-engineered tools, they have also
caused investigators to more completely think out their studies,
AND have forced scientists to explain and defend their approaches
and tools to the people who ultimately fund their research.

>Second, there is an economic conseqence. Since we were unable to
>determine the extent of the programs activities we were forced to
>commit programmers time to installing kernel fixes, rebuilding systems,
>checking user data files, and checking for other damage. That was
>the direct cost. The indirect cost comes from the delay in other

Perhaps I am foolish, but I feel some of the responsibility goes to
whoever left the debug option in sendmail, and to those who allow
promiscuous permissions in their systems.

>
>If we tolerate those who view the network as a playground where
>anyhting goes, we are going to be faced with serious consequences. But
>the answer is not to change the character of the network (by increasing
>restrictions and decreasing freedom of access), but to promote a sense
>of character among the members of the community who work and experiment
>in this network. This puts the burden on us to remember that there
>is a need for us to encourage, teach, and provide examples of the
>kind of behaviors that we need to preserve in order to preserve the
>network.
>

You talk of personal responsibility -to oneself, to one's colleagues,
to one's community - and I heartily agree; however, you also talk of the 
burden we all have to somehow teach and instill in others that sense of
rightness which makes the net possible. This does not insure that those
whom we teach will listen, and even if they do, that they will do it
right away. Perhaps there is an analogy to children who, though they
have been told to "do right", test the limits of their freedom, test the
extent of their personal strengths. We hope that through time and 
experience these children will grow to become an integral part of their
communities - but it takes time.
I do not wish to condone the actions of anyone who disrupts the net or
rips out pages from library books or trashes the environment in which we
all live. Although our site has not seen evidence for this particular
virus, it will, no doubt, be the victim of others. In that vein, we need 
to protect our site from the thrashings of either childish behaviour 
or cynical attacks. This means we treat our sites more protectively -
viz. the family heirloom - yet no so much that growth and evolution
of the system is stifled.
I suspect that part of the openess and collegiality which we would like
pays its price in these attacks. We can only muted the number and
intensity of them

Don Chen
Dept of Microbiology
Oregon State University

everett@hpcvlx.HP.COM (Everett Kaser) (11/09/88)

I would propose that there is a place (in our computer-network-society) for
persons attempting to write (non-destructive!) viruses.  There is no better
means of protecting ourselves from destructive viruses than to be constantly
testing ourselves with non-destructive ones.  Of course, there's two small
holes in this logic:  1) there may be a bug in your non-destructive virus
which turns it destructive, accidentally; and 2) non-destructive viruses may
not find all of the possible holes in the system, ie. a destructive virus
may get into the system in a destructive way, which a non-destructive virus
would never find.

I feel that the risk of hole number 1 is worth the benefits.  If a few 100
people KNEW about these holes in the system that were exploited by the 
recent virus, WHY WEREN'T THEY FIXED?  Making a "game" out of non-destructive
viruses would have an anology to the military's "war games";  try testing
your strategies and tactics in a non-destructive way BEFORE getting into
a destructive situation, and hopefully, in that way, cut your losses.

Perhaps a university or some other organization could be set up as a 
"clearing house" for virus tests.  Something along the line of:
   1) John Doe thinks he sees a hole in the security system.
   2) John creates a program to exploit that hole (in a non-destructive way).
   3) John takes that program (along with appropriate documentation, to the
      "clearing house".
   4) The "clearing house" would review it for possible destructive behaviour.
      (This would not be 100% proof that destruction wouldn't occur, but
       would make the likelihood of it much lower, and provides a means of
       "licensing" the virus author to do the test without alerting the
       defenders (sys-admins) that the test is going to be run.)
   5) The test is run, and if successful, all systems will be tightened to
      avoid future use of the hole.
Remember, appealing to peoples sense of "morality" doesn't work.  There are
always terrorists and anti-social people who will behave amorally.  Either
we can strengthen our own defense, or wait for the terrorists to force us
to do it.

Everett Kaser
!hplabs!hp-pcd!everett

karl@triceratops.cis.ohio-state.edu (Karl Kleinpaste) (11/09/88)

If you want to suggest ongoing testing of small viruses in order to be
prepared for the big, dangerous ones, that's fine.  One might even
call it noble.  But such testing should be done on ISOLATED networks
in order to preclude the little jewels from going where they're not
intended.  I don't yet accept the idea that the latest worm was
released `by accident'; but even if so, it was grossly careless.

Once you have a physically isolated network, there is of course no
reason to make the test viruses and worms non-destructive, aside from
the time you'll cost yourself in restoring your system following each
local simulated attack.  But knowledge of that restoral cost would be
useful data to have, perhaps as a function of the virulence of the attack.

I would be very angry with anyone deciding to arrogate to themselves
the position of "official network virus tester," and thereby give
themselves permission to abuse my systems from Far Away.

kent@tfd.UUCP (Kent Hauser) (11/16/88)

In article <101070001@hpcvlx.HP.COM>, everett@hpcvlx.HP.COM (Everett Kaser) writes:
> I would propose that there is a place (in our computer-network-society) for
> persons attempting to write (non-destructive!) viruses.  There is no better
> means of protecting ourselves from destructive viruses than to be constantly
> testing ourselves with non-destructive ones. 

I propose that the ``worm'' be tested on the manufacturer's network.
For instance, the recent Internet virus should have been sent
directly to Sun and Dec (and maybe Berkeley) to *blow away* their 
networks.  I think that if we users were diligent in harassing
the manufacturers, maybe they would expend a little more effort in 
beefing up security!



-- 
Kent Hauser			UUCP: sun!sundc!tfd!kent
Twenty-First Designs		INET: kent%tfd.uucp@sundc.sun.com