[comp.org.eff.talk] "Computers at Risk"

faustus@gargoyle.uchicago.edu (Kurt Ackermann) (12/12/90)

This appeared in the December 6 New York Times, 
section C, p.1, col. 1 and continues on p. C4, col. 1
It is reprinted here without permission.

---------------------------------------------------------

ACADEMY OF SCIENCES URGES GREATER COMPUTER SECURITY
By John Markoff


	Warning that the United States' computers are 
not secure, the National Academy of Sciences urged the
nation yesterday to revamp computer security procedures,
institute new emergency response teams and create a
special non-Government organization to take charge of
computer security planning.
	The report's authors noted the increasing 
dependence on computers to control almost every aspect 
of America's business and Government activity as well
as critical systems like airline traffic control and 
vital hospital equipment.  They said the nation had 
been fortunate not to have suffered a catastrophic
failure.
	The report, "Computers at Risk: Safe Computing
In the Information Age," was prepared by 16 university
and Government computer security specialists at the 
request of the Pentagon's Defense Advanced Projects 
Agency.  The authors of the report, prepared by the 
academy's National Research Council, included David D.
Clark, a computer scientist at the Massachussetts
Institute of Technology; M. Douglas McIlroy, a computer
scientist at the American Telephone and Telegraph 
Company's Bell Laboratories; Peter G. Neumann, a 
computer security researcher at SRI International, and
Willis H. Ware, a computer privacy researcher at the 
Rand Corporation.

'Luck Will Soon Run Out'

	"As far as we can tell, there has been no 
successful systematic attempt to subvert any of our
critical computing systems," the report said. "Unfortun-
ately, there is reason to believe that our luck will
soon run out.  Thus far, we have relied on the absence of 
malicious people who are both capable and motivated.  We
can no longer do so.  We must instead attempt to build 
computer systems that are secure and trustworthy."
	The panel began work in the fall of 1988 after a
nationwide computer was jammed bay a Cornell University
student who released a rogue program that had been written 
as a prank.  Because of several flaws in the program, it
went out of control and stalled thousands of computers
connected to a network of business, Government and 
military computers known as the Internet.
	"While we haven't had the equivalent of a Chernobyl-
like disaster for computers, do we want to wait for 
something like that to happen?" asked Marjory S. Blumenthal,
the academy's staff director for the study.  "There are 
enough signs of potential vulnerability that we should act
now."
	The report cited threats to individual privacy, the
danger of increased trust placed in computers used in 
safety-critical applications like medical instruments and
air traffic control systems, corporate espionage and the 
increasing vulnerability of international computer networks
to political attacks.

Dual Responsibility

	At present, responsibility for developing computer
security systems and standards is split between the National
Security Agency, an intelligence gathering and military
computer security organization, and the National Institute 
of Standards and Technology, which has responsibility for
nonclassified computer security.  Ms. Blumenthal said, 
however, that the institute has not had sufficient funding
to develop secure computer designs.
	"The current system isn't working, and people know
it," said Marc Rotenberg, Washington director for the 
Computer Professionals for Social Responsibility, a national 
public interest organization.
	The academy panel proposed that a private, not-for-
profit organization-- to be called the Information Security
Foundation-- be set up with private financing to develop 
security standards and to research security technologies.
	The proposed group would be supported by computer
emergency response teams, known as CERT's.  The Defense
Advanced Project Agency set up such an organization after the
Internet incident.  Based at the Software Engineering 
Institute in Pittsburg, the unit is made up of a small group
of specialists.  It has a 24-hour hot line and can respond 
quickly to attacks on computer security.
	The National Security Agency has refused to permit
the export of advanced computer equipment that protects the
security of voice and data communications, but the report
called for a change in that policy.
	"If the United States does not allow vendors of
commercial systems to export security features," the report
said, "large multinational firms, as well as foreign 
customers will simply purchase equivalent systems from 
foreign manufacturers."

----------------------------------------------------------------

My questions:

1. Why are references to the US government written "Government"?
2. Who are the people on this panel, and are they a responsible
   and competent representation of the computer-using community?
3. Should the EFF get involved in the development of the soon-to
   be-created Information Security Foundation?
4. Has anyone read the report itself, or (even better) know 
   someone who was involved in the preparation of the report?
5. Why is the "rogue program" that "stalled thousands of computers"
   referred to as the "Internet incident"?
6. Why is it the Defense Department that's doing all this stuff????
7. What is and has been the National Academy of Science's role in
   the development of computer networks?
8. What exactly is a "nationwide computer" that was jammed by the
   "rogue program" of the "Internet incident" fame?
9. What does journalist John Markoff know about computers?
10. What are your opinions????

-----------------------------------------------------------------

This has been a Public Service breach of copyright

-----------------------------------------------------------------

barmar@think.com (Barry Margolin) (12/12/90)

In article <faustus.660935323@gargoyle.uchicago.edu> faustus@gargoyle.uchicago.edu (Kurt Ackermann) writes:
>1. Why are references to the US government written "Government"?

By using it as a proper noun they imply the US Government, rather than any
government or the concept of government in general.

>5. Why is the "rogue program" that "stalled thousands of computers"
>   referred to as the "Internet incident"?

Well, it was an incident that occurred on the Internet.

>6. Why is it the Defense Department that's doing all this stuff????

The Defense Department has long been the leader in the development of
computer communications and security facilities.  One of the primary
justifications for the development of packet-switched networks was to
support communications that could be used during warfare (packet switching
support rapid bypassing of portions of a network that have been bombed to
oblivion).  And advanced computer security was developed so that computers
could be used to hold military secrets.

>7. What is and has been the National Academy of Science's role in
>   the development of computer networks?

How is the National Science Foundation related to the NAS?  Over the last
few years the NSF has been sponsoring the national educational and research
network that replaced the Arpanet.

>8. What exactly is a "nationwide computer" that was jammed by the
>   "rogue program" of the "Internet incident" fame?

"Nationwide computer" is probably a confusion of "nationwide computer
network".  The Internet Worm certainly was a rogue program.

>10. What are your opinions????

The computer industry has been extremely lax about dealing with all the
issues of computer vulnerability.  However, they are not totally to blame,
because computer purchasers frequently don't ascribe much value to
protection from vulnerability, so the vendors are simply responding to the
market.  All parties need to be educated to the need, and the people making
purchasing decisions must be willing to spend a little extra to get less
vulnerable systems.  If there were money in computer security, more vendors
would invest more research resources into it, and things would get better.
--
Barry Margolin, Thinking Machines Corp.

barmar@think.com
{uunet,harvard}!think!barmar

wayner@cello.cs.cornell.edu (Peter Wayner) (12/12/90)

faustus@gargoyle.uchicago.edu (Kurt Ackermann) writes:

>This appeared in the December 6 New York Times, 
>section C, p.1, col. 1 and continues on p. C4, col. 1
>It is reprinted here without permission.

> ... article not repeated...

>-----------------------------------------------------------------

>This has been a Public Service breach of copyright

>-----------------------------------------------------------------


I suppose the Secret Service was doing a "Public Service" when they 
breached those portals?
Peter Wayner   Department of Computer Science Cornell Univ. Ithaca, NY 14850
EMail:wayner@cs.cornell.edu    Office: 607-255-9202 or 255-1008
Home: 116 Oak Ave, Ithaca, NY 14850  Phone: 607-277-6678

dsc3rjs@nmrdc1.nmrdc.nnmc.navy.mil (Bob Stratton) (12/15/90)

In article <faustus.660935323@gargoyle.uchicago.edu> faustus@gargoyle.uchicago.edu (Kurt Ackermann) writes:
>
>My questions:
>
>1. Why are references to the US government written "Government"?

Probably because they referred specifically to one particular gov't (?)

>2. Who are the people on this panel, and are they a responsible
>   and competent representation of the computer-using community?

The list seemed to me to be partially representative, if a bit "strato-
spheric".. NAS/NRC primarily exists as a mechanism to bring together 
groups of experts in a field, for purposes of info. exchange, or 
the creation of guidelines for research in a given field, as I understand 
it. I would certainly like to see the academic computing community 
better represented (perhaps even a little more on the "techie" (other
peoples' term) or "hacker" (my term) level.

>3. Should the EFF get involved in the development of the soon-to
>   be-created Information Security Foundation?

Probably, unless EFF members question the need/propriety of its existence
(as I do in the form delineated above). 

>4. Has anyone read the report itself, or (even better) know 
>   someone who was involved in the preparation of the report?

I'm getting a copy. I used to do contract computer support there, and I
very much want to talk to some of the people involved.

>5. Why is the "rogue program" that "stalled thousands of computers"
>   referred to as the "Internet incident"?

Good question - probably because it affected machines on the Internet,
as opposed to PC's on LANs, etc.

>6. Why is it the Defense Department that's doing all this stuff????

DoD was instrumental in providing the backbone for research in the areas
of protocols, etc. that were the genesis of TCP/IP and the Internet as we
currently know it. 

>7. What is and has been the National Academy of Science's role in
>   the development of computer networks?

As an institution, not very significant as I see it. The NSF, on the other
hand, has had a big influence, both from a funding and research standpoint,
or is that redundant? :-)

>8. What exactly is a "nationwide computer" that was jammed by the
>   "rogue program" of the "Internet incident" fame?

Again, that refers to computers on the Internet, running TCP/IP / sendmail / 
fingerd, that were "infected" by the RTM worm.

>9. What does journalist John Markoff know about computers?

Good question. Couldn't tell 'ya.

>10. What are your opinions????

I'm basically holding mine until I read the report in full, and talk to 
its progenitors. I will say that I tend to distrust _any_ institution that
claims to be the "be all and end all" of a given aspect of information
technology, be it "standards bodies", "industry consortia", John McAfee's
Computer Virus Industry Association, or anyone else. I especially tend to
worry when people start deciding on access controls for the rest of the 
community. I have witnessed all too often the problems when some  
managerial type decides that a development environment "needs more
security". [note: this is not a slur on management types in general, just
those who are technically incompetent -- RJS]


-- 
Bob Stratton		| dsc3rjs@nmdsc{20 | 10}.nmdsc.nnmc.navy.mil [Internet]
Stratton Systems Design	| dsc3rjs@vmnmdsc.BITNET [BITNET only, please!]
			| +1 703 823 MIND [PSTNet]

SPRAGGEJ@QUCDN.QueensU.CA (John G. Spragge) (12/15/90)

In article <1990Dec11.213718.13211@Think.COM>, barmar@think.com (Barry Margolin)
says:

>The computer industry has been extremely lax about dealing with all the
>issues of computer vulnerability.  However, they are not totally to blame,
>because computer purchasers frequently don't ascribe much value to
>protection from vulnerability, so the vendors are simply responding to the
>market.

Perhaps purchasers are more concerned about protection from mistakes
than they are from malice. I have certainly restored my share of
files through the years, and I have never yet seen a file wiped
out by a virus or a trojan. The majority of failures were due
(depending on your perspective) either to user error, or to systems
that were too hard to learn or understand. If a buyer asked me
whether they should buy a hard to understand system with good
security, or a user friendly system with little security, I would
always recommend the latter.

Losing data to someone else's malice is undoubtedly more
traumatic: it's clearly a violation. However, on the scale
of dangers, the person who is clearly most dangerous to any
data is the primary user. Second in line is the person who
wrote the editor/data processing system being used. Between
those two, innocent mistakes account for the vast majority
of failures I have ever seen or (shudder) lived through.

>        All parties need to be educated to the need, and the people making
>purchasing decisions must be willing to spend a little extra to get less
>vulnerable systems.  If there were money in computer security, more vendors
>would invest more research resources into it, and things would get better.

The use of the word "educated" assumes that there is a consensus on
priorities. I don't share Mr. Margolin's priorities. From the point
of view of efficiency only, I believe the top priority is to
establish efficient, error free systems (not, unfortunately, the
case now). It is equally important to make them user friendly, so
that user errors will not result in data loss. Then, and only
then, comes security.
>--
>Barry Margolin, Thinking Machines Corp.
>
>barmar@think.com
>{uunet,harvard}!think!barmar

disclaimer: Queen's University merely supplies me with computer services, and
            they are responsible for neither my opinions or my ignorance.

John G. Spragge

barmar@think.com (Barry Margolin) (12/16/90)

Newsgroups: comp.org.eff.talk
Subject: Re: "Computers at Risk"
Summary: 
Expires: 
References: <faustus.660935323@gargoyle.uchicago.edu> <1990Dec11.213718.13211@Think.COM> <90349.014530SPRAGGEJ@QUCDN.QueensU.CA>
Sender: 
Followup-To: 
Distribution: na
Organization: Thinking Machines Corporation, Cambridge MA, USA
Keywords: 

In article <90349.014530SPRAGGEJ@QUCDN.QueensU.CA> SPRAGGEJ@QUCDN.QueensU.CA (John G. Spragge) writes:
>Perhaps purchasers are more concerned about protection from mistakes
>than they are from malice. I have certainly restored my share of
>files through the years, and I have never yet seen a file wiped
>out by a virus or a trojan. The majority of failures were due
>(depending on your perspective) either to user error, or to systems
>that were too hard to learn or understand. If a buyer asked me
>whether they should buy a hard to understand system with good
>security, or a user friendly system with little security, I would
>always recommend the latter.

The recommendation wouldn't depend at all on the planned use of the
computer?  I admit that there are environments where security is not
tantamount, but for applications such as banking and the Defense
Department, security can be very important.

As a former developer of a secure system (Multics), I can tell you that the
design processes that go into developing a secure system naturally result
in a more correct system.  Thinking about security forces developers to
consider all the ways that different pieces of the system may interact.
They must pay more attention to details, because that's where most security
flaws are.  Better design and development methodologies are used.  Good
system structuring is necessary so that it is possible to understand the
system and believe that it is secure.  Once you are in this mindset, it is
normal to think this way for all the software on the system, not just the
security-related parts.  For instance, in Multics development, all changes
to the system are subject to peer review of the proposed change, and the
code must be reviewed and approved by at least one other developer familiar
with that area of the system.

The problem with many "secure" systems today is that the security was
grafted on as an afterthought.  Thus, the systems were not designed in the
careful ways that security promotes.  Adding security to a
poorly-structured system is like patching leaks in a roof made of inferior
raw materials: you can be pretty sure that new leaks will form.

The same design processes that result in more correct systems can also
produce easier to use systems.  For instance, the peer design review was
good at preventing user interface inconsistencies from being implemented.
Good system structuring generally means encapsulating common facilities
into libraries, and when this is done for user interface routines you get a
more uniform environment (example: on Multics, the same command-line
parsing subroutine is used by the shell and by most commands with internal
command lines, such as the mail and conferencing user interfaces).

Yes, Multics is harder to use than a Macintosh, but that's true of most
systems.  It's no harder to use than MS-DOS, Unix, or VMS.  And if your
personal computer hasn't been hit by a virus, you've just been lucky so
far; I think part of the problem is an "it won't happen to me" attitude by
many people.
--
Barry Margolin, Thinking Machines Corp.

barmar@think.com
{uunet,harvard}!think!barmar

SPRAGGEJ@QUCDN.QueensU.CA (John G. Spragge) (12/16/90)

In article <1990Dec15.174416.6338@Think.COM>, barmar@think.com (Barry Margolin)
says:

>References: <faustus.660935323@gargoyle.uchicago.edu>
><1990Dec11.213718.13211@Think.COM> <90349.014530SPRAGGEJ@QUCDN.QueensU.CA>

>The recommendation wouldn't depend at all on the planned use of the
>computer?  I admit that there are environments where security is not
>tantamount, but for applications such as banking and the Defense
>Department, security can be very important.

I'm not interested in providing software for military purposes. As
for financial transactions: yes, I agree they ought to be secure.
However, in banking, as in every other field, it is equally
important that the system work to avoid errors, which can have
consequences that are just as severe, if not more so, than
security breaches. Discussing this with other business people,
I heard of one large commercial organisation that made an error
in the supplier's favour in 20% of their transactions. If that
had been caused by sabotage, it would have been reported as a nice
haul.

>They must pay more attention to details, because that's where most security
>flaws are.  Better design and development methodologies are used.  Good
>system structuring is necessary so that it is possible to understand the
>system and believe that it is secure. Once you are in this mindset, it is
>normal to think this way for all the software on the system, not just the
>security-related parts.  For instance, in Multics development, all changes
>to the system are subject to peer review of the proposed change, and the
>code must be reviewed and approved by at least one other developer familiar
>with that area of the system.

This is a somewhat circular argument. The desire for a secure system
leads to the development of "better" design techniques, which are by
definition "better" because they allow you to verify system security.
I understand what you are saying: any determined pursuit of a goal
will focus your efforts, and in software design that usually produces
clearer, "better" thinking. But I don't think you have established that
the quest for security will lead to better results than a quest
for "user friendliness", or simply error free code.

>The same design processes that result in more correct systems can also
>produce easier to use systems.  For instance, the peer design review was
>good at preventing user interface inconsistencies from being implemented.

Peer review is only useful for determining user friendliness if the
"peers" in question are also the end-users of the systems. If this
is not the case, you also need "user review". The internals of the
system should be designed as consistently as possible, but
with human interactions, logic doesn't work the same way. In some
cases, what is consistent to the user may seem very inconsistent to
the software designers. In such a case, if the user's view does
not agree with the programmer's, I would argue the program is not
"user friendly", and likely to be part of a disaster down the line.

>                                                            And if your
>personal computer hasn't been hit by a virus, you've just been lucky so
>far; I think part of the problem is an "it won't happen to me" attitude by
>many people.

I haven't been hit by a virus (that I know of) because I religiously
crank up the virus scanner for every disk that goes into or out of
my computer that could be carrying viruses. I have still had to deal
with massive data losses caused by "user error" (read: systems that
weren't easy enough to understand or use). And before you tell me
that I'm blaming the system for my mistakes, they were someone else's
mistakes: I just had to clean them up. And I see no reason we should
expect to "educate" naive users to accept things that we as a programmer
community think are "user friendly enough". I do take security seriously,
in its place. I merely insist that it ought not to be top priority
for 90%+ of applications.

>--
>Barry Margolin, Thinking Machines Corp.
>
>barmar@think.com
>{uunet,harvard}!think!barmar

disclaimer: Queen's University merely supplies me with computer services, and
            they are responsible for neither my opinions or my ignorance.

John G. Spragge