[alt.society.cu-digest] CU Digest #3.00

TK0JUT2%NIU.BITNET@UICVM.uic.edu (01/06/91)

  ****************************************************************************
                  >C O M P U T E R   U N D E R G R O U N D<
                                >D I G E S T<
              ***  Volume 3, Issue #3.00 (January 6, 1991)   **
  ****************************************************************************

MODERATORS:   Jim Thomas / Gordon Meyer  (TK0JUT2@NIU.bitnet)
ARCHIVISTS:   Bob Krause / Alex Smith / Bob Kusumoto
BYTEMASTER:  Brendan Kehoe

USENET readers can currently receive CuD as alt.society.cu-digest.
Anonymous ftp sites: (1) ftp.cs.widener.edu (2) cudarch@chsun1.uchicago.edu
E-mail server: archive-server@chsun1.uchicago.edu.

COMPUTER UNDERGROUND DIGEST is an open forum dedicated to sharing
information among computerists and to the presentation and debate of
diverse views.  CuD material may be reprinted as long as the source is
cited.  Some authors, however, do copyright their material, and those
authors should be contacted for reprint permission.
It is assumed that non-personal mail to the moderators may be reprinted
unless otherwise specified. Readers are encouraged to submit reasoned
articles relating to the Computer Underground.
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
DISCLAIMER: The views represented herein do not necessarily represent the
            views of the moderators. Contributors assume all responsibility
            for assuring that articles submitted do not violate copyright
            protections.
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

CONTENTS:
File 1: Moderators' Corner
File 2: From the Mailbag
File 3: Gender-Neutral Language
File 4: Sexism and the CU
File 5: Security on the Net
File 6: The CU in the News

++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

----------------------------------------------------------------------

********************************************************************
***  CuD #3.00: File 1 of 6: Moderator's corner                  ***
********************************************************************

From: Moderators
Subject: Moderators' Corner
Date: January 6, 1991

++++++++++
In this file:
1. VOLUME 3 BEGINS WITH THIS ISSUE
2. SEXISM AND CuD
++++++++++

+++++++++++
Volume 3 Starts Here
+++++++++++

Volume 1, with issues #1.00 thru 1.29 and Volume 2, issues 2.00 thru 2.19,
complete the first year of CuD. With the new year we start a new volume,
and it will remain Volume #3 thru 1991.  We'll spare readers self-indulgent
reflections on the first year, but we're amazed that what began as a
temporary outlet with Pat Townson's support and help back in March seems to
have become at least semi-permanent. Following Craig Neidorf's victory, we
thought there would be little else to write about, but the articles,
comments, and responses keep coming, so we'll keep publishing as long as
they do. The ftp sites have expanded and contain a variety of papers and
documents related to the CU. We *STRONGLY ENCOURAGE* researchers, attorneys
and law students to send quality papers over to us for the archives. We
also thank all those who send in news blurbs--keep them coming.

+++++++++++
CuD and Sexism
++++++++++++

In a file below, the writer takes the moderators to task for not taking a
stand on sexist language. We agree that writing should be as gender free as
possible, but we don't change articles (except for formatting, spelling,
and deleting long sigs).  Authors have their own style, and while we object
to sexist language (or any other action that reinforces the cultural power
of one group over another), we cannot edit it out. An author's style is a
valid index of cultural influences, and therefore it remains an open
archive to be decoded as window into the world of, in this case, the CU. We
*STRONGLY* encourage articles on the isms (ageism, sexism, racism) and the
CU.

********************************************************************
                           >> END OF THIS FILE <<
***************************************************************************

------------------------------

From: Various
Subject: From the Mailbag
Date: January 6, 1991

********************************************************************
***  CuD #3.00: File 2 of 6: From the Mailbag                    ***
********************************************************************

From: wayner@SVAX.CS.CORNELL.EDU(Peter Wayner)
Subject: Re: Cu Digest, #2.19
Date: Thu, 3 Jan 91 14:27:26 -0500

This is in reply to John Debert's note in CuDigest #2.19:

He writes:
"Now, suppose that someone has used this method to encrypt files on his/her
system and then suppose that Big Brother comes waltzing in with a seizure
warrant, taking the system along with all the files but does not take the
code keys with them. Knowing Big Brother, he will really be determined to
find evidence of a crime and is not necessarily beneath (or above) fudging
just a bit to get that evidence. What's to keep him from fabricating such
evidence by creating code keys that produce precisely the resultsthat they
want-evidence of a crime? Would it not be a relatively simple procedure to
create false evidence by creating a new key using the encrypted files and a
plaintext file that says what they want it to? Using that new key, they
could, in court, decrypt the files and produce the desired result, however
false it may be. How can one defend oneself against such a thing? By
producing the original keys? Whom do you think a court would believe in
such a case?

One should have little trouble seeing the risks posed by encryption."

This is really unlikely, because in practice most people only use one-time
pads for communication. They are not in any way practical for on-site
encryption. Imagine you have 40 megabytes of data. If you want to encrypt
it with a one-time pad, you need 40 megabytes of key. If you did this,
it would be very secure because there exists a perfectly plausible 40 Meg key
for each possible 40 meg message.

But, if you were going to keep the 40 megs of encrypted data handy, you
would need to keep the 40 megs of key just as handy. When the government
came to call, they would get the key as well. That is why it is only
practical to use systems like DES and easy to remember, relatively short
keys to do the encryption. That way there is nothing to seize but your
brain.

---Peter Wayner
Dept. of Computer Science, Cornell Univ.
(wayner@cs.cornell.edu)

++++++++++++++++++++++++++

From: CuD Dump Account <works!cud@UUNET.UU.NET>
Subject:   BBSs as Business Phones?
Date: Thu, 03 Jan 91 15:57:49 EDT

Ok this is just a quick question.

How can it be legal to make BBS' operators shell out extra money for a
hobby, answering machines aren't something people have to pay extra for,
and in some cases thats what BBS's are used for. If its a public BBS, it is
receiving no true income from its users, unless they pay a standard,
billable time, (ie. A commercial BBS) What gives them the right to charge
us now? They don't force you to pay for special business class lines/fiber
optic lines to call lond distance do they? No its by choice. Most SysOps
buy the cheapest line available which is usually local only, no dial out,
etc. SysOp's in the long run absorb most, if not all the costs of running a
BBS, that means power, servicing, and the phone. The phone line at minimum,
is going to cost at least a hundred or so per year. Then power, its absurd.
In my case, I run a BBS to share information, and I allow everyone on for
free.  I've seen the old FCC proposals to have people using modems pay
more, but I don't rightly see why. If I am not mistaken this is bordering
on their greed to make more money for the growing modem populous.

Do they have a right to charge us? are they providing any type of special
service because we have a modem on the line, instead of an answering
machine, FAX, phone, or other? we are private citizens, it should be up to
us how we use the phones. TelCo's still a monopoly

There are a lot of rumours about this type of thing, only I've never seen
it actually put into action.

+++++++++++++++++++++++++

From: Paul Cook <0003288544@MCIMAIL.COM>
Suject: Response to "Hackers as a software development tool"
Date: Fri, 4 Jan 91 06:44 GMT

{Andy Jacobson <IZZYAS1@UCLAMVS.BITNET> writes:}
>
>I received one of those packs of postcards you get with comp.  subscription
>magazines (Communications Week) that had an unbelievable claim in one of
>the ads. I quote from the advertisement, but I in no way promote,
>recommend, or endorse this.
>
>"GET DEFENSIVE!
>YOU CAN'S SEE THEM BUT YOU KNOW THEY'RE THERE.
>Hackers pose an invisible but serious threat to your information system.
>Let LeeMah DataCom protect your data with the only data security system
>proven impenetrable by over 10,000 hackers in LeeMah Hacker Challenges I
>and II. For more information on how to secure your dial-up networks send
>this card or call, today!" (Phone number and address deleted.)
>
>So it seems they're claiming that 10,000 hackers (assuming there are that
>many!) have hacked their system and failed. Somehow I doubt it. Maybe they
>got 10,000 attempts by a team of dedicated hackers, (perhaps employees?)
>but has anyone out there heard of the LeeMah Hacker Challenges I and II?

Yes, Lee Mah is for real.  They make a some nice computer security
equipment to stop folks from trying to gain access to your dialup modems.

The "Hacker Challenge" is for real too.  They publicized it for a long
time, and I recall reading about it in PC Week, Byte, and possibly
InfoWorld.  I don't know how accurate the "10,000" hackers is (maybe it was
10,000 call attempts?) but they ran a couple of contests where they gave a
phone number of one of their devices, and offered some kind of a prize to
anyone who could figure out how to get in.  I have seen the Lee Mah
catalog, and I don't recall how they provide security, but I think some of
their gear uses dialback modems that call pre-programmed user numbers when
the right code is entered.

++++++++++++++++++++++

From: stanley@PHOENIX.COM(John Stanley)
Subject: Re: a.k.a. freedom of expression
Date: Fri, 04 Jan 91 23:45:31 EST

In CuD 2.19, balkan!dogface!bei@CS.UTEXAS.EDU(Bob Izenberg) writes:

> I read this in issue 2.16 of the Computer Underground Digest:
>
>          [ quoted text follows ]
>
>           ADAM E. GRANT, a/k/a The             :
>           Urvile, and a/k/a Necron 99,         :
>           FRANKLIN E. DARDEN, JR., a/k/a       :
>           The Leftist, and                     :
>           ROBERT J. RIGGS, a/k/a               :
>           The Prophet                          :
>           [ quoted text ends ]
>
> The assumption here, that an alias employed in computer communications is
> the same as an alias used to avoid identification or prosecution, doesn't
> reflect an awareness of the context within which such communications
> exist.

The only reason "The Prophet" was used was to avoid identification.
But, that doesn't really matter. The reason it was included in the
Government doohicky was to identify the one legal name and alternates
chosen by the defendant used by him as his sole identification at specific
times.

> The very nature of some computer operating systems demands some
> form of alias from their users. Management policy also affects how you
> can identify yourself to a computer, and to anyone who interacts with you
> through that computer.

How you identify yourself in communications is entirely up to you.  You
do not need to use your computer User ID as your sole identity. Note that
the From: line of your original post identified you, as does mine.  If I
add a .sig that identifies me as "Draken, Lord of Trysdil", and remove the
From: comment name, then you know me as Draken, and bingo, I have an a.k.a.
Am I doing it to commit a crime? Probably not. It doesn't really matter.

> If we strip the implication from those three letters
> that the party of the leftmost part is calling themselves the party of the
> rightmost part to avoid getting nabbed with the goods, what's left?

You are left with the fact that they are also known as ..., which is
just what the a.k.a stands for. It does NOT stand for Alias for Kriminal
Activity, as you seem to think it does. The "implication" you speak of is
an incorrect inferance on your part. Guilty conscience?

> In using a computer communications medium, particularly an informal one
> like a BBS, the name you choose can set the tone for the aspect of your
> personality that you're going to present (or exaggerate.)

You mean, like, the name you chose is how you will be known? Like, you
will be known to some as "Bob Izenberg", but on the BBS you will be also
known as "Krupkin the Gatherer"? Like a.k.a.?

> Are radio
> announcers using their "air names" to avoid the law?  How about people with
> CB handles?  Movie actors and crew members?  Fashion designers?  Society
> contains enough instances of people who, for creative reasons, choose
> another name by which they're known to the public.

And if any of them go to court, they will have a.k.a., too. There will
be their legal name, followed by the a.k.a. There is no implication of
criminal activity from just having an a/k/a, just the indication that the
prosecution wants to make sure the defendants are identified. "Him.  That
one, right there. His legal name is X, but he is also known as Y and Z. All
the evidence that says that Y did something is refering to him, X, because
the witness knows him by that."

> Whenever somebody uses a.k.a., correct them!

Ok, consider this a correction, at your own demand.

+++++++++++++++++++++++

From: 6600mld@UCSBUXA.BITNET
Subject: Response to Encryption dangers in seizures
Date: Sat, 5 Jan 91 14:19:07 PST

>Subject: Encryption dangers in Seizures
>Date: Sat, 29 Dec 90 11:20 PST

[misc background on encryption and its use to thwart Big Brother deleted.]

>Now, suppose that someone has used this method to encrypt files on his/her
>system and then suppose that Big Brother comes waltzing in with a seizure
>warrant, taking the system along with all the files but does not take the
>code keys with them. Knowing Big Brother, he will really be determined to
>find evidence of a crime and is not necessarily beneath (or above) fudging
>just a bit to get that evidence. What's to keep him from fabricating such
>evidence by creating code keys that produce precisely the results that they
>want-evidence of a crime? Would it not be a relatively simple procedure to
>create false evidence by creating a new key using the encrypted files and a
>plaintext file that says what they want it to? Using that new key, they
>could, in court, decrypt the files and produce the desired result, however
>false it may be. How can one defend oneself against such a thing? By
>producing the original keys? Whom do you think a court would believe in
>such a case?  > >One should have little trouble seeing the risks posed by
encryption.

I think it unlikely that if the Feds wanted to frame you or fabricate
evidence that they would bother to use the encrypted data found at your
site.  Instead, I think, they would fabricate the whole wad -- plaintext,
key, and ciphertext.  For this reason, it is not only one-time key
encryption that is threatened, but iterative algorithms as well.

So, if I have data encrypted, and the feds are going to "fix" it, why is
this any more dangerous than having NO DATA?  If they want to frame me,
they're going to (try), regardless of whether they found encrypted data or
not!  Thus, I see encryption as preventing the feds from really KNOWING
what you do and do not have.  This is very valuable.  I think that even in
our mostly corrupt government that it would be difficult to fabricate
evidence to the tune of posession of AT&T source code.

Similar tactics can be applied JUST AS EASILY to physical crimes.  The
crime lab finds a dead guy with a .44 slug in him.  The suspect owns a .44,
but not the one used in the shooting.  What is to prevent the (now seized)
.44 of the suspect to be fired and the slug swapped for the slug discovered
in the body?  This is trivial to accomplish, assuming the poeple involved
are sufficiently crooked.

Now, I'm not saying that the Feds don't fabricate evidence.  But I do not
think that encrypting one's data makes one a more vulnerable target to such
injustice.

>jd / onymouse@netcom.UUCP     netcom!onymouse@apple.com

********************************************************************
                           >> END OF THIS FILE <<
***************************************************************************

------------------------------

From: "Brenda J. Allen (303) 492-0273" <ALLEN_B@CUBLDR.COLORADO.EDU>
Subject: Gender-Neutral Language
Date: Wed, 2 Jan 1991 14:03 MST

********************************************************************
***  CuD #3.00: File 3 of 6: Gender-Neutral Language             ***
********************************************************************

The Dark Adept's article (CuD #2.10, File 9) on In-House Security Problems
was informative and insightful.  However, I was appalled by the author's
consistent and flagrant use of masculine pronouns and sex-linked nouns to
refer to persons (hackers, system operators, employees) who could be either
male or female.  Although hackers and system operators traditionally have
been men, women also are assuming those roles.  Moreover, employees who use
computers certainly comprise both genders.  Therefore, references to users
as males (e.g., "employees often choose passwords such as their wife's
maiden name") are particularly inappropriate and sexist.

I am not accusing the author of intentional discrimination against females.
Rather, I believe that he or she may not be aware of the implications and
ramifications of gender-biased language.  Language has the power to shape
thought, reinforce biases, and perpetuate stereotypes.  Consequently,
omitting mention of females in a discussion about computer-related
activities may help to sustain the impression of male domination of that
area of our lives.  Moreover, such oversights may send the covert message
that some persons wish to maintain such an image, to discount contributions
by women, and/or to discourage female participation.

Therefore, I encourage everyone to become more thoughtful of their choice
of words and more sensitive to issues regarding gender.  This seems
particularly crucial in the contemporary forum of electronic discourse.  As
we pave new paths, we must assume responsibility for changing old language
habits.  Also, we should strive to avoid sending implicit and explicit
messages regarding females and their roles in computer science and related
fields.

On a positive note, I've observed such awareness in other CuD files.  For
instance, job announcements usually cite both genders, and Alan Wexelblat
recently qualified a reference to philosophers as males by noting that
women had been systematically excluded from that area of study.

Guidelines for avoiding the use of male-only pronouns include the
following: reword sentences to eliminate unnecessary gender pronouns;
alternate the use of female and male pronouns and nouns; recase sentences
into plural forms (e.g., "they" or "we"); use neutral terms like "one,"
"you," "an individual," etc. instead of "he" or "she."  Another way to
avoid subtle sexism is to substitute asexual words and phrases for
man-words (e.g., "spouse's name" instead of "wife's maiden name").

Although applying these and other guidelines may be challenging and
somewhat time-consuming, it is imperative that we make the effort to
acknowledge the changing shape of our society as women continue to occupy
positions previously reserved for men.

********************************************************************
                           >> END OF THIS FILE <<
***************************************************************************

------------------------------

From: Liz E. Borden
Subject: Sexism and the CU
Date: Mon, 31 Dec 90 12:52 PST

********************************************************************
***  CuD #3.00: File 4 of 6: Sexism and the CU                   ***
********************************************************************

Why, you ask, do I think the CU is sexist?  Carol Gilligan wrote that women
speak in "a different voice" from men, one grounded more in nurturing,
dialogue, negotiation and control-fee language.  The voice of the computer
world reflects a male voice and recreates the subtle patriarchy of the
broader society through the so-called neutrality of "objective" science and
the ways of speaking and behaving that, when translated into the
two-dimensional world of electronic communications, tend to silence women.

Computer underground Digest, like the CU in general, is a male bastion.
Sexist language, male metaphors, and if I'm counting correctly, not a
single self-announced female contributor (although it is possible that some
of the pseudonyms and anonymous writers were women).  In fairness, I judge
that the editors of CuD attempt to be sensitive to the concerns of
feminists, and have noticed that articles under their name do not contain
sexist language and tend toward what's been called "androgenous discourse."
But, they have have not used their position to translate concerns for
social justice into practice by removing sexist language (or even posting a
policy preference), by encouraging women, or by soliciting articles on
minorities, women, and other groups that are invisible and silent.

Let's look at just a few areas where cybersexism creeps in.  First, The CU
is made up mostly of males. I'm told by friends, and the facts are
consistent with those given to me by one CuD moderator, that at a maximum,
less that five percent of pirates are female, and probably less than one
percent are phreaks or hackers. This skewed participation transports the
male culture of values, language, concerns, and actions, into a new world
and creates models that women must conform to or be excluded from full
membership. Like the Europeans, CUites move into a new territory and stake
out their cultural claim committing a form of cultural genocide against
those with different cultural backgrounds.  Isn't it ironic that in a new
world where "a million flowers bloom" and a variety of subcultures emerge,
that they are for all practical purposes male?

Second, BBSs, especially those catering to adolescents and college
students, are frightening in their mysogeny. I have commonly seen in
general posts on large boards on college towns discussion of women in the
basest of terms (but never comparable discussions of men), use of such
terms as broads, bitches, cunts, and others as synonomous with the term
"woman" in general conversation, and generalized hostile and angry
responses against women as a class. These are not isolated, but even if we
were to concede that they are not typical of all users on a board, such
language use is rarely challenged and the issues the language implies are
not addressed.

Third, sexism is rampant on the nets. The alt.sex (bondage, gifs,
what-have-you) appeal to male fantasies of a type that degrades women. No,
I don't believe in censorship, but I do believe we can raise the gender
implications of these news groups just as we would if a controversial
speaker came to a campus.  Most posts that refer to a generic category tend
to use male specific pronouns that presume masculinity (the generic "he")
or terms such as "policeman" or "chairman" instead of "chair" or "police
officer."

At the two universities I attended, both with excellent computer science
departments, women comprised about half of the undergraduate majors. This
shifted dramatically in grad school, and the male professors were generally
well-meaning, but most were not sensitive to the difficulties of women in a
male-dominated career. Yes, of course it's possible for women to succeed
and be taken seriously in the computer world, to advance, to earn high
salaries. But this isn't the point. The peripheral treatment in which we
are still treated like second class citizens exists.  The jokes, the
language, the subtle behaviors that remind us that we are women first and
professionals second, and all the other problems of sexism are carried over
into the computer world.  Why don't we think about and discuss some of
this, and why isn't CuD taking the lead?!

********************************************************************
                           >> END OF THIS FILE <<
***************************************************************************

------------------------------

From: Name withheld
Subject: Security on the Net
Date: Sun, 23 Dec 90 17:04:49 -0500

********************************************************************
***  CuD #3.00: File 5 of 6: Security on the Net                 ***
********************************************************************

COPS is a unix security package that runs through a checklist of sorts
looking for common flaws in system security.

I polled a security mailing list and got about 40 responses to a selected
number of questions dealing with security; it might be useful for inclusion
on how the net (at least some of the security minded ones) view security.
The answers to these questions shaped some of the philosophies of COPS and
might be indicative of the type of security tools to be developed in the
future.  My questions start with a number and a ")".

   1)  What kinds of problems should a software security system (SSS)
   such as COPS check for? (Mention specific examples, if you can.)

Just about everyone agreed that the more things checked, the better.  Some
specific wants of items I didn't mention, more or less in the order of # of
requests:

Some kind of _secure_ checksum method for checking up on binary files.

Checking binaries for known security problems - sendmail, fingerd, ftpd,
ect.

Checking the validity of the _format_ of key files rather than merely
checking if they are writable.

Checking for potential trojan horses; files such as "ls" in a users
account.

Finding things hidden under mount points.

Keeping track of accounts in a seperate file from /etc/passwd and run
periodic checks to see if any accounts have been added by any unauthorized
user.

Report unusual system activity, such as burning lots of CPU time.

Record unsuccessful login attempts and su's to root, when and by whom if
possible.

   2)  Are there any security problems too sensitive to be checked
   by a SSS?  That is, what things should *not* be built into a SSS?

Boy, this was a landslide.  Over 90% said NO, and not only no, but
basically "Hell No".  The only concerns I got were against password
cracking and problems that could not be easily fixed.  There was also a
small amount of concern about limiting access to root, but most realized
that no matter what, the benifits would outweigh any losses if the programs
were put out.

   3) What should the primary goal of a SSS be -- discovering as many
   security holes as possible in a given system (including bugs or
   design flaws that may not be easily fixed -- especially without
   source code), or merely uncovering correctable errors (due to
   ignorance, carelessness, etc)?

Another landslide.  Of all the responses, only one person objected to
finding all holes, although a few did say that finding the fixable holes
was top priority.

One view:

My use for an SSS is as a system monitor, not as a diagnostic tool.  I
suppose the diagnostic version also has its uses, but writing and
distributing such a program is asking for trouble.  I don't see anything
wrong with writing it and distributing only the binaries.

   4)  Do you feel that SSS are a security threat themselves?

Some dissent begins to show.... It was almost even here, with the no's
beating out the yes's by a single vote.  However, 2/3 of the yes votes
qualified there answer by stating something like "a tool can be misused"
and whatnot.  Here are some typical responses:

Of course.  They point to way for bad guys.  Such is life.  They are a
tool.  They have the potential for anything.  The security threat lies in
how they are used....

No, as long as they don't breed complacency. Just by running a SSS each
night should not make you thinks your systems are secure.

Fire is also dangerous but VERY useful.


   5) Do you think that the SSS should be restricted to be used only
   by system administrators (or other people in charge), or should
   they be accessible to all?

Here's where the problems start :-)  Everyone wants as many features as
possible, but quite a few of you don't want anyone else to have it.  Hmm...
Out of 35 responses on this question:

  12 - Yes, only SA's.
  10 - No.
   6 - It would be nice to have it restricted, but... How?
   5 - Have two versions; one restricted, one not.  Needless to say,
        the dangerous stuff should go in the first.
   1 - Restrict only parts that detect bugs/whatever that cannot be
        repaired.
   1 - Argh!  Help!

     Some quotable quotes:

I don't see how it could be restricted.

Admins, etc only. (possibly said because I'm an admin. From an intellectual
standpoint, I would want to know about this stuff even if I was just a
user)

I think the SSS should be restricted to system administrators with the
realisation that others can probably get their hands on the code if they
want to.

Definitely available to all, SA's can be as lazy as anyone and should not
be allowed to hide behind a veil of secrecy if, in doing so, they expose
the systems they administer.

It seems to me that only an "administrator type" will have sufficient
privilege levels to make _effective_ use of such a tool.  Ordinary users
may be able to garner _some_ benefit though, if run on their own files.  If
possible, can there be an "administrator" mode and a (restriced/limited)
"user" mode?

(and finally, my personal favorite...)

I think that a check for a hole that can't be closed shouldn't be a part of
the check, if that hole is widespread.  I have no examples of any such
hole, but a weak spot that can't be closed and has no workaround is one of
the few candidates for the security by secrecy concept.  I have mixed
feelings about this, but if I can't fix the hole, I'd rather not have it's
existence be "public" knowledge.  A freely available routine to locate the
hole would spread it's existence far and wide.....(?) But, if I didn't know
about it beforehand then it would be good to have a tool to tell me it
existed.  Gads, I hate moral conflicts!

   6) When a SSS finds a security flaw in a system, do you want it to
   indicate how they flaw could be used to compromise your system, or
   would you just accept the conclusion and apply a fix?

This question was ill worded and gramatically incorrect, but still managed
to conjure up a lot of comments.  Some thought it was asking if the system
should apply a fix.  In any case, almost 3/4 said Yes, indicate exactly how
to exploit any potential hole.  As usual, there were a few with
reservations about the info getting out, but....

Here are some of the more interesting comments:

                (Think about this one!)
*I* would like to know to futher my knowledge of Unix, but more importantly
to make sure that the version I have was not modified by a cracker to put
security holes *into* a system.  (That'd be sneaky :-)

Security by obfuscation doesn't work.

By definition, a SSS is a software system, and therefore has bugs in it.
If it reported a problem which would cause quite a bit of inconvenience if
fixed, or would be difficult to fix, then I would be much more apt to make
the fix if I knew how the problem could be exploited.  This is important,
because many, if not most, sites require only a moderate level of security,
and many security holes are fiendishly difficult to exploit.

We cannot assume that end-purchasers of a system can be as aware of the
internal workings of a system as the designers of the system (or SSS) are.
If a security flaw is discovered, the administrators need to be informed
about what changes are necessary to remove that flaw, and what
repercussions they may have.

Imagine a SSS that knew sendmail(8) was a security flaw allowing a worm to
enter systems.  It would report that sendmail is a security flaw, please
disable it like....  If the vendor had released a patch, and the SSS didn't
know how it, the administrator (in blind faith to this SSS program) might
disable a *very* useful program unnecessarily.

   7)  Do you think that there is too much, not enough, or just about
   the right amount of concern over computer security?  How about at
   your computer site?  At other sites?

The "not enough"s won, but not by much.  I thought that given the paranoia
of a security group, this would be a larger victory.  Lots of people said
it depends -- on the type of facility, the size, etc. Large sites seem to
have a healthier view of security (paranoia :-)) than
smaller/non-governmental.  Only 4 or 5 said there was enough concern.  A
couple of people mentioned _The Cuckoo's Egg_ as suggested reading (I
heartily agree.)

More quotes:

(I don't know if the next answer is true, but I like it anyway!)

This is really a deep philosophical question---something to talk about over
a few beers at the bar, but not here.

I think it's a site dependent problem, and all the above are true: too
much, too little, and just right. Computer is not a "one size fits all"
situation. Having offered that opinion, I think an assessment of my site or
other sites is extraneous, and I will reserve that opinion.

... more attention to unauthorized use of the networks.

   8)  Do you think that there should be a ruling body that governs
   and enforces rules and regulations of the net -- sort of a net.police?

Some of you wondered what this had to do with software security, but just
about everyone answered anyway.  This one scared me!  The "No's" only beat
out the "yes's" by one vote.  Yikes!  Maybe I'm from the old school of
thought, but....  Several people said that it couldn't be done anyway; a
couple mentioned they a CERT-like agency to help out, but not control, and
finally two said that the laws and government were already there to do
this.

It's there, defacto.  The free market is working pretty well.

Absolutely. I quarrel with the "net.police" designation, per se, of course,
as do many others. But perhaps something more like a recognized trade
association, and providing similar services. Also, it is time that the
basic duties which must be reasonably performed by a site in order for it
to remain on the net should become a requirement rather than a matter of
individual whim.

Yuck!  This is very distasteful to me.  It will probably be necessary
though as more and more people participate in the net.  Enforcement will
have to be judicious until secure networking is developed and implemented
generally.

No.  Aside from the fact that it'd never work, I like Usenet as an anarchy.
It has some rough edges, but for the most part it works.  What does this
question have to do with SSS-type programs?

Enforcement will be tough and may hold back legitimate users.  But we have
to start somewhere.  So I suppose that I agree with having net.police, as
long as they don't turn things into a police.state.net.

   9)  Do you believe that breaking into other people's systems should
   continue to be against the law?

Only one said "no", and s/he had a smiley following the answer.  But there
were some of you who voiced concern that it wasn't really against the law
to begin with.  In _The Cuckoo's Nest_, Cliff Stoll talked about a
(Canadian, I think) case that the only reason the cracker was prosecuted
was for stealing electricity!  Less than a watt or something.  A few of you
mentioned denial of services as being a just reason, but what if they break
in only at night, when no one else is on, and they really don't take
anything at all?  Should that be less punishable than someone who sucks
away user CPU/disk/whatever?

Breakins should be encouraged and rewarded (1/2 :-).

Yes.  Unquestionably.  However, those laws should not attempt to regulate
inter-system traffic to cause these things to happen.

Yes - and as a felony in all cases, without exception.

Yes but murder, rape, robbery... are more important and laws and sentencing
should reflect this. There are some around who want to treat cracking as a
capital crime!

Yes, from the denial of services standpoint.  I pay $XXX,XXX.XX for a
system, and joe blow slides in and sucks away at those resources, there
should be a nontrivial penalty for getting caught.  Don't behead the guy,
but monetary fines or community service would be just fine.

I don't know.  I'm not a philosopher.  Certainly causing damage to others
is wrong, including denial of service, compromising sensitive info, or
whatever.  I'm concerned though that clamping down on young kids will
discourage them from becoming computer geeks.  I think we need to encourage
our young people to become technically literate.  If we don't become a more
expert society we can kiss it goodbye; all we'll have left is our military
solutions, like some brainless jock bully...

I'm not sure that it is everywhere - but: Yes.  Should attempting to break
in be against the law: No.  Is this vague: Yes.

I did not know that it was. The laws about it have not been tested and are
vague and unclear. You need to be very clear about what the laws are going
to do.

**HELL FUCKING YES** Those of us who started in UNIX years ago have for the
most part *always* respected others!! This I can't stress strong enough.

  10)  Is your site academic, government, or commercial in nature?

Just over 1/2 of those that answered claimed university ties, with about
1/4 being commercial, 1/6 government, a few research sites, and a couple
that were a mixture.  Sites included Sun, AT&T, SCO (Xenix), the DoD, and
the Army, among others.

(Guess where this one came from :-)

Research.  We invented Unix.

Academic with commercial applications.

Primarily academic, but we are part of the government.

Academic, except when collecting student fees *) *)

********************************************************************
                           >> END OF THIS FILE <<
***************************************************************************

------------------------------

From: Various
Subject: The CU in the News
Date: January 6, 1991

********************************************************************
***  CuD #3.00: File 6 of 6: The CU in the News                  ***
********************************************************************

From: portal!cup.portal.com!ZEL@SUN.COM
Subject: Kevin Mitnick ejected from DEC Meeting
Date: Wed,  2 Jan 91 19:30:48 PST

The December 24 edition of COMMUNICATIONS WEEK has an interesting article
on page 18 by Anne Knowles.  Quickly . . . DEC caught a fellow by the name
of Kevin Mitnick trying to register to attend their DECUS user group
meeting in Las Vegas.  According to the article he (Mitnick) is a well
known hacker who is currently on probation after having been found guilty
of breaking into Easynet.  Apparently someone recognized him while he was
registering.  They apparently barred him from the meeting and DEC is now
figuring out how to address any future attempts by "hackers" to get into
their meetings.  The article said they threw someone out of a meeting a
couple of years ago for hacking during the meeting.   One wonders exactly
what was being hacked during a training meeting!  The article says DEC
supplies networked terminals for for use by attendee's.

+++++++++++++++++++++++++++++++++

From: Rambo Pacifist
Subject: Another Naperville Story
Date: Sat, 5 Jan 91 05:09:22 CST

                     "Naperville man pleads innocent"
             From: CHICAGO TRIBUNE, Jan. 4, 1991, sect II p. 3
                            By Joseph Sjostrom

A former employee of Spiegel Inc. pleaded innocent Thursday to computer
fraud and other charges in connection with the alleged theft of thousands of
dollars worth of cash and credits from the company.

Michael H. Ferrell, 34, of Naperville, entered the plea before Du Page
County Associate Judge Brian F. Telander, who set the next hearing for Jan.
31.

Ferrell was indicted on Dec. 10 by the Du Page County grand jury on four
counts of computer fraud, three counts of theft and three counts of
forgery. The computer fraud indictments charge him with using computerized
cash registers in Spiegel stores on four occasions between November 1989,
and September 1990, to issue $5,451.41 in credits to his Mastercard,
American Express and Spiegel's charge cards.

The theft and forgery indictments charge that he took $22,673 in cash from
the company. He allegedly generated vouchers and other forms, some of them
at the Downers Grove and Villa Park stores, that described services
performed for Spiegel by equipment renters and printers. However, those
services had never actually been performed, and Ferell pocketed the money
that Spiegel payed for the services, according to the indictments.

Ferrell worked for Spiegel from 1981 until he was fired last Oct.  24, said
a company spokesman. Ferrell was a support services manager for the
company's catalog and outlet store operations, the spokesman said.

Ferrell was the second person charged in December by the Du Page County
state's attorney's office with the illegal use of a computer. The other
defendant was charged with computer tampering for allegedly gaining access
to computer programs in a Naperville software firm where he worked,
although he was not charged with profiting financially from the alleged
intrusion.

                               (end article)

********************************************************************

------------------------------

                           **END OF CuD #3.00**
********************************************************************