[comp.org.eff.talk] Should we let students run COPS to get each other's passwords?

mcovingt@athena.cs.uga.edu (Michael A. Covington) (06/12/91)

A few people here have been advocating the strange idea that UNIX users
have a moral right to obtain each other's passwords using COPS. I have a few
responses...

(1) Why is this any different from obtaining passwords by other forms of
snooping?  

(2) Are you saying "People with easy-to-guess passwords deserve to have their
accounts broken into"?  Blame the victim, of course, folks!  Do you say
the same thing about rape victims?

(3) Do users of our computer have a basic civil right to run any software
they want to?  Like maybe a program that writes to the disk until the disk
is full, deliberately crashing the machine?  Or does the administration have
some right to control what the computer is used for?

Come back to earth, folks.  Obtaining other users' passwords is an obvious
breach of security, regardless of how you do it.
-- 
-------------------------------------------------------
Michael A. Covington | Artificial Intelligence Programs
The University of Georgia  |  Athens, GA 30602   U.S.A.
-------------------------------------------------------

df@sei.cmu.edu (Dan Farmer) (06/12/91)

In article <foo>, mcovingt@athena.cs.uga.edu (Michael A. Covington) writes:
> A few people here have been advocating the strange idea that UNIX users
> have a moral right to obtain each other's passwords using COPS. I have a few
> responses...

  Out of curiosity, do you mind users running COPS at all, or do you
think it's too dangerous?

> (1) Why is this any different from obtaining passwords by other forms of
> snooping?  

(then later, you say:)

> Come back to earth, folks.  Obtaining other users' passwords is an obvious
> breach of security, regardless of how you do it.

  There is a simple way of stopping a password cracker from working --
don't let it see the password file (shadow passwords, for instance.)  Plus,
running COPS on a system doesn't mean that you want to break into it;
I run it on every new system I'm get on -- it gives me at least some idea
of what I'm up against, and if I can feel at least a little safe with
storing any "interesting" information on the system.  If a student
decides to run it, just to have some idea if their senior project is
easily stolen or whatever, then I'm all for it, personally.  Of course
it's up to the individual's site policy, which might preclude this....
Snooping?  Well, other forms of snooping.  Like writing a trojan horse 
to grab passwords?  Like looking over someone's shoulder while they type
a password?  Like packet snarfing to grab a password?  These are all
"active" attacks that usually entail some kind of mischief or malice 
behind them, although even with these it's not clear, depending on the
situation.  I wouldn't equate these with running a password cracker,
although depending on motive, the distinction can be non-existant.

  Finally, someone having a password does not mean that your system
will be broken into.  Use proactive password checking, education, and
other methods to ensure that the passwords on your system are secure.

> (2) Are you saying "People with easy-to-guess passwords deserve to have their
> accounts broken into"?  Blame the victim, of course, folks!  Do you say
> the same thing about rape victims?

  Getting the password is not the same as breaking into the account.
In addition, as was said in another post (muffy@remarque.berkeley.edu),
there is no defense against rape (short of killing yourself.)  There
certainly is a defense against having your account broken into because
you had a poor password.  Like changing it.

> (3) Do users of our computer have a basic civil right to run any software
> they want to?  Like maybe a program that writes to the disk until the disk
> is full, deliberately crashing the machine?  Or does the administration have
> some right to control what the computer is used for?

  How do you propose to stop people from running such software?  Are you
saying that running COPS should be against policy?  It does nothing
very special, just looks at the system involved; it certainly doesn't
break anything, it doesn't write to anyone else's files, or misuse
the information.  It's a tool that attempts to show you what's going
on on your system -- security through obscurity does not work; and
COPS merely tries to part the veil.

 -- dan

muffy@remarque.berkeley.edu (Muffy Barkocy) (06/12/91)

In article <1991Jun12.141657.29238@athena.cs.uga.edu> mcovingt@athena.cs.uga.edu (Michael A. Covington) writes:
   (2) Are you saying "People with easy-to-guess passwords deserve to have their
   accounts broken into"?  Blame the victim, of course, folks!  Do you say
   the same thing about rape victims?

No, and no.  However, I've heard this comparison to rape before, and it
is not a very good one.  A better one would be something like not
locking the front door of your house.  In the case of rape, the victim
can't take preventative measures against the attack (I don't believe
that "provocative dress" is truly a cause - if someone wants to rape
someone, they will do it however the other person is dressed).  I am not
saying to "blame the victim," but I am saying that if you leave all your
doors and windows unlocked, you should recognize that your chances of
getting broken into will probably go up, and if you do not want this to
happen, you *can* do something about it. I don't want to be raped, but
there is nothing I can do (if there is something, please let me know!)
to make it less likely.

Muffy

mcovingt@athena.cs.uga.edu (Michael A. Covington) (06/13/91)

In article <MUFFY.91Jun12081841@remarque.berkeley.edu> muffy@remarque.berkeley.edu (Muffy Barkocy) writes:
>In article <1991Jun12.141657.29238@athena.cs.uga.edu> mcovingt@athena.cs.uga.edu (Michael A. Covington) writes:
>   (2) Are you saying "People with easy-to-guess passwords deserve to have their
>   accounts broken into"?  Blame the victim, of course, folks!  Do you say
>   the same thing about rape victims?
>
>No, and no.  However, I've heard this comparison to rape before, and it
>is not a very good one.  A better one would be something like not
>locking the front door of your house.  

Does that mean I have no right to prosecute a burglar who happens to
get in through an unlocked door?  Does the unlocked door justify burglary???
-- 
-------------------------------------------------------
Michael A. Covington | Artificial Intelligence Programs
The University of Georgia  |  Athens, GA 30602   U.S.A.
-------------------------------------------------------

df@sei.cmu.edu (Dan Farmer) (06/13/91)

In article <foo>, mcovingt@athena.cs.uga.edu (Michael A. Covington) writes:
> In article <bar> muffy@remarque.berkeley.edu (Muffy Barkocy) writes:
> >In article <foo2> mcovingt@athena.cs.uga.edu (Michael A. Covington) writes:
> >(2)Are you saying "People with easy-to-guess passwords deserve to have their
> >   accounts broken into"?  Blame the victim, of course, folks!  Do you say
> >   the same thing about rape victims?
> >No, and no.  However, I've heard this comparison to rape before, and it
> >is not a very good one.  A better one would be something like not
> >locking the front door of your house.  
> Does that mean I have no right to prosecute a burglar who happens to
> get in through an unlocked door?  Does the unlocked door justify burglary???

  Nope.  She only said that your analogy wasn't any good (I agree), and
said nothing about breakins (computer or house-type) being justified by
leaving your door (virtual or physical) open.  And to some countries,
merely cracking a few passwords, browsing around, etc., etc., is not 
illegal, so you wouldn't have a right to prosecute, unless they were
so foolish as to come to the USA, come up to you and say to your face
that they did it.  Strange issues, all this security stuff.

 -- d

skymaste@brahms.udel.edu (Paul S Masters) (06/13/91)

		What is COPS?



-- 
      "Let there be light"   -- Bomb #20 -- Starship Dark Star

      Paul Masters N3IRU (The ham license arived 12/04/90) 

muffy@remarque.berkeley.edu (Muffy Barkocy) (06/13/91)

In article <1991Jun12.170651.4239@athena.cs.uga.edu> mcovingt@athena.cs.uga.edu (Michael A. Covington) writes:
   In article <MUFFY.91Jun12081841@remarque.berkeley.edu> muffy@remarque.berkeley.edu (Muffy Barkocy) writes:
   >In article <1991Jun12.141657.29238@athena.cs.uga.edu> mcovingt@athena.cs.uga.edu (Michael A. Covington) writes:
   >   (2) Are you saying "People with easy-to-guess passwords deserve to have their
   >   accounts broken into"?  Blame the victim, of course, folks!  Do you say
   >   the same thing about rape victims?
   >
   >No, and no.  However, I've heard this comparison to rape before, and it
   >is not a very good one.  A better one would be something like not
   >locking the front door of your house.  

   Does that mean I have no right to prosecute a burglar who happens to
   get in through an unlocked door?  Does the unlocked door justify burglary???

No, nor is that what I said.  What it does mean, though, is that if you
complain about someone walking in through your unlocked door, people
would be perfectly justified in telling you that you were stupid to
leave it unlocked.  And the analogy to rape is still wrong, which *is*
what I said.

Muffy

jet@karazm.math.uh.edu (J Eric Townsend) (06/13/91)

In article <1991Jun12.141657.29238@athena.cs.uga.edu> mcovingt@athena.cs.uga.edu (Michael A. Covington) writes:
>A few people here have been advocating the strange idea that UNIX users
>have a moral right to obtain each other's passwords using COPS. I have a few
>responses...

I would mention that during breakins at UH, crackers have often copied
COPS into place and cranked it up, hoping to get points to all sorts
of holes.

--
J. Eric Townsend - jet@uh.edu - bitnet: jet@UHOU - vox: (713) 749-2126
Skate UNIX! (curb fault: skater dumped)

   --  If you're hacking PowerGloves and Amigas, drop me a line. --

db@argon.Eng.Sun.COM (David Brownell) (06/13/91)

In article <1991Jun12.141657.29238@athena.cs.uga.edu>
	mcovingt@athena.cs.uga.edu (Michael A. Covington) writes:

> A few people here have been advocating the strange idea that UNIX users
> have a moral right to obtain each other's passwords using COPS.

Actually, I didn't read their comments that way.  I read them as
criticisms of a prior restraint policy, which prejudged those
users as "guilty" of some unspecified (but dire) crime.

The subject line is curious ... it addresses "students" as if they
are different from the "users" discussed in the rest of the note.
A University official might be able to claim that she was acting
in loco parentis for SOME students; not for all, and at many sites
not even enough to support this as general rationale.

If your site wants to make password guessing attacks difficult,
it should use shadow (adjunct) password files, and require users
to change their passwords relatively frequently.


> (3) Do users of our computer have a basic civil right to run any software
> they want to?

Depends.  If they entered a contract with your organization in which
they explicitly gave up such a right, they may not have one.  The only
laws I know about pertain to actually causing damage.  Your example
is prior restraint of behaviour that in itself is not damaging.

In educational environments I'm familiar with, there are multiple
kinds of accounts on timesharing systems ... some have restrictions
like "class work only", some have no restrictions, and some are
for "educational" use.  Only that "class work only" restriction
precludes students running cryptanalysis programs; and if they were
taking a cryptography course, not even then.

I think it's quite appropriate that students explore social issues like
"how do the social values and social costs of providing various kinds
of security differ in this computer system?".  They learn a lot when
they notice that "security" means different things to different people,
what the hot buttons are for different personality types, and that
there's often an asymmetric cost/benefit matrix.  If not in school,
when are they allowed to start finding these things out?

Also, it strikes me as counterproductive to claim (one side of the
mouth) that your computer is secure, and also (other side of mouth)
that your user community should not be able to evaluate those claims
for itself.  "Trust me, I'm from the government."  No thanks.

- Dave

#include <std/disclaimer.h>

gl8f@astsun7.astro.Virginia.EDU (Greg Lindahl) (06/13/91)

In article <1991Jun12.141657.29238@athena.cs.uga.edu> mcovingt@athena.cs.uga.edu (Michael A. Covington) writes:

>A few people here have been advocating the strange idea that UNIX users
>have a moral right to obtain each other's passwords using COPS. I have a few
>responses...

I'd like to point out that this isn't my point at all; rather, I've
been trying to say that the illegal act here is breaking into a
system. Mr. Covington seems to have lost sight of this.

I've also been saying that a responsible sysadmin should close obvious
holes. Mister Covington seems to think this is a blame-the-victim
mentality. I think it's good professional practice. Sysadmins should
expect that users need to be educated about proper security
procedures; any sysadmin that doesn't should be fired no matter
whether a break-in is detected or not.

I never claimed that lax procedures justify a break-in. I don't think
they do. End of story.

woodcock@mentor.cc.purdue.edu (Bruce Sterling Woodcock) (06/13/91)

In article <15013@exodus.Eng.Sun.COM> db@argon.Eng.Sun.COM (David Brownell) writes:
>In article <1991Jun12.141657.29238@athena.cs.uga.edu>
>	mcovingt@athena.cs.uga.edu (Michael A. Covington) writes:
>
>> A few people here have been advocating the strange idea that UNIX users
>> have a moral right to obtain each other's passwords using COPS.
>
>[various deleted]
>
>Also, it strikes me as counterproductive to claim (one side of the
>mouth) that your computer is secure, and also (other side of mouth)
>that your user community should not be able to evaluate those claims
>for itself.  "Trust me, I'm from the government."  No thanks.
>
>- Dave

I just thought I'd throw in my thoughts on this.  Being very security
conscious lately, especially after the incidents from the FSF machines,
I've begun running cops almost immediately after I get a new account
someplace.  I want to know just *how* much I can trust the machine, and
whether or not there are any really major holes that I think the staff there
should be warned about.

Recently, I got my FSF account reinstated, and so what was one of the first
things I did?  I ran cops.  About halfway through the process I received a
talk request from one official, asking why I was running it.  I assured them
I was not a cracker and simply wanted to ensure how secure the systems were
now... he told me (politely) that security was something that was their
concern, not mine, and that they were also wary of users who felt they had
some sort of obligation to enforce security on other users, do the staff
members a favor (as well as their job).

At this point, I had a choice.  And while I still feel I had every right to
determine the security of that system for myself (without exploiting any
breahes, mind you, simply just looking for them), I also realized that they
were understandably very paranoid and cautious about the whole thing and
that I was simply a guest on their system.  So I killed the process, and
deleted the long report (without even reading it) and even removed all the
programs.  I also received email about a half-hour later from another admin
there, noting that I had called crypt a suspiciously high number of times and
that he hoped I wasn't a cracker.  I referred him to the talk I had before
and he replied understandingly, noting that security was something they still
couldn't garauntee very well.

So naturally I don't trust their own evaluation, especially when it is by 
their own admission not good.  So I simple don't trust what I keep there to
any great extent.

Bruce

-- 
|    woodcock@mentor.cc.purdue.edu    | "If I can sell explosives to IH, then |
|       sirbruce@gnu.ai.mit.edu       | there's no reason you can't sell me a |
| sterling@maxwell.physics.purdue.edu | box of condoms."  - Jasper, in RL     |
|   Bruce@Asylum/CaveMUCK/FurryMUCK   | "I can't believe I'm doing this." - me|

bhv@areaplg2.corp.mot.com (Bronis Vidugiris) (06/13/91)

In article <1991Jun12.141657.29238@athena.cs.uga.edu> mcovingt@athena.cs.uga.edu (Michael A. Covington) writes:
)A few people here have been advocating the strange idea that UNIX users
)have a moral right to obtain each other's passwords using COPS. I have a few
)responses...

I imagine you can justify not allowing use of certain programs, such as COPS,
just as you can attempt to disallow people from running 'game' programs on
the school machines.  And probably, with about equal success....

I think, however, that the real problem is not in knowing other student's
passwords, but using their accounts without permission or authorization.

Anyway, if just 'knowing' other people's passwords is a crime, I'm guilty.  I
know several - mainly because people don't want the hassle of learning how to
ftp files around our heterogeneous internal computer network.

It seems to me that a shadow password file, as other posters have suggested,
is a much more realistic way to attempt to keep a lid on things.

-- 
* Disclaimer *

This posting (probably) represents what the NNTP socket was told, but it
isn't representative of Company Policy or Opinion.

bhv@mot.com

campbell@redsox.bsw.com (Larry Campbell) (06/13/91)

In article <1991Jun12.141657.29238@athena.cs.uga.edu> mcovingt@athena.cs.uga.edu (Michael A. Covington) writes:
-A few people here have been advocating the strange idea that UNIX users
-have a moral right to obtain each other's passwords using COPS. I have a few
-responses...
-
-(1) Why is this any different from obtaining passwords by other forms of
-snooping?  

Snooping without intrusion is not, and should not be, illegal.  Would you
outlaw telescopes?  Police scanners?

-(2) Are you saying "People with easy-to-guess passwords deserve to have their
-accounts broken into"?  Blame the victim, of course, folks!  Do you say
-the same thing about rape victims?

Having my password obtained (not used, just obtained) *hardly* equates to
being raped.  And I don't think *anyone* here has advocated breaking into
accounts -- simply guessing the password is *not* the same as breaking in!

-(3) Do users of our computer have a basic civil right to run any software
-they want to?  Like maybe a program that writes to the disk until the disk
-is full, deliberately crashing the machine?  Or does the administration have
-some right to control what the computer is used for?

No, and no one said they did.  Assuming the user has a right to a reasonable
amount of CPU cycles and disk space, what they do with this -- in an
academic environment -- should be pretty much up to them.  If they want to
try to crack passwords, fine!

-Come back to earth, folks.  Obtaining other users' passwords is an obvious
-breach of security, regardless of how you do it.

Wrong again.  It is a *potential* breach of security.

Look at it this way.  Suppose the intelligent teenager next door uses his
telescope to make close-up photographs of my house key while I'm using it.
Suppose further that he digitizes the photographs, and using a desktop
manufacturing setup (okay, he's a *rich* teenager) manages to produce a
workable duplicate of my house key.  

I think Michael Covington would have him clapped in irons immediately.  I,
however, claim that he has done nothing illegal, or even immoral.  The line
would be crossed, though, if he tried to *use* the key (without my
permission), or if he gave the key to someone else.

There's entirely too much hysteria surrounding hacking -- it's beginning to
sound a bit like the entirely specious War on Drugs -- and it sounds like
Michael Covington, and some of the other readers of this group, have
succumbed.
-- 
Larry Campbell             The Boston Software Works, Inc., 120 Fulton Street
campbell@redsox.bsw.com    Boston, Massachusetts 02109 (USA)

mcovingt@athena.cs.uga.edu (Michael A. Covington) (06/13/91)

Whoa there, everybody.

(1) I never claimed my computer was secure. I claim, very loudly, that
no computer on the network is perfectly secure.

(2) I stick to my guns. Running a password guesser is inappropriate
behavior because it involves access to other people's confidential
information. The encrypted password is world readable; the password
itself is not; that's why it's encrypted!

Stealing another person's confidential data (even by trial-and-error
guessing) is against the conduct regulations of any university you
care to name. I see no reason why this ordinarily culpable activity
should become innocent merely because it is performed with software.

For those who tuned in late: we HAVE NOT penalized any student for
running Cops. This is a hypothetical case only. We probably would not penalize
a student for running Cops unless he subsequently used or divulged
the passwords that he obtained, or there was clear evidence that he intended
to do so. (He would probably get nothing; we
run Cops ourselves, too!) 
-- 
-------------------------------------------------------
Michael A. Covington | Artificial Intelligence Programs
The University of Georgia  |  Athens, GA 30602   U.S.A.
-------------------------------------------------------

mcovingt@athena.cs.uga.edu (Michael A. Covington) (06/13/91)

In article <1991Jun12.211143.18803@murdoch.acc.Virginia.EDU> gl8f@astsun7.astro.Virginia.EDU (Greg Lindahl) writes:
>In article <1991Jun12.141657.29238@athena.cs.uga.edu> mcovingt@athena.cs.uga.edu (Michael A. Covington) writes:
>
>>A few people here have been advocating the strange idea that UNIX users
>>have a moral right to obtain each other's passwords using COPS. I have a few
>>responses...
>
>I'd like to point out that this isn't my point at all; rather, I've
>been trying to say that the illegal act here is breaking into a
>system. Mr. Covington seems to have lost sight of this.

  -- Or facilitating a break-in by others.

>
>I've also been saying that a responsible sysadmin should close obvious
>holes. 

  -- I agree.

  -- What YOU have lost sight of is that no computer will ever
     be perfectly free of security holes.  

>Mister Covington seems to think this is a blame-the-victim
>mentality. 

  -- Only when people take it to the extreme of saying that if
     a system has holes, people shouldn't be punished for
     exploiting those holes. And this is a very common attitude.

  -- My point is extremely simple: honest people don't even TRY to
     break into other people's accounts or obtain passwords without
     authorization.  Security holes or not!





-- 
-------------------------------------------------------
Michael A. Covington | Artificial Intelligence Programs
The University of Georgia  |  Athens, GA 30602   U.S.A.
-------------------------------------------------------

awessels@ccwf.cc.utexas.edu (Allen Wessels) (06/13/91)

In article <1991Jun13.042534.16952@athena.cs.uga.edu> mcovingt@athena.cs.uga.edu (Michael A. Covington) writes:

>  -- My point is extremely simple: honest people don't even TRY to
>     break into other people's accounts or obtain passwords without
>     authorization.  Security holes or not!

Your point is very simple.  And absurd.  I've watched several people attempt
to crack systems.  In most cases, their intent was to see if it could be done.
That's it.  Sys admins were considered members of a "club" that threw up
security as a challenge.  The entry test was the system security.  (Not
literally of course.)

Honest people do break into systems.  I've witnessed them do the crack and then
report it to the systems people.  Sounds pretty honest to me.

mcovingt@athena.cs.uga.edu (Michael A. Covington) (06/13/91)

In article <1991Jun13.033953.16881@redsox.bsw.com> campbell@redsox.bsw.com (Larry Campbell) writes:
>In
>I think Michael Covington would have him clapped in irons immediately.  I,
>however, claim that he has done nothing illegal, or even immoral.  The line
>would be crossed, though, if he tried to *use* the key (without my
>permission), or if he gave the key to someone else.
>
>There's entirely too much hysteria surrounding hacking -- it's beginning to
>sound a bit like the entirely specious War on Drugs -- and it sounds like
>Michael Covington, and some of the other readers of this group, have
>succumbed.


Let's discuss ideas here, not people.

Please do not attach my name to your untested hypotheses.



-- 
-------------------------------------------------------
Michael A. Covington | Artificial Intelligence Programs
The University of Georgia  |  Athens, GA 30602   U.S.A.
-------------------------------------------------------

mcovingt@athena.cs.uga.edu (Michael A. Covington) (06/13/91)

In article <50445@ut-emx.uucp> awessels@ccwf.cc.utexas.edu (Allen Wessels) writes:
>In article <1991Jun13.042534.16952@athena.cs.uga.edu> mcovingt@athena.cs.uga.edu (Michael A. Covington) writes:
>
>>  -- My point is extremely simple: honest people don't even TRY to
>>     break into other people's accounts or obtain passwords without
>>     authorization.  Security holes or not!
>
>Your point is very simple.  And absurd.  I've watched several people attempt
>to crack systems.  In most cases, their intent was to see if it could be done.
>That's it.  

By "don't" I meant "ideally don't".  

That's precisely the point on which I want to raise the ethical
consciousness of the user community.

Honest people do not go around picking the locks on people's houses or
cars, not even "to test security." I see no reason why the ethics of
computers should be any different.


-- 
-------------------------------------------------------
Michael A. Covington | Artificial Intelligence Programs
The University of Georgia  |  Athens, GA 30602   U.S.A.
-------------------------------------------------------

brian@ucsd.Edu (Brian Kantor) (06/13/91)

mcovingt@athena.cs.uga.edu (Michael A. Covington) writes:
>Honest people do not go around picking the locks on people's houses or
>cars, not even "to test security." I see no reason why the ethics of
>computers should be any different.

Actually, some of the most honest people in the world DO just that -
locksmiths.  They do it for money.  There are also amateur locksmiths 
who do it for the intellectual challenge.

Many computer geeks consider computer system security barriers in
exactly the same as an amateur locksmith considers a lock - it's a
challenge to his intellect.

Where the difference comes in is that most people learning to pick
locks do it on locks that don't protect something - typically, the lock
is mounted in a little piece of wood.  They don't practice on bank
vaults, the neighbor's front door, or in the car park.  Computer security
systems, rarely exist in isolation - they're usually attached to a
computer that's got lots of other stuff on it.

But in my experience, the people who want to pick their way through
locks and computer security systems are insatiably curious - and once
one of them is inside somewhere that has a lot of OTHER interesting
stuff to examine, the temptation is overwhelming.  I don't believe that
any significant number of those people, students or not, who pick their
way into a computer system are going to just leave without also looking
around and prying into data stored there - data that is quite possibly
personal and private.

And I can't believe that some of them won't do some unintended damage
in the process of getting in or looking around - they are, after all,
learning, and one learns by making mistakes.

So overall, I prefer to keep the uninvited out of my systems.  I'd like
to provide some place for people to play, but I haven't the time to
do the kind of monitoring that's required to make such a system work.

I wish I had an answer to this.  I don't.  If only we has some magic
touchstone....
	- Brian

bernie@metapro.DIALix.oz.au (Bernd Felsche) (06/14/91)

In <1991Jun13.042115.16845@athena.cs.uga.edu>
   mcovingt@athena.cs.uga.edu (Michael A. Covington) writes:

>Whoa there, everybody.

>(2) I stick to my guns. Running a password guesser is inappropriate
>behavior because it involves access to other people's confidential
>information. The encrypted password is world readable; the password
>itself is not; that's why it's encrypted!

Running a guesser is not breaking confidentiality. If I guessed that
you had red hair, never having seen you, and found out that you did
indeed have red hair, then I would not be breaking confidentiality,
even if you do wear a hat all the time.

All I gain, upon verification, is that you have red hair, or don't.
You can go and change the colour, that very day.

You are assuming an intent to break confidentiality, by somebody
guessing passwords, yet they may be seeking to protect theirs, by
ensuring that nobody else has guessable passwords. You are punishing
them, for checking the level of security in their environment.

You allow students to run COPS. Do you _encourage_ them to do so?
Security only works if it is enforced at all levels. 
-- 
Bernd Felsche,                 _--_|\   #include <std/disclaimer.h>
Metapro Systems,              / sold \  Fax:   +61 9 472 3337
328 Albany Highway,           \_.--._/  Phone: +61 9 362 9355
Victoria Park,  Western Australia   v   Email: bernie@metapro.DIALix.oz.au

mcovingt@athena.cs.uga.edu (Michael A. Covington) (06/15/91)

In article <1991Jun14.053131.753@metapro.DIALix.oz.au> bernie@metapro.DIALix.oz.au (Bernd Felsche) writes:
>In <1991Jun13.042115.16845@athena.cs.uga.edu>
>   mcovingt@athena.cs.uga.edu (Michael A. Covington) writes:
>
>>(2) I stick to my guns. Running a password guesser is inappropriate
>>behavior because it involves access to other people's confidential
>>information. The encrypted password is world readable; the password
>>itself is not; that's why it's encrypted!
>
>Running a guesser is not breaking confidentiality. If I guessed that
>you had red hair, never having seen you, and found out that you did
>indeed have red hair, then I would not be breaking confidentiality,
>even if you do wear a hat all the time.

Balderdash.  Information obtained by trial-and-error is still information!


-- 
-------------------------------------------------------
Michael A. Covington | Artificial Intelligence Programs
The University of Georgia  |  Athens, GA 30602   U.S.A.
-------------------------------------------------------

gordon@sneaky.lonestar.org (Gordon Burditt) (06/15/91)

>> (2) Are you saying "People with easy-to-guess passwords deserve to have their
>> accounts broken into"?  Blame the victim, of course, folks!  Do you say
>> the same thing about rape victims?
>
>  Getting the password is not the same as breaking into the account.
>In addition, as was said in another post (muffy@remarque.berkeley.edu),
>there is no defense against rape (short of killing yourself.)  There

There are defenses against rape, even though they aren't perfect. (And
killing yourself isn't a perfect defense, either).  Placing yourself on 
an otherwise uninhabited and unknown desert island is one.  Knowing martial 
arts and having an Uzi built into your artificial arm is another.  Some of 
the people on alt.sex.bondage have sources for equipment which can act as 
protection.

The existence of a defense is not a justification for blaming the victim
because they don't use it.

					Gordon L. Burditt
					sneaky.lonestar.org!gordon

campbell@redsox.bsw.com (Larry Campbell) (06/15/91)

In article <1991Jun14.193545.24869@athena.cs.uga.edu> mcovingt@athena.cs.uga.edu (Michael A. Covington) writes:
->Running a guesser is not breaking confidentiality. If I guessed that
->you had red hair, never having seen you, and found out that you did
->indeed have red hair, then I would not be breaking confidentiality,
->even if you do wear a hat all the time.
-
-Balderdash.  Information obtained by trial-and-error is still information!

Excuse me, but are we all speaking *English* here, or some new language with
which I am not familiar?  "Obtaining information" is not a breach of
confidentiality.  To violate a confidence, there must first *be* a confidence
to be violated.  A confidence exists when person A gives person B some
information, person B having agreed -- either implicitly or explicitly --
to keep the information to himself.

Posting your "confidential information" in encrypted form in a public place
hardly constitutes a confidence.  *You* may regard the information as
confidential, but there is no second party -- no person B -- who has agreed
not to violate the confidence.  Any confidentiality is entirely a figment of
your imagination.

If you don't like it, then either don't post your encrypted secrets (i.e.,
use a shadow password file), or get a better encryption algorithm.  But
don't go persecuting the curious and clever students who find the puzzle
challenging!!!
-- 
Larry Campbell             The Boston Software Works, Inc., 120 Fulton Street
campbell@redsox.bsw.com    Boston, Massachusetts 02109 (USA)

mrs@netcom.COM (Morgan Schweers) (06/15/91)

Some time ago mcovingt@athena.cs.uga.edu (Michael A. Covington) happily mumbled: 
>In article <1991Jun12.211143.18803@murdoch.acc.Virginia.EDU> gl8f@astsun7.astro.Virginia.EDU (Greg Lindahl) writes:
>>In article <1991Jun12.141657.29238@athena.cs.uga.edu> mcovingt@athena.cs.uga.edu (Michael A. Covington) writes:
>>
>>>A few people here have been advocating the strange idea that UNIX users
>>>have a moral right to obtain each other's passwords using COPS. I have a few
>>>responses...
>>
>>I'd like to point out that this isn't my point at all; rather, I've
>>been trying to say that the illegal act here is breaking into a
>>system. Mr. Covington seems to have lost sight of this.
>
>  -- Or facilitating a break-in by others.
>
    True, however COPS is a TOOL, and does not signify a break-in.  It
signifies a user concerned (whether for good or for evil is unknown)
about security on the system in question.
>>
>>I've also been saying that a responsible sysadmin should close obvious
>>holes. 
>
>  -- I agree.
>
>  -- What YOU have lost sight of is that no computer will ever
>     be perfectly free of security holes.  
>
    Very good, sir.  However, if the system manager uses COPS and removes
the holes listed, then the cracker *AND* the user will not find anything.

>>Mister Covington seems to think this is a blame-the-victim
>>mentality. 
>
>  -- Only when people take it to the extreme of saying that if
>     a system has holes, people shouldn't be punished for
>     exploiting those holes. And this is a very common attitude.

     No one (as far as I can tell) is saying that.  They are saying
that people should not be punished for DETERMINING that there are
holes.
>
>  -- My point is extremely simple: honest people don't even TRY to
>     break into other people's accounts or obtain passwords without
>     authorization.  Security holes or not!

     This is bull$#!t, excuse my language.  I was a student at a
East coast college, and I developed a small package to test the
security of the local VMS system.  I did it because I wanted to
learn how to use the library functions, as well as to evaluate
how strong the security under VMS was.

     I handed the data I learned over to the system operators,
and *THEY* didn't know what to do with the information.  I
proceeded to go up the chain of managers, until I managed to
convince a Very Highly Placed Personage that with one command
I could crash VMS V4.0.
     I also convinced them that the algorithm they were using
to generate passwords was a VERY VERY VERY bad idea, and
proceeded to demonstrate IN FRONT OF THEM that one could enter
any one of 3000 accounts knowing *NOTHING* about the student
in question.  I had confirmed my knowledge with a fellow student,
and *NOT* with anyone else's accounts before this.
    They quickly revamped security, and regened the entire set of
passwords with random passwords.  Sadly, this didn't work out too
well either, but that's another story.

     In any case, they were grateful.

     I learned a great deal from this, and the knowledge gained
STILL has application after I've long since migrated from VMS
systems.

     Am I an honest person?  Obviously I'll say yes.  If you
think that I'm lying, then you would say no.  I dispute your
claim that no user who is 'honest' is interested in obtaining
passwords without authorization, in any case.
    I enjoy knowledge for knowledge's sake.  If I can help
someone out through it, I do so.  (I was called by a user
once who had forgotten his password.  I wasn't the official
person to call, but I was a friend of his and the campus
was closed.  (Yes, there *WERE* no system operators there
at night.)  I cracked his password, and told him it.  Was
this honest?  Probably.  Was this the 'right' thing to do?
That's an ethical decision that *I* made, and that *YOU*
would have to make too.)

>-------------------------------------------------------
>Michael A. Covington | Artificial Intelligence Programs
>The University of Georgia  |  Athens, GA 30602   U.S.A.
>-------------------------------------------------------

                                       --  Morgan Schweers

-- 
mrs@netcom.com   |   Morgan Schweers   |  Good code, good food, good sex.  Is
ms@gnu.ai.mit.edu|   These messages    |  anything else important?  --  Freela
Kilroy Balore    |   are not the       +--------------------------------------
Freela           |   opinion of anyone.|  I *AM* an AI.  I'm not real...

mcovingt@athena.cs.uga.edu (Michael A. Covington) (06/15/91)

In article <1991Jun15.024453.17639@redsox.bsw.com> campbell@redsox.bsw.com (Larry Campbell) writes:

>"Obtaining information" is not a breach of
>confidentiality.  To violate a confidence, there must first *be* a confidence
>to be violated.  A confidence exists when person A gives person B some
>information, person B having agreed -- either implicitly or explicitly --
>to keep the information to himself.
>

You've deliberately misunderstood what I meant by "confidential."
Passwords are secret. Period. You have no right to decrypt other
people's passwords. Period. Regardless of the technical difficulty or ease
of doing so.

OK then, if passwords aren't secret, give me yours!!!


-- 
-------------------------------------------------------
Michael A. Covington | Artificial Intelligence Programs
The University of Georgia  |  Athens, GA 30602   U.S.A.
-------------------------------------------------------

mbrown@testsys.austin.ibm.com (Mark Brown) (06/15/91)

campbell@redsox.bsw.com (Larry Campbell) writes:
| mcovingt@athena.cs.uga.edu (Michael A. Covington) writes:
| ->Running a guesser is not breaking confidentiality. If I guessed that
| -Balderdash.  Information obtained by trial-and-error is still information!
| 
| Excuse me, but are we all speaking *English* here, or some new language with
| which I am not familiar?  "Obtaining information" is not a breach of
| confidentiality.  To violate a confidence, there must first *be* a confidence
| to be violated.  A confidence exists when person A gives person B some
| information, person B having agreed -- either implicitly or explicitly --
| to keep the information to himself.

Yup. And I'll maintain the "B", being a user on the system, agreed implicitly
to maintain confidentiality.

| Posting your "confidential information" in encrypted form in a public place
| hardly constitutes a confidence.  *You* may regard the information as
| confidential, but there is no second party -- no person B -- who has agreed
| not to violate the confidence.  Any confidentiality is entirely a figment of
| your imagination.

Nope. "B", a user on the system, has a responsibility to that system.

| If you don't like it, then either don't post your encrypted secrets (i.e.,
| use a shadow password file), or get a better encryption algorithm.  But
| don't go persecuting the curious and clever students who find the puzzle
| challenging!!!

I'll prosecute any of the "curious and clever" who find the locks to my
front door (locked or unlocked) a "challenging puzzle".

I'll also "persecute" users of my systems who try to subvert the security of
that system without my permission. That security isn't there as a "puzzle",
fool.


DISCLAIMER: My views may be, and often are, independent of IBM official policy.
Mark Brown       IBM PSP Austin, TX. |     Crazed Philosophy Student
(512) 823-3741   VNET: MBROWN@AUSVMQ |   Kills 15 In Existential Rage!
MAIL: mbrown@testsys.austin.ibm.com  |                      --tabloid headline

mbrown@testsys.austin.ibm.com (Mark Brown) (06/15/91)

Several Different people write Many Things:
| >>I'd like to point out that this isn't my point at all; rather, I've
| >>been trying to say that the illegal act here is breaking into a
| >>system. Mr. Covington seems to have lost sight of this.
| >
| >  -- Or facilitating a break-in by others.
| >
|     True, however COPS is a TOOL, and does not signify a break-in.  It
| signifies a user concerned (whether for good or for evil is unknown)
| about security on the system in question.

Yup. Thus, if I find out someone's using COPS on my system, I'm sure as hell
going to be concerned. 'cause I don't knwo the intent.

|     Very good, sir.  However, if the system manager uses COPS and removes
| the holes listed, then the cracker *AND* the user will not find anything.

If there are holes, though, exploiting them still isn't something I'm going
to condone or allow. If the intent is to harm or even to *explore*, I'm 
going to shut them down *hard*. "explore?" you say? 

Yes. Information is valuable.

|      No one (as far as I can tell) is saying that.  They are saying
| that people should not be punished for DETERMINING that there are
| holes.

Ah. Here's where intent (and determining intent) come in.
If *I* were the admin for a large system with 100s of users I didn't know
very well (any major university) *I'd* be viewing "probing for security
holes" with hostility, too.

I'm not about to waste my time trying to determine "truth of intent" for all
these students. I'm going to prohibit that behavior unles permission is 
granted in *advance*.

You are free to "test" my system all you want, provided I know what/why you
are doing it. If I don't know, I *have* to assume hostile intent to protect
my other users.

If you view this as "persecution", get your own system, or find out what
it's like to be responsible for one.

DISCLAIMER: My views may be, and often are, independent of IBM official policy.
Mark Brown       IBM PSP Austin, TX. |     Crazed Philosophy Student
(512) 823-3741   VNET: MBROWN@AUSVMQ |   Kills 15 In Existential Rage!
MAIL: mbrown@testsys.austin.ibm.com  |                      --tabloid headline

bernie@metapro.DIALix.oz.au (Bernd Felsche) (06/16/91)

In <1991Jun15.152057.7681@athena.cs.uga.edu>
   mcovingt@athena.cs.uga.edu (Michael A. Covington) writes:

>You've deliberately misunderstood what I meant by "confidential."
>Passwords are secret. Period. You have no right to decrypt other
>people's passwords. Period. Regardless of the technical difficulty or ease
>of doing so.

>OK then, if passwords aren't secret, give me yours!!!

bernie:VINZ3VL2Gz8A6,2.RF:402:100:Bernd Felsche:/u1/us/bernie:/bin/ksh

Good luck! You'll note that it expires shortly.
-- 
Bernd Felsche,                 _--_|\   #include <std/disclaimer.h>
Metapro Systems,              / sold \  Fax:   +61 9 472 3337
328 Albany Highway,           \_.--._/  Phone: +61 9 362 9355
Victoria Park,  Western Australia   v   Email: bernie@metapro.DIALix.oz.au

df@sei.cmu.edu (Dan Farmer) (06/16/91)

In article <foo>, mcovingt@athena.cs.uga.edu (Michael A. Covington) writes:
[...]
> You've deliberately misunderstood what I meant by "confidential."
> Passwords are secret. Period. You have no right to decrypt other
> people's passwords. Period. Regardless of the technical difficulty or ease
> of doing so.

  Why?  We don't have *explicit* rights to do much of anything in this
world -- mostly people are constrained by laws against certain actions.
You have no authority or "right" to tell me what my rights are -- living
in the US of A, supposedly we have free speach, etc., but even those are
extremely tenuous.  What is a "right", anyway?  I'm not sure what you're
even talking about here -- the word is abused, misconstrued, and generally
misued.  Certainly there are no judicial laws that I'm aware of against
cracking passwords.  There are some laws, in some countries, against
breaking into (whatever that means) certain systems.  If you're talking
about some *moral* law or right, you haven't made a convincing argument --
all you are saying, as far as I can see, is "because that's the way I
think it should be."  I suppose some sites can make a policy that anyone
on their system, cracking passwords is a punishable (by whatever) offence,
but I'm not sure how well that would hold up in court, if someone
decided to challenge it.

> OK then, if passwords aren't secret, give me yours!!!

  Sure:

df:T8oOksRWnnA8Y:3271:20:Dan:/usr/users/df:/usr/local/bin/tcsh

  Break it if you can.

 -- d

mcovingt@athena.cs.uga.edu (Michael A. Covington) (06/17/91)

Stealing passwords is a violation of Georgia law and of the
policies that students promise, in writing, to obey when they 
receive computer accounts.

All this casuistry about what is meant by "confidentiality" is pointless.
Passwords are clearly confidential information.

I see no reason to keep repeating this.
-- 
-------------------------------------------------------
Michael A. Covington | Artificial Intelligence Programs
The University of Georgia  |  Athens, GA 30602   U.S.A.
-------------------------------------------------------

emv@msen.com (Ed Vielmetti) (06/17/91)

In article <1991Jun16.214835.26892@athena.cs.uga.edu> mcovingt@athena.cs.uga.edu (Michael A. Covington) writes:

   Stealing passwords is a violation of Georgia law and of the
   policies that students promise, in writing, to obey when they 
   receive computer accounts.

Do you have the text of the relevant Georgia law, and of the written
policy statements which students are required to sign?  That'd be a
good addition to the archive of policy statements kept at
ftp.cs.widener.edu:/pub/cud/schools/ .  

Note that there are some schools which have an "honor code" of some
sort that is a blanket prohibition of all kinds of unethical behavior;
for instance, if you were to go about cracking passwords at the U of
Michigan engineering school chances are that as a student the first
thing that'll be thrown against you is an honor code violation.

   All this casuistry about what is meant by "confidentiality" is pointless.

Nice word, casuistry.  American Heritage just says "determination of
right and wrong in questions of conduct or conscience by the
application of general principles of ethics".  Webster goes a bit
farther and says "sophistical, equivocal, or specious reasoning."
Apparently you have a low opinion of people who are critical of your
policies.  

That said, there are any number of people who make decisions about
where to pursue further schooling or employment based on the
reasonablenes of the computing environment provided to them; to be
blunt, the U of Georgia doesn't appear to have a particularly
appealing setup.  A giant password file out in the open, and the only
effective means of ensuring security of any means is via litigation
and kicking people off of the system and out of school.  

--Ed

ear@wpi.WPI.EDU (Eric A Rasmussen) (06/17/91)

In article <27111@as0c.sei.cmu.edu> df@sei.cmu.edu (Dan Farmer) writes:
>In article <foo>, mcovingt@athena.cs.uga.edu (Michael A. Covington) writes:
>> OK then, if passwords aren't secret, give me yours!!!
>  Sure:
>df:T8oOksRWnnA8Y:3271:20:Dan:/usr/users/df:/usr/local/bin/tcsh
>  Break it if you can.

Ok, now here's an interesting (IMHO) question...

By distributing your password in encrypted form and encouraging others to
crack it, are you guilty of the same 'crime' as that student who distributed
his system's /etc/passwd file to a known cracker?  Should any action be taken
against you for possibly compromising the security of your system, and if so
what?

(Note that I am just raising a point and am not advocating actually getting you
in trouble.)

+---------< Eric A. Rasmussen - Mr. Neat-O (tm) >---------+ +< Email Address >+
|   A real engineer never reads the instructions first.   | | ear@wpi.wpi.edu |
|   (They figure out how it works by playing with it.)    | | ear%wpi@wpi.edu |
+---------------------------------------------------------+ +-----------------+
                     ((( In Stereo Where Available )))

learn@ddsw1.MCS.COM (William Vajk) (06/17/91)

In article <7681@athena.cs.uga.edu> Michael A. Covington writes:

>OK then, if passwords aren't secret, give me yours!!!

Of course the entire purpose of passwords is security through secrecy.

One that I used as a root password, 'welcom'

Misspellings are great. Classical words and terms not found in dictionaries
are just as good.

Bill Vajk

df@sei.cmu.edu (Dan Farmer) (06/17/91)

In article <foo2>, ear@wpi.WPI.EDU (Eric A Rasmussen) writes:
> In article <bar> df@sei.cmu.edu (Dan Farmer) writes:
> >In article <foo>, mcovingt@athena.cs.uga.edu (Michael A. Covington) writes:
> >> OK then, if passwords aren't secret, give me yours!!!
> >  Sure:
> >df:T8oOksRWnnA8Y:3271:20:Dan:/usr/users/df:/usr/local/bin/tcsh
> >  Break it if you can.
> By distributing your password in encrypted form and encouraging others to
> crack it, are you guilty of the same 'crime' as that student who distributed
> his system's /etc/passwd file to a known cracker?  Should any action be taken
> against you for possibly compromising the security of your system, and if so
> what?

   Well, since the password is for a machine that is not on the internet,
I don't think there's much of a problem (yeah, I know, I cheated).  But it
*is* an interesting point.  I'm not sure if it matters, but since I make
my living on computer security and how secure passwords are, and I know
that that password is pretty much uncrackable by "normal" means -- e.g.
unless you did an exhaustive search, you'd be out of luck unless you got
"lucky" with a random statistical guess.  I suppose you could certainly
make a case for it, I don't know -- perhaps since I'm an administrator for
the machine the password is from, then I'm exempt, since I would handle 
any breakin problems?  I suppose if my machine was somehow broken into
because of my post, I might get into trouble, although if they would need
physical access to the machine, and then passwords would be the last of
my problems.

  One last point -- I haven't heard of anyone getting prosecuted (in a
legal/judicial sense) for the act of distributing password files, *unless*
some misfortune happened as a result from this, although if you were from
a classified site, you might get into serious trouble, and you could
probably get fired or suspended for distributing passwords or password
files, depending on what your site policy was.  And even though Michael
Covington says that "stealing passwords is a violation of Georgia law",
I'd be suprised if this was actually true; more likely that if you use
stolen passwords to break into a system, you can get in trouble.

 -- dan

tighe@convex.com (Mike Tighe) (06/17/91)

In article <27137@as0c.sei.cmu.edu> df@sei.cmu.edu (Dan Farmer) writes:
>In article <foo2>, ear@wpi.WPI.EDU (Eric A Rasmussen) writes:
>> In article <bar> df@sei.cmu.edu (Dan Farmer) writes:
>>> In article <foo>, mcovingt@athena.cs.uga.edu (Michael A. Covington) writes:

>>>> OK then, if passwords aren't secret, give me yours!!!

>>> Sure:
>>> df:T8oOksRWnnA8Y:3271:20:Dan:/usr/users/df:/usr/local/bin/tcsh
>>> Break it if you can.

>> By distributing your password in encrypted form and encouraging others to
>> crack it, are you guilty of the same 'crime' as that student who
>> distributed his system's /etc/passwd file to a known cracker?  Should any
>> action be taken against you for possibly compromising the security of your
>> system, and if so what?

> I suppose if my machine was somehow broken into because of my post, I might
> get into trouble, although if they would need physical access to the
> machine, and then passwords would be the last of my problems.

Well, if Dan can get in trouble for giving away his encrypted password,
what about the person who solicited it? Is Michael Convington now guilty of
something, since he attempted (and succeeded) in getting several folks to
post their encrypted passwords to the net?

What if some bad guy sees these encrypted postings, and breaks into these
systems. Is Mike at fault? After all, if he had not solicited for these
passwords, that would not have been posted. Right?

My view is no, neither Dan Farmer are guilt of anything. Only the person
who broke into the systems is guilty. One could argue that Dan and Mike
etl. al. did not exercise good judgement, but they didn't break into
anything, and more importantly, they had not intent to.

Other views?
--
-------------------------------------------------------------
Mike Tighe, Internet: tighe@convex.com, Voice: (214) 497-4206  
-------------------------------------------------------------

df@sei.cmu.edu (Dan Farmer) (06/17/91)

In article <foo>, learn@ddsw1.MCS.COM (William Vajk) writes:
> Of course the entire purpose of passwords is security through secrecy.
> One that I used as a root password, 'welcom'
> Misspellings are great. Classical words and terms not found in dictionaries
> are just as good.

  Not if you don't want to get broken into.  Larger, more comprehensive
on line dictionaries are becoming easier to get access to.  *Any* word
that is found in a dictionary can be easily guessed, whether it is an
english or foreign language word/term.  I also wouldn't advise such things
as chopping off a character from the back or front of a word (thanks,
BTW, for the idea -- just added it to COPS), replacing an "o" (the letter
"oh") by a 0 ("zero"), adding a single digit, capitalizing a single word,
etc., etc., etc.  Dan Klien wrote up a nifty paper in the Summer '90
workshop, talking more about this, if you're into that kind of thing.

 -- dan

jp@tygra.Michigan.COM (John Palmer) (06/18/91)

In article <1991Jun15.152057.7681@athena.cs.uga.edu> mcovingt@athena.cs.uga.edu (Michael A. Covington) writes:
"
"OK then, if passwords aren't secret, give me yours!!!
"

root:zVC/kylCNiQAs:0:0:Charlie Root:/:/bin/sh

NO ONE will be able to crack this one.

-- 
CAT-TALK Conferencing System   |  "Buster Bunny is an abused | E-MAIL:
+1 313 343 0800 (USR HST)      |   child. Trust me - I'm a   | jp@Michigan.COM
+1 313 343 2925 (TELEBIT PEP)  |   professional..."          | 
********EIGHT NODES*********** |   -- Roger Rabbit           | 

rogue@cellar.UUCP (Rache McGregor) (06/18/91)

emv@msen.com (Ed Vielmetti) writes:

> In article <1991Jun16.214835.26892@athena.cs.uga.edu> mcovingt@athena.cs.uga.
> 
>    Stealing passwords is a violation of Georgia law and of the
>    policies that students promise, in writing, to obey when they 
>    receive computer accounts.
> 
> Do you have the text of the relevant Georgia law, and of the written
> policy statements which students are required to sign?  That'd be a
> good addition to the archive of policy statements kept at
> ftp.cs.widener.edu:/pub/cud/schools/ .  
> 
> Note that there are some schools which have an "honor code" of some
> sort that is a blanket prohibition of all kinds of unethical behavior;
> for instance, if you were to go about cracking passwords at the U of
> Michigan engineering school chances are that as a student the first
> thing that'll be thrown against you is an honor code violation.
> 
>    All this casuistry about what is meant by "confidentiality" is pointless.
> 
> Nice word, casuistry.  American Heritage just says "determination of
> right and wrong in questions of conduct or conscience by the
> application of general principles of ethics".  Webster goes a bit
> farther and says "sophistical, equivocal, or specious reasoning."
> Apparently you have a low opinion of people who are critical of your
> policies.  

   Now that we're talking about ethics and morality, I'm wondering where the 
"Community Standards" application may come in.  Should we use the community 
standards of Athens, Georgia, or teh community standards of cyberspace?

   For example, this newsgroup is sponsored by the Electronic Frontier 
Foundation, and is intended as a breeding ground for developing the mores of 
cyberspace - the world of computer networks.  Many of those debating the 
topic, including myself, feel that the student should not have been 
disciplined as harshly as he was.  Mind you, we are not in full posession of 
the facts, and for as long as the investigation may take, it seems we will 
not be for some time.


Rachel K. McGregor            : Let the fire be your friend : Call the
a/k/a Rogue Winter            : And the sea rock you gently : Cellar at
rogue@cellar.uucp             : Let the moon light your way : 215/336-9503
{tredysvr,uunet}!cellar!rogue : 'Til the wind sets you free : BBS & Usenet

woolf@isi.edu (Suzanne Woolf) (06/18/91)

In article <1991Jun12.211143.18803@murdoch.acc.Virginia.EDU> gl8f@astsun7.astro.Virginia.EDU (Greg Lindahl) writes:
>In article <1991Jun12.141657.29238@athena.cs.uga.edu> mcovingt@athena.cs.uga.edu (Michael A. Covington) writes:
>
>>A few people here have been advocating the strange idea that UNIX users
>>have a moral right to obtain each other's passwords using COPS. I have a few
>>responses...
>
>I'd like to point out that this isn't my point at all; rather, I've
>been trying to say that the illegal act here is breaking into a
>system. Mr. Covington seems to have lost sight of this.
>
>I've also been saying that a responsible sysadmin should close obvious
>holes. Mister Covington seems to think this is a blame-the-victim
>mentality. I think it's good professional practice. Sysadmins should
>expect that users need to be educated about proper security
>procedures; any sysadmin that doesn't should be fired no matter
>whether a break-in is detected or not.

This discussion of the responsibility of sysadmins for system security
does bring up something I was wondering about:

An aquaintance who recently served on a jury in a personal injury case
tells me that the jury was instructed that, as a matter of law (this
was in California, for any serious legal scholars out there), you
cannot hold someone negligent for not foreseeing that someone else
would commit an illegal act.  In the particular case, the jury agreed
that a motorcyclist hadn't done everything possible to avoid an
accident (although he *had* done everything reasonable and prudent)
but that he had no contributory negligence because the motorist who'd
hit him had made an illegal U-turn to do it.

Does this principle extend to system administrators and/or users?
Should it??  Since breaking into other people's computers is already
an illegal act, should the users of a system be able to hold
administrators legally responsible for damage due to not preventing a
break-in?

Or maybe system administrators, as professionals with specific
responsibilities, can commit malpractice?!

Obviously we all have a certain degree of professional responsibility;
if our employers think we screwed up, we can (and arguably should)
lose our jobs.

But legal responsibility?  Are we negligent if we don't prevent users
from using dictionary-based passwords, or for leaving well-known
security holes unpatched?

Hmmmm....

				--Suzanne
				woolf@isi.edu

gsh7w@astsun7.astro.Virginia.EDU (Greg Hennessy) (06/18/91)

Michael A. Covington writes:
#Honest people do not go around picking the locks on people's houses or
#cars, not even "to test security." I see no reason why the ethics of
#computers should be any different.

Richard Feinmann did.

--
-Greg Hennessy, University of Virginia
 USPS Mail:     Astronomy Department, Charlottesville, VA 22903-2475 USA
 Internet:      gsh7w@virginia.edu  
 UUCP:		...!uunet!virginia!gsh7w

karn@epic.bellcore.com (Phil R. Karn) (06/18/91)

In article <1991Jun18.050402.19338@murdoch.acc.Virginia.EDU>, gsh7w@astsun7.astro.Virginia.EDU (Greg Hennessy) writes:
|> Michael A. Covington writes:
|> #Honest people do not go around picking the locks on people's houses or
|> #cars, not even "to test security." I see no reason why the ethics of
|> #computers should be any different.
|> 
|> Richard Feinmann did.

He certainly did -- except that he went after office safes at Los
Alamos during the Manhattan project. He relates quite clearly in his
book "Surely You're Joking Mr. Feynmann" what happened when he tried
to report his safecracking successes to the powers that be so that
something could be done to tighten security.

Phil

dhesi@cirrus.com (Rahul Dhesi) (06/19/91)

In <27141@as0c.sei.cmu.edu> df@sei.cmu.edu (Dan Farmer) writes:

     *Any* word that is found in a dictionary can be easily guessed...

The other day, while I was being shaved by my barber (who is,
incidentally, clean-shaven himself), I happened to think about word
lists used for screening out guessable passwords.

It occurred to me that, as storage costs get lower, online word lists
are getting bigger and bigger.  My own personal CD-ROM collection of
meaningful words (in 17 languages) is huge, and includes most possible
character sequences that you might want to use as passwords.  As a
result, the number of 8-character passwords that are not guessable is
becoming smaller and smaller.

Checking to make sure that a password used is not in any online word
lists can be very time-consuming.  It is more efficient to generate in
advance what I call a LOWNIAL (list of words not in any list).  Ideally
you would use a modified /bin/passwd program that would accept a
password only if it was found in the online LOWNIAL, and reject all
others.

Would you like to see the LOWNIAL database as a commercial product?
How much would you pay for it?  Should it be accompanied by the source
code for a /bin/passwd program?

Also, an interesting philosphical question occurs to me:  If there were
more than one vendor selling a LOWNIAL, should each vendor's LOWNIAL
exclude all words occurring in all other vendors' LOWNIALs?
-- 
Rahul Dhesi <dhesi@cirrus.COM>
UUCP:  oliveb!cirrusl!dhesi

ear@wpi.WPI.EDU (Eric A Rasmussen) (06/19/91)

In article <27141@as0c.sei.cmu.edu> df@sei.cmu.edu (Dan Farmer) writes:
>In article <foo>, learn@ddsw1.MCS.COM (William Vajk) writes:
>> Misspellings are great. Classical words and terms not found in dictionaries
>> are just as good.
>
>  Not if you don't want to get broken into.  Larger, more comprehensive
>on line dictionaries are becoming easier to get access to.  *Any* word
>that is found in a dictionary can be easily guessed, whether it is an
>english or foreign language word/term.

I personally like to use words that are brand names or product names, but
which are not likely to be found in a dictionary.  For example, I used
'Mopar' as a password for a long time and never had any problems.  Of course,
I might be just as likely to use 'GlassPlus' the next time because there just
happens to be a bottle of it sitting near me.  The advantage is that these
types of passwords are easy to memorize, but hard for any automated password
checker to guess.

+---------< Eric A. Rasmussen - Mr. Neat-O (tm) >---------+ +< Email Address >+
|   A real engineer never reads the instructions first.   | | ear@wpi.wpi.edu |
|   (They figure out how it works by playing with it.)    | | ear%wpi@wpi.edu |
+---------------------------------------------------------+ +-----------------+
                     ((( In Stereo Where Available )))

dpassage@soda.berkeley.edu (David G. Paschich) (06/19/91)

In article <1991Jun18.205258.25918@cirrus.com>,
	dhesi@cirrus.com (Rahul Dhesi) writes:

   Checking to make sure that a password used is not in any online word
   lists can be very time-consuming.  It is more efficient to generate in
   advance what I call a LOWNIAL (list of words not in any list).  Ideally
   you would use a modified /bin/passwd program that would accept a
   password only if it was found in the online LOWNIAL, and reject all
   others.

So you create a plaintext list of "good passwords".  Either a) this
list is much too large to be useful, or b) this list is small enough
that a cracker can get at it and use it for a dictionary attack on
your system.  If you make the file root-readable only, then the
problem of reading it reduces to that of reading the passwords in an
/etc/shadow file.  If you encrypt the file, its size increases at
least 2,000 times because you have to encrypt each plaintext password
once for each possible salt.  And /etc/shadow is still more secure because
there's no absolute list of usable passwords anywhere.  

Restricting your users to a certain list of passwords small enough to
be usable is bad.  Using something like the replacement passwd program
in the perl book is a much better idea.

--
David G. Paschich	Open Computing Facility		UC Berkeley
dpassage@ocf.berkeley.edu
"But I'd rather be a fish, 'cause a fish is an animal" -- Gener Fox

ear@wpi.WPI.EDU (Eric A Rasmussen) (06/19/91)

In article <1991Jun18.205258.25918@cirrus.com> dhesi@cirrus.com (Rahul Dhesi) writes:
>Checking to make sure that a password used is not in any online word
>lists can be very time-consuming.  It is more efficient to generate in
>advance what I call a LOWNIAL (list of words not in any list).  Ideally
>you would use a modified /bin/passwd program that would accept a
>password only if it was found in the online LOWNIAL, and reject all
>others.

Umm, I hope that was a joke...

Assuming you weren't joking:
----------------------------
1) That's the dumbest idea I've heard all week... If one person get's ahold
of the list, you're screwed.
2) By definition, your list would have to be empty.  As soon as you added a
word to it, it would be in a list and then it would have to be excluded.
3) This would result in no passwords at all.
4) The poor Websters would be up all night coming up with definitions for
'words' like 9xjh;-)6$sH!.

+---------< Eric A. Rasmussen - Mr. Neat-O (tm) >---------+ +< Email Address >+
|   A real engineer never reads the instructions first.   | | ear@wpi.wpi.edu |
|   (They figure out how it works by playing with it.)    | | ear%wpi@wpi.edu |
+---------------------------------------------------------+ +-----------------+
                     ((( In Stereo Where Available )))

merlin@presto.UUCP (Jeff W. Hyche) (06/20/91)

In article <1991Jun17.161159.4438@convex.com> tighe@convex.com (Mike Tighe) writes:
>
>>>> Sure:
>>>> df:T8oOksRWnnA8Y:3271:20:Dan:ntdHHHOgetsA8EDH999 <1> n
 I I @ex! 91 
R> tnnEDUCPed.4.0.0.:
:
:D P <21Juerereti:ululu8e: riteteteerereD.4LLLD7.ED:ta1@tigsgI.ED rouZZZi.4 IED(JewsLLd9 I"et7.17.17PiI.74niCP"1@n"eh1.ED:d
.
t n
 IEDrliex.ELW.MEDRta



SurSH 11OkencnveEdn
SSSCP'mSTe$
YZZrim>f.dedHSTe$OgetsA8EDH9

hychejw@infonode.ingr.com (Jeff W. Hyche) (06/21/91)

merlin@presto.UUCP (Jeff W. Hyche) writes:

>In article <1991Jun17.161159.4438@convex.com> tighe@convex.com (Mike Tighe) writes:
>>
>>>>> Sure:
>>>>> df:T8oOksRWnnA8Y:3271:20:Dan:ntdHHH>OgetsA8EDH999 <1> n
> I I @ex! 91 
>R> tnnEDUCPed.4.0.0.:
>:
>:D P <21Juerereti:ululu8e: riteteteerere>D.4LLLD7.ED:ta1@tigsgI.ED rouZZZi.4 IED(JewsLLd9 I"et7.17.17PiI.74niCP"1@n"eh1.E>D:d
>.
>t n
> IEDrliex.ELW.MED>Rta



>SurSH 11OkencnveEdn
>SSSCP'mSTe$
>YZZrim>f.dedHSTe$>OgetsA8EDH9that's not good.
-- 
                                  // Jeff Hyche           
    There can be only one!    \\ //  Usenet: hychejw@infonode.ingr.com
                               \X/   Freenet: ap255@po.CWRU.Edu