[comp.unix.wizards] Request for human interface design anecdotes

caf@omen.UUCP (Chuck Forsberg WA7KGX) (10/27/87)

In article <1325@desint.UUCP> geoff@desint.UUCP (Geoff Kuenning) writes:
:> just create a file called "-i" in your directory that you want protected!
:> then "rm *" expands to "rm -i file1 file2 file3 ..."
:> (unless you have other files beginning with weird characters)
:
:What a typically Unix solution.  Even to the flaws:  you have to put up
:with an ugly file in your directory, and it doesn't work if you
:type "rm test *".

One other flaw that can be circumvented: it takes up an inode.
So, my "-i" has many links to it, it only takes up a directory slot
plus one inode total.

hunt@spar.SPAR.SLB.COM (Neil Hunt) (10/28/87)

In article <1325@desint.UUCP> geoff@desint.UUCP (Geoff Kuenning) writes:
> just create a file called "-i" in your directory that you want protected!
> then "rm *" expands to "rm -i file1 file2 file3 ..."
> (unless you have other files beginning with weird characters)

What about when you have a file called '-a' in your directory as well !
Seems to me that appropriate use of write protections is a better solution.
Failing that, how about an alias

% alias rm 'rm -i'

Neil/.

djones@megatest.UUCP (Dave Jones) (11/06/87)

An old version of emacs we used to use created backup files tagged with
".BAK".   One day I quickly typed "rm *.BAK", or so I thought.  To
my horror, I looked at the command line and saw, "% rm *>BAK".  The
greater-than is a capital period, and I depressed the shift key just
a fraction of a second early.  The system was industriously deleting all
my files and piping the (empty) listing to a new file called BAK.

chip@ateng.UUCP (Chip Salzenberg) (11/11/87)

In article <1621@megatest.UUCP> djones@megatest.UUCP (Dave Jones) writes:
>An old version of emacs we used to use created backup files tagged with
>".BAK".   One day I quickly typed "rm *.BAK", or so I thought.  To
>my horror, I looked at the command line and saw, "% rm *>BAK".

I had a similar disaster with an editor that creates backups of the form
",filename".  I missed the comma and typed "rm *".  I now use a (safe!)
alias to do this deletion:

	alias b 'rm -f ,*'
-- 
Chip Salzenberg         "chip@ateng.UUCP"  or  "{codas,uunet}!ateng!chip"
A T Engineering         My employer's opinions are not mine, but these are.
   "Gentlemen, your work today has been outstanding.  I intend to recommend
   you all for promotion -- in whatever fleet we end up serving."   - JTK

chris@mimsy.UUCP (Chris Torek) (11/13/87)

In article <1621@megatest.UUCP> djones@megatest.UUCP (Dave Jones) writes:
>... One day I quickly typed "rm *.BAK", or so I thought.  To
>my horror, I looked at the command line and saw, "% rm *>BAK". ...
>The system was industriously deleting all my files and piping the
>(empty) listing to a new file called BAK.

Which, by the way, was also removed by rm.  The shells (csh, sh;
I have not tried ksh) perform `<' and `>' redirection before `*'
expansion.

	% cat * > together

will often fill up a file system, since `*' might expand to `ch1
ch2 ch3 ch4 index together'.  cat eventually starts copying from
the beginning of `together', appending to its end, which provides
more text for cat to read, which writes more, which provides more,
which . . . .
-- 
In-Real-Life: Chris Torek, Univ of MD Comp Sci Dept (+1 301 454 7690)
Domain:	chris@mimsy.umd.edu	Path:	uunet!mimsy!chris

chris@mimsy.UUCP (Chris Torek) (11/13/87)

In article <9332@mimsy.UUCP> I wrote
>... The shells (csh, sh; I have not tried ksh) perform `<' and `>'
>redirection before `*' expansion.

Correction: only `csh' does this.

>	% cat * > together

This is also a bad example, as `cat' explicitly checks each input
file against cat's standard output, to prevent loops.  Using
something like `soelim' that does not have such checks will cause
such a loop.
-- 
In-Real-Life: Chris Torek, Univ of MD Comp Sci Dept (+1 301 454 7690)
Domain:	chris@mimsy.umd.edu	Path:	uunet!mimsy!chris

dlm@cuuxb.ATT.COM (Dennis L. Mumaugh) (11/13/87)

This  request  has  spawned  many  stories  involving  rm  *  (or
variants)  that were not intentional.  When we first started with
unix four people managed to destroy things within  the  same  day
that  way.  In my case, I had worked all night to build a new and
wondrous piece of software.  It all worked, etc.  So I was  doing
the  final  clean  up:  I  typed rm *.c instead of rm *.o and the
whole project went down the drain!

Shortly thereafter we made a change to the  shell:  in  the  code
that did global expansions (*,?) we set a flag and if the command
name was "rm* or del*" we  said  confirm:  and  printed  out  the
expanded list of files.

Even with this aid we still had people screw up but not nearly as
often.

Then there was the time we did rm -rf ../*

The moral of this is that the command  interpreters  need  to  be
modified to request confirmation of potentially fatal things such
as rm * and it must be in the command interpreter as the  command
itself  can't  know  whether  the  list  is an expanded list or a
individually entered list.

After that is fixed we can talk  about  Jim  Gillogly's  spelling
corrector shell.
-- 
=Dennis L. Mumaugh
 Lisle, IL       ...!{ihnp4,cbosgd,lll-crg}!cuuxb!dlm

dhb@rayssd.RAY.COM (David H. Brierley) (11/16/87)

If users removing all of their files by inadvertently typing "rm *" is
a habitual problem at your site, why not make the command default to
interactive mode?  If you have source this is a trivial task and if you
don't have source it's not much harder.  Simply move the real rm
command to some new secret place, for example: /bin/.hidden/rm, and
then make /bin/rm be a shell script which invokes the real rm with the
"-i" flag.  If you wanted to be real fancy you could add a new option,
say "-I", which would disable interactive mode.  Another possibility
would be to have the shell script enable interactive mode if you try to
remove more than some pre-determined number of files.  That way you
could still type "rm foo" without having to use interactive mode but
"rm foo *" would put you into interactive.

Of course, nothing you can do will ever solve the problem completely
since even the most expert user will occasionally make mistakes.  Just
the other day I wiped out a weeks worth of work by typing "cc -o pgm.c"
on my AT&T unix-pc.  I had meant to use -O to invoke the optimizer.
Instead, it optimized away all of my code by giving me a message to the
effect of "no source file" and happily creating a zero length output
file called "pgm.c".  I was not at all amused.  I now have a shell
script in place of cc which checks all its arguments for consistency
(i.e. you can't say "-o pgm.c").
-- 
	David H. Brierley
	Raytheon Submarine Signal Division
	1847 West Main Road
	Portsmouth, RI 02871

Phone:		(401)-847-8000 x4073
Internet:	dhb@rayssd.ray.com
Uucp:		{cbosgd, gatech, linus, mirror, necntc, uiucdcs} !rayssd!dhb

andy@rocky.STANFORD.EDU (Andy Freeman) (11/17/87)

In article <1402@cuuxb.ATT.COM> dlm@cuuxb.UUCP (Dennis L. Mumaugh) writes:
[We're talking about "rm *".  Guess why I have a csh alias for rm that
 always asks about every file.  When I'm absolutely sure that I want to
 delete a number of files and I don't want to answer questions, I pipe
 the whole command off to sh.]

>The moral of this is that the command  interpreters  need  to  be
>modified to request confirmation of potentially fatal things such
>as rm * and it must be in the command interpreter as the  command
>itself  can't  know  whether  the  list  is an expanded list or a
>individually entered list.

There are far more general solutions.  Most people have trash cans.
One can recover their contents for some time, but they go away at
well defined times.  Too bad unix doesn't handle generations in
the file system (rcs and friends are clever archivers - they are
still useful in this context).  Obsolete versions can be marked
"deleted" so they aren't normally visible, but they can be retrieved.
Then it makes sense to have a file deleter that tells you what it
has done, just to reduce the chance of surprise.  (Yes, there should
be bozo mode for people who don't want to know or for programs that
think they know what they are doing.)

One should never simplify more than necessary.

-andy
-- 
Andy Freeman
UUCP:  {arpa gateways, decwrl, sun, hplabs, rutgers}!sushi.stanford.edu!andy
ARPA:  andy@sushi.stanford.edu
(415) 329-1718/723-3088 home/cubicle

wcw@psuhcx (William C Ward) (11/17/87)

In article <1689@rayssd.RAY.COM> dhb@rayssd.RAY.COM (David H. Brierley) writes:
>If users removing all of their files by inadvertently typing "rm *" is
>a habitual problem at your site, why not make the command default to
>interactive mode? 

The rm * disaster catches not only the absent-minded, but also the hasty
and uncoordinated.  I once mistyped a command like:
	rm *&foo
instead of rm *foo (*& is a double-strike of adjacent keys!) and the
machine obediently and hastily removed all files in the directory via a 
background process.  My screams were audible many doors down the hall as
I looked helplessly at the screen.

What I have done to lessen future disasters of this kind is to insert the
following crontab entry for other users:
# Keep second copies of recent source files (*.c, *.f, *.h) in /tmp
30 * * * * nice -10 find /usr/usr -mtime -1 -name *.[cfh] -exec cp {} /tmp \;
# Get rid of old /tmp files
0 2 * * * find /tmp -atime +4 -exec rm -f {} \;

If incremental dumps are done at least every 4 days, this means that
most source development work that can be lost is one hour's worth, if
your disk doesn't crash entirely.  The extra load on a small system 
with a little extra space and 10 or 20 users is pretty negligible, since
only files which have been modified in the last hour are copied.  If
security is a concern, the backup files (owned by root) can be set to
600 mode.  Moreover, it protects against `generic' disasters (rm, cp,
cc -o, or foolish edits).

This has saved me more than once now!
Bill Ward			Bitnet:	WCW@PSUECL
Noise Control Laboratory	UUCP:	{gatech,rutgers,..etc.}!psuvax1!ncl!wcw
The Penn. State University	USnail:	157 Hammond Bldg.;
Fone:		(814)865-7262	University Park, PA 16802

gwyn@brl-smoke.UUCP (11/17/87)

In article <1689@rayssd.RAY.COM> dhb@rayssd.RAY.COM (David H. Brierley) writes:
>If users removing all of their files by inadvertently typing "rm *" is
>a habitual problem at your site, why not make the command default to
>interactive mode?

Please don't fuck with the standard commands.  If you're going to
change the semantics, give it a new name and retain the old one
for applications that expect the documented semantics.

roy@phri.UUCP (Roy Smith) (11/17/87)

In article <1689@rayssd.RAY.COM> dhb@rayssd.RAY.COM (David H. Brierley) writes:
> Just the other day I wiped out a weeks worth of work by typing
> "cc -o pgm.c" on my AT&T unix-pc.

	This may sound harsh, but I really have little sympathy in this
case.  That zapping a source file wipes out a week's worth of work implies
that you don't make daily backups.  Even on a PC, doing backups should be
routine every day; there really is little excuse for not doing so.

	Things like emacs's "~" backup files (I'm not familiar with other
editors; I assume this feature is available in vi, etc, as well) mitigate
the damage from "rm *.c" instead of "rm *.o", and similar disasters (at the
cost of some wasted disk space), but daily backups are really the bottom
line.  In fact, I have given serious thought to running incremental
disk-to-disk dumps several times a day here to narrow the window of
vulnerability from a whole day to a few hours.  Yes, I know dumps on live
file systems don't always work, but it's better than not doing it at all.
-- 
Roy Smith, {allegra,cmcl2,philabs}!phri!roy
System Administrator, Public Health Research Institute
455 First Avenue, New York, NY 10016

bak@csd_v.UUCP (11/18/87)

In article <763@rocky.STANFORD.EDU>, andy@rocky.STANFORD.EDU (Andy Freeman) writes:
> There are far more general solutions.  Most people have trash cans.
> One can recover their contents for some time, but they go away at
> well defined times....

I use a version of rm adapted from Wizard's Grabbag in UNIX/XENIX world.
It simply prepends '#' to the file name if no swithces are listed in 
the command line.  Thus

		$ rm foo      

creates a file #foo, while

		$ rm -[i|f|r] foo

all work normally.  Since # is the shell comment charater it is very
hard to unintentionally delete files with names beginning with it.

My crontab contains the line

10 3 * * 0,2,4,6 find / \( -name '#*' -o -name 'tmp.*' -o -name '*.tmp' -o -name 'temp.*' -o  -name '*.temp' \) -mtime +3 -exec rm -f {} \;

which deletes files beginning with '#' which have been unmodified for 3 days.

If disk space is a problem you can cut down the -mtime value.  This script 
has saved me grief on more than one occasion.
-- 
Bruce Kern -- Computer Systems Design, 29 High Rock Rd., Sandy Hook, Ct. 06482
uunet!swlabs!csd_v!bak

ustel@well.UUCP (Mark Hargrove) (11/18/87)

> Xref: well comp.cog-eng:280 comp.unix.xenix:1083 comp.unix.wizards:5329
> 
> In article <1402@cuuxb.ATT.COM> dlm@cuuxb.UUCP (Dennis L. Mumaugh) writes:
> [We're talking about "rm *".  Guess why I have a csh alias for rm that
>  always asks about every file.  When I'm absolutely sure that I want to
>  delete a number of files and I don't want to answer questions, I pipe
>  the whole command off to sh.]

In a similar vein, we have a shell function defined in /etc/profile
for our Bourne Shell users:

rm(){
	if [ ! -d /usr/tmp/$LOGNAME ] ; then
		mkdir /usr/tmp/$LOGNAME
	fi
	mv $* /usr/tmp/$LOGNAME
}

Then we have /usr/lbin/reallyrm linked to /bin/rm for when
you really mean it.

A once a week cron script cleans out /usr/tmp right AFTER a backup.

This DOESN'T fix the ol' slip of the fingers that results in
reallyrm * .o  <---you only see the space AFTER you hit return ;-)
but it HAS saved the day enough times to make it worth the 10 minutes
it took to implement.

Mark Hargrove
U.S. TeleCenters
{backbones}!hplabs!well!ustel

clif@chinet.UUCP (11/18/87)

I dunno if this is quite a human interface tale of woe, but...
  I recently lost my hd0 to a power supply problem.  
  No problem, says I, once I fixed the power supply, I have a dump level
0 backup from one month back, and a crontab entry that does a dump level
2 of hd0 to a file hd1 each morning at 06:00.
  After restoring things from floppy, it was somewhat after midnight, and
I decided to complete the task the following day.
  At 06:00, right on schedule, the machine did a level 2dump, over the
file of good data on hd1.  Pow, in one swell foop my clever method of 
making sure I couldn't lose any data had lost me a months worth.

  Moral:  I dunno.  Maybe don't leave the machine running unattended
until you've completely fixed things up.  

-- 
------------------------------------------------------------------------
My Opinions are my own. I can't imagine why anyone else would want them.
Clif Flynt	ihnp4!chinet!clif
------------------------------------------------------------------------

rapaport@sunybcs.uucp (William J. Rapaport) (11/18/87)

After wiping out one too many directories, I aliased rm to:

'mv \!:1 #\!:1'

Now, it is impossible for me to execute:  rm *

It slows me down a bit when I do want to rm lots of stuff, but the price
is well worth the insurance.

jec@nesac2.UUCP (John Carter ATLN SADM) (11/18/87)

In article <3032@phri.UUCP>, roy@phri.UUCP (Roy Smith) writes:
> In article <1689@rayssd.RAY.COM> dhb@rayssd.RAY.COM (David H. Brierley) writes:
> > Just the other day I wiped out a weeks worth of work by typing
> > "cc -o pgm.c" on my AT&T unix-pc.
> 
> 	This may sound harsh, but I really have little sympathy in this
> case.  That zapping a source file wipes out a week's worth of work implies
> that you don't make daily backups.  Even on a PC, doing backups should be
> routine every day; there really is little excuse for not doing so.

My multi-user systems get daily backups - my PC gets infrequent
backups, except for some critical items (my LAN database).

However, in the original case, it appears that the  unix-pc has an
old and rather braindead compiler - the ones I use (DEC 11/70, AT&T
3B2, 3B5) respond to 'cc -o file.c' with 'would overwrite source'
and then abort.  Getting 'cc -o' instead of 'cc -O' is very easy.
-- 
USnail: John Carter, AT&T, Atlanta RWC, 3001 Cobb Parkway, Atlanta GA 30339
Video:	...ihnp4!cuea2!ltuxa!ll1!nesac2!jec    Voice: 404+951-4642
(The above views are my very own. How dare you question them? :-)

djones@megatest.UUCP (Dave Jones) (11/21/87)

in article <1689@rayssd.RAY.COM>, dhb@rayssd.RAY.COM (David H. Brierley) says:
> Xref: dlb comp.cog-eng:259 comp.unix.xenix:1008 comp.unix.wizards:4836
> 
> If users removing all of their files by inadvertently typing "rm *" is

 ...

> Of course, nothing you can do will ever solve the problem completely
> since even the most expert user will occasionally make mistakes.  Just
> the other day I wiped out a weeks worth of work by typing "cc -o pgm.c"
> on my AT&T unix-pc.  I had meant to use -O to invoke the optimizer.
> Instead, it optimized away all of my code by giving me a message to the
> effect of "no source file" and happily creating a zero length output
> file called "pgm.c".  I was not at all amused.  I now have a shell
> script in place of cc which checks all its arguments for consistency
> (i.e. you can't say "-o pgm.c").
> -- 
> 	David H. Brierley
> 	Raytheon Submarine Signal Division
> 	1847 West Main Road
> 	Portsmouth, RI 02871
> 
> Phone:		(401)-847-8000 x4073
> Internet:	dhb@rayssd.ray.com
> Uucp:		{cbosgd, gatech, linus, mirror, necntc, uiucdcs} !rayssd!dhb


I guess I had been programming about two months when it occured to me
that a program should always open all the input-files that it can
before it opens ANY output-files.  Somebody forgot to tell the writer of
your cc.  Sigh.  (When output is going to a disc-file, a program
should write it first to a temporary, then if there is no error, move it
to the real place.)

It is interesting that the same principle can apply to microprocessor
hardware:  instructions which read all their inputs, and then write one
output can be restarted from the beginning after a page-fault at any
step in the instruction.  The T.I. 990 microprocessor line had some
instructions which were not like that.  They made it hard to upgrade
to virtual memory when all the competators did.  So far as I know, the
990 is pretty much a dinasaur now.

msb@sq.UUCP (11/21/87)

> The rm * disaster catches not only the absent-minded ...

I thought it was about time someone expressed the opposite point of view.

If I type "rm *", it is because I want to remove all the files.  No, not
all *my* files.  All *the* files that I still have write permission on,
that are in the current directory.  Usually no more than about 20 of them.
In short, the proper UNIX* flavored method for protecting important files
from "rm" is to turn off the write permission bit.

Now, if you want to talk about human interface disasters and "rm" ...
Tell me how come "rm ... &" causes the -f flag to be assumed, and thus
removes the write-protected files after all?  Write-protecting the directory
stops it, but this is often not feasible.  I think the gods nodded on that one.

Mark Brader, utzoo!sq!msb, msb@sq.com		C unions never strike!

*"UNIX is a trademark of Bell Laboratories" is a religious incantation.
  That it no longer reflects reality is a bug in reality.

jc@minya.UUCP (John Chambers) (11/21/87)

> This  request  has  spawned  many  stories  involving  rm  *  (or
> variants)  that were not intentional.  
> 
> Then there was the time we did rm -rf ../*
> 
> The moral of this is that the command  interpreters  need  to  be
> modified 

Well, I hate to be a wet blanket (sure, sure; ya love it; admit it :-),
but I've never typed anything like this.  What I have typed a *lot* of
is 'y' to silly suggestions that I don't know what I'm doing.  This has
wasted a lot of my time.

One of the first things that impressed me when I first snuck onto a
Unix system 'way back when was that, unlike all the other systems I'd
ever used, Unix actually did what I told it to do.  It didn't harass
me, question my intelligence, or anything like that.  When I typed
"rm *", it did it.

Nowadays, it has gotten recalcitrant, and I have to say, in effect,
"Of course, I want that file rm'ed; that's what I said, wasn't it?"
While I've grown in experience, Unix has taken to treating me as
a child who needs his hand held.

When I was in High School, most of the power tools in the shops had
various shields and safety features.  The intructors sometimes pointed
out that professionals would routinely remove the shields.  But that
was after they had learned a certain respect for the tools, and also
after they had learned work techniques that made the tools safe for
them.  On the other hand, such professionals didn't allow children
(or unqualified adults) into their workshops.

This could be well applied to computer systems.  Unix, like any good
tookbox, requires power tools like rm.  Professionals want such tools
without safeguards.  But you don't hand them to novices.  Anyone who
hands a "bare" Unix to a novice, and teaches him/her to type "rm *.o",
is doing the equivalent of teaching someone to use a circular saw with
no shield or goggles.  Novices should not be taught about rm, but about
other tools (say, a simple script or alias called 'del') that asks the
right questions.  After they express frustration with such safeguards,
or when they need to write a script that doesn't harass its users, they
can be told "Well, there's this other library command 'rm'...".

I guess this is a complain about the fact that we have a nicely layered
system; its designers carefully explained to us all about this; we don't
listen.  The Unix community gets criticised for giving novices a command
language (sh) that was intended for system developers.  But Unix has a
perfectly good method of giving non-hackers their own user-friendly shells;
why do we teach them to use sh?  It is trivial to add a library script
that 'removes' files by renaming them; why don't we do it?  It is trivial
to say "ln /bin/cat /bin/type"; why don't we do it?

My claim is that Unix in fact has a good design for user-friendliness;
the problem is vendors and system developers that violate the design by
using low-level tools like rm at a higher (user) level.

'Nuf preaching for now; this oughta get me lots flames to while away
the hours reading...

-- 
John Chambers <{adelie,ima,maynard,mit-eddie}!minya!{jc,root}> (617/484-6393)

allbery@ncoast.UUCP (11/21/87)

As quoted from <3032@phri.UUCP> by roy@phri.UUCP (Roy Smith):
+---------------
| In article <1689@rayssd.RAY.COM> dhb@rayssd.RAY.COM (David H. Brierley) writes:
| > Just the other day I wiped out a weeks worth of work by typing
| > "cc -o pgm.c" on my AT&T unix-pc.
| 
| 	This may sound harsh, but I really have little sympathy in this
| case.  That zapping a source file wipes out a week's worth of work implies
| that you don't make daily backups.  Even on a PC, doing backups should be
| routine every day; there really is little excuse for not doing so.
+---------------

NO REASON?!  When it takes 50 floppies to back up the HD, there is VERY MUCH
a reason.  (Tape?  You got $1500 free to give me for a tape drive?)  I back
up my home directory, and if the disk crashes I just do a full reinstall.
This is no slower than reloading lots of disks....

For the "rm" problem, I think I've got a solution.  The idea comes from a
cross between existing "rm/unrm" programs and fsck, and deals with links
as well.

(1) For every mounted filesystem PLUS the root, create a directory called
"wastebasket" or some such.

(2) The program "del" (NOT "rm" -- you'll screw up programs which invoke
rm via system(), such as the System V spooler) links a file into the
wastebasket directory for a filesystem by its inode number, and writes a
line into an index file consisting of inum, path, and date and time.  Maybe
also the user who did it.

(3) The program "undel" links the file back out of the wastebasket to its
original path, via the index.

(4) A program "expdel" (expunge deleted files) uses the index to choose
files del'ed more than some specified or default time ago and unlinks them.

By using rename() under BSD or SVR3, or using root privs under SVR2 or older,
this can be generalized to directories as well, giving a safe rmdir as well.

Note that this retains all links (except symbolic ones, but that's part and
parcel of the problems with a symlink -- not to start THAT war again, but
there isn't a whole lot to be done about it), and the expunge process does
not have to search every user's home directory either.  The result is a
reversible rm which doesn't have any of the drawbacks of current ones.
-- 
Brandon S. Allbery		      necntc!ncoast!allbery@harvard.harvard.edu
{hoptoad,harvard!necntc,{sun,cbosgd}!mandrill!hal,uunet!hnsurg3}!ncoast!allbery
			Moderator of comp.sources.misc

dave@onfcanim.UUCP (11/22/87)

In article <3032@phri.UUCP> roy@phri.UUCP (Roy Smith) writes:
>
>In fact, I have given serious thought to running incremental
>disk-to-disk dumps several times a day here to narrow the window of
>vulnerability from a whole day to a few hours.  Yes, I know dumps on live
>file systems don't always work, but it's better than not doing it at all.

There is an even better way.

We run a "backup daemon", originally written by Ciaran o'Donnell at Waterloo,
and still in use there, which is called from the crontab (every hour in
our case) to scan a list of filesystems looking for files that were
changed since it was last run.  When it finds one, and it isn't too large
and its name doesn't pattern-match a list of "not worthwhile" names like
"*.out", it copies it into a backup fileystem.

If the original filename was /u/dave/film.c, the copy will be named
/backup/u/dave/film.c/Nov20-19:01.  If I change the file again, it will
be backed up again an hour later, with a filename that reflects the changed
time or date.  Then, when I trash a file through carelessness, I have
a whole "history" of backup copies to go back through, so even if I introduced
a bug 5 hours ago, I can generally get back the code before that.
And I don't have to run "restore" to look for it; I just chdir to
/backup/u/dave/film.c and look around.

The /backup filesystem must be dedicated to the use of the backup program,
since it keeps it from filling up by deleting the oldest files as necessary
to make room for the new ones.  We use a 30-Mb partition, which seems to
keep stuff around for about a month on a system with 4 people writing code.

The only way I still lose files is if I clobber them within the first hour
of working on them (often it's withing the first 2 seconds when it happens!)
and the file hadn't been touched for 2 months before that, so all old
copies have been deleted.  So then I have to get out the tapes.
But it works most of the time, gives me a backup every hour of a file that
I am changing frequently, and requires no work on my part at all.

gwyn@brl-smoke.ARPA (Doug Gwyn ) (11/22/87)

In article <407@minya.UUCP> jc@minya.UUCP (John Chambers) writes:
>My claim is that Unix in fact has a good design for user-friendliness;
>the problem is vendors and system developers that violate the design by
>using low-level tools like rm at a higher (user) level.

More precisely, it's making these part of a naive-user interface
that is at fault.

Just linking /bin/cat to /bin/type isn't the right kind of solution;
for one thing, it adds commands into the system name space (incorrectly
in this case; "type" is already a command with a different meaning).
Non-expert users need an interface that doesn't require them to type
> or * or | or other mystical symbols.  There are examples of better
interfaces for the non-expert in widespread use, for example on the
Macintosh; there is no excuse for vendors neglecting this aspect of
their systems if they see what they provide as being something for
the end user.  On the other hand, it is not that hard for someone with
the proper training to learn how to exploit the usual UNIX tools
effectively, but too often users are just dumped into a UNIX shell
environment with inadequate guidance and reference and tutorial
material.  If they expect (by analogy with more end-user oriented
interfaces) to simply "learn by doing", they get into trouble or at
the very least never learn how to use the tools in an efficient way.

It is really fairly easy to whip up a naive-user interface to UNIX;
one evening I did one as a shell script (the "adventure shell") and
was amazed to find that some users actually preferred it to a normal
UNIX (Bourne or C) shell environment.  By the way, it arranged for
destroyed objects (files) to be reincarnatable.

Of course, for my own use I prefer flexible power to safety, so I
would object to the removal of an expert-user interface; but I do
think a naive-user interface is also needed.

cameron@elecvax.eecs.unsw.oz (Cameron Simpson "Life? Don't talk to me about life.") (11/23/87)

In article <1402@cuuxb.ATT.COM>, dlm@cuuxb.ATT.COM (Dennis L. Mumaugh) says:
| After that is fixed we can talk  about  Jim  Gillogly's  spelling
| corrector shell.
| -- 
| =Dennis L. Mumaugh
|  Lisle, IL       ...!{ihnp4,cbosgd,lll-crg}!cuuxb!dlm

I once used something calling itself `nsh' on a System V machine, and typed
	$ cd thnig
and thought "bother, I meant `thing'" and was then disconcerted when it said
	path/thing
	$
back at me. It had fixed the transposed chacters and dropped me in the right
spot! Hopefully it only happened in interactive mode, but it was very
disconcerting.
	- Cameron Simpson

drears@ARDEC.arpa (FSAC) (11/23/87)

   It seems to me the real problem is the "magic characters" and not
the novice user.  I like everyone else has made the mistake of typing
stuff like "rm -rf mail *" instead of "rm -rf mail*". Rather then fo
aliases or write programs that mimic rm and don't actually remove the
file until later, rewrite the sh to add a "-m" option.  The "-m" option
would subsitute names for magic and then echo out the command line with
the actual arguments.  If the user wants the command executed he types
return, otherwise he hits the quit or interrupt key.

Example:

unix> rm ty*
rm ty1 ty2 type3 tye ty4 (OK?)    [CR] rm completed
unix> rm ty*                           
rm ty1 ty2 type3 tye ty4 (OK?)    [quit] rm not completed
rm cmd canceled
unix>

     This would be done for all sh commands.  This saves lots of
headaches.  It anyone is interested I can send the code.

Dennis
------------------------------------------------------------
ARPA:	drears@ardec.ac4
AT&T:	201-724-6639
Snailmail:	Box 210, Wharton, NJ 07885
------------------------------------------------------------

mjr@osiris.UUCP (Marcus J. Ranum) (11/23/87)

In article <407@minya.UUCP>, jc@minya.UUCP (John Chambers) writes:
> > This  request  has  spawned  many  stories  involving  rm  *  (or
> > variants)  that were not intentional.  
> > 
> > Then there was the time we did rm -rf ../*
> > 
> > The moral of this is that the command  interpreters  need  to  be
> > modified 

	That's the thing I really like about UNIX. See, if you don't like your
command interepreter, you can always trot out and write your own. If you like
an interface like, say, MSDOS or JCL, you can (after the lobotomy) probably
write one to look enough like your beastie to make you happy.

	If you're like me, and you're getting increasingly frustrated with all
the expansion, macroizing, aliasing, noclobbering, etc, that is getting built
into shells, you can go back to the Bourne shell, or write an even simpler one
that does even less. Actually, I have a "minimal" shell I put together - does
nothing but support an environment, parse a PATH, and do execs. It's amazingly
small and fast when you leave all the crap out. That's what UNIX was all about.

	My only concern is that UNIX, in its rush (by some people) to become
a "mainstream business OS" doesn't get so pink cadillac'ed up that it is not
useful anymore. Of course, there are always going to be minimalist OS' out 
there, so I'm not worried.
	
	There are some great paragraphs at the beginning of the original
"UNIX Time-Sharing System" paper, about the "soapbox" UNIX stands (used to
stand) on. There are some nifty ideas there.

--mjr();
-- 
"We're fantastically incredibly sorry for all these extremely unreasonable
things we did. I can only plead that my simple, barely-sentient friend and
myself are underprivileged, deprived and also college students." 
					- Waldo "D.R." Dobbs.

ed@mtxinu.UUCP (Ed Gould) (11/23/87)

>Of course, for my own use I prefer flexible power to safety, so I
>would object to the removal of an expert-user interface; but I do
>think a naive-user interface is also needed.

I, too, want an expert-user interface while recognizing the need for
a naive-user interface.  The problem I've seen with most of the naive-user
systems is that there's no reasonable migration path from that
interface to the experts' one.  That is, there's no way for someone
to move on from being a beginner without learning a completely different
mechanism for interacting with the system.

Some attempts at solving this sort of problem have been clumsy (e.g.,
the "edit" and "ex" interfaces for editing text), while others have
just been bad (no example springs to mine:  I try to forget them).

People who design interfaces for novices should remember that a very
large fraction of users are complete novices for only a short time.
They soon move on to become more and more sophisticated with time.
As they progress, they should have easy access to more and more of the
power available in whatever system they're using, culminating in the
"expert" interface.  The only users who do not tend to progress
are those who use a system only very infrequently and essentially re-
learn each time.  All others need an evolutionary path from beginner
to expert.

-- 
Ed Gould                    mt Xinu, 2560 Ninth St., Berkeley, CA  94710  USA
{ucbvax,uunet}!mtxinu!ed    +1 415 644 0146

"`She's smart, for a woman, wonder how she got that way'..."

nortond@mosys.UUCP (Daniel A. Norton) (11/23/87)

Distribution:



On the version of Unix V.3 here (CTIX), when a new user enters a
password he/she will invariably choose a password of less than
six characters, to which the system replies:

	Password is too short - must be at least 6 digits

Fortunately, they do not usually notice the word "digits" (as
opposed to characters).  Unfortunately, when they attempt to
satisfy the program, it usually replies:

	Password must contain at least two alphabetic characters and
	at least one numeric or special character.

In other words, the first "help" message was not specific enough
as to the password requirements.  I would not expect a BNF description
of what to type in here, we must assume that the user has _some_
intuition, but seriously folks.
-- 
Daniel A. Norton				nortond@mosys.UUCP
c/o Momentum Systems Corporation	     ...uunet!mosys!nortond
2 Keystone Avenue
Cherry Hill, NJ   08003 			609/424-0734

dsill@NSWC-OAS.arpa (Dave Sill) (11/24/87)

>It is trivial to say "ln /bin/cat /bin/type"; why don't we do it?

One reason is that linking `type' to `cat' doesn't create a man page
for `type'.

meissner@xyzzy.UUCP (Michael Meissner) (11/24/87)

In article <407@minya.UUCP> jc@minya.UUCP (John Chambers) writes:
>        ....                      It is trivial to add a library script
> that 'removes' files by renaming them; why don't we do it?  It is trivial
> to say "ln /bin/cat /bin/type"; why don't we do it?

I dunno, but with the System V.[23] bourne shell, you can't even do that
because "type" is builtin (it tells where on your path a command is, or
if it's a shell function).  My other "favorite" poor design choice is
"dump" in the System V.3 sgs (software generation system, ie, C compiler)
which dumps out the object file into readable form, not what the operator
does at night.  Sigh.....
-- 
Michael Meissner, Data General.		Uucp: ...!mcnc!rti!xyzzy!meissner
					Arpa/Csnet:  meissner@dg-rtp.DG.COM

jfh@killer.UUCP (11/24/87)

In article <1987Nov21.014754.19660@sq.uucp>, msb@sq.UUCP writes:
> > The rm * disaster catches not only the absent-minded ...
> 
> I thought it was about time someone expressed the opposite point of view.
> 
> If I type "rm *", it is because I want to remove all the files.  No, not
> 
> Mark Brader, utzoo!sq!msb, msb@sq.com		C unions never strike!

I cast my vote for doing the remove.  I'd also like rm to consider asking
me to confirm the decision if I should happen to delete, say, more than
10 or 15 files.  Having the first few lines in main() be something like,

	fflg = (argc > 15) || fflg;

might be nice, or having a prompt, ala' MessyDos (yick) might be nice.

Thoughts?

- John.
-- 
John F. Haugh II                  SNAIL:  HECI Exploration Co. Inc.
UUCP: ...!ihnp4!killer!jfh                11910 Greenville Ave, Suite 600
      ...!ihnp4!killer!rpp386!jfh         Dallas, TX. 75243
"Don't Have an Oil Well?  Then Buy One!"  (214) 231-0993

reggie@pdn.UUCP (George W. Leach) (11/24/87)

In article <6713@brl-smoke.ARPA> gwyn@brl.arpa (Doug Gwyn (VLD/VMB) <gwyn>) writes:
>In article <407@minya.UUCP> jc@minya.UUCP (John Chambers) writes:
>>My claim is that Unix in fact has a good design for user-friendliness;
>>the problem is vendors and system developers that violate the design by
>>using low-level tools like rm at a higher (user) level.

       It all goes back to the intentions of the development of UNIX.  It was
developed by programmers, for programmers as a programming environment.  

>More precisely, it's making these part of a naive-user interface
>that is at fault.

>Non-expert users need an interface that doesn't require them to type
>> or * or | or other mystical symbols.  There are examples of better
>interfaces for the non-expert in widespread use, for example on the
>Macintosh; 

       But the Macintosh user interface and the UNIX user interface were
designed to meet two different sets of universes!  One pet peeve of mine
over the past several years has been the complaints voiced by people about
the cryptic UNIX UI.  Well is it anymore cryptic than any other real work
Operating System?  How about MVS?  Or VMS?  I really don't see why the
business world likes IBM PC/XT/AT/MS-DOS UI over that offered by UNIX.  They
are basically the same (in terms of specifying commands that is), except that 
UNIX has many command names that are two character quantities rather than a 
command that reads like a word, eg.  cp vs copy, pr vs print, etc...  Naturally,
a name is easier to remember than a two character command, but come on it just
is not that hard.  Furthermore, one could take a version of UNIX and rename the
commands to more mnuemonic names, but it would no longer be UNIX.

>               On the other hand, it is not that hard for someone with
>the proper training to learn how to exploit the usual UNIX tools
>effectively, but too often users are just dumped into a UNIX shell
>environment with inadequate guidance and reference and tutorial
>material.  

>It is really fairly easy to whip up a naive-user interface to UNIX;

>Of course, for my own use I prefer flexible power to safety, so I
>would object to the removal of an expert-user interface; but I do
>think a naive-user interface is also needed.

In article <530@mtxinu.UUCP> Ed Gould writes:

>I, too, want an expert-user interface while recognizing the need for
>a naive-user interface.  The problem I've seen with most of the naive-user
>systems is that there's no reasonable migration path from that
>interface to the experts' one.  That is, there's no way for someone
>to move on from being a beginner without learning a completely different
>mechanism for interacting with the system.

      Ed stole my thunder with that one!  I knew I should have replied
yesterday and not waited until today!!!!!

       If the novice user, is a programmer who is new to UNIX, then
the goal should be to try and keep the novice from pain while learning the
system.  However, this must be done in a transparent mode.  To do otherwise
would result in the programmer learning some variant of the UNIX UI.  The
transition from the novice level to more experienced level should be as
simple as removing training wheels from your child's bicycle! (Ha, tell that
one to my 5 year old :-) )



      As far as I am concerned, and don't get me wrong I believe very much
in the kind of work that people like Ben Shneiderman at U of MD. are doing,
I like the UNIX UI because it allows *ME* to do my job effectively.  Now if
I were a businessman who was not at all interested in learning how to use
the leverage of a computer, but only in how it can help me with my spreadsheet
that is a different matter and a different requirement of a UI!!!!

      UNIX is not for everyone!  It *CAN* form the platform for some very
useful applications and is a wonderful programming environment.  MS-DOS is
no easier to use for a non computer person and it offers less in terms of
capability in return for the difficulties.  The only difference between it
and UNIX from a UI point of view is that there is a larger base of end-user
oriented application software out there that make life simpler for JOE USER.
Most UNIX shops are engineering/scientific oriented where we don't really
need that kind of UI, but it might be helpfull so that I'm not spending lots
of time figuring out how to make the machine do what I want.  That is where
the lack of applications for UNIX hurts.

     Perhaps what we need is a couple of more interfaces like the nsh(Novice 
Shell) and bsh (Business Shell).  Ah well, what do you want?  A powerful and
flexible general purpose OS that has a User Interface that is all things to
all people?

-- 
George W. Leach					Paradyne Corporation
{gatech,rutgers,attmail}!codas!pdn!reggie	Mail stop LF-207
Phone: (813) 530-2376				P.O. Box 2826
						Largo, FL  34649-2826

goudreau@xyzzy.UUCP (11/25/87)

In article <1987Nov21.014754.19660@sq.uucp> msb@sq.UUCP (Mark Brader) writes:
>> The rm * disaster catches not only the absent-minded ...
>
>I thought it was about time someone expressed the opposite point of view.
>
>If I type "rm *", it is because I want to remove all the files.  No, not
>all *my* files.  All *the* files that I still have write permission on,
>that are in the current directory.  Usually no more than about 20 of them.
>In short, the proper UNIX* flavored method for protecting important files
>from "rm" is to turn off the write permission bit.

I'm sorry if that's what you want, because that's not what your system
is going to do.  I quote from the rm(1) entry in the 7th Edition
Programmer's Manual:

	"Removal of a file requires write permission in its directory,
	 but neither read nor write permission on the file itself."

Protecting your files in this way is thus an all-or-nothing method,
per directory.

A better to way to understand this is to think about what's really
going on at the directory level.  When you remove (or move) a file
within a directory, you never need to read or write the file itself.
You need to rewrite the directory because you wish to change the
contents of the directory file (its dir entries), and so write permission
in the directory is what is required.

barnett@steinmetz.ge.com (Bruce G Barnett) (11/25/87)

In article <2205@killer.UUCP> jfh@killer.UUCP (The Beach Bum) writes:

[ rm could ask for comfirmation if more than 15 files were to be deleted]

| Having the first few lines in main() be something like,
|
|	fflg = (argc > 15) || fflg;
|
|might be nice, or having a prompt, ala' MessyDos (yick) might be nice.
|
|Thoughts?

Don't change rm when you could have a shell script do the same thing!

Sheesh!

trb@ima.ISC.COM (Andrew Tannenbaum) (11/25/87)

In article <390@xyzzy.UUCP> meissner@xyzzy.UUCP (Michael Meissner) writes:
> I dunno, but with the System V.[23] bourne shell, you can't even do that
> because "type" is builtin (it tells where on your path a command is, or
> if it's a shell function).  My other "favorite" poor design choice is
> "dump" in the System V.3 sgs (software generation system, ie, C compiler)
> which dumps out the object file into readable form, not what the operator
> does at night.  Sigh.....

These naming arguments are silly.  Why is "type" better than "cat?"
To a novice, is "type:"

	the act of entering data through a keyboard?
	a synonym for "kind?"
	a synonym for "typeface?"
	
Ask a group of folks who never used computers what "type" means, and none
will describe what the "cat" command does.  They might call what "cat"
does "printing" but not "typing."

"Print" is no better than "type."  "Cat" is no worse.  Is "dump" what the
operator does at night?  I know "cats" that "dump" day and night,
usually after meals.  "Poor design choice?"  Fooey.

	Andrew Tannenbaum   Interactive   Boston, MA   +1 617 247 1155

msb@sq.UUCP (11/27/87)

Having had my knowledge of UNIX* insulted in public, I feel obliged to
reply in public.  This is positively my last posting on the topic.
[And if you see it twice, it's not MY fault, I canceled the first one.]

> >In short, the proper UNIX flavored method for protecting important files
> >from "rm" is to turn off the write permission bit.

> I'm sorry if that's what you want, because that's not what your system
> is going to do.

And then he quotes the V7 manual at me, and explains why permissions work
as they do.  Well, he should have read one more paragraph:

#   If the a file has no write permission and the standard input is a
#   terminal, its permissions are printed and a line is read from the
#   standard input.  If that line begins with `y' the file is deleted,
#   otherwise the file remains...

This is precisely the kind of interactive prompting that one school of
"rm is too powerful" users like.  But you only get it when you want it.
Sure, write protecting the file doesn't affect what rm has *permission*
to do ... it affects what it *will* do.

As I said in my original posting, I do consider it a misfeature that
if stdin is NOT a terminal then rm proceeds regardless of the file's
permissions.  I think the -f flag should be required in that mode also.
(I also think that having said that should have been sufficient
prevention from having UNIX basics explained to me on the net.)

While I'm posting, I'll add the bit I left out the first time.  I have
made it a habit *not* to hit Return instantly upon typing a line that
has both "rm" and "*" in it.  I pause and reread it.  It's an easy habit
to establish, and it's all the protection I think I need against "rm * .o".

Mark Brader		"Male got pregnant -- on the first try."
utzoo!sq!msb			Newsweek article on high-tech conception
msb@sq.com			November 30, 1987

*"UNIX is a trademark of Bell Laboratories" is a religious incantation.

rbj@icst-cmr.arpa (Root Boy Jim) (11/27/87)

   From: Ed Gould <ed@mtxinu.uucp>

   I, too, want an expert-user interface while recognizing the need for
   a naive-user interface.  The problem I've seen with most of the naive-user
   systems is that there's no reasonable migration path from that
   interface to the experts' one.  That is, there's no way for someone
   to move on from being a beginner without learning a completely different
   mechanism for interacting with the system.

   Some attempts at solving this sort of problem have been clumsy (e.g.,
   the "edit" and "ex" interfaces for editing text), while others have
   just been bad (no example springs to mine:  I try to forget them).

I really don't think the edit/ex distinction is clumsy. All it seems to
do is turn off the `magic' variable, as novices are usually unaware of
the intricacys of regular expressions.

In fact, this might be an excellent solution for the shell; implement
a `magic' variable which is set depending on what name the shell
was invoked under.


   Ed Gould              mt Xinu, 2560 Ninth St., Berkeley, CA  94710  USA
   {ucbvax,uunet}!mtxinu!ed    +1 415 644 0146

   "`She's smart, for a woman, wonder how she got that way'..."

A far cry from your quality/equality statement.

	(Root Boy) Jim Cottrell	<rbj@icst-cmr.arpa>
	National Bureau of Standards
	Flamer's Hotline: (301) 975-5688
	Edwin Meese made me wear CORDOVANS!!

stpeters@dawn.steinmetz (Dick St.Peters) (11/27/87)

In article <530@mtxinu.UUCP> ed@mtxinu.UUCP (Ed Gould) writes:
>People who design interfaces for novices should remember that a very
>large fraction of users are complete novices for only a short time.

Sigh ... if only that were really true.

>The only users who do not tend to progress
>are those who use a system only very infrequently and essentially re-
>learn each time.

... and those who use it a lot but "don't want to learn any more about
the system than they *have* to"  ("I'm judged by how much work I get
done, not by how much I know about the computer.")

... and those who are petrified by keyboards (or mice, or ...)

... and those who somehow just don't learn  (I know one very bright
engineer, eminent in his field, who for years has spent the major part
of every day at a terminal writing and running programs, yet who still
does not comprehend the concept of a "process" and how it differs from
a "program".)

... and those who are stuck in their ways  (We had a user on our old
mainframe who insisted on programming stacks of Hollerith cards for
years after interactive time-sharing offered the alternatives of
either interactive computation or batch jobs submitted as files of
"card images".)

The world will likely always be full of novices who remain novices,
and they will always require hand-holding.

As Gould argues, there should be an evolutionary path from novice
interface to expert interface.  However, the overall interface should
*encourage* the novice to take that path, not just passively allow the
possibility.
--
Dick St.Peters                        
GE Corporate R&D, Schenectady, NY
stpeters@ge-crd.arpa              
uunet!steinmetz!stpeters

stpeters@dawn.steinmetz (Dick St.Peters) (11/27/87)

In article <1819@pdn.UUCP> reggie@pdn.UUCP (George Leach) writes:
>One pet peeve of mine
>over the past several years has been the complaints voiced by people about
>the cryptic UNIX UI.  Well is it anymore cryptic than any other real work
>Operating System?  How about MVS?  Or VMS?

It's a pet peeve of mine too, but one I understand.  I can't comment
on MVS, but UNIX vs. VMS is an everyday issue here, and no, UNIX is
not more cryptic than VMS *as_used_by_experts*.  However, for the
novice user, things are different.  For a novice wanting to see the
contents of file junk.txt, it is comforting to say "type junk.txt",
which resembles English.  Watch an expert, though, and s/he will use
"t/p junk", as cryptic as any UNIX command.

(The VMS interface is not always so friendly to novices: name the file
"junk" instead of "junk.txt", and a novice may never figure out how to
read it.  As for expert interfaces, rename the expert's .emacs file to
sav.emacs and watch him/her try to recover.)
--
Dick St.Peters                        
GE Corporate R&D, Schenectady, NY
stpeters@ge-crd.arpa              
uunet!steinmetz!stpeters

mkhaw@teknowledge-vaxc.ARPA (Mike Khaw) (11/28/87)

> (The VMS interface is not always so friendly to novices: name the file
> "junk" instead of "junk.txt", and a novice may never figure out how to
> read it.  As for expert interfaces, rename the expert's .emacs file to
> sav.emacs and watch him/her try to recover.)

Yeah, I really hate when VMS outsmarts itself.  I had this exact problem
when I inserted a bug into my .emacs file and decided "No problem.  I'll
just rename it to emacs.init, run emacs, edit the file, and rename it back".
Boy was I ticked off when "rename emacs.init .emacs" gave me "emacs.emacs".
Of course, I didn't discover it until the next time I ran emacs and I didn't
get my usual environment.

Mike Khaw
-- 
internet:  mkhaw@teknowledge-vaxc.arpa
usenet:	   {uunet|sun|ucbvax|decwrl|uw-beaver}!mkhaw%teknowledge-vaxc.arpa
USnail:	   Teknowledge Inc, 1850 Embarcadero Rd, POB 10119, Palo Alto, CA 94303

blarson@skat.usc.edu (Bob Larson) (11/28/87)

In article <7995@steinmetz.steinmetz.UUCP> dawn!stpeters@steinmetz.UUCP (Dick St.Peters) writes:
>(The VMS interface is not always so friendly to novices: name the file
>"junk" instead of "junk.txt", and a novice may never figure out how to
>read it.  As for expert interfaces, rename the expert's .emacs file to
>sav.emacs and watch him/her try to recover.)

I'm no VMS expert and I know a way to recover.  Use a gun to put a few
bullets in the aproprate disk drive.  (When it is replaced and the
backups restored, my .emacs reappears. :-)
--
Bob Larson		Arpa: Blarson@Ecla.Usc.Edu
Uucp: {sdcrdcf,cit-vax}!oberon!skat!blarson		blarson@skat.usc.edu
Prime mailing list (requests):	info-prime-request%fns1@ecla.usc.edu

bzs@bu-cs.bu.EDU (Barry Shein) (11/30/87)

Gak, this discussion comes up every few months doesn't it? And,
predictably, not one poster ever offers anything beyond the thinnest
anecdotal evidence. No research papers or even informal, controlled
studies, nothing. Just introspective, armchair psychology.

I do know that AT&T has made MegaSagans of US$'s with a user interface
that requires people to type in long strings of digits to contact
their friends and business associates.

I suppose we tend to remember the user who whines about learning the
system (and their rationalizations) more than the silent masses who
just seem to be able to remember that "cat" is short for "caterwaul",
as in "caterwaul that file for me" (and, of course, the -v means
caterwaul less vigoroso, it's all very clear if you grok the aural
traditions involved.)

I'm not even completely convinced that the goal of a computer
interface design is to make it easy for any idiot to use the system
with no effort. Unless perhaps the hidden agenda is to turn every body
in site into a data entry clerk. Perhaps.

	-Barry Shein, Boston University

hubcap@hubcap.UUCP (Mike Marshall) (12/01/87)

In article <1987Nov27.011955.10801@sq.uucp>, msb@sq.uucp (Mark Brader) writes:
> While I'm posting, I'll add the bit I left out the first time.  I have
> made it a habit *not* to hit Return instantly upon typing a line that
> has both "rm" and "*" in it.  I pause and reread it.  It's an easy habit
> to establish, and it's all the protection I think I need against "rm * .o".

I agree. I can be as scatter brained as they come, but I have cultivated the
above habit, and I don't think I have EVER lost any files with "rm * .o" 
(or whatever). I always automatically reread whatever I've typed when 
using rm, it's not a hassle, cause I do it without thinking. 

Another habit that I have extablished is "rm -i" whenever I am su'ed to root.

You can take your good habits with you to a new environment... but maybe not
your aliases :-).

-Mike Marshall       hubcap@hubcap.clemson.edu        ...!hubcap!hubcap

allbery@ncoast.UUCP (12/01/87)

As quoted from <392@xyzzy.UUCP> by goudreau@xyzzy.UUCP (Bob Goudreau):
+---------------
| In article <1987Nov21.014754.19660@sq.uucp> msb@sq.UUCP (Mark Brader) writes:
| >If I type "rm *", it is because I want to remove all the files.  No, not
| >all *my* files.  All *the* files that I still have write permission on,
| 
| I'm sorry if that's what you want, because that's not what your system
| is going to do.  I quote from the rm(1) entry in the 7th Edition
| Programmer's Manual:
| 
| 	"Removal of a file requires write permission in its directory,
| 	 but neither read nor write permission on the file itself."
+---------------

True enough -- at the level of unlink().  But if you'll unalias (or un-
function, if you're a System V type) rm for a moment and try to "rm" a file
which is write-protected without using the "-f" flag, you'll see:

bsd% rm foo
foo 444 mode _

$ rm foo	#system V
foo: 444 mode ? _

The biggest problem with this is that it's rather difficult to edit a C
program that's been "rm"-proofed in this manner....
-- 
Brandon S. Allbery		      necntc!ncoast!allbery@harvard.harvard.edu
 {hoptoad,harvard!necntc,cbosgd,sun!mandrill!hal,uunet!hnsurg3}!ncoast!allbery
			Moderator of comp.sources.misc

jc@minya.UUCP (12/01/87)

In article <10579@brl-adm.ARPA>, bzs@bu-cs.bu.EDU (Barry Shein) writes:
> 
> Gak, this discussion comes up every few months doesn't it? And,
> predictably, not one poster ever offers anything beyond the thinnest
> anecdotal evidence. No research papers or even informal, controlled
> studies, nothing. Just introspective, armchair psychology.

Yeah, and have you noticed that most of the postings have casually ignored
the original question, and just gone on to a trivial discussion of novices
who can't handle rm?  This is a unix.wizards discussion?  I'm disappointed
with y'all!  Here I was expecting some really juicy examples of bad system 
design.  All that's appeared is a hacker's version of Trivial Pursuit.

> I do know that AT&T has made MegaSagans of US$'s with a user interface
> that requires people to type in long strings of digits to contact
> their friends and business associates.

And IBM makes similar income from JCL.  Perhaps good user interfaces are
a bad marketing idea.  I mean, if you want to be the size of Apple, maybe
you can sell a good user interface.  But if you want to be the size of IBM
or AT&T, you should sell incomprehensible interfaces.  Which has the market 
rewarded best?

[OK, so I've confused cause and effect; let's see the evidence that I'm
wrong. :-]

-- 
John Chambers <{adelie,ima,maynard,mit-eddie}!minya!{jc,root}> (617/484-6393)

paul@umix.cc.umich.edu ('da Kingfish) (12/01/87)

In article <7994@steinmetz.steinmetz.UUCP> dawn!stpeters@steinmetz.UUCP (Dick St.Peters) writes:
>As [Ed] Gould argues, there should be an evolutionary path from novice
>interface to expert interface.  However, the overall interface should
>*encourage* the novice to take that path, not just passively allow the
>possibility.
>--

Yes, I think the key idea here is the *overall* interface.  For
example, three people were working on a software project under my
direction.  One rm-ed an entire directory of source, and stammered
something about rm star, spaces, backslashes, and something else.  He
was sweating profusely, and had something to say about the
inappropriate user interface that /bin/csh had, etc.

Well, this has probably happened to some of you, and it's always "well,
we lost a day's worth of work, but we had really good backups, etc."

Well, we hadn't done backups in about two months.

So, I fired him.

The other two saw the "big picture (or what I believe
dawn!stpeters refers to as "the overall interface") and got right on
that evolutionary path!

--paul
-- 
Trying everything that whiskey cures in Ann Arbor, Michigan.
Over one billion messages read.

david@elroy.Jpl.Nasa.Gov (David Robinson) (12/01/87)

In article <2975@umix.cc.umich.edu>, paul@umix.cc.umich.edu ('da Kingfish) writes:
< For
< example, three people were working on a software project under my
< direction.  One rm-ed an entire directory of source, and stammered
< something about rm star, spaces, backslashes, and something else.  He
< was sweating profusely, and had something to say about the
< inappropriate user interface that /bin/csh had, etc.
 
< Well, this has probably happened to some of you, and it's always "well,
< we lost a day's worth of work, but we had really good backups, etc."
 
< Well, we hadn't done backups in about two months.
 
< So, I fired him.
 

Mistake, you should have fired the person who did not have the brains to
do more frequent backups if you had such critical code!

That wasn't your decision was it? ;-)





-- 
	David Robinson		elroy!david@csvax.caltech.edu     ARPA
				david@elroy.jpl.nasa.gov	  ARPA
				{cit-vax,ames}!elroy!david	  UUCP
Disclaimer: No one listens to me anyway!

tim@amdcad.AMD.COM (Tim Olson) (12/01/87)

In article <421@minya.UUCP> jc@minya.UUCP (John Chambers) writes:
| Yeah, and have you noticed that most of the postings have casually ignored
| the original question, and just gone on to a trivial discussion of novices
| who can't handle rm?  This is a unix.wizards discussion?  I'm disappointed
| with y'all!  Here I was expecting some really juicy examples of bad system 
| design.  All that's appeared is a hacker's version of Trivial Pursuit.

Back to the original discussion, here is an example Alan Kay gave in a
talk at Stanford about 2 years ago (paraphrased by me and my potentially
faulty memory!):

To test out new user interfaces, Xerox would videotape novice users
working with the system.  In one particular instance, one person was to
perform a task that required a DoIt command at the end (from a pull-down
menu).  He kept repeating the cycle of performing everything up to the
DoIt, pulling down the menu, going to the DoIt entry in the menu,
muttering something under his breath, then quitting out of the menu.

Upon review of the tape, the researchers discovered that the person was
muttering "DOLT!..  I'm not a dolt".  They then realized that DoIt (with
an uppercase I) *did* look like the word "dolt" in the sans-serif font
they had for the system.  They later changed it to "doit" (lowercase
'i'). 

	-- Tim Olson
	Advanced Micro Devices
	(tim@amdcad.amd.com)

franka@mmintl.UUCP (12/01/87)

[I have directed follow-ups to comp.cog-eng only.]

In article <1987Nov27.011955.10801@sq.uucp> msb@sq.UUCP (Mark Brader) writes:
>While I'm posting, I'll add the bit I left out the first time.  I have
>made it a habit *not* to hit Return instantly upon typing a line that
>has both "rm" and "*" in it.  I pause and reread it.  It's an easy habit
>to establish, and it's all the protection I think I need against "rm * .o".

I agree.  Without having particularly thought about it, I do the same thing.
I suspect that most experienced programmers do, too.

This, of course, makes it no less a human interface problem.  The only
people who can fix the problem are the people who don't need to.
-- 

Frank Adams                           ihnp4!philabs!pwa-b!mmintl!franka
Ashton-Tate          52 Oakland Ave North         E. Hartford, CT 06108

allbery@ncoast.UUCP (12/02/87)

As quoted from <10515@brl-adm.ARPA> by dsill@NSWC-OAS.arpa (Dave Sill):
+---------------
| >It is trivial to say "ln /bin/cat /bin/type"; why don't we do it?
| 
| One reason is that linking `type' to `cat' doesn't create a man page
| for `type'.
+---------------

....but once you bring man pages into it, we got far worse problems than
cryptic commands and typos in * commands!
-- 
Brandon S. Allbery		      necntc!ncoast!allbery@harvard.harvard.edu
 {hoptoad,harvard!necntc,cbosgd,sun!mandrill!hal,uunet!hnsurg3}!ncoast!allbery
			Moderator of comp.sources.misc

brianc@cognos.uucp (Brian Campbell) (12/02/87)

In article <1987Nov27.011955.10801@sq.uucp> msb@sq.UUCP (Mark Brader) writes:
> In short, the proper UNIX flavored method for protecting important files
> from "rm" is to turn off the write permission bit.

Marking selected files as read-only is often useful for protecting files
in the singular sense.  However, it is also possible to protect an
entire directory from accidental erasure with:

   touch \!
   chmod -w \!

Now, when the careless (?) user enters "rm * .o" or any variation
thereof (excluding the addition of -f), the first file rm will encounter
will be !  (unless someone has filenames starting with spaces or other
unprintables).  rm will ask if the user really wants to delete this
file.  At this point, an INTR will stop rm from deleting any files at
all; answering n will simply tell rm not to delete that single file.

> As I said in my original posting, I do consider it a misfeature that
> if stdin is NOT a terminal then rm proceeds regardless of the file's
> permissions.  I think the -f flag should be required in that mode also.
> (I also think that having said that should have been sufficient
> prevention from having UNIX basics explained to me on the net.)

I do not think this is a "misfeature".  With shell scripts and system() calls
I have a chance after I have typed the command to verify that it is indeed
what I wanted.  When interactive, its too late after I've pressed return.
-- 
Brian Campbell        uucp: decvax!utzoo!dciem!nrcaer!cognos!brianc
Cognos Incorporated   mail: POB 9707, 3755 Riverside Drive, Ottawa, K1G 3Z4
(613) 738-1440        fido: (613) 731-2945 300/1200, sysop@1:163/8

dave@lsuc.uucp (David Sherman) (12/03/87)

cameron@elecvax.eecs.unsw.oz (Cameron Simpson) writes:
>I once used something calling itself `nsh' on a System V machine, and typed
>	$ cd thnig
>and thought "bother, I meant `thing'" and was then disconcerted when it said
>	path/thing
>	$
>back at me. It had fixed the transposed chacters and dropped me in the right
>spot! Hopefully it only happened in interactive mode, but it was very
>disconcerting.

We have that in our Bourne shell here.  You get used to it very
quickly, and it's VERY handy.  Yes, it only works in interactive mode.

As far as I remember, the origins of spelling-correction for chdir
in sh go back to Tom Duff adding it to the v6 shell at U of Toronto
around 1976 or so.  I then pulled out td's spname() routine and
began plugging it into other utilities on our v6 11/45, when
used interactively (p, cmp, and a few others).  The routine
accompanied Rob Pike on his travels when he left U of T, and
it shows up in Kernighan & Pike (with credit to Duff, I believe;
don't have a K&P handy).

In the original version, it would ask you:
	$ cd /ibn
	cd /bin? y
	$
The version of sh currently on our system (I got this part
from sickkids!mark) doesn't bother asking, which I think is right
because you often type ahead and don't want some command swallowed
as an answer to "cd foo?".  It just does it:
	$ cd /ibn
	cd /bin
	$

If anyone with a source license wants the code in sh to
implement this, let me know.  It's pretty trivial once you have
spname(3), which we use all over the place now (more(1), for example).

David Sherman
The Law Society of Upper Canada
-- 
{ uunet!mnetor  pyramid!utai  decvax!utcsri  ihnp4!utzoo } !lsuc!dave
Pronounce it ell-ess-you-see, please...

sys1@anl-mcs.arpa (12/04/87)

     In an article in UNIX-WIZARDS-DIGEST (V4#088) paul@umix.cc.umich.EDU 
writes:
	
	> Yes, I think the key idea here is the *overall* interface.  For
	> example, three people were working on a software project under my
	> direction.  One rm-ed an entire directory of source, and stammered
	> something about rm star, spaces, backslashes, and something else.  He
	> was sweating profusely, and had something to say about the
	> inappropriate user interface that /bin/csh had, etc.
	
	> Well, this has probably happened to some of you, and it's always "well,
	> we lost a day's worth of work, but we had really good backups, etc."
	
	> Well, we hadn't done backups in about two months.
	
	> So, I fired him.
	
	> The other two saw the "big picture (or what I believe
	> dawn!stpeters refers to as "the overall interface") and got right on
	> that evolutionary path!
	
I submit that in that case perhaps the wrong person was fired.  Perhaps the
manager of a system that is not backed up in two months or more has already
demonstrated a considerably more significant level of incompetence and danger
to his installation than someone who makes a typing error.  Even secretaries,
who are professional typists, are expected to make an occasional typing error.

                               Scott Bennett
                               Systems Programming
                               Northern Illinois University
                               DeKalb, Illinois 60115

                      UUCP:  ...!anlams!niuvax!sys1
                      BITNET:  A01SJB1@NIU

stpeters@dawn.UUCP (12/05/87)

In article <10579@brl-adm.ARPA> bzs@bu-cs.bu.EDU (Barry Shein) writes:
>Gak, this discussion comes up every few months doesn't it?

Yeah.  Too bad it has to keep coming up.  It should never stop.

>predictably, not one poster ever offers anything beyond the thinnest
>anecdotal evidence. No research papers or even informal, controlled
>studies, nothing. Just introspective, armchair psychology.

Papers and studies may someday make for a wonderful user interface,
but in the meantime we have to live with - and help others live with -
the interface(s) we've got.  In a few weeks, I will have been helping
people make the VMS-->UNIX transition for five years, and other
peoples' warnings, hints, suggestions, etc. have been of great help.
Even the tiresone rm * discussions occasionally bring up something of
interest.

>I do know that AT&T has made MegaSagans of US$'s with a user interface
>that requires people to type in long strings of digits to contact
>their friends and business associates.

Thank you for your research report. :-)

>I'm not even completely convinced that the goal of a computer
>interface design is to make it easy for any idiot to use the system
>with no effort.

There are people who do not adapt well to using computers but are far
from being idiots, including at least one member of my group who could
design a computer from scratch (the chips, the boards, and the bus)
but will never be a very comfortable user of one.  However, he does
have to use one routinely, and it's a part of my job to make that as
painless as possible.
--
Dick St.Peters                        
GE Corporate R&D, Schenectady, NY
stpeters@ge-crd.arpa              
uunet!steinmetz!stpeters

wcs@ho95e.ATT.COM (Bill.Stewart) (12/07/87)

In article <771@hubcap.UUCP> hubcap@hubcap.UUCP (Mike Marshall) writes:
:I agree. I can be as scatter brained as they come, but I have cultivated the
:above habit, and I don't think I have EVER lost any files with "rm * .o" 

Must be nice.  I had a spurious file once, called * , and removed it.
I realized what I'd done about the time the $ came back; this was when
I learned about nightly backups (the administrators did them), and rm -i.

At Purdue, the local version of 4.*BSD had modified rm to move things
to /tmp/graveyard instead of really deleting them; they'd stick around
48 hours or so.  You could use the real rm if you wanted to.  Of
course, this doesn't prevent other ways of trashing files, though
noclobber helps.  One of the few things I appreciate about VMS is the
file versioning; every time you modify a file, it creates a new copy of
it (I assume at open-file-for-writing time?).  Even a one-deep automatic
backup would be helpful; emacs does this but vi and ed don't.
-- 
#				Thanks;
# Bill Stewart, AT&T Bell Labs 2G218, Holmdel NJ 1-201-949-0705 ihnp4!ho95c!wcs

ejp@ausmelb.oz.au (Esmond Pitt) (12/09/87)

In article <10659@brl-adm.ARPA> niuvax!sys1@anl-mcs.arpa (Systems Programmer) writes:
<     In an article in UNIX-WIZARDS-DIGEST (V4#088) paul@umix.cc.umich.EDU 
<writes:
<	> One rm-ed an entire directory of source ...
<	> Well, we hadn't done backups in about two months.
<	> So, I fired him.
<	
<I submit that in that case perhaps the wrong person was fired.  Perhaps the
<manager of a system that is not backed up in two months or more has already
<demonstrated a considerably more significant level of incompetence and danger
<to his installation than someone who makes a typing error.

Hear hear. Once upon a time a manager fired a payroll programmer for updating
only one of the two occurrences of the tax rate in a program.
He should have fired himself, for not manifesting the constant or putting
it outside in a file, when he wrote the program some years before.

Managers have occupational hazards too.

-- 
Esmond Pitt, Austec International Ltd
...!seismo!munnari!ausmelb!ejp,ejp@ausmelb.oz.au

matt@ncr-sd.SanDiego.NCR.COM (Matt Costello) (12/12/87)

The real problems in interface design generally occur because of
unstated assumptions.  We had a hilarious incident occur here
recently...

There is a plan here to put a PC on every desk, including those of
secretaries and managers.  So that each individual would not have
to waste time aquiring a decent editor and the other tools, a
package was put together containing 10 disks of software that was
PD or that we had a license for.  At the part of the installation
process where the files were to be copied on the hard disk, the
instructions said to insert each of the 10 disks into the floppy
disk drive.  Imagine our suprise when a worried secretary called
to say that she had been able to fit only 5 of the disks into the
disk drive.  Fortunately no damage occured and the instructions
were quickly changed.

Disclaimer: I don't have a PC, nor do I want one.
-- 
Matt Costello	<matt.costello@SanDiego.NCR.COM>
+1 619 485 2926	<matt.costello%SanDiego.NCR.COM@Relay.CS.NET>
		{sdcsvax,cbosgd,pyramid,nosc.ARPA}!ncr-sd!matt

gwyn@brl-smoke.ARPA (Doug Gwyn ) (12/13/87)

In article <1943@ncr-sd.SanDiego.NCR.COM> matt@ncr-sd.SanDiego.NCR.COM (Matt Costello) writes:
>instructions said to insert each of the 10 disks into the floppy
>disk drive.  Imagine our suprise when a worried secretary called
>to say that she had been able to fit only 5 of the disks into the

Reminds me about the time when someone took the instructions
	"Remove the floppy disk from its protective sleeve
	and insert the disk into the drive."
too literally.  Oh, that black plastic square isn't the protective sleeve?

clif@chinet.UUCP (Clif Flynt) (12/15/87)

In article <1943@ncr-sd.SanDiego.NCR.COM> matt@ncr-sd.SanDiego.NCR.COM (Matt Costello) writes:
>The real problems in interface design generally occur because of
>unstated assumptions.  We had a hilarious incident occur here
>recently...
>
>  Imagine our suprise when a worried secretary called
>to say that she had been able to fit only 5 of the disks into the
>disk drive.  
>

  A similar incident happened to a friend, diagnosing a floppy disk read 
problem over the phone.  
  "Have you cleaned the disk?"  He inquired, thinking that the heads might
be dirty.
  "I'll try it and call you back", said the person at the other end, and
about 10 minutes later called back to inform my friend.  "I took the disk
out of that black wrapper, and you were right, it was covered with brown
dusty stuff.  I cleaned that all off, but it still doesn't work."


  There is also the tale of the DP manager who wanted to make sure that
nobody would overwrite the data on his tapes.  He filled the slots where
the write-enable rings would go with epoxy, so that no-one could put
a write enable ring in.  He didn't realize that ANYTHING in that slot will 
enable the tape for writing.

  Another friend of mine tells the tale of a system where people 
could log in OK as long as they sat in front of the terminal.
If they stood in front, then their password was rejected.
  It finally turned out that two key-caps on the keyboard had been swapped.
When people sat, they put their fingers on the 'home row' and typed,
but standing, they typed with two fingers, and looked at the key-caps to 
see which keys to press.
-- 
------------------------------------------------------------------------
My Opinions are my own. I can't imagine why anyone else would want them.
Clif Flynt	ihnp4!chinet!clif
------------------------------------------------------------------------

fairchil@ARDEC.arpa (GUEST-EAI) (12/22/87)

Remembering from a Wall Street Journal article last week:

     A novice being monitored while using a package for the first time
     hit the 'CANCEL' when 'DOIT' was the intended obvious choice,
     commented: "I'm no dolt."