[comp.unix.questions] the file that got away

root@thestepchild.sgi.com (Super-User) (04/12/91)

In his message, msc%SUN2.NCKU.EDU.TW@VM.TCS.Tulane.EDU said :
> 
> Dear Netters :
>       Is there any tool which can be used to undelete UNIX files ?
> 

*** on soapbox! ***

A while back someone posted a review of a Norton Utilities package that
hacked over the appropriate system calls to allow deleted files to be
resurrected.  A number of UNIX purists blasted this package, because,
a) it isn't UNIX, and b) NU isn't a UNIX shop.  These may be valid
reasons for mistrusting a vendor's add-on package, but then, how many
window-based interfaces have been developed with the explicit goal of
papering over UNIX?  Yet, those interfaces are still useful and many of
these same UNIX purists use them.

Personally, I felt that as long as the NU package worked as advertized
and was otherwise transparent to both the C and user interfaces (which
they apparently were), they were just fine.  Nobody complained about
inserting vnodes in the file system to support NFS.  Anyway, if you
really want to buck UNIX's wild and wooly tradition, you could look
into purchasing that.

Another more common approach is to use an alias for rm that moves the
defunct file to a directory that gets purged when you log out, or at
intervals by cron.  This solution is free, but doesn't cover you in the
case of files that are clobbered by a redirection (>).  However, if you
use csh, you can set the noclobber variable to prevent that from biting
you.  Between these two admitted hacks, you're covered.  I'm not sure
if ksh allows you an out for redirections.  I know that sh doesn't.

As much as I admire UNIX for its overall design philosophy--the toolkit
idea is a very powerful one!-- I said then that the inablity to
retrieve "deleted" files is a flaw in the design of the interface, and
I still believe that.  Why?  Because it allows users to lose data by
accident.  In any other context, that would be considered a serious
bug, perhaps even a showstopper.  Also, any such hole that can't be
plugged by a user (the redirection problem can't be aliased away)
reflects badly on an otherwise admirable design.

Maybe retrieval of defunct files isn't the way to go, but if not, then
allowing confirmation prior to truncations or deletions certainly is.

Removal of data should be treated as another form of file access, with
(yet) another permissions mode bit.  This would allow users to identify
files containing data that shouldn't just vanish.  They could set up
their umasks to enable or disable the confirm-before-clobber bit by
default.  The system calls could check this bit along with the access
checks they already make, and if set, prompt for confirmation before
deleting or truncating.  The -f option to rm could still get 'round
this.  If you had to you could add an envar to have the shell disregard
this bit as well (such as for use with make).

This all seems straightforward, and I don't understand why so many
brilliant people have wasted so many man-years defending such a nasty
problem, one that has hurt so many users and done so much damage to the
reputation of UNIX, when it would be so easy to fix.

Personally, if I were in the business of marketing yet another flavor
of UNIX to industry and government customers, I'd find a big commercial
advantage in having that particular hole plugged.

Flame away, but please address the technical issues when you do.  The
argument from history will carry no weight with me, nor with the next
new user who loses a day's work.  It will be no comfort at all to know
that it's happened to so many others.  He won't think it's an endearing
feature.  If he comes to like the rest of UNIX well enough, he'll learn
to tolerate this bug, as we all have had to.  But let's stop pretenting
that it isn't a bug.  It is.  There, I said it.

*** off soapbox! ***

-r

Disclaimer: no one would ever allow me to represent them in any capacity
after saying what I just said.