davidson (06/11/82)
To remove a link named - (dash), try rm ../- or any other path name which does not begin with a dash (which would make it a flag). Such links are often created when a typo messes up the typing of a flag and the initial dash gets separated. I've answered this question enough, that I hope the answer is of general interest. Greg Davidson
mkg (06/12/82)
What I use to remove files that have "interesting" characters is rm -i * What you get is an individual yes/no question about removing each file in the directory. The best thing is that it works on ANYTHING. Someone I work with managed to create a file name that was a space! Marsh Gosnell BTL Piscataway (201) 981-2758 npois!pyuxbb!mkg
nrf (06/12/82)
I would like to point out that 'rm -i *' doesn't work on files that have names with non-ascii (> 127) characters. The only way I have found to remove such files is to use the following program statement, coding the bizarre characters in octal: unlink("abc\341\342\343.c"); N. R. Fildes BTL - Whippany
mark (06/12/82)
Removing a file whose name contains a space is not hard, if you know exactly what the file name is. Just type something like rm "foo bar" which quotes the space. Globbing can also be used, e.g. rm foo* if there are no other foo*'s, or rm foo?bar rm -i is a useful tool if you're not sure of the file name. However, in some versions of UNIX (4.1BSD in particular) a buggy program can create a file whose name contains a meta character (e.g. 0200-0377). The kernel will create such a file, but the unlink operation strips the parity bit, making it impossible to unlink, even with shell globbing. (It might be the shell that strips the parity, I'm not sure.) In any case, a sledge hammer approach that seems to work even in this case is to (1) cd to the directory, say "dir" (2) create a directory, say ../foo (3) mv * ../foo This will fail on the funny file. If you have subdirectories this is harder - you have to do some work by hand. But eventually you'll get everything moved except the bad file. (4) cd .. (5) rm -rf dir This gets rid of it. I'm not sure if an unreachable directory entry gets created which will be cleaned up on the next fsck, but it seems not to even leave that. Mark
lepreau (06/13/82)
This is another task that "dired" is perfect for-- no pain, no strain. (Posted on net.sources about a week ago.) -jay
clives (06/13/82)
Seems like the easiest way I've found to rm problem filenames (Murphy's law
says these things get generated in the top of trees) is the following:
(No comments on explanatory style, please, you experts already know how).
1. cd to the directory, then od -c . > temp.c
2. In temp.c, find the line(s) which contains the monster(s), and delete
the rest. See hints below. Each line will look like this:
0000360 303 " j u n k 222 \t n a m e \0 \0 \0 \0
3. Delete the first 3 elements, (3dw in vi) which are the dump byte label
and the file system pointer, leaving the 14 char file name with non-ascii
in \x or octal number notation:
j u n k 222 \t n a m e \0 \0 \0 \0
4. Then put backslash in front of any numbers which don't already have one:
j u n k \222 \t n a m e \0 \0 \0 \0
5. Remove all blanks from the string(s) (:%s/ //g does this in vi), leaving:
junk\222\tname\0\0\0\0
6. Write C like the following around the name(s):
main ()
{
unlink("junk\222\tname\0\0\0\0\0"); /* 1 line like this for each name */
}
7. Leave the editor, cc temp.c, run a.out, and your junk should be gone.
Hints:
It may seem a little difficult the first time to find real wierd junk files
in the od output (the kind which ls shows as cr?u??d???). The ?'s usually
represent non-ascii bytes, shown by their octal or \x values, as 222 and \t
are above.
There will probably be some extra names shown in the dump besides presently
active ones. In these inactive labels, the pointer (first and second byte,
which are the second and third element of the dump lines) will be \0 \0:
0000420 \0 \0 n o t _ u s e d _ n o w \0 \0
Removing them from the dump file will leave you with lines which should match
one for one with your ls (-a).
Incidentally, some of the filenames I've had to wipe around here..........
Clive Steward
Tektronix, Beaverton.
cak (06/13/82)
The best way I've found to remove files with non-ASCII chars is find. The problem is that when you glob, the names come out with the high order bit turned off -- but find doesn't pull this stunt. So, if you can come up with an expression you can pass to find, you can say find -name <expr> -exec rm -f {} \; and get rid of it. Of course, if there's no way to glob it (all the chars are funny, say control chars), this won't work. At this point, I'll endorse the dired program which appeared recently in net.sources -- it's great for this kind of bogosity. Chris Kent, Purdue CS
dmmartindale (06/13/82)
It isn't very difficult to modify nami to simply disallow creation of files with any chars >127 (or control chars if you like) in the name. When most of the users are students, this can save a lot of time which would otherwise be spent helping people get rid of files they created by mistake.