cdash@boulder.UUCP (11/25/87)
yesterday, i got bit by rm. I was remotely logged in to a system over a network and had created a bunch of temp files. to delete them, i naturally typed in "rm t*" only the %$*#&^#@ network managed to drop the "t" and you all know what happened then. It wasn't too bad because with the archiving we do it was only 2 hours to get them back. Of course yeaterday's changes got lost and had to be redone. The point is that there are two things a command interface could do: 1) protect us from our own stupidity (i'm not convinced it should) 2) protect us from "extended system" errors like dropping a character but i'm not sure how you separate the two. -- cdash aka cdash@boulder.colorado.edu aka ...hao!boulder!cdash aka ...nbires!boulder!cdash aka (303) 593-3492
koreth@ssyx.ucsc.edu (Steven Grimm) (11/25/87)
In article <1257@boulder.Colorado.EDU> cdash@boulder.Colorado.EDU (Charles Shub) writes: >yesterday, i got bit by rm. I was remotely logged in to a system over a network >and had created a bunch of temp files. to delete them, i naturally typed in >"rm t*" only the %$*#&^#@ network managed to drop the "t" and you all know what >happened then. It wasn't too bad because with the archiving we do it was only 2 >hours to get them back. Of course yeaterday's changes got lost and had to be >redone. The point is that there are two things a command interface could do: > 1) protect us from our own stupidity (i'm not convinced it should) > 2) protect us from "extended system" errors like dropping a character >but i'm not sure how you separate the two. >-- > >cdash aka cdash@boulder.colorado.edu aka ...hao!boulder!cdash > aka ...nbires!boulder!cdash aka (303) 593-3492 Sounds like you need better error detection on your network software more than a better "rm". If the problem is common, try using rm -i, which will prompt you for each filename. +New! Improved! Now 100% Artificial-+-+-----------------------------------+ |# # @@@ **** &&&&& $$$$$ % %| |Steven Grimm | |# # @ @ * * & $ % %+-+ ARPA: koreth@ucscb.ucsc.edu | |### @ @ **** &&&& $ %%%%%| | UUCP: ...!ucbvax!ucscc!ssyx!koreth| |# # @ @ * * & $ % %+-+ ______________________________| |# # @@@ * ** &&&&& $ % %| | |"Let's see what's out there."| +-----with NutraSour(TM)! No natural colors or preservatives!------------+
gwyn@brl-smoke.ARPA (Doug Gwyn ) (11/26/87)
In article <1257@boulder.Colorado.EDU> cdash@boulder.Colorado.EDU (Charles Shub) writes: >... "rm t*" only the %$*#&^#@ network managed to drop the "t" ... Some "network" that must be, to lose data at the user interaction level!
sunil@hpcllmv.HP.COM (Sunil Bhargava) (11/27/87)
I had posted a solution to the rm * problem earlier. This involved having a file called -i in your directory (aliasing mkdir to a mkdir + echo > -i ). What is worng with that hich th spwi
chris@mimsy.UUCP (Chris Torek) (11/28/87)
In article <6840002@hpcllmv.HP.COM> sunil@hpcllmv.HP.COM (Sunil Bhargava) writes: >I had posted a solution to the rm * problem earlier. This involved having >a file called -i in your directory (aliasing mkdir to a mkdir + echo > -i >). What is worng with that solution??. Among other things, it does not always work: % rm tmp. * (oops!) % rm dir/* (hmm) (The latter example reminds me of another thing: Why is it that people have so much trouble figuring out how to remove files whose names start with `-'?) -- In-Real-Life: Chris Torek, Univ of MD Comp Sci Dept (+1 301 454 7690) Domain: chris@mimsy.umd.edu Path: uunet!mimsy!chris
paul@tut.UUCP (11/29/87)
In article <6738@brl-smoke.ARPA> gwyn@brl.arpa (Doug Gwyn (VLD/VMB) <gwyn>) writes: < In article <1257@boulder.Colorado.EDU> cdash@boulder.Colorado.EDU (Charles Shub) writes: < >... "rm t*" only the %$*#&^#@ network managed to drop the "t" ... < < Some "network" that must be, to lose data at the user interaction level! I can think of a very common "network" that does just that: rs232. Error checking? Why would we want that? 8-) At OSU, we have a laser printer that talks only rs232, no error detection. It will drop bits and bytes quite regularly. Some types of workstation keyboards talk to the host using rs232 also. Then there is the phone problem: this sort of thing happens quite often to me when using my 1200 baud modem. My personal opinion is that if someone relies on async without any error checking (or just parity), they are asking for trouble... I do not mean to imply that Mr. Shub is not accurately describing the situation, just pointing to a possible cause. Actally, this sort of thing has had me thinking about adding a shell builtin called "rm", which checks all of it's (un expanded) arguments to see if any are "*", asks for confirmation if they are, then does the ususal path search (sorta like (ack, bletch, phooy) MessyDos). -- Paul
allan@didsgn.UUCP (allan) (11/30/87)
In article <1257@boulder.Colorado.EDU>, cdash@boulder.Colorado.EDU (Charles Shub) writes: > yesterday, i got bit by rm. I was remotely logged in to a system over a network > and had created a bunch of temp files. to delete them, i naturally typed in > "rm t*" only the %$*#&^#@ network managed to drop the "t" and you all know what > happened then. ... > The point is that there are two things a command interface could do: > 1) protect us from our own stupidity (i'm not convinced it should) > 2) protect us from "extended system" errors like dropping a character > but i'm not sure how you separate the two. Isn't this a classic reliability problem for the network? Your "extended system" problem is really a faulty network problem. If your network (what type is it?) had supported reliable transfers, it would have detected the lost "t". Allan G. Schrum ..!gatech!rebel!didsgn!allan
roger@celtics.UUCP (Roger B.A. Klorese) (12/02/87)
In article <9555@mimsy.UUCP> chris@mimsy.UUCP (Chris Torek) writes: >Why is it that people have so much trouble figuring out how to remove >files whose names start with `-'? Because, unless one knows and fully understands that globbing is done by the shell and not the program, one would expect, as with other operating systems, that the process is: - pick up the command and its options from the entered command - pick up the filenames to which the command is to be applied from the supplied filenames, expanding wildcards if necessary In fact, the actual process, which is (basically) - expand all wildcards - execute the full command line as it appears, with the command plucking its options from the expanded command line ...is counterintuitive... after all, an option is an option, and a filename a filename... (or so it goes) -- ///==\\ (Your message here...) /// Roger B.A. Klorese, CELERITY (Northeast Area) \\\ 40 Speen St., Framingham, MA 01701 +1 617 872-1552 \\\==// celtics!roger@necntc.nec.com - necntc!celtics!roger
gwyn@brl-smoke.ARPA (Doug Gwyn ) (12/02/87)
In article <1890@celtics.UUCP> roger@celtics.UUCP (Roger B.A. Klorese) writes: >Because, unless one knows and fully understands that globbing is done by >the shell and not the program, one would expect, as with other operating >systems... I guess this adds support to the notion that the normal UNIX shell interface is not suitable for naive users. Back when I was first learning to use UNIX, nobody considered it unreasonable to read the UNIX tutorial contained near the beginning of the user documents that came with the system. I'm sure that the notion that * and ? were processed by the shell was made clear there. In fact that is a key "win" of UNIX over OSes that make applications deal with globbing.
chris@mimsy.UUCP (Chris Torek) (12/02/87)
>In article <9555@mimsy.UUCP> I asked: >>Why is it that people have so much trouble figuring out how to remove >>files whose names start with `-'? In article <1890@celtics.UUCP> roger@celtics.UUCP (Roger B.A. Klorese) writes: >one [might] expect, as with other operating systems, that the process is: > >- pick up the command and its options from the entered command >- pick up the filenames to which the command is to be applied from the > supplied filenames, expanding wildcards if necessary Even if naive users do hold this (incorrect) belief, why, after discovering the file called `-b' in the current directory and trying to `rm -b' and then `rm -*' and `?b', do these people not think `grr, what a stupid rm, it expands the ?b and *then* looks to see if it has an option ... how do I tell the <censored> thing not to do that?' Experience tells me that they *do* think this, which makes it the basic problem worse yet. In an attempt to get them to learn something about the system, I sometimes answer with this: 0. Every file has more than one name. Tell me another name for the file `-b' in your current directory. It has yet to work, so I expand a bit: 1. Where is your home directory? After a small delay, I usually get the answer `~'. 2. Now what is another name for the file `-b' in your home directory? Sometimes I have to supply the answer for this, too: 3. One is ~/-b. Notice anything special about this name versus that other one? Most of them catch on at this point, but sometimes it takes two more questions: 4. What is the first character of each of those two names? 5. What character precedes rm options? At the end of the six-questions game, they have all figured it out. Do they learn something? I wish I could tell. . . . -- In-Real-Life: Chris Torek, Univ of MD Comp Sci Dept (+1 301 454 7690) Domain: chris@mimsy.umd.edu Path: uunet!mimsy!chris
ok@quintus.UUCP (Richard A. O'Keefe) (12/03/87)
In article <1890@celtics.UUCP> roger@celtics.UUCP (Roger B.A. Klorese) writes: > >Because, unless one knows and fully understands that globbing is done by > >the shell and not the program, one would expect, as with other operating > >systems... > In article <6774@brl-smoke.ARPA>, gwyn@brl-smoke.ARPA (Doug Gwyn ) writes: > In fact that is a key "win" of UNIX over OSes that make applications deal > with globbing. Have you ever used TOPS-10? That was a system where globbing was done by the program, not the shell. Result? No two programs had exactly the same syntax for file names (some would let you quote strange characters by writing octal, some wouldn't, some allowed directories, some didn't, &c &c). And of course user-programs and commands HAD to use different syntax... Doug Gwyn is absolutely right: doing file-name expansion in the shell so that EVERY command does it EXACTLY the same way is wonderful. If rm did its own wild-carding, you'd STILL have the "rm *" problem (after all, someone might *mean* that) and you'd have the additional problem of not being quite sure what wild-carding it accepted. (Ever tried a System V where some utilities did Berkeleyish ~user expansion, but nothing else did? Uniformity!) Here is another good way to lose files: restore a tar tape into the wrong directory. Wild-cards? What wild-cards? It's a really good way to lose files, because it looks as though you still have them... With respect to the author of PDTAR, sometimes absolute file names are exactly the right thing to have on a tar tape.
roger@celtics.UUCP (Roger B.A. Klorese) (12/03/87)
In article <337@cresswell.quintus.UUCP> ok@quintus.UUCP (Richard A. O'Keefe) writes: |In article <1890@celtics.UUCP> roger@celtics.UUCP (Roger B.A. Klorese) writes: |> >Because, unless one knows and fully understands that globbing is done by |> >the shell and not the program, one would expect, as with other operating |> >systems... |> |In article <6774@brl-smoke.ARPA>, gwyn@brl-smoke.ARPA (Doug Gwyn ) writes: |> In fact that is a key "win" of UNIX over OSes that make applications deal |> with globbing. | |Have you ever used TOPS-10? That was a system where globbing was done by |the program, not the shell. Result? No two programs had exactly the |same syntax for file names (some would let you quote strange characters |by writing octal, some wouldn't, some allowed directories, some didn't, |&c &c). And of course user-programs and commands HAD to use different |syntax... Doug Gwyn is absolutely right: doing file-name expansion in |the shell so that EVERY command does it EXACTLY the same way is wonderful. Have *you* ever used Multics or PRIMOS? The decision of which types of pattern matches are expanded by the command interpreter and which by the program can be set in the linking process. My point, really, is that we need some way of determining in a command line which patterns are *filenames* - these should be expanded by the shell - and which are *(option patterns, network-node wildcards, etc.)* - things which cannot be expanded or pattern-matched against a directory, and should be passed to the program for expansion. -- ///==\\ (Your message here...) /// Roger B.A. Klorese, CELERITY (Northeast Area) \\\ 40 Speen St., Framingham, MA 01701 +1 617 872-1552 \\\==// celtics!roger@necntc.nec.com - necntc!celtics!roger
barmar@think.UUCP (12/04/87)
In article <1895@celtics.UUCP> roger@celtics.UUCP (Roger B.A. Klorese) writes: >In article <337@cresswell.quintus.UUCP> ok@quintus.UUCP (Richard A. O'Keefe) writes: >|In article <1890@celtics.UUCP> roger@celtics.UUCP (Roger B.A. Klorese) writes: >|> >Because, unless one knows and fully understands that globbing is done by >|> >the shell and not the program, one would expect, as with other operating >|> >systems... >|> >|In article <6774@brl-smoke.ARPA>, gwyn@brl-smoke.ARPA (Doug Gwyn ) writes: >|> In fact that is a key "win" of UNIX over OSes that make applications deal >|> with globbing. >| >|Have you ever used TOPS-10? That was a system where globbing was done by >|the program, not the shell. Result? No two programs had exactly the >|same syntax for file names (some would let you quote strange characters >|by writing octal, some wouldn't, some allowed directories, some didn't, >|&c &c). And of course user-programs and commands HAD to use different >|syntax... Doug Gwyn is absolutely right: doing file-name expansion in >|the shell so that EVERY command does it EXACTLY the same way is wonderful. > >Have *you* ever used Multics or PRIMOS? The decision of which types of >pattern matches are expanded by the command interpreter and which by the >program can be set in the linking process. Not QUITE correct, at least in the Multics case. Wildcards are never interpreted by the command processor. If a command wishes to treat an argument as a pathname it calls expand_pathname_ on it (this translates relative paths to absolute paths, as the system calls only accept absolute paths) and if wildcard expansion is appropriate then it calls hcs_$star_ (or one of a few variants). To Roger: If TOPS-10 doesn't provide a library that allows all commands to operate consistently, that is a TOPS-10 problem, but don't assume that all systems that don't expand in the shell are just like TOPS-10. You complained about some programs allowing directories and some not. On both Unix and Multics users are required to use different commands to delete files and directories. It would make sense that in "rm *" the * should not match directories, and "rmdir *" should not match files. I know this isn't really a good answer, but on Multics it is possible to make any command that doesn't accept wildcards do so, using command line functions. If a command doesn't do wildcard processing, you can do: command [files <wildcard>] (the [...] syntax on Multics is similar to Unix's `...`); the files function returns the pathnames of files that match the wildcard. Yes, I realize that this can be done on Unix using `find ...`. --- Barry Margolin Thinking Machines Corp. barmar@think.com seismo!think!barmar
denbeste@bgsuvax.UUCP (12/04/87)
In article <9555@mimsy.UUCP> I asked: >Why is it that people have so much trouble figuring out how to remove >files whose names start with `-'? in article <9593@mimsy.UUCP>, chris@mimsy.UUCP (Chris Torek) says: .. What else could you call the file to pacify rm ... > 3. One is ~/-b. Notice anything special about this name > versus that other one? I like this one. It is much general than the method I use. I tell then to type: % man rm and notice that rm has a (null) option, -, that causes it to not treat args beginning with a - as a switch, so all you have to do is type: % rm - -b Of course, this one only works for rm, whereas Chris's will work for any program. I would be tempted to use ./-b myself, since it is independant of where you are. --- William C. DenBesten | CSNET denbeste@research1.bgsu.edu Dept of Computer Science | UUCP ...!cbosgd!osu-cis!bgsuvax!denbeste Bowling Green State University | Bowling Green, OH 43403-0214 |
gwyn@brl-smoke.ARPA (Doug Gwyn ) (12/05/87)
In article <1413@bgsuvax.UUCP> denbeste@bgsuvax.UUCP (William C. DenBesten) writes: >and notice that rm has a (null) option, -, that causes it to not treat args >beginning with a - as a switch, so all you have to do is type: Great, another violation of the command syntax standard, and one that is incompatible with previous practice as well. "-" traditionally has been shorthand for the standard input, as in myprog | cat header - trailer | troff | dimp On sane systems, "--" on practically ANY command marks the end of the option arguments. It is much better to have a universal rule than a special hack in a particular command.
ggs@ulysses.homer.nj.att.com (Griff Smith) (12/06/87)
In article <12726@think.UUCP>, barmar@think.UUCP writes: ... > >In article <337@cresswell.quintus.UUCP> ok@quintus.UUCP (Richard A. O'Keefe) writes: > >| > >|Have you ever used TOPS-10? That was a system where globbing was done by > >|the program, not the shell. Result? No two programs had exactly the > >|same syntax for file names (some would let you quote strange characters > >|by writing octal, some wouldn't, some allowed directories, some didn't, > >|&c &c). > > If TOPS-10 doesn't provide a library that allows all commands to > operate consistently, that is a TOPS-10 problem, but don't assume that > all systems that don't expand in the shell are just like TOPS-10. > > Barry Margolin > Thinking Machines Corp. As a veteran of the TOPS-10 SCAN/WILD wars, I agree with the first complaint. When command parsing is done in the application program there is too much opportunity for "creativity". When it's done in the shell it may be less beautiful, but it's more likely to be consistent. DEC did have a library that "simplified" directory scanning ("WILD") and command arg parsing ("SCAN"). It required the user to build tables of expected arguments, default values, etc. Some people took the trouble to learn how to use it, many gave up and did their own quick and dirty versions. Among my associates it was a badge of honor to have written a few command scanners ("I'll look at yours if you'll admire mine"). The net result was confusion, just as with UNIX System commands that don't follow the conventions. On the plus side, commands that did use SCAN/WILD were much more robust than most UNIX commands; none of this crap about passing "junk" to atoi() and silently getting zero. -- Griff Smith AT&T (Bell Laboratories), Murray Hill Phone: 1-201-582-7736 UUCP: {allegra|ihnp4}!ulysses!ggs Internet: ggs@ulysses.uucp
bzs@bu-cs.bu.EDU (Barry Shein) (12/06/87)
I think we've just come back to the age-old problem of "standards vs innovation". At what point will the shell no longer be the shell and we may as well just leave the old one (and its documentations) intact and go on to the Nsh or whatever we want to call it. This is essentially what the Korn shell is, its intense resemblence to previous shells is merely a design decision rather than a mandate the author was required to work with (ie. a modifier of the bourne shell should probably feel much more pressure for backward compatability before offering a replacement rather than an alternative.) I could certainly see Roger Klorese's idea which he brings in from Multics implemented in a Macintoshian sort of way with resource files, a one line text pattern a la getopt would probably suffice for starters. A program (shell) with such facilities could then be distributed and the community might decide. That's all the ksh did and it seems to have been quite successful in attracting fans. I don't see any other magic way to introduce features like this, nothing gets incorporated into standards for Unix without first coming into common use, this isn't OSI where one gets a promise from everyone in the world to remove the risk inherent before any implementation occurs, I think many of us question that sort of approach. Of course, sounding out ideas for reaction is certainly not harmful, but its value is somewhat muted, it's hard to intelligently judge the utility of software without being able to actually put it to use. -Barry Shein, Boston University
mikep@ism780c.UUCP (Michael A. Petonic) (12/06/87)
In article <1895@celtics.UUCP> roger@celtics.UUCP (Roger B.A. Klorese) writes: >My point, really, is that we need some way of determining in a command >line which patterns are *filenames* - these should be expanded by the >shell - and which are *(option patterns, network-node wildcards, etc.)* - >things which cannot be expanded or pattern-matched against a directory, >and should be passed to the program for expansion. Hey, wait a minute. This is easily accomplished... Use the backslash. I know, I know, the backslash is usually located out of the way, but that's a wimpy excuse. I couldn't see adding a "feature" like smart expansion and thoroughly modifying the shell (and possibly the kernel) just to make things slightly simpler for a select few. The rest of us are used to globbing in the shell and use the backslash when we don't want globbing. No problem. -MikeP -------- Michael A. Petonic (213) 453-8649 x3247 INTERATIVE Systems Corporation "My opinions in no way influences 2401 Colorado Blvd. the price of tea in China." Santa Monica, CA. 90404 {sdcrdcf|attunix|microsoft|sfmin}!ism780c!mikep
roger@celtics.UUCP (12/08/87)
In article <8145@ism780c.UUCP> mikep@ism780c.UUCP (Michael A. Petonic) writes: |In article <1895@celtics.UUCP> roger@celtics.UUCP (c'est moi) writes: |>My point, really, is that we need some way of determining in a command |>line which patterns are *filenames* - these should be expanded by the |>shell - and which are *(option patterns, network-node wildcards, etc.)* - |>things which cannot be expanded or pattern-matched against a directory, |>and should be passed to the program for expansion. | |Hey, wait a minute. This is easily accomplished... Use the |backslash. I know, I know, the backslash is usually located out |of the way, but that's a wimpy excuse. | | |I couldn't see adding a "feature" like smart expansion and thoroughly |modifying the shell (and possibly the kernel) just to make things |slightly simpler for a select few. The rest of us are used to |globbing in the shell and use the backslash when we don't want |globbing. No problem. Yeah, but the question is not when the *user* doesn't want globbing, but when globbing by the shell is *inappropriate* to the command, and the user expects wildcards to do pattern matches against *the appropriate pool of selections*, not against *filenames*. Explain to a user, please, why the user can get a list of filenames beginning with "cel" by typing ls cel* ...but in order to get a list of adjacent network nodes beginning with "cel", using the hypothetical "netlist" command, the user must type netlist cel\* ...makes no sense to me. (An even better example would be installations of products like Technology Concepts' "CommUnity" (alias Celerity's Accelnet/DNI). Why can I list local and NFS'd files using "ls cel*" but can only list files on a DECnet-connected system using "dnals cel\*"?) -- ///==\\ (Your message here...) /// Roger B.A. Klorese, CELERITY (Northeast Area) \\\ 40 Speen St., Framingham, MA 01701 +1 617 872-1552 \\\==// celtics!roger@necntc.nec.com - necntc!celtics!roger
jc@minya.UUCP (John Chambers) (12/12/87)
> Explain to a user, please, why the user can get a list of filenames beginning > with "cel" by typing > ls cel* > ...but in order to get a list of adjacent network nodes beginning with "cel", > using the hypothetical "netlist" command, the user must type > netlist cel\* > ...makes no sense to me. Nor to me. Ane there are "network" Unix systems where it works. For example, with the Newcastle Connection, you can find out about network nodes at the same level as your system by typing: ls /../* Other wildcard expansions work similarly, because "/../" is implemented as a normal directory. (It is actually a special file, of course, within which is hidden the network.) On a more general note, part of the problem is the widespread violation of the object-oriented design of Unix. Objects are, of course, called "files", and operators are called "processes". If you have a set of objects, you make them a set by linking them into a directory. You can then use various operators (such as ls or find or wildcard expansion) to extract subsets. There is no reason that this can't be used for network nodes, as it can be used for programs, directories, source files, object files, disk drives, or anything else whose name can be entered in a directory. Instead, most network implementations are kludges that violate this useful design, usually by introducing some sort of special syntax for network nodes (host:file, host!user, something@host, etc.) which the standard Unix library programs don't understand, and which doesn't follow the model of a tree of directories. Then we go through the process of "reinventing the wheel", trying all sorts of ways to use a bad design, when we already had one that works well. It's really another case of Henry Spencer's .signature: Those who don't understand Unix are re-inventing it, poorly. Eventually, people might realize that networked systems, like multi-disk file systems, should simply be combined in the same heirarchy that looks the same from any vantage point. Anything else (like Sun's NFS) is simply an interim kludge that interferes with effective applications of simple tools. -- John Chambers <{adelie,ima,maynard,mit-eddie}!minya!{jc,root}> (617/484-6393)
reggie@pdn.UUCP (George W. Leach) (12/16/87)
In article <431@minya.UUCP>, jc@minya.UUCP (John Chambers) writes: [ discussion of how wildcard characters break with most network access to files on remote machines] > > ...makes no sense to me. > Nor to me. Ane there are "network" Unix systems where it works. For example, > with the Newcastle Connection, you can find out about network nodes at the > same level as your system by typing: > ls /../* > Other wildcard expansions work similarly, because "/../" is implemented as > a normal directory. (It is actually a special file, of course, within which > is hidden the network.) > On a more general note, part of the problem is the widespread violation of > the object-oriented design of Unix. Objects are, of course, called "files", > and operators are called "processes"............ [ text deleted here.... discussion of how most networking solutions are a kludge that break the model of UNIX, where any and all operations that act upon files are not necessarily valid across a network connection ] The Eight Edition UNIX addressed this problem as well. While at Bellcore I worked on a machine that was part of a collection of machines runninv v8, all connected to a Datakit Switch, that formed a natural hierarchy above the various machines: /n/indra/i5/reggie, would access my home directory from one of ther other machines on the switch, eg. koura, vishnu, matha, ..... Although we were disconnected after divestiture, there also existed a network of these datakit switches. The file system hierarchy was extended to provide names for the site (eg. mh for Murray Hill) and the particular lab in which the desired machine resided. For example, to reach my brother at Bell Labs in Liberty Corners, NJ the path would have been: /lc/garage/pierce/jpl2. For more information on the v8 file system, the naming problem with networks of UNIX machines, and other similar solutions (eg. The Newcastle Connection) and problems (eg. The IBIS remote file system on 4.2 BSD) see Rob Pike, and P.J. Weinberger The Hideous Name Summer 1985 USENIX Conference Proceedings Portland, Oregon, June 11-14, 1985, pp. 563-568. Perhaps someone at AT&T could expand upon this theme and describe the current scheme. -- George W. Leach Paradyne Corporation {gatech,rutgers,attmail}!codas!pdn!reggie Mail stop LF-207 Phone: (813) 530-2376 P.O. Box 2826 Largo, FL 34649-2826