barmar@think.COM (Barry Margolin) (12/02/87)
In article <6774@brl-smoke.ARPA> gwyn@brl.arpa (Doug Gwyn (VLD/VMB) <gwyn>) writes: >In fact that is a key "win" of UNIX over OSes that make applications deal with >globbing. Ahah, now you've hit one of my favorite complaints about Unix. I do NOT think it is such a win that wildcard expansion is done by the shell, at least not when it is done in the haphazard style that Unix shells use. It assumes that all commands take filenames as arguments, and that any argument with wildcard characters is supposed to be a filename. A very common counterexample is grep. Its first argument will often contain wildcard characters, for example grep "foo.*bar" <files> I wonder how many new users get screwed when they forget to quote the first argument and it says "No match" so they assume that none of the files contain the pattern (I think the Bourne shell "solved" this problem by making unmatched tokens expand into themselves, but the C shell just aborts the line). Other commands may want to take wildcards, although not necessarily to match filenames; for example who bar* should list all the logged-in users whose names begin with "bar" (equivalent to "who | grep '^bar'"). It should be up to the command to decide the appropriate context for treating arguments as pathnames and performing wildcard expansion. This way, a command that knows it is dangerous, such as rm, can check whether it was called with a wildcard and perhaps be more careful. On Multics, the delete command does exactly this, querying "Are you sure you want to 'delete **' in <directory>?" unless -force is specified. Globbing in the shell also severely limits the syntax of commands; I will admit that this could be seen as a benefit, because it forces conformity, but sometimes a minor syntax change can be useful. For example, there's no way to write a version of the cp or mv commands that takes an alternating list of source and destination pathnames, where the source pathnames are permitted to have wildcards. You also can't do something like Multics's rename foo.** foo.bar.== (the == is replaced by whatever the ** matched) without writing a complicated script that used grep and sed on the output of ls. Finally, even when an argument is a pathname, it is sometimes not allowed to be multiple files. For example, diff takes pathnames, but it requires exactly two of them, and ar allows only one archive pathname to be specified. On Multics, a command with a syntax like this can check whether the argument contains wildcards and complain. Diff can check that it received exactly two pathnames, but it won't know whether this is simply because one wildcard happened to match exactly two files (maybe this was intentional on the user's part, but maybe it wasn't), and ar will simply treat the extra arguments as member files. So does this mean that globbing MUST be done by the commands themselves? Well, yes and no. This is how it is done on Multics, although the actual matching is done by a system call for filenames (for efficiency, since Multics directories are not directly readable by user-mode code, so it saves lots of data copying) and by a library subroutine for non-filenames. Some more modern systems allow commands to provide information to the command processor that tell it how to do the automatic parsing; in this case, this data would specify which arguments are pathnames that allow wildcards, and the command processor would automatically perform the expansion in the right cases. --- Barry Margolin Thinking Machines Corp. barmar@think.com seismo!think!barmar
gandalf@russell.STANFORD.EDU (Juergen Wagner) (12/03/87)
What you need may be something like the TOPS-20 style of interactive help facility. For many commands you can just hit the '?' key to get some hint on what is supposed to come next. This, however, requires some kind of interface to the program to be invoked. Encountering a line % foo -l frob followed by typing a special character (e.g. '?') your shell could fork "foo -help -l frob" or something similar. This requires a convention on how to request this piece of help information from the program. Another way (much simpler) is to do it in Macintosh-style: for each program keep a (canned text) help file around in a special subdirectory, which is typed out when the help char is encountered. I reckon, either of these facilities is not too difficult to built into existing shells. Juergen Wagner, gandalf@Russell.stanford.edu Center for the Study of Language and Information (CSLI), Stanford CA
gandalf@russell.STANFORD.EDU (Juergen Wagner) (12/03/87)
Sorry, I didn't come to the point in my last message. Of course, having this message passing to a program, the shell could get much more information on how to parse the command line arguments. Sorry, --Juergen
gwyn@brl-smoke.ARPA (Doug Gwyn ) (12/03/87)
In article <12441@think.UUCP> barmar@sauron.UUCP (Barry Margolin) writes: >For example, there's no way to write a version of the cp or mv commands >that takes an alternating list of source and destination pathnames, $ apply -2 cp src1 dst1 src2 dst2 src3 dst3 or $ echo 'src1 dst1 src2 dst2 src3 dst3' | while read s d; do cp $s $d; done >where the source pathnames are permitted to have wildcards. If you allow this, then how do you ensure 1-1 correspondence between sources and destinations? Or are you assuming the destinations must be directories? If so, then a modification of the second method will work, if the wildcard arguments are quoted and "eval" is used to do the cp. >You also can't do something like Multics's > > rename foo.** foo.bar.== > >(the == is replaced by whatever the ** matched) without writing a >complicated script that used grep and sed on the output of ls. $ for i in foo.*; do mv $i `echo $i | sed 's/foo/&.bar/'`; done Notice that with this approach I can perform types of renaming that are well beyond the built-in ability of Multics's "rename". If I had to do any of these operations very often, I would make scripts or shell functions rather than type them in by hand each time. The advantage of the UNIX "toolkit" approach to such issues is that the user of a command like "cp" is not limited to what the command's designer was able to anticipate. I actually do things analogous to my last example routinely, interactively, in ways that I am sure would never have been built into any utility command. I also encounter "integrated" utilities that attempt to offer all the functionality everyone could ever want, and somehow they usually don't seem to be able to do what I need. The price paid for UNIX's flexibility is that one needs to learn how to use the tools effectively. But I would think that the expert user would want to learn that anyway.
paulsc@orca.UUCP (12/04/87)
I remember a couple people on "system staff" at one site talked about adding a "command line" argument to main(). Something like: int main(int argc, char *argv[], char *command_line); They sounded serious and had the authority to do it. I never found out, if they actually tried it or not. The "new" exec*() system call(s) could take an extra argument (a pointer to the command line string). The "old" exec*() system call(s) could supply an empty string or NULL pointer or some other reasonable facsimile for a command line. The programs that make the most difference could be fixed first (the shells, rm, ...). Programs that didn't care about the command line could be left alone. Other programs could wait, until you got to them. In particular, rm could check command_line for magic characters and behave in a more friendly manner. Your special "fixed" programs wouldn't be directly portable to machines without your special kernel, but I think the differences would be managable as "#ifdef"'s. (This would have been on PDP11's and VAX780's, so there would not be problems with programs that didn't know about the extra argument to main().) I won't say it would be easy, but it certainly sounds doable. The above steps could be done one at a time, without leaving you with an unusable system in-between. On many machines, programs that don't know about the extra argument wouldn't be affected. I think it sounds like an interesting experiment. If someone hasn't already done it, it would probably also make a nice USENIX paper. Paul Scherf, Tektronix, Box 1000, MS 61-033, Wilsonville, OR, USA paulsc@orca.GWD.Tek.COM tektronix!orca!paulsc
chris@mimsy.UUCP (Chris Torek) (12/04/87)
One suggestion I have heard, which makes some sense to me, would be to have the shell provide an environment variable giving the exact command line used to invoke any particular command: Yes master? rm a.out core *.o *.s * .i ----- oops, a tab! would run `rm' with "CMDLINE=rm a.out core *.o *.s *\t.i" in its environment. Programs that deem themselves potential troublemakers (rm) could then look at the original command line. Programs that wanted special globbing (grep) could do their own command line parsing, assuming that `glob' was provided as a separate program or a library routine. Programs that do not care (all the rest) would ignore $CMDLINE. -- In-Real-Life: Chris Torek, Univ of MD Comp Sci Dept (+1 301 454 7690) Domain: chris@mimsy.umd.edu Path: uunet!mimsy!chris
howie@cunixc.UUCP (12/04/87)
In article <889@russell.STANFORD.EDU> gandalf@russell.Stanford.edu (Juergen Wagner) writes: >What you need may be something like the TOPS-20 style of interactive help >facility. For many commands you can just hit the '?' key to get some hint >on what is supposed to come next. This, however, requires some kind of >interface to the program to be invoked. Encountering a line > % foo -l frob >followed by typing a special character (e.g. '?') your shell could fork > "foo -help -l frob" We have developed a package which runs under BSD, sysV, and msdos, which implements tops-20 like help and command completion. I gave a short work in progress talk on it at the summer Usenix. See the September ;login: for a description. We have started writing various applications, the largest of which is like the tops-20 MM (mail manager) program. I think some work has gone towards a shell also, though that hasn't progressed too far. The shell would really need to be rewritten from scratch, and we don't have the time. ------------------------------------------------------------ Howie Kaye howie@columbia.edu Columbia University hlkcu@cuvma.bitnet Systems Group ...!rutgers!columbia!howie
gwyn@brl-smoke.ARPA (Doug Gwyn ) (12/04/87)
In article <9610@mimsy.UUCP> chris@mimsy.UUCP (Chris Torek) writes: > Yes master? rm a.out core *.o *.s * .i >would run `rm' with > "CMDLINE=rm a.out core *.o *.s *\t.i" >in its environment. What about rm `find . -name '*.o' -print` That wouldn't do a hell of a lot of good in CMDLINE. If you argue then that the "expanded" command line should be in CMDLINE, well, how does that differ from the argv array? Nope, the UNIX shell approach does this right. "User-friendly" interfaces can do what they want; as I've said before, it is unwise to try to make the UNIX shell environment into a naive-user interface.
eichin@athena.mit.edu (Mark W. Eichin) (12/05/87)
Rather than > int main(int argc, char *argv[], char *command_line); I believe the un!x main is already of the form int main(int argc, char *argv[], char *environ[]); (this is from the 4.3BSD man page for execl(3), rewritten as a prototype) Since we are talking about `system people' I assume we are not concerned with ANSI but with existing stuff. Why break anything, when you could just pass an environment variable in? This would mean a modified exec, by wrapping something around it in the C library... then the child could do a getenv("INVOCATION"), and you could perhaps use this in shell scripts too. In fact... execve(name, argv, envp) char *name, *argv[], *envp[]; /* from execve(2) man page */ { setenv(envp, "INVOCATION", name); return(real_execve(name, argv, envp)); } is all you would need [the syntax for setenv should be obvious and the code is left as an exercise for the reader :-] to experiment with the idea. Mark Eichin <eichin@athena.mit.edu> Disclaimer: The opinions and indenting style in this posting are mine, and not those of MIT or Project Athena. Of course, you can't prove they are mine, either... scary, isn't it.
gwyn@brl-smoke.ARPA (Doug Gwyn ) (12/05/87)
In article <2434@orca.TEK.COM> paulsc@orca.UUCP (Paul Scherf) writes: > int main(int argc, char *argv[], char *command_line); Please don't violate system interface standards like this. Alternative methods are available. Thank you.
allbery@ncoast.UUCP (12/05/87)
As quoted from <12441@think.UUCP> by barmar@think.COM (Barry Margolin): +--------------- | In article <6774@brl-smoke.ARPA> gwyn@brl.arpa (Doug Gwyn (VLD/VMB) <gwyn>) writes: | >In fact that is a key "win" of UNIX over OSes that make applications deal with | >globbing. | | Ahah, now you've hit one of my favorite complaints about Unix. | | I do NOT think it is such a win that wildcard expansion is done by the | shell, at least not when it is done in the haphazard style that Unix | shells use. It assumes that all commands take filenames as arguments, | and that any argument with wildcard characters is supposed to be a | filename. +--------------- A counterargument is taht your examples are contrary to the spirit of Unix; in particular, allowing "who" to perform globbing on user names isn't a whole lot different from "ls -[A-Za-z]". +--------------- | Finally, even when an argument is a pathname, it is sometimes not | allowed to be multiple files. For example, diff takes pathnames, but | it requires exactly two of them, and ar allows only one archive | pathname to be specified. On Multics, a command with a syntax like | this can check whether the argument contains wildcards and complain. +--------------- So, how many times have you abbreviated a filename which you know is unique with a well-placed "*"? I do it all the time, and I'd get mighty upset if the shell were changed such that this wouldn't work in, for example, "cd". As for your "smart shell" -- having worked on one, I can say that it's not the best of solutions. What should the default case be for a new command? And have you *any* idea how much work it is to build command descriptions for all the regular commands? Not to mention describing the idiosyncracies of various non-conformant commands (such as "tail -/[0-9]*[cbl]?f?/", or (perhaps especially) "dd")? -- Brandon S. Allbery necntc!ncoast!allbery@harvard.harvard.edu {hoptoad,harvard!necntc,cbosgd,sun!mandrill!hal,uunet!hnsurg3}!ncoast!allbery Moderator of comp.sources.misc
stpeters@dawn.steinmetz (Dick St.Peters) (12/05/87)
In article <9610@mimsy.UUCP> chris@mimsy.UUCP (Chris Torek) writes: >One suggestion I have heard, which makes some sense to me, would >be to have the shell provide an environment variable giving the >exact command line used to invoke any particular command: I like this! (It pains me to say it, but I think you can get the unglobbed command line in VMS: lib$get_command or some such.) -- Dick St.Peters GE Corporate R&D, Schenectady, NY stpeters@ge-crd.arpa uunet!steinmetz!stpeters
mikep@ism780c.UUCP (Michael A. Petonic) (12/06/87)
In article <9610@mimsy.UUCP> chris@mimsy.UUCP (Chris Torek) writes: >One suggestion I have heard, which makes some sense to me, would >be to have the shell provide an environment variable giving the >exact command line used to invoke any particular command: > > Yes master? rm a.out core *.o *.s * .i > ----- > oops, a tab! > >would run `rm' with > > "CMDLINE=rm a.out core *.o *.s *\t.i" > >in its environment. Programs that deem themselves potential >troublemakers (rm) could then look at the original command line. >Programs that wanted special globbing (grep) could do their own >command line parsing, assuming that `glob' was provided as a separate >program or a library routine. Programs that do not care (all the >rest) would ignore $CMDLINE. Ahhh, looks good, initially. But suppose the guy typed: rm /tmp/* .i ---- a real wanted tab and really did want to remove everything out of /tmp and also ".i" in the current directory? What would RM do? Do a y/n query? I guess this wouldn't be too much of a hassle if in interactive mode, but what happens if rm is being run in a shellscript and that's exactly what he wanted. Would RM have to have a special case if it was running in a shell script or in the background? The way I feel about it, introducing "intelligence" in standard utilities beyond what was originally there tends to cloud up UNIX. If one keeps up this "creeping featurism", then one runs the risk of having sooo many exceptions to general rules that UNIX becomes unwieldy. -MikeP -------- Michael A. Petonic (213) 453-8649 x3247 INTERATIVE Systems Corporation "My opinions in no way influences 2401 Colorado Blvd. the price of tea in China." Santa Monica, CA. 90404 {sdcrdcf|attunix|microsoft|sfmin}!ism780c!mikep
wesommer@athena.mit.edu.UUCP (12/07/87)
In article <6356@ncoast.UUCP> allbery@ncoast.UUCP (Brandon Allbery) writes: >In article <12441@think.UUCP> by barmar@think.COM (Barry Margolin) writes: >> Finally, even when an argument is a pathname, it is sometimes not >> allowed to be multiple files. For example, diff takes pathnames, but >> it requires exactly two of them, and ar allows only one archive >> pathname to be specified. On Multics, a command with a syntax like >> this can check whether the argument contains wildcards and complain. >So, how many times have you abbreviated a filename which you know is unique >with a well-placed "*"? Rarely; it all depends on what you are used to. On unix, using tcsh, I hit 'TAB', and have the shell or emacs complete the filename for me. On Multics, segments generally have short add_names which are easy to type -- for example, ">user_dir_dir>Multics>Margolin" is otherwise known as ">udd>m>barmar". Bill Sommerfeld wesommer@athena.mit.edu
henry@utzoo.UUCP (Henry Spencer) (12/08/87)
> ...I think the Bourne shell "solved" this > problem by making unmatched tokens expand into themselves, but the C > shell just aborts the line... Well, if you will insist on using an obsolete (and non-standard) shell, them's the breaks. (only 1/2 :-)) -- Those who do not understand Unix are | Henry Spencer @ U of Toronto Zoology condemned to reinvent it, poorly. | {allegra,ihnp4,decvax,utai}!utzoo!henry
dcornutt@murphy.UUCP (Dave Cornutt) (12/08/87)
In article <12441@think.UUCP>, barmar@think.COM (Barry Margolin) writes: > In article <6774@brl-smoke.ARPA> gwyn@brl.arpa (Doug Gwyn (VLD/VMB) <gwyn>) writes: > >In fact that is a key "win" of UNIX over OSes that make applications deal with > >globbing. > > Ahah, now you've hit one of my favorite complaints about Unix. > > I do NOT think it is such a win that wildcard expansion is done by the > shell, at least not when it is done in the haphazard style that Unix > shells use. It assumes that all commands take filenames as arguments, > and that any argument with wildcard characters is supposed to be a > filename. > > A very common counterexample is grep. Its first argument will often > contain wildcard characters, for example > > grep "foo.*bar" <files> The problem here is that there are so many punctuation characters that are special to the shell that you have to get in the habit of quoting the pattern anyway, just to be safe. I do agree with you that the "No match" is confusing to novice users. But then, they shouldn't be using C shell anyway. > I wonder how many new users get screwed when they forget to quote the > first argument and it says "No match" so they assume that none of the > files contain the pattern (I think the Bourne shell "solved" this > problem by making unmatched tokens expand into themselves, but the C > shell just aborts the line). If you set "nonomatch", it behaves like the Bourne shell. This is documented; it's been in there at least since 4.2. > Other commands may want to take > wildcards, although not necessarily to match filenames; for example > > who bar* > > should list all the logged-in users whose names begin with "bar" > (equivalent to "who | grep '^bar'"). > > It should be up to the command to decide the appropriate context for > treating arguments as pathnames and performing wildcard expansion. I've worked on a couple of systems that did this. It sounds great in principle, but if every command interprets the wild card characters differently, you will start to go nuts trying to remember what a wild card character does with a particular command. Take VMS BACKUP for instance. The wild card means different things depending on whether it's in the first arg or the second, or in a SELECT option, or what direction the data is going it, the phase of the moon, etc. It winds up being a very difficult-to-remember syntax. On the other hand, with C shell, if I put an asterisk in an arg, I *know* what it will do. > This way, a command that knows it is dangerous, such as rm, can check > whether it was called with a wildcard and perhaps be more careful. On > Multics, the delete command does exactly this, querying "Are you sure > you want to 'delete **' in <directory>?" unless -force is specified. Barf! Gag! If I say 'rm *', I *mean* 'rm *'! Multics isn't exactly my model of an easy-to-use system. This is a little off the subject, though. I don't want to start the rm wars up again. I will just say that if you want a safe rm, there are plenty of ways to get one on Unix. On the other hand, if you want an "unsafe" rm in Multics, there is no convenient way to do it. I hate inflexible systems, especially ones that try to second-guess me. > Globbing in the shell also severely limits the syntax of commands; I > will admit that this could be seen as a benefit, because it forces > conformity, but sometimes a minor syntax change can be useful. For > example, there's no way to write a version of the cp or mv commands > that takes an alternating list of source and destination pathnames, > where the source pathnames are permitted to have wildcards. You also > can't do something like Multics's > > rename foo.** foo.bar.== > > (the == is replaced by whatever the ** matched) without writing a > complicated script that used grep and sed on the output of ls. > > Finally, even when an argument is a pathname, it is sometimes not > allowed to be multiple files. For example, diff takes pathnames, but > it requires exactly two of them, and ar allows only one archive > pathname to be specified. On Multics, a command with a syntax like > this can check whether the argument contains wildcards and complain. > Diff can check that it received exactly two pathnames, but it won't > know whether this is simply because one wildcard happened to match > exactly two files (maybe this was intentional on the user's part, but > maybe it wasn't), and ar will simply treat the extra arguments as > member files. You mean if I have two files named "foo.1xxx" and "foo.2yyy" and I want to diff them, you won't let me type "diff foo.*"? But I *want* to be able to do this! > So does this mean that globbing MUST be done by the commands > themselves? Well, yes and no. This is how it is done on Multics, > although the actual matching is done by a system call for filenames > (for efficiency, since Multics directories are not directly readable > by user-mode code, so it saves lots of data copying) and by a library > subroutine for non-filenames. VMS has a setup like this. It's an enormous pain in the ass. You have to call this RMS routine and give it the pattern, then keep calling another routine to get the names. And they aren't easy to use; there are all kinds of parameter blocks and things you have to set up. > Some more modern systems allow commands > to provide information to the command processor that tell it how to do > the automatic parsing; in this case, this data would specify which > arguments are pathnames that allow wildcards, and the command > processor would automatically perform the expansion in the right > cases. Again, VMS has just such a setup, and again, it's a pain in the ass. First of all, since the command parser has to have knowledge of what the commands are and what their syntax is, you have to load in a bunch of tables in order to set up new commands. (You can't just write a program and run it -- the only way to run something without setting up a syntax table is to use a RUN command, which does not allow parameter passing. And I challenge the notion that any system in which it is necessary to use a RUN command qualifies as a modern system.) The other thing is that, while you complain that sh and csh enforce a rigid command syntax, VMS DCL enforces an even more rigid one -- because so many things are special to the parser, and *there is no way to bypass them.* And the available syntax for command parsing (a programming language in itself) will never quite do exactly what you want. All in all, I think that it's not the way to go, because it removes flexibility and doesn't give you anything in return. Even the convenience of not having to parse the args yourself in your program is offset by the inconvenience of having to write a syntax table and load it into the DCL, and the gyrations that you have to go through to access the parsed parameters in the program. P.S.: in the column recently, I have seen a lot of talk about the virtues and failings of "the" UNIX user interface. My question is: what's this "the" stuff? Last time I checked, there was sh (Bourne shell, old and new), csh, ksh, tcsh, msh, and all manner of screen-oriented shell front ends. If you don't like any of these, you can install your own. That's one of the great virtues of UNIX! --- Dave Cornutt, Gould Computer Systems, Ft. Lauderdale, FL [Ignore header, mail to these addresses] UUCP: ...!{sun,pur-ee,brl-bmd,uunet,bcopen,rb-dc1}!gould!dcornutt or ...!{ucf-cs,allegra,codas,hcx1}!novavax!gould!dcornutt ARPA: dcornutt@gswd-vms.arpa "The opinions expressed herein are not necessarily those of my employer, not necessarily mine, and probably not necessary."
jjw@celerity.UUCP (Jim ) (12/09/87)
One thing I have not seen in all this discussion about "wildcard characters" in rm is whether anything is accomplished if rm uses interactive mode anytime it is invoked with wildcards? I assume that "-f" can be used to avoid interactive mode when large numbers of files are to be deleted and in shell scripts. I don't think this automatic interactive mode would have saved me from many (if any) cases where I deleted a file inadvertantly: I have deleted as many files I shouldn't have by typing the full name without fully engaging my brain as I have by using wild cards. If the system started spitting file names at me when I was deleting files I would probably just start hitting the "y" key without really watching the files. If I was only trying to delete one file using wildcards I would be likely to type "rm x*x \n y" without looking at the screen, thereby possibly deleting a file I wanted and retaining the one I wanted to delete. I would soon tire of having to always respond to a list of file names so I would either get in the habit of always typing "rm -f" or alias rm to "rm -f". Are others that much different from me? I have also lost many more files to disk failures and system failures than I ever have because wild cards are expanded in the shell. Proper backup procedures are required, and if they are in place the loss from an inadvertant file deletion is manageable. Those who are really worried about file loss can always create their own file deletion script/alias to stash the files away for a few days before final deletion. -- J. J. Whelan
root@cit5.oz (Steve Balogh) (12/09/87)
I have overcome the problem of accidently deleting a whole directory of files (after accidently doing it to my home directory ONCE only) by doing the following.... 1. I have set up a bin directory under my home directory which contains commands and programs which I use on a regular basis. 2. I have changed my PATH to include $HOME/bin as the FIRST entry (or at least BEFORE /bin ). 3. I have included a set of programs which simulates a trashcan by making a directory called $HOME/.TRASH and creating the following scripts: rm will do a mv from current directory to $HOME/.TRASH rummage will do an ls of $HOME/.TRASH scavange will retrieve an accidently rm'ed file empty will REALLY remove all files from $HOME/.TRASH 4. I automatically perform an "empty" command when logging out. (it could be said that this is dangerous, but it is a risk that I am willing to take so as to not make the .TRASH directory too big) It could be done manually on a regular basis. This system has saved many files which would have been accidently deleted and sometimes also serves as a place to put temporary files. (although this is not really recommended) I believe that the mv command only changes file pointers and does not actually copy a file, so even large files can be removed quickly with this method. I am sure that there are better and more exotic ways of solving the "rm *" problem but I find that the above works well for me. Steve. - - - - - (It's my opinion and not my employers) Steve Balogh VK3YMY | steve@cit5.cit.oz (...oz.au) Chisholm Institute of Technology | steve%cit5.cit.oz@uunet.uu.net PO Box 197, Caulfield East | Melbourne, AUSTRALIA. 3145 | {hplabs,mcvax,uunet,ukc}!munnari\ +61 3 573 2266 (Ans Machine) | !cit5.cit.oz!steve
allbery@ncoast.UUCP (Phil Smith) (12/11/87)
As quoted from <1975@bloom-beacon.MIT.EDU> by wesommer@athena.mit.edu (William Sommerfeld): +--------------- | In article <6356@ncoast.UUCP> allbery@ncoast.UUCP (Brandon Allbery) writes: | >In article <12441@think.UUCP> by barmar@think.COM (Barry Margolin) writes: | >So, how many times have you abbreviated a filename which you know is unique | >with a well-placed "*"? | Rarely; it all depends on what you are used to. On unix, using tcsh, | I hit 'TAB', and have the shell or emacs complete the filename for me. +--------------- "On unix"?! We run Unix on ncoast. No tcsh, no GNU Emacs (won't fit). AT&T UNIX System III is just as much Unix as 4BSD is... so I cope. The use of wildcards is the *only* way to do it here; that, or type the full pathname (even with only 14 characters max. per component, this can be painful). -- Brandon S. Allbery necntc!ncoast!allbery@harvard.harvard.edu {hoptoad,harvard!necntc,cbosgd,sun!mandrill!hal,uunet!hnsurg3}!ncoast!allbery Moderator of comp.sources.misc
jay@splut.UUCP (Jay Maynard) (12/14/87)
In article <9069@utzoo.UUCP>, henry@utzoo.UUCP (Henry Spencer) writes: [about a problem with the C shell] > Well, if you will insist on using an obsolete (and non-standard) shell, > them's the breaks. (only 1/2 :-)) What are those of us who run an SVR2 without the Korn shell available supposed to do for a history mechanism, then? I'm willing to put up with some gotchas to get the ability to easily re-enter a command line, possibly with some changes... -- Jay Maynard, K5ZC (@WB5BBW)...>splut!< | GEnie: JAYMAYNARD CI$: 71036,1603 uucp: {uunet!nuchat,academ!uhnix1,{ihnp4,bellcore,killer}!tness1}!splut!jay Never ascribe to malice that which can adequately be explained by stupidity. The opinions herein are shared by none of my cats, much less anyone else.
dhesi@bsu-cs.UUCP (Rahul Dhesi) (12/14/87)
In article <8111@steinmetz.steinmetz.UUCP> dawn!stpeters@steinmetz.UUCP (Dick St.Peters) writes: >(It pains me to say it, but I think you can get the >unglobbed command line in VMS: lib$get_command or some such.) In the VMS environment, "unglobbed" does not mean "ungarbled". Anything that isn't quoted, VMS changes to uppercase before you get to see it. Staunch VMS defenders consider this to be a highly desirable feature. Ahem. -- Rahul Dhesi UUCP: <backbones>!{iuvax,pur-ee,uunet}!bsu-cs!dhesi
gwyn@brl-smoke.ARPA (Doug Gwyn ) (12/16/87)
In article <288@splut.UUCP> jay@splut.UUCP (Jay Maynard) writes: >What are those of us who run an SVR2 without the Korn shell available >supposed to do for a history mechanism, then? Obviously, you send me a copy of your source license and a magtape and I send you back a SVR2 shell that does these things (and much more). They're trickling out of here slowly, because we don't have staff to deal with writing and mailing tapes, and I've been busy with other projects. However, another batch should be mailed this week.
allbery@ncoast.UUCP (Brandon Allbery) (12/21/87)
As quoted from <6843@brl-smoke.ARPA> by gwyn@brl-smoke.ARPA (Doug Gwyn ): +--------------- | In article <288@splut.UUCP> jay@splut.UUCP (Jay Maynard) writes: | >What are those of us who run an SVR2 without the Korn shell available | >supposed to do for a history mechanism, then? | | Obviously, you send me a copy of your source license and a magtape | and I send you back a SVR2 shell that does these things (and much | more). +--------------- Which answer begs for another question: What are those of us who run an SVR2 (or SVR3) without the Korn shell available AND WITHOUT A SOURCE LICENSE supposed to do for a history mechanism, then? Last time I asked something like this I was told that only cretinous people or businesses didn't buy source licenses. Kindly consider that not everyone has $50000 to blow on a source license, especially for a $5000 computer. (The 386 may be the best thing that ever happened to Un*x.) -- Brandon S. Allbery necntc!ncoast!allbery@harvard.harvard.edu {hoptoad,harvard!necntc,cbosgd,sun!mandrill!hal,uunet!hnsurg3}!ncoast!allbery Moderator of comp.sources.misc
gwyn@brl-smoke.ARPA (Doug Gwyn ) (12/22/87)
In article <6874@ncoast.UUCP> allbery@ncoast.UUCP (Brandon Allbery) writes: >Which answer begs for another question: What are those of us who run an SVR2 >(or SVR3) without the Korn shell available AND WITHOUT A SOURCE LICENSE >supposed to do for a history mechanism, then? You can either ask your system supplier, who is required to have a source license, to get this for you, or you can suffer.
jay@splut.UUCP (Jay Maynard) (12/23/87)
In article <6843@brl-smoke.ARPA>, gwyn@brl-smoke.ARPA (Doug Gwyn ) writes: > In article <288@splut.UUCP> jay@splut.UUCP (Jay Maynard) writes: [re: using csh, an "obsolete, non-compatible shell"] > >What are those of us who run an SVR2 without the Korn shell available > >supposed to do for a history mechanism, then? > > Obviously, you send me a copy of your source license and a magtape > and I send you back a SVR2 shell that does these things (and much > more). Source license? What's that? I doubt I can get it at all, much less at a price I can afford for my personal system... ...and yes, there are more of us floating in day by day. Actually, this is a good opportunity to apologize for the mildly flamish tone of my prior article; I read it later, and said "Oops." Until then, I'll either have to hack up ash, or do something useful :-) :-) :-) :-) :-)...aw, what the heck. Me? Useful? :-) :-) :-) -- Jay Maynard, K5ZC (@WB5BBW)...>splut!< | GEnie: JAYMAYNARD CI$: 71036,1603 uucp: {uunet!nuchat,academ!uhnix1,{ihnp4,bellcore,killer}!tness1}!splut!jay Never ascribe to malice that which can adequately be explained by stupidity. The opinions herein are shared by none of my cats, much less anyone else.
gwyn@brl-smoke.ARPA (Doug Gwyn ) (01/06/88)
In article <295@splut.UUCP> jay@splut.UUCP (Jay Maynard) writes: -In article <6843@brl-smoke.ARPA>, gwyn@brl-smoke.ARPA (Doug Gwyn ) writes: -> Obviously, you send me a copy of your source license and a magtape -> and I send you back a SVR2 shell that does these things (and much -> more). -Source license? What's that? I doubt I can get it at all, much less at a -price I can afford for my personal system... Well, then, get whoever supplied your Bourne shell binary to send me the source license etc. and later supply you with an enhanced shell binary. Somewhere up the chain there has to be a source licensee.