fritz@catfish.caltech.edu (Fritz Nordby) (02/01/89)
In article <38@microsoft.UUCP> w-colinp@microsoft.uucp (Colin Plumb) writes: >(Quick: who's run into Unix's 10K command-line limit?) Me. Often. And probably other folks who've worked with large numbers of source files. Consider: $ pr `find . -type f -print|egrep '^([Mm]akefile|.*\.[ch])$'` | lpr Another example: have you looked at looked at the way the "rcbook.t" program (from the alt.gourmand recipes software) works? It has to work around this restriction. BTW, the "10K command-line limit" is not exactly that; it's really a "10 block command-line limit". Yes, that's right, it used to be a 5K limit in the days of 512 byte disk blocks, and on systems with 4K disk blocks I rather suspect that it's a 40K limit. Moral: A restriction is a restriction, and no matter how lax or trivial the restriction may seem today, eventually somebody will run up hard against it. (Anybody else remember when 64k was a lot of memory? And now we're running out of space with 32 bits?) Fritz Nordby. fritz@vlsi.caltech.edu cit-vax!cit-vlsi!fritz