rcd@opus.UUCP (Dick Dunn) (07/07/84)
>From: Mike Caplinger <mike@RICE.ARPA> > >I can't believe that somebody is proposing ANOTHER standard that imposes a >6-character uniqueness limit on external names! How much longer are we >going to be stuck with somebody's 1960s linker implementation choice? (This was in reference to the developing C standard.) I agree that it's about time to tackle problems with lame-brain linkers! Short names is only one of a whole raft of problems we've got with <most> present-day linkers. I suspect that a lot of the difficulty in getting language systems which provide a reasonable module facility lies with linkers. If you want a language to flourish, you need separate compilation and the ability to link with routines written in other languages. (Yes, that was a bit of a rash generalization. Spare me the flames, please.) That means you either have to make do with the existing linker (which quite likely has NO provision whatsoever for cross-checking interfaces at link time) or write your own linker. -- Dick Dunn {hao,ucbvax,allegra}!nbires!rcd (303)444-5710 x3086 ...Are you making this up as you go along?
henry@utzoo.UUCP (Henry Spencer) (07/10/84)
Dick Dunn observes: (This was in reference to the developing C standard.) I agree that it's about time to tackle problems with lame-brain linkers! Short names is only one of a whole raft of problems we've got with <most> present-day linkers. And Larry West likewise: The other point which bothers me, even more, is the limitation of six significant characters in external names. It seems to me that the cost of converting a few linkers from 6 characters to some larger number (say, 16 -- even 10 or 12 would be a vast improvement) is much less than the cost of having programmers figure out meaningful six-character names to use. There aren't really that many informative identifiers with six characters -- maybe a few hundred at most. Add to the cost of figuring out a group of 6-character identifiers (also not conflicting with any system call or subroutine name) the cost of trying to decipher such things. And who really has 6-char-max linkers that they plan to support, unchanged, for the next ten years? I've never come accross any. The problem is that most of these deficiencies lie not with the *linkers*, but with the *object* *module* *formats*. Changing those would require changing every compiler -- remember, in most non-Unix environments the compilers generate object code directly -- and this is the job that nobody can face. Do you really want to tackle the job of fixing the output module of every compiler ever written for the 360? Sure, it could be done, but the problems are monumental and the conversion period would be agonizing for the customers. Many of them, with good reason, would simply refuse to cooperate. The problem really is unfixable in the context of old systems. The best we can do is to make sure that *new* systems do it right. -- Henry Spencer @ U of Toronto Zoology {allegra,ihnp4,linus,decvax}!utzoo!henry
hansen@pegasus.UUCP (Tony L. Hansen) (07/10/84)
> The problem really is unfixable in the context of old systems. > The best we can do is to make sure that *new* systems do it right. Interesting. How do we make sure that *new* systems do it right? By standardizing on the old methods or standardizing on the new methods of thinking? If you standardize on the old method, then no one will consider having to upgrade their loaders. When building new compilers/loaders/linkers, who will stop and think "Gee, this doesn't conform to the non-standard. I'd better think about this some more." Instead, they'll stop and think "Gee, this conforms just fine to the standard of 6 characters, monocase. I'm doing a great job!" One possible way around the problem it is to go to a dual-standard: a standard, preferred way of doing things (32 characters, dualcase) and a sub-standard way of doing things (6 characters, monocase). Those compilers which only support the sub-standard can be billed as supporting the ANSI sub-standard version of C. Those that support global names of 32 characters and dualcase can be billed as supporting the full ANSI standard. Unless something is made an official part of the standard, there's no way we're going to make new products conform to something that isn't a part of the standard. If you embrace an out-dated standard, there's no way that the out-dated stuff will disappear and we will always be haunted by it. Tony Hansen pegasus!hansen P.S. A tool to consider writing would be a program to convert C source from the full standard to the proposed sub-standard. One easy way of generating unique global names would be to prepend an underscore followed by 5 digits to every global name defined and called by the user's program. That shouldn't be too hard to write.
mjk@tty3b.UUCP (Mike Kelly) (07/10/84)
Henry Spener points out the problem with longer names is an object file format problem. He's right. SVR2 implements flexnames in the compiler, and it causes no small amount of trouble. Sure, it's worth it in the long run, but you find a lot of typos you didn't know existed in the tenth or eleventh characters of long names. Mike Kelly
jim@ism780b.UUCP (08/01/84)
#R:opus:-59800:ism780b:25500003:000:256 ism780b!jim Jul 13 19:39:00 1984 > SVR2 implements flexnames in the compiler, and it causes no small amount of > trouble. Sure, it's worth it in the long run, but you find a lot of typos > you didn't know existed in the tenth or eleventh characters of long names. This is a *complaint*?