[comp.lang.c] Are the criteria for Unix and PC compilers different?

tr@wind.bellcore.com (tom reingold) (03/02/88)

This is mostly a philosophical question directed to those like me
who use both PC's and Unix.

How come compilers for the PC keep improving and make last year's
obsolete while we are totally satisfied with the one on Unix?  The
Unix ones has not been significantly changed in a long time.

Is it because we want hardware access to the PC, e.g. graphics,
keyboard, etc.?  Or is it because the PC compilers are only now
approaching the quality and completeness of the Unix ones?

What's your opinion?

Tom Reingold                    INTERNET:       tr@bellcore.bellcore.com
Bell Communications Research    UUCP:           rutgers!bellcore!tr
435 South St room 2L350         SOUNDNET:       (201) 829-4622 [work]
Morristown, NJ 07960                            (201) 287-2345 [home]

henry@utzoo.uucp (Henry Spencer) (03/05/88)

> How come compilers for the PC keep improving and make last year's
> obsolete while we are totally satisfied with the one on Unix?  The
> Unix ones has not been significantly changed in a long time.

Depends on *which* Unix ones you are talking about.  The System V ones
do get improved, as do those from some Unix-box manufacturers.  If yours
doesn't, complain to your supplier.  (If it's Berkeley, you are out of
luck, because nobody there gets paid to soup up the compiler.  This is
part of the price you pay for university software.)  Outfits like MIPS
work quite hard on compiler improvements.

> ... is it because the PC compilers are only now
> approaching the quality and completeness of the Unix ones?

Well, one can argue about that particular issue, but the real reason is
that the PC compilers are commercial products in a viciously competitive
marketplace.  Many of the Unix compilers have little or no competition.
-- 
Those who do not understand Unix are |  Henry Spencer @ U of Toronto Zoology
condemned to reinvent it, poorly.    | {allegra,ihnp4,decvax,utai}!utzoo!henry

backstro@silver.bacs.indiana.edu (Dave White) (03/05/88)

In article <5847@bellcore.bellcore.com> tr@wind.UUCP (tom reingold) writes:
>How come compilers for the PC keep improving and make last year's
>obsolete while we are totally satisfied with the one on Unix?  The
>Unix ones has not been significantly changed in a long time.
>
>Is it because we want hardware access to the PC, e.g. graphics,
>keyboard, etc.?  Or is it because the PC compilers are only now
>approaching the quality and completeness of the Unix ones?
MS-DOS C compilers keep improving because of competition:  no vendor can
assume that the competition won't come out with a product that will do
something it can't.

DOS-world compilers don't exist in the same
world as Unix compilers:  there's considerable competition.  Many
were released incomplete -- they generate lousy code, or their libraries
are incomplete, or they don't support every single canonical variation
of code generation for the segmented architecture.  They might be buggy,
too.

It used to be possible to release a lousy compiler because there was no
standard for libraries and such, and because the operating system didn't
suffer from a lousy code generator or slow libraries.  Microsoft
deserves credit for bringing out a DOS compiler that accepts Unix-style
C, and that comes with a fairly complete library.  The compiler that
comes with IBM Xenix 2.0 (SCO 2.06?  it's out of date, now) generated
lousy code, but I was able to cross-compile compress 4.0 for DOS;  Turbo
C 1.5 doesn't like it!

The PC compiler vendors broke down and developed support for arcane
memory models -- even huge model, with arrays >64K bytes, which involve
a subroutine call to handle pointer manipulation.  When will they
realize that some of us really want the option of using 32-bit ints to
port Unix-born code?  Yes, the code would be slow, but that'd be better
than not having the resulting program run without weeks of debugging!
--
backstro@silver.bacs.indiana.edu

Ralf.Brown@B.GP.CS.CMU.EDU (03/06/88)

In article <1106@silver.bacs.indiana.edu>, backstro@silver.bacs.indiana.edu (Dave White) writes:
}The PC compiler vendors broke down and developed support for arcane
}memory models -- even huge model, with arrays >64K bytes, which involve
}a subroutine call to handle pointer manipulation.  When will they
}realize that some of us really want the option of using 32-bit ints to
}port Unix-born code?  Yes, the code would be slow, but that'd be better
}than not having the resulting program run without weeks of debugging!
}--
}backstro@silver.bacs.indiana.edu

#define int long

Need I say more?

--
{harvard,ucbvax}!b.gp.cs.cmu.edu!ralf -=-=- TalkNet: (412)268-3053 (school)
ARPA: RALF@B.GP.CS.CMU.EDU |"Tolerance means excusing the mistakes others make.
FIDO: Ralf Brown at 129/31 | Tact means not noticing them." --Arthur Schnitzler
BITnet: RALF%B.GP.CS.CMU.EDU@CMUCCVMA -=-=- DISCLAIMER? I claimed something?

blarson@skat.usc.edu (Bob Larson) (03/07/88)

In article <22314ad9@ralf.home> Ralf.Brown@B.GP.CS.CMU.EDU writes:
>In article <1106@silver.bacs.indiana.edu>, backstro@silver.bacs.indiana.edu (Dave White) writes:
>}When will they
>}realize that some of us really want the option of using 32-bit ints to
>}port Unix-born code?

>#define int long

* Does not handle implicit int declarations, such as undeclared functions,
funtion argument promotion, "register i;", etc, etc.

* Does not work on compilers that don't allow #define of a keyword.

>Need I say more?

No, you already made a big enough fool of yourself.
--
Bob Larson	Arpa: Blarson@Ecla.Usc.Edu	blarson@skat.usc.edu
Uucp: {sdcrdcf,cit-vax}!oberon!skat!blarson
Prime mailing list:	info-prime-request%fns1@ecla.usc.edu
			oberon!fns1!info-prime-request

Ralf.Brown@B.GP.CS.CMU.EDU (03/07/88)

In article <7451@oberon.USC.EDU>, blarson@skat.usc.edu (Bob Larson) writes:
}In article <22314ad9@ralf.home> Ralf.Brown@B.GP.CS.CMU.EDU writes:
}>#define int long
}
}* Does not work on compilers that don't allow #define of a keyword.
}--
}Bob Larson      Arpa: Blarson@Ecla.Usc.Edu      blarson@skat.usc.edu
}Uucp: {sdcrdcf,cit-vax}!oberon!skat!blarson

I wasn't aware of any preprocessors smart enough to disallow #defines of
keywords.  The preprocessor's knowledge of C is limited to recognizing tokens;
it doesn't "know" C, it just does the textual substitution.

--
{harvard,ucbvax}!b.gp.cs.cmu.edu!ralf -=-=- TalkNet: (412)268-3053 (school)
ARPA: RALF@B.GP.CS.CMU.EDU |"Tolerance means excusing the mistakes others make.
FIDO: Ralf Brown at 129/31 | Tact means not noticing them." --Arthur Schnitzler
BITnet: RALF%B.GP.CS.CMU.EDU@CMUCCVMA -=-=- DISCLAIMER? I claimed something?

backstro@silver.bacs.indiana.edu (Dave White) (03/08/88)

In article <22314ad9@ralf.home> Ralf.Brown@B.GP.CS.CMU.EDU writes:
>In article <1106@silver.bacs.indiana.edu>, backstro@silver.bacs.indiana.edu
>(Dave White) writes:
>}When will [PC C compiler vendors] realize that some of us really want
>}the option of using 32-bit ints to port Unix-born code?  
>
>#define int long
>
>Need I say more?
It isn't that simple. Your suggestion would ensure that the
compiler thought the code I was compiling saw ints and longs
the same way, but this would not be the case with the compiler's
run-time library, which would need to be recompiled and debugged.
alloc and friends, for example, take int-sized arguments in my machine!

I want to port code, not support a compiler (the code is enough trouble
as it is :-)
If we want to coverse further, perhaps we should do so by mail -- let;s
not clutter up the newsgroup!
--
backstro@silver.bacs.indiana.edu

exodus@uop.edu (G.Onufer) (03/08/88)

In article <22314ad9@ralf.home>, Ralf.Brown@B.GP.CS.CMU.EDU writes:
> #define int long
> 
> Need I say more?


Yes.

rbutterworth@watmath.waterloo.edu (Ray Butterworth) (03/10/88)

In article <22314ad9@ralf.home>, Ralf.Brown@B.GP.CS.CMU.EDU writes:
> In article <1106@silver.bacs.indiana.edu>, backstro@silver.bacs.indiana.edu (Dave White) writes:
> }When will they
> }realize that some of us really want the option of using 32-bit ints to
> }port Unix-born code?
> 
> #define int long
> Need I say more?

Yes.

I think "Unix-born" is a euphemism for "badly written" or "non-portable".

Code that assumes that "int" is 32 bits is usually written because
the programmer was lazy.  By making everything "int" he can avoid
having to make many declarations and having to include header files.

In particular, instead of having code like:
    extern int func1();
    extern int x;
    int func2(a) int a; { blah; blah; }
he will probably have:
    extern x;
    func2(a) { blah; blah; }
or maybe even
    x;
    func2(a) { blah; blah; }

"#define int long" won't help much with this code.

meissner@xyzzy.UUCP (Michael Meissner) (03/10/88)

In article <2232abb7@ralf.home> Ralf.Brown@B.GP.CS.CMU.EDU writes:
| In article <7451@oberon.USC.EDU>, blarson@skat.usc.edu (Bob Larson) writes:
| }In article <22314ad9@ralf.home> Ralf.Brown@B.GP.CS.CMU.EDU writes:
| }>#define int long
| }
| }* Does not work on compilers that don't allow #define of a keyword.
| 
| I wasn't aware of any preprocessors smart enough to disallow #defines of
| keywords.  The preprocessor's knowledge of C is limited to recognizing tokens;
| it doesn't "know" C, it just does the textual substitution.

In the traditional UNIX/cpp yes, /lib/cpp "only" recognizes tokens.  It is
not engraved in stone tablets, that this is the one true course.  Other
compilers have the preprocessor integrated in the compiler proper, and yes
some of those do warn and/or disallow #define'ing keywords.  For instance,
early rev's of the Data General C compiler gave warnings about that, as well
as some of the micro compilers.
-- 
Michael Meissner, Data General.		Uucp: ...!mcnc!rti!xyzzy!meissner
					Arpa/Csnet:  meissner@dg-rtp.DG.COM

peter@sugar.UUCP (Peter da Silva) (03/18/88)

In article ... backstro@silver.bacs.indiana.edu (Dave White) writes:
>When will they
>realize that some of us really want the option of using 32-bit ints to
>port Unix-born code?

* UNIX does not imply 32 BITS.

My first porting nightmare was getting a PDP-11 UNIX program with 16 bit ints
to run on a VAX under UNIX with 32 bit ints. Most everything went together
well, but there were places where the following assumptions were made:

	-1 == 0xFFFF
	32767<<1 < 0
	sizeof(int) == 2
	sizeof(char *) == 2

* 32 bit machine does not imply sizeof(int) == 4.

There are compilers for 32-bit machines for which sizeof(int) = 2. This is a
bit more effcient on the 68000, for example.

* sizeof(int) == 4 does not imply sizeof(int) == sizeof(char *).

In fact on the prime sizeof(char *) does not even equal sizeof(float *).
-- 
-- Peter da Silva  `-_-'  ...!hoptoad!academ!uhnix1!sugar!peter
-- Disclaimer: These U aren't mere opinions... these are *values*.