[comp.os.minix] definition of NULL

mikem@amc.com (Mike McGinnis) (06/26/91)

>>*** BUG FIX ***
>>
>>In /usr/include/stdio.h, #define NULL 0
>>In /usr/include/*.h (all others), #define NULL ((void *)0)
>>
>>stdio.h should be updated asap.
>>
>>If I'm wrong about this, or if this will break anything, please let me
>>know.
>
>You are absolutely right. When I made this change, I was able to
>compile cdungeon (ie. Zork) in 16-bit mode with c68 on the Atari ST.
>Before, I had to use Gnu C in 32-bit mode. Let's make this change
>official...
>
>
>David
>

You are absolutely wrong ;^).

OK, One More Time...
This is your brain on (int)0, this is your brain on (pointer)0...

NULL should always be defined as the integer 0. It is the compilers job to
infer the context and treat it accordingly in every instance with the exception
of function parameters; it should then be cast explicitly to the type defined
in the called function i.e. "(char *)NULL".
C compiler writers go to great pains to make this so with pointers and zero.
Please utilize it accordingly.

In fun, to see this context sensitivity, try this:
{
    char *c;

    if (c == 0)	/* integer 0 will draw no warning */
	;
    if (c == 1)	/* integer 1 will draw warning */
	;
}

The VAX compiler produces the following warning for the comparison with 1, but
no warning for the comparison of zero.

foo.c, line 7: warning: illegal combination of pointer and integer, op ==

Most compilers will probably draw a warning on the comparison with 1,
especially if it sizeof(int) != sizeof(pointer).
If it draws a warning on the comparison with 0, it is a weak compiler.

	Michael E. McGinnis
	Applied Microsystems Corporation, Redmond Washington.

"Help Mr. Wizard... I don't want to be an engineer anymore!"

david@doe.utoronto.ca (David Megginson) (06/26/91)

So, it looks like this. If you use a non-ANSI compiler on a system
that has 16-bit ints and 32-bit pointers (ie. Minix ST), then

	#define NULL (0)

will break just about half the Unix code in existance, because,
without prototypes, the compiler does not know when to promote
it to 32 bits in a function argument. On the other hand,

	#define NULL ((void *)0)

will break very little, since NULL should never be used for an integer
anyway.

I agree that

	#define NULL (0)

is probably better C style, but I would not be surprised if Minix 68K
users prefer the other so that they can compile programs like cdungeon
using 16-bit compilers. We will, of course, make the change only in
our local copies, and we promise not to whine when it breaks something
in the native Minix code :-)


David

-- 
////////////////////////////////////////////////////////////////////////
/  David Megginson                      david@doe.utoronto.ca          /
/  Centre for Medieval Studies          meggin@vm.epas.utoronto.ca     /
////////////////////////////////////////////////////////////////////////