[comp.lang.c] Casting NULL again

Peter@adm.UUCP (01/27/87)

Thanks for your comments on this matter. Unfortunately, most
of you misunderstood what I wanted. I realize that under some
machines ints are 16 bits and pointers are 32. I have a MAC and
thats the case there and I often have to pass parameters as
(long)NULL to get the right number of bytes passed. That's
nothing to do with pointers per se,=that's a "problem" of
dealing with long int parameters. I also realize that C has
been implemented on word architectures such as the DEC-20 which
has 36-bits per location. In such a machine the representation
of a char pointer is certainly different that other types of
pointers if packing of chars is wanted. I've programmed in C
on the VAX/Unix, MS-DOS (Lattice-C, MicroSoft-C) and the Macintosh
(Megamax C, Aztec-C, LightspeedC). If all the systems I've used
(which are byte-oriented of course), the size of a pointer to
char is always the same as the size of a pointer to integer or double
or struct or whatever (with the exception that under microsoft C you
can have pointers and "far" pointers, but that's because of the
80x86's quirky addressing scheme). My question is what byte-oriented
machines out there have pointers to data that are different sizes
I really just wanted some specific examples (machines/pointer sizes).
That's all.

Peter@Acadia.BITNET

guy@gorodish.UUCP (01/28/87)

>Thanks for your comments on this matter. Unfortunately, most
>of you misunderstood what I wanted. I realize that under some
>machines ints are 16 bits and pointers are 32. I have a MAC and
>thats the case there and I often have to pass parameters as
>(long)NULL to get the right number of bytes passed.
>That's nothing to do with pointers per se,=that's a "problem" of
>dealing with long int parameters.

The person who claimed that the reason you cast NULL to a particular
pointer type is that pointers to different types might have different
sizes is only partially correct.  The *real* reason is that pointers
to different types have different types!  Too many C programmers feel
that correct C is anything they can get away with, and too many
programmers get burned by this.

Yes, you can probably get away with not casting NULL, or casting to
"long", or casting it to "char *" in all cases, or something else
wrong like that.  However, you'll get surprised if you have a machine
where null pointers aren't represented with the same bit pattern as
an integral zero, or where pointers contain tags that indicate the
data type that they refer to.  (Casting them to "long" is just
totally wrong - this won't even work on "normal" machines if "long"
and the pointer type in question aren't the same size!)

Furthermore, programs like "lint" will complain if you don't cast
them.  Running "lint" is a good way of catching many bugs, some of
which manifest themselves to an inter-module procedure call checker
like "lint" as a mismatch in argument types.  C has a type system,
and if you use it it can help you write correct code.

No, I don't know of any machines offhand that are byte-addressible
(but why is this relevant?  Do you think you'll never work on any
machine that isn't?  I certainly wouldn't make that prediction about
myself...) and where pointers to different types of objects have
different representations (the size isn't the only characteristic of
the representation that's important).  However, that's not an excuse
to drop the casts.  The fact that something happens to work on the
machines you know about doesn't say anything about whether they'll
work on the machine you next have to deal with.

greg@utcsri.UUCP (Gregory Smith) (01/28/87)

In article <3859@brl-adm.ARPA> Peter@adm.UUCP writes:
>Thanks for your comments on this matter. Unfortunately, most
>of you misunderstood what I wanted. I realize that under some
>machines ints are 16 bits and pointers are 32. I have a MAC and
>thats the case there and I often have to pass parameters as
>(long)NULL to get the right number of bytes passed. That's
>nothing to do with pointers per se,=that's a "problem" of
???
If you are passing a long 0 to a function, you indeed should be casting
it to long:
	fseek( file, (long)0, 0 );

Or, use a long 0 constant:
	fseek( file, 0L, 0 );

but (long)NULL ? If you are using this to pass a 0L, then it is a misleading
use of the NULL symbol. If you are passing a NULL pointer and using a (long)
cast to make it the right size, you are making a mistake.

lint will tell you about this mistake; a function expects a pointer and is
being passed a long. There are machines where a long is not the same size as
a pointer, and there may well be machines where (long)0 does not give the
same bit pattern as a NULL pointer. If you are passing a NULL pointer
to a function expecting a (char *), just write (char *)NULL. Guaranteed to
work, no extra trouble, more self-explanatory.
-- 
----------------------------------------------------------------------
Greg Smith     University of Toronto      UUCP: ..utzoo!utcsri!greg
Have vAX, will hack...

holtz@sdcsvax.UUCP (01/29/87)

In article <12211@sun.uucp> guy@sun.UUCP (Guy Harris) writes:
>  ... much deleted ...
>No, I don't know of any machines offhand that are byte-addressible
>(but why is this relevant?  Do you think you'll never work on any
>machine that isn't?  I certainly wouldn't make that prediction about
>myself...) and where pointers to different types of objects have
>different representations (the size isn't the only characteristic of
>the representation that's important).  However, that's not an excuse
>to drop the casts.  The fact that something happens to work on the
>machines you know about doesn't say anything about whether they'll
>work on the machine you next have to deal with.

I assume you meant 'are not byte-addressible' in the above.  An example
of such a machine is the CDC Cyber 700 series,  which used a 60 bit
word that was the only addressable entity.  Packed into a word could
be 10 6 bit characters or 5 12 bit chars  (I know,  YUCH!!).  Pointers
to char in such an environment are drastically different than pointers
to int or float.  Although I never used C on such a machine,  Fortran 77's
pass by reference on CHARACTER variables created such a monstrous piece
of code that I could never again assume pointers to different objects
were always the same...  how would you like having to deal with a 60 bit
pointer to a single 6 bit character from an assembler routine, where the
lower 18 bits were the base address,  the upper 18 bits the array offset
(if any),  then somewhere in between came the bit offset?  The nightmares
still linger...   (but then that's what I get for using fortran)
-- 

Fred Holtz
holtz@sdcsvax.UCSD.EDU

guy@gorodish.UUCP (01/29/87)

>I assume you meant 'are not byte-addressible' in the above.

No, I know lots of machines that are byte-addressible and lots of
machines that are not byte-addressible.  What I said, although it may
have been a bit hard to read due to a long parenthetical aside, was
that I don't know any machines that are byte-addressible *and* that
have different sizes for "char *" and other pointers.

lmiller@venera.isi.edu.UUCP (01/30/87)

In article <12211@sun.uucp> guy@sun.UUCP (Guy Harris) writes:
>>Thanks for your comments on this matter. Unfortunately, most
>>of you misunderstood what I wanted. I realize that under some
>>machines ints are 16 bits and pointers are 32. I have a MAC and
>>thats the case there and I often have to pass parameters as
>>(long)NULL to get the right number of bytes passed.
>>That's nothing to do with pointers per se,=that's a "problem" of
>>dealing with long int parameters.
>
>The person who claimed that the reason you cast NULL to a particular
>pointer type is that pointers to different types might have different
>sizes is only partially correct.  The *real* reason is that pointers
>to different types have different types!  Too many C programmers feel
>that correct C is anything they can get away with, and too many
>programmers get burned by this.
>
> etc., etc.

For what it's worth, I'd like to add my two cents in support of this point
of view.  Though it's rediculous now, there may come a time when automatic
verification of programs is reasonable.  Verification is aided by languages
with well defined semantics.  It is impossible when we use tricks.  So as
programmers who also give at least lip service to "computer science", we
should try to make our programs as formally correct as possible.  And as
has been repeatedly noted, doing so aids portability, maintainability,
etc., etc.