[net.unix-wizards] NULL vs 0 - chapter and verse

crl@pur-phy.UUCP (Charles LaBrec) (01/21/84)

For the most part, I stand corrected.  It's amazing that when I've read
the manual before, the PRECISE meaning of NULL's escaped me.  Rereading
the sections again showed me that most of us have been wrong about NULL.
Only the *assignment* or *comparison* of 0 to a pointer produces a NULL
pointer.  A NULL pointer is *not* 0.  Therefore, I think in the future I
will not use "foo(0)" (with or w/o using NULL).  However, it might still
be argued that this is still legal C, since the manual also states that
"In preparing for the call to a function, *a copy is made of each actual
parameter*" (pg. 186, italics mine).  Sounds like an assignment, doesn't
it?

Guy however pointed out something about the 32/16 problem that does break
the standard--the fact that the subtraction of two pointers produces an
int that is the number of elements between the two pointers.  I don't see
a "portable" way of resolving this difficulty.

Since I was wrong about "NULL == 0", that part of my argument is invalid.
(By the way, I meant "NULL pointer" rather than what stdio #defines it to 
be.)  However, I stand by the spirit of paragraph--an implementation must
adhere to the letter of manual, or the implementation is flawed.  I won't
go as far to say that "it ain't C", but it does make portability a *big*
problem.

By the way, I don't necessarily consider a program that does/doesn't
pass through lint to be correct/incorrect.  While this is almost always
the case, it is still someone's interpretation of the standard (I say
"someone" rather loosely, including corrections made over the years).

Thanks, Guy, for the exact quotes.

Charles LaBrec
UUCP:		pur-ee!Physics:crl, purdue!Physics:crl
INTERNET:	crl @ pur-phy.UUCP

p.s. Does everyone consider stucture assignment and function passing and
     enums (flawed though they may be) to be part of the "official" standard?
     Has anyone heard of an updated version of the reference manual?

guy@rlgvax.UUCP (Guy Harris) (01/22/84)

The "standard" is best described as "'Version 7 Phototypesetter' C", rather than
V7 C, System III C, or System V C, or System V Release 2 C, or 4.xBSD C.  (There
are still a few archaic systems running "early V6", without the new system
calls, without Standard I/O, and without a "modern" C with "long", "typedef",
etc., etc. - but life goes on and most people assume that all UNIXes have V7
capabilities or more).  V7 added "enum"s and structure assignment, structure
function arguments, and structure-valued functions; System III C added "void"
(although PCC had "void" before it became "official") and new rules for
structure/union member names, System V enforced some parts of the standard
more strictly (old-style initializations and assignment operators were
decommissioned and a multiple-module program could have only *one* definition
of an external value), and System V Release 2 supported long variable names,
as did 4.xBSD C which is otherwise System III C.  There may be some V7s out
there that don't support "enum" or structures-as-full-fledged-objects.

At this point I'd like to see the C Reference Manual reissued, describing at
least V7 C and probably System III C (4.xBSD, S3, and S5 support this, and
more and more systems are either S3/S5 based, or are going with 4.2BSD; the
PDP-11 V7s out there could probably pick up the S3 or S5 PDP-11 C compiler
without too much trouble.)

	Guy Harris
	{seismo,ihnp4,allegra}!rlgvax!guy

alan@allegra.UUCP (Alan S. Driscoll) (01/22/84)

	p.s. Does everyone consider structure assignment and function passing
	     and enums (flawed though they may be) to be part of the "official"
	     standard?  Has anyone heard of an updated version of the reference
	     manual?


Yes, there is an updated version of the C Reference Manual, dated
September, 1980.  I have no idea whether it is available outside
of Bell Labs.  It talks about enum's and assignment of structures.


	Alan S. Driscoll
	AT&T Bell Laboratories

henry@utzoo.UUCP (Henry Spencer) (01/24/84)

Charles LaBrec observes:

  Guy however pointed out something about the 32/16 problem that does break
  the standard--the fact that the subtraction of two pointers produces an
  int that is the number of elements between the two pointers.  I don't see
  a "portable" way of resolving this difficulty.

Lots of fun.  In fact, it gets worse.  If you have a 32-bit machine
that uses all 32 bits for pointers, there may be *no* C data type big
enough to hold the difference of two pointers.  The difference should
clearly be signed (positive or negative depending on whether a > b or
a < b), but with 32 bits of pointer an unsigned long is just barely
large enough to hold the magnitude, and there's no room for the sign.

The ANSI-C-standard people are aware of this one (in fact, I first heard
about it from one of them); it's not clear that they can come up with
a clean and simple solution.
-- 
				Henry Spencer @ U of Toronto Zoology
				{allegra,ihnp4,linus,decvax}!utzoo!henry

mike@hcr.UUCP (Mike Tilson) (02/17/84)

Pointer subtraction on a 16/32 bit implementation does not break the
standard; it is simply an implementation restriction.  The subtraction
of two 32 bit pointers may well yield a 16 bit int -- so what?  In
practice, that means that under such an implementation it is unreasonable
to have single storage objects with more than 16 bit's worth of elements.
That is a restriction, not a logical inconsistency.

/Mike Tilson	decvax!hcr!hcrvax!mike