[comp.sys.mac.programmer] Pointer Subtraction in THINK C 3.0

swilson%thetone@Sun.COM (Scott Wilson) (08/05/88)

I was a little suprised when I ran this program under THINK C 3.0:

	#include <stdio.h>

	printi(i)
	int i;
	{
		printf("i = %d\n", i);
	}

	char buf[10];

	_main()		/* so I can redirect output */
	{
		char *cp1, *cp2;

		cp1 = buf;
		cp2 = buf + 5;
		printi(cp2 - cp1);
		printi((int) (cp2 - cp1));

		printi(&buf[5] - buf);
		printi((int) (&buf[5] - buf));
	}

and got the output:

	i = 0
	i = 5
	i = 0
	i = 5

It looks as though THINK C 3.0 is using long as the resulting type for
pointer subtraction.  (A long gets pushed on the stack instead
of an int, its upper two bytes are zero so printi prints 0.)

According to K&R:

	If two pointers to objects of the same type are subtracted,
	the result is converted ... to an int ...

However, K&R 2nd ed. (and I assume ANSI) says:

	If two pointers to objects of the same type are subtracted, ...
	The type of the result ... is defined as ptrdiff_t in the
	standard header <stddef.h>

Which definiton does THINK C 3.0 claim to use?  Do they specify it?
Is there a stddef.h and is ptrdiff_t defined to be long?

I'm sorry if this a RTFM question, but either I'm at work with the C
manuals or at home with THINK C manuals.

--
Scott Wilson		arpa: swilson@sun.com
Sun Microsystems	uucp: ...!sun!swilson
Mt. View, CA

clive@drutx.ATT.COM (Clive Steward) (08/08/88)

From article <63033@sun.uucp>, by swilson%thetone@Sun.COM (Scott Wilson):
> 
> It looks as though THINK C 3.0 is using long as the resulting type for
> pointer subtraction.

(does this seem wrong?)

[summarized]
> According to K&R: (subtracted pointers -> int) (pdp11 ptr size = int, right?)
> 	       ANSI: subtracted pointers -> mfgr defined, hopefully sensible

Well, maybe they got complaints, as I might have given last night, trying
to figure out why a program blew up, compiled with LSC2.13, when it worked 
fine on earlier versions.

I had:  txtmax = inbuf + (48 * 1024 - 1);	/* inbuf is malloc'd char[] */

    this produced txtmax (buffer limit) _negative_ to inbuf.  Apparently 
    2.13 followed the book rules again, where 3.0 deviates sensibly.

what worked, by the way: txtmax = inbuf + (48L * 1024L - 1L);

Personally, I think this is all a little too tricky, and am glad
that they've gone back to results of subtractions being in the
type of the arguments.  Seems it should never have been otherwise,
and that the K&R thing is just a slipup -- clearly two pointers
can have a bigger difference than an int can hold, depending on sizes.


Clive Steward

singer@endor.harvard.edu (Rich Siegel) (08/08/88)

In article <63033@sun.uucp> swilson@sun.UUCP () writes:

>It looks as though THINK C 3.0 is using long as the resulting type for
>pointer subtraction.  (A long gets pushed on the stack instead
>of an int, its upper two bytes are zero so printi prints 0.)

	Right. Though I'm not  a C god, my guess is that pointers are
treated as longs in that the difference of two pointers is a pointer,
not an int.

	The solution? Either cast the result, or use prototypes.

		--Rich




Rich Siegel
Quality Assurance Technician
THINK Technologies Division, Symantec Corp.
Internet: singer@endor.harvard.edu
UUCP: ..harvard!endor!singer
Phone: (617) 275-4800 x305

awd@dbase.UUCP (Alastair Dallas) (08/12/88)

In article <63033@sun.uucp>, swilson%thetone@Sun.COM (Scott Wilson) writes:
> I was a little suprised when I ran this program under THINK C 3.0:
> 
>	<example deleted>
> 
> It looks as though THINK C 3.0 is using long as the resulting type for
> pointer subtraction.  (A long gets pushed on the stack instead
> of an int, its upper two bytes are zero so printi prints 0.)
> 
> According to K&R:
> 
> 	If two pointers to objects of the same type are subtracted,
> 	the result is converted ... to an int ...
> 
> However, K&R 2nd ed. (and I assume ANSI) says:
> 
> 	If two pointers to objects of the same type are subtracted, ...
> 	The type of the result ... is defined as ptrdiff_t in the
> 	standard header <stddef.h>
> 

In an environment where pointers are 32-bit values (such as the Macintosh
and Large-Model MS-DOS) and unqualified ints are 16-bits (such as Mac and
MS-DOS), what's a compiler to do?  Clearly, there is the possibility of
data loss if ptr32 - ptr32 gets truncated to int16.  So, yes, per K&R
LSC renders an int--an int of the minimum size required to hold the
result, in this case a 32-bit int.  However, since you the programmer
know that the difference between the pointers will not be greater than
32767, you can safely cast (i.e., truncate) the ptr expression to an
int.

I wish THINK C was a little more on the mark when it comes to ANSI C--
I work with Microsoft C and Metaware High C on MS-DOS and they are both
full-blown, up-to-the-minute ANSI draft implementations.  THINK merely
leans in ANSI's direction.  There are some nice pre-processor toys from
ANSI that I'd like to have, and standard libraries and headers (like
ptrdiff_t) would be nice, too.  But if you're using Mac libs instead of
UNIX-compatible it really doesn't make too much difference.  <Insert 
my standard exhortations and praise for THINK C here.>

/alastair/

-- 

/* alastair */