[comp.sys.mac.programmer] THINKC 4.0.2 #define is short

tcm@moscom.UUCP (Tom Maszerowski) (11/14/90)

Here's a gotcha in THINK C 4.0.2 that got me.  It's probably in the
manual somewhere, but a cursory galnce didn't reveal it ( I couldn't
find any reference to #define or the preprocessor anywhere in the index).
Anyway, this is what happened:

given:

#define   MaxAllocation  (1<<16)

sets MaxAllocation to 0.  

I believe that this is because the preprocessor assumes that
MaxAllocation is an int ( 16 bits ) but the result of the shift is
0x10000, which requires a long ( 32 bits ).  I realize that compiler
implementers are free to make an int any size they want, but this is the
first 68000 C compiler I've used where sizeof(int) != sizeof(long).  By
replacing the shift with 0x10000, MaxAllocation was defined correctly. 
BTW, this was in code I ported, I wouldn't use this style when it's
simple to just use an explicit number. 

Is this in UMPG? I've ftp'd it but haven't printed anything yet.

-- 
Tom Maszerowski	tcm@moscom.com
		{rit,tropix,ur-valhalla}!moscom!tcm
*** Note Changed uucp Address ***

hanche@imf.unit.no (Harald Hanche-Olsen) (11/14/90)

In article <2374@moscom.UUCP> tcm@moscom.UUCP (Tom Maszerowski) writes:

   #define   MaxAllocation  (1<<16)

   sets MaxAllocation to 0.  

   I believe that this is because the preprocessor assumes that
   MaxAllocation is an int ( 16 bits )

Nope, the preprocessor assumes no such thing.  It does only textual
substitution, so wherever MaxAllocation appears in the program it's as
if it said (1<<16) instead.  You're on the right rack, though...

   but the result of the shift is 0x10000, which requires a long ( 32 bits ).

Exactly.  To get the desired result, use (1L<<16).  AND make sure the
program doesn't try to stuff the result into an int...

   I realize that compiler
   implementers are free to make an int any size they want, but this is the
   first 68000 C compiler I've used where sizeof(int) != sizeof(long).

Yes, it's a pain isn't it.  However, be aware that there is nothing at
all in the standard (ANSI or K&R) that says int==long, it's just that
programmers are so used to this always being the case that they tend
to be sloppy about it.  Writing portable code just ain't easy!

- Harald Hanche-Olsen <hanche@imf.unit.no>
  Division of Mathematical Sciences
  The Norwegian Institute of Technology
  N-7034 Trondheim, NORWAY

nick@cs.edinburgh.ac.uk (Nick Rothwell) (11/14/90)

In article <2374@moscom.UUCP>, tcm@moscom.UUCP (Tom Maszerowski) writes:
> 
> Here's a gotcha in THINK C 4.0.2 that got me.  It's probably in the
> manual somewhere, but a cursory galnce didn't reveal it ( I couldn't
> find any reference to #define or the preprocessor anywhere in the index).
> Anyway, this is what happened:
> 
> given:
> 
> #define   MaxAllocation  (1<<16)
> 
> sets MaxAllocation to 0.  

Er, no, the #define is just textual substitution. It doesn't know about
ints, longs, or anything (perhaps besides comments and newlines and stuff
for macro parameter substitution).

I think "1<<16", in isolation, is meaningless in THINK C (well, it's 0).
Try "1L << 16" or something.

-- 
Nick Rothwell,	Laboratory for Foundations of Computer Science, Edinburgh.
		nick@lfcs.ed.ac.uk    <Atlantic Ocean>!mcsun!ukc!lfcs!nick
~~ ~~ ~~ ~~ ~~ ~~ ~~ ~~ ~~ ~~ ~~ ~~ ~~ ~~ ~~ ~~ ~~ ~~ ~~ ~~ ~~ ~~ ~~ ~~ ~~
 "Now remember - and this is most important - you must think in Russian."

phils@chaos.cs.brandeis.edu (Phil Shapiro) (11/14/90)

In article <2374@moscom.UUCP> tcm@moscom.UUCP (Tom Maszerowski) writes:
   Here's a gotcha in THINK C 4.0.2 that got me.  It's probably in the
   manual somewhere, but a cursory galnce didn't reveal it ( I couldn't
   find any reference to #define or the preprocessor anywhere in the index).
   Anyway, this is what happened:

   given:

   #define   MaxAllocation  (1<<16)

   sets MaxAllocation to 0.  

   I believe that this is because the preprocessor assumes that
   MaxAllocation is an int ( 16 bits ) but the result of the shift is
   0x10000, which requires a long ( 32 bits ). [ ... ]

Close.  Since ThC's int size (or 'word' size) is 16 bits, all integer
constants that don't need to be put in long integers aren't.  If you
specify a constant that isn't outside of the 16 bit integer range, it
will be placed into the 16 bit storage, and consequently the
expression is evauated using 16 bit storage.  To force an extension to
a long integer, you can use:

#define MacAllocation ((long)1 << 16)

or (preferred):

#define MaxAllocation (1L<<16)

To write really really ANSI standard code, you probably would want to
always use unsigned integers, as bitshifts of signed integers are left
unspecified by the ANSI standard.

	-phil
--
   Phil Shapiro                           Technical Support Analyst
   Language Products Group                     Symantec Corporation
		Internet: phils@chaos.cs.brandeis.edu

pepke@gw.scri.fsu.edu (Eric Pepke) (11/14/90)

In article <2374@moscom.UUCP> tcm@moscom.UUCP (Tom Maszerowski) writes:
> I believe that this is because the preprocessor assumes that
> MaxAllocation is an int ( 16 bits ) but the result of the shift is
> 0x10000, which requires a long ( 32 bits ).

No, it depends on how you use it.  Preprocessing is textual.  Try this bit 
of code:

#define foo (1 << 16)
#define bar (1L << 16)

    long face, bork;
    printf("Foo = %ld\n", (long) foo);
    printf("Bar = %ld\n", (long) bar);
    face = foo;
    bork = bar;
    printf("Face = %ld\n", face);
    printf("Bork = %ld\n", bork);

It produces

Foo = 65536
Bar = 65536
Face = 0
Bork = 65536

> I realize that compiler
> implementers are free to make an int any size they want, but this is the
> first 68000 C compiler I've used where sizeof(int) != sizeof(long).

Whether this is "proper" or not is the subject of one of the loudest and 
least productive of the periodic flame wars that grace this newsgroup.  
It's like zen and Mount Everest: it's just there.  

Eric Pepke                                    INTERNET: pepke@gw.scri.fsu.edu
Supercomputer Computations Research Institute MFENET:   pepke@fsu
Florida State University                      SPAN:     scri::pepke
Tallahassee, FL 32306-4052                    BITNET:   pepke@fsu

Disclaimer: My employers seldom even LISTEN to my opinions.
Meta-disclaimer: Any society that needs disclaimers has too many lawyers.

drs@max.bnl.gov (Dave Stampf) (11/15/90)

In article <2374@moscom.UUCP> tcm@moscom.UUCP (Tom Maszerowski) writes:
>
>Here's a gotcha in THINK C 4.0.2 that got me.  It's probably in the
>manual somewhere, but a cursory galnce didn't reveal it ( I couldn't
>find any reference to #define or the preprocessor anywhere in the index).
>Anyway, this is what happened:
>
>given:
>
>#define   MaxAllocation  (1<<16)
>
>sets MaxAllocation to 0.  
>
>I believe that this is because the preprocessor assumes that
>MaxAllocation is an int ( 16 bits ) but the result of the shift is
>0x10000, which requires a long ( 32 bits ).  I realize that compiler
>implementers are free to make an int any size they want, but this is the
>first 68000 C compiler I've used where sizeof(int) != sizeof(long).  By

	The preprocessor assumes nothing. The word MaxAllocation is 
replaced as *text* everywhere with (1<<16). When you shift a 16 bit integer
16 places, you get 0. To get what you want, try 

#define MaxAllocation (1L<<16)

There are many compilers for 68000's and PC's which have sizeof(int) !=
sizeof(long). In fact, the reason for giving programmers the sizeof operator
is precisely to allow for varying architectures - get used to it. 

	By the way, I suspect that you will frequently run into the problem 
of Think C routines returning an int when you *know* that the routine returns
a something*. I have a note above my mac that says "int != char*". You may
want to do the same - it has saved me *hours* of debugging.

	< dave

(I agree however that it would be helpful if Symantec would describe the
operation of the preprocessor and perhaps even have a mode where you saw
that preprocessed text fed to the compiler (-E) option.)