[comp.lang.c] bits in an int vs. long?

logan@inpnms.UUCP (James Logan) (10/04/89)

I would like to take a poll of any modern compilers, on a 680x0,
80386, or RISC architecture, use anything besides 32 bits for
their int's and long's.  Please email any comments on this.  

My current project has the following definitions that I must
choose from when using UNIX system calls:  

#define LONG	long
#define BYTE	unsigned char
#define CHAR	char
(others, such as UWORD, etc.)

For a while I was using the variable types that the section 2 & 3
man pages declare to interface with the system calls and library
routines; and using the #define'ed types when sending and
receiving data to and from foriegn microprocessors.  Now I have
been directed to used these #define'ed types for EVERYTHING.  :-(

There is not a definition for int, so I have to use LONG.  The only
time I can see this falling apart is if we port to a UNIX system
with an odd-sized int or long.  (I realize that it is wrong to make
assumtions about the number of bits in an int or a long, BTW.  I
just can't convince anyone else.)

Unless there is a clear real-world argument against the
assumption that int's and long's are the same size, I will have
to treat the two as interchangeable.  Comments?

-- 
James Logan                       UUCP: uunet!inpnms!logan
Data General Telecommunications   Inet: logan%inpnms@uunet.uu.net
(301) 590-3069

jnh@ecemwl.ncsu.edu (Joseph N. Hall) (10/05/89)

The widely-used THINK C compiler for the Macintosh uses 16-bit ints.  MPW
C (from Apple), on the other hand, uses 32-bit ints.

v   v sssss|| joseph hall                      || 4116 Brewster Drive
 v v s   s || jnh@ecemwl.ncsu.edu (Internet)   || Raleigh, NC  27606
  v   sss  || SP Software/CAD Tool Developer, Mac Hacker and Keyboardist
-----------|| Disclaimer: NCSU may not share my views, but is welcome to.

gwyn@smoke.BRL.MIL (Doug Gwyn) (10/05/89)

In article <181@inpnms.UUCP> logan@inpnms.UUCP (James Logan) writes:
>Unless there is a clear real-world argument against the
>assumption that int's and long's are the same size, I will have
>to treat the two as interchangeable.  Comments?

I don't have a good solution for the problem you're stuck with,
since such assumptions should never have been made in the first
place.

The AT&T MC680x0 compiler I use is configured for sizeof(int)==2
and sizeof(long)==4.  That's a fairly common choice on smaller
CPU architectures.

I haven't seen many C implementations where longs are 64 bits,
but there are some and it is likely to become more common in the
future.

frotz@drivax.UUCP (Frotz) (10/06/89)

logan@inpnms.UUCP (James Logan) writes:

>There is not a definition for int, so I have to use LONG.  The only
>time I can see this falling apart is if we port to a UNIX system
>with an odd-sized int or long.  (I realize that it is wrong to make
>assumtions about the number of bits in an int or a long, BTW.  I
>just can't convince anyone else.)

>Unless there is a clear real-world argument against the
>assumption that int's and long's are the same size, I will have
>to treat the two as interchangeable.  Comments?

The Intel 186, 286, 386 processors all use 16-bit ints.  i80386 allows
the use of 32-bit ints, but you need a 386 code generator to get this.
If you are using a compiler that DOES NOT GENERATE 386 code, you will
most likely NOT get 32-bit ints... 

It is my understanding that:

	sizeof(char) < sizeof(short) < sizeof(long)
	sizeof(short) == 2
	sizeof(long)  == 4

	'int' may be defined as a 'short' or a 'long' depending on the
hardware...  I have heard that there are processors in the world that
use 20 bit integers???

"My two bits...  clink... clink..."
--
Frotz

davidsen@crdos1.crd.ge.COM (Wm E Davidsen Jr) (10/06/89)

In article <252BB03A.2AEC@drivax.UUCP>, frotz@drivax.UUCP (Frotz) writes:

|  It is my understanding that:
|  
|  	sizeof(char) < sizeof(short) < sizeof(long)
|  	sizeof(short) == 2
|  	sizeof(long)  == 4

Actually sizeof(char) <= sizeof(short) <= sizeof(int) <= sizeof(long)

All of the sizeof are defined as being a minimum number of bits, or the
equivalent range I don't remember. I suspect that char could be the same
size as short if someone made a char large enough (16 bits min).

This from the bible according to St ANSI ;-)

  In some machines the sizeof short, int, and long are identical.
Extrapolate to unsigned everything and float/double, too.
-- 
bill davidsen	(davidsen@crdos1.crd.GE.COM -or- uunet!crdgw1!crdos1!davidsen)
"The world is filled with fools. They blindly follow their so-called
'reason' in the face of the church and common sense. Any fool can see
that the world is flat!" - anon