MSRS002@ECNCDC.BITNET ("THE DOCTOR.") (11/08/87)
How big is a word? It seems to be common practice to implement INTEGERS and CARDINALS as 16 bit words, and use LONGINT and LONGCARD for 32 bits. If you're developing for a 32 bit computer, would it be appropriate to make INTEGERS and CARDINALS 32 bits and either drop longs or make them 64 bits? I notice that Wirth writes ~usually 16 bits~ which would seem to indicate any word size would be appropriate. This would raise some more portability questions. My own opinion would be to implement 32 bit integers and such, I just wanted to hear some other opinions. Tom Ruby MSRS002@ECNCDC
BOTCHAIR@UOGUELPH.BITNET (Alex Bewley) (11/08/87)
A word, as usual, is system dependant. On my PC it is 16 bits, and can't be changed to anything else. Why would you want to? Cardinals, integers and words are all the same size to allow for type-conversion (if the need should arise). Alex Bewley 'Just this guy' BOTCHAIR@UOGUELPH
alan@pdn.UUCP (Alan Lovejoy) (11/11/87)
In article <INFO-M2%87110717121037@UCF1VM> Info-Modula2 Distribution List <INFO-M2%UCF1VM.bitnet@jade.berkeley.edu> writes: >How big is a word? It seems to be common practice to implement INTEGERS >and CARDINALS as 16 bit words, and use LONGINT and LONGCARD for 32 bits. > >My own opinion would be to implement 32 bit integers and such, I just wanted >to hear some other opinions. I think that 16 bits should be a *minimum* size for a word, and that 32 bits should be a *minimum* size for a longword. Also SIZE(word) = SIZE(CARDINAL) = SIZE(INTEGER) and SIZE(LONGWORD) = SIZE(LONGCARD) = SIZE(LONGINT). If the instruction set of the machine supports longwords greater than 32 bits, then a LONGWORD should be that size. A WORD should be an intermediate size between the shortest (byte?) and the longest operand size. LONGWORDs should be bigger than WORDs. The BSI M2 committee intends to have 3 standard sizes for all number types: "short" "normal" and "long". A "SHORTWORD" is probably a byte. --alan@pdn
schaub@sugar.UUCP (Markus Schaub) (11/12/87)
In article <INFO-M2%87110717121037@UCF1VM>, MSRS002@ECNCDC.BITNET ("THE DOCTOR.") writes: > If you're developing for a 32 bit computer, would it be appropriate to make > INTEGERS and CARDINALS 32 bits and either drop longs or make them 64 bits? > I notice that Wirth writes ~usually 16 bits~ which would seem to indicate Although it is a 32-bit computer I would prefer CARDINAL and INTEGER types with 16 bit, to save some space in RECORD and ARRAY definitions. If you pack subranges ([0..255] -> 1 byte; [0..65535] -> 2 bytes) you can use 32 bit CARDINALs and INTs. Finally on a 32 bit machine you have 32 x 32 bit multiplications so why mess with 64 bit types and use the same techniques used today for 32 bit types (routines in the run-time system). -Markus -- // Markus Schaub uunet!nuchat!sugar!schaub (713) 523 8422 \X/ c/o Interface Technologies Corp, 3336 Richmond #323, Houston Tx 77098