[comp.windows.x] 29 bit resource id and garbage collection

sxm@philabs.philips.com (Sandeep Mehta) (11/02/89)

This question (or part of it) may be outdated depending on the current
state of the X11 protocol. 

It seems that the server allocates 29-bit resource identifiers to
simplify implementation in languages which have garbage collection built
in, according to the orig ACM TOG paper.

Could someone explain how the resource id size, or not using the hi bits
affects GC. This question probably has more to do with GC in
languages.

thanks in advance.

sandeep 
--
sxm@philabs.philips.com
uunet!philabs!bebop!sxm                             ...to be or not to bop ?

jg@max.crl.dec.com (Jim Gettys) (11/02/89)

Languages like lisp, CLU, etc. often distinguish an pointer to some
object from a integer internally by
having the high order bit(s) of the integer set.  Among other things,
this means that when a garbage
collector goes rummaging around to find garbage, it can tell objects
that may need garbage collection
from integers on the stack.  Some languages/implementations use as many
as the three bits of a word
for this purpose.

But the real point is that if an ID could have one of these bits on,
then all ID's in lisp might
have to be represented by bignums, causing a substantial performance
penalty for such languages.
Better to just avoid generating problems for people working in those
languages by requiring that
ID's not use those bits.
					- Jim Gettys

rws@EXPO.LCS.MIT.EDU (Bob Scheifler) (11/05/89)

    But several lisp implementations I know of use the low order bits
    for type codes.

This is irrelevant (think about it).  What matters is how many bits are
available, not which bit positions they are shifted into to represent an
integer in the particular implementation.  (CLU also uses the low order bits
on the 68000, and the high order bits on the VAX for type codes.)