mario@r3.cs.man.ac.uk (06/06/89)
I have a few questions concerning the semantics and implementation of the BITS construct to appear in the latest Eiffel release. As I understand it, one use of this construct is to define a "root" class, BITS 32, from which all user-defined classes will implicitly inherit. In implementation terms, this means that all object identifiers must be no more than 32 bits long. In addition, some predefined classes such as Integer and Real will also inherit from BITS 32. This suggests to me that I can now define a variable of type BITS 32, and assign integers and instances of user-defined classes to this variable, thus: v : BITS 32; s : Stack ... v := 42; ... s.Create; v := s; If this is so, how does the underlying run-time system distinguish between these objects, without reserving more than 32 bits for the variable? The usual solution to this problem (in Smalltalk, for example) is to reserve one or more bits of each word for a tag, indicating whether the rest of the word represents an integer or a pointer. Of course, this restricts the size of integer to less than 32 bits, which is inconsistent with a declaration of BITS 32. So: either 1. You can't mix integers and "normal" objects, or 2. A BITS 32 variable needs more than 32 bits, or 3. A BITS 32 integer is smaller than 32 bits, or 4. Something completely different is going on. Any enlightenment would be most welcome. Finally, it would seem that each class, BITS n, exports several operations appropriate to that class: logical and, or, not, and the constants true and false. Does this mean that I can now perform meaningless operations, such as "or-ing" a string with a window, by assigning them variables of type BITS 32? Mario Wolczko ______ Dept. of Computer Science Internet: mario@ux.cs.man.ac.uk /~ ~\ The University USENET: mcvax!ukc!man.cs.ux!mario ( __ ) Manchester M13 9PL JANET: mario@uk.ac.man.cs.ux `-': :`-' U.K. Tel: +44-61-275 6146 (FAX: 6280) ____; ;_____________the mushroom project____________________________________