[comp.windows.x] What is the LENGTH target for a selection?

guy@auspex.auspex.com (Guy Harris) (08/07/90)

In the ICCCM, in Table 2, it says that the target atom LENGTH means the
"number of bytes in selection".  However, said table lists all sorts of
*other* targets, more than one of which represent a format in which a
selection can be delivered.  For example, some word processor might be
willing to deliver a text selection in:

	STRING form, by stripping formatting/font/etc. information out;

	TEXT form, by simply coughing its "native" encoding out;

	ODIF form, by mapping it to ODIF;

etc..

Does the "number of bytes in the selection" mean the number of bytes it
takes up in the application's native representation, the number of bytes
it takes in STRING form, the number of bytes it takes in ODIF form,
etc.?

XView probably thinks it's supposed to be the number of bytes in STRING
format, since LENGTH gets mapped to the {Sun,X}View SELN_REQ_BYTESIZE
selection and the SunView documentation says that's "the number of bytes
in the selection's ascii contents".  Is that the correct interpretation,
is there another interpretation that's correct, or is this a "convention
that's not a convention", i.e. there's no agreed-on conventional meaning
for the "number of bytes in selection"?