d25001@mic.UUCP (10/24/86)
>MSC 4.0 defaults 'char' to 'signed char'. For standard ASCII there >is no difference between 'signed char' and 'unsigned char'. When I >get to IBM's extensions to ASCII the situation is much different! >2) What possible justification is there for this default? Is not >'char' primarily a logical (as opposed to mathematical) quantity? What >I mean is, what is the definition of a negative 'a'? I can understand >the desirability of allowing 'signed char' for gonzo programmers who >won't use 'short', or who want to risk future compatibility of their >code on the bet that useful characters will always remain 7-bit entities. ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ > >Peace, > >Jeffrey William Gillette >Humanities Computing Facility >Duke University > duke!jwg Just yesterday I received my upgrade to Microsoft Windows. The new documentation that comes with the upgrade talks about a 256 character ANSI character set and gives a table of all of the printables. Alas, they do not document the number of the standards document from which this list was drawn. Anybody know more about this? This is definitely NOT the usual IBM PC extended character set; many of the characters appear in both sets but not with the same bit patterns. Carrington Dixon UUCP: { convex, infoswx, texsun!rrm }!mcomp!mic!d25001