fostel@ncsu.UUCP (Gary Fostel) (05/11/84)
C'mon, its not hard to imagine how the character set could make quite a differnce -- lots of those very unpleasant sorts of problems: Sign extension. Ebcdic uses all eight bits -- are YOU sure your code/compiler is up to the stringint distinction between signed and unsigned excitement? Hashcodes developed for ascii values may perform differently with ebcdic input. Ebcdic has a different layout pattern -- ascii is quite linear and dense, while ebcdic is , well just say its different and go find a green card (or is it yellow now?) and take a look. How many tricks do we all use from time to time, assumptions like lower case "b" and upper case "B" being related by a constant? Is it true in ebcdic? And we all know that "B" = "A" + 1 right. But is it true in ebdic? And of course then there are a few peculiarities of ebcdic, in terms of where the control chars go. Imagine if you had to go through your code and find every place you ever stuck in an octal constant to be compared with, say form feed, newline, tab, escape! for gossakes those heathens at Big Blue don't use the same values! Now ask youself, would Y-O-U want the job of completeing this list, little by little as you tried to convert our code to an ebcdic system? Any wonder your counterpart at Ahmdahl came up with good reasons not to try? (By the way, the magic ASCII/EBCDIC bit some of you may recall seeing in the IBM Priciples of Op in the PSW, was never used and was dropped in the transition from 360 to 370. Any guesses why it was there originally?) ----GaryFostel----
jlw@ariel.UUCP (05/12/84)
The original IBM 360 assembler that we used in Kingston, NY in 1966 was actually a cross assembler which ran on IBSYS on a 7094 across the road. We used a three tape rotation system. This is probably why the 360 had an ASCII bit in the PSW. Some of the code we were working on in those days can still be found in the SE's room. CEDA = CPU Error Detection and Analysis and SEREP = System Environment Recording, Edit, and Print (prints out the machine check area after a machine check; what's where is very model dependent). Joseph L. Wood, III AT&T Information Systems Laboratories, Holmdel (201) 834-3759 ariel!jlw
guy@rlgvax.UUCP (05/13/84)
> The original IBM 360 assembler that we used in Kingston, NY in > 1966 was actually a cross assembler which ran on IBSYS on a > 7094 across the road. We used a three tape rotation system. > This is probably why the 360 had an ASCII bit in the PSW. But didn't the 7094 use BCDIC (the six-bit predecessor to EBCDIC) rather than ASCII? There was probably an ASCII bit there because IBM wanted to be able to support 8-bit ASCII when it came out - 7-bit ASCII was already out (or coming out soon), and there was an 8-bit version in the works - but it turned out the final version of 8-bit ASCII was not going to be what IBM expected it to be and not what the 360 implemented. For that reason, and because converting all the 360 hardware and software to ASCII would have been a major undertaking, IBM canned the ASCII mode in the 370. Guy Harris {seismo,ihnp4,allegra}!rlgvax!guy
stan@clyde.UUCP (Stan King) (05/14/84)
I'm not sure whether this point is clearly understood by the participants of net.arch: the USASCII-8 mode bit of the System/360 PSW was used ONLY for determining the representation of decimal results. NOTHING else in the hardware architecture makes assumptions about character set. I welcome people to prove me wrong, by mail, and I will summarize replies. I am wearied by statements that "IBM is an EBCDIC machine"; it is so, but only for the purposes of the decimal (radix=10) instructions. My point of reference is GA22-7000-4, File No. S/370-01, Sept. 1974. Stan King phone: 201-386-7433 Bell Labs, Whippany, NJ Cornet: 8+232-7433 room 2A-111 uucp: clyde!stan
guy@rlgvax.UUCP (Guy Harris) (05/14/84)
> I'm not sure whether this point is clearly understood by the > participants of net.arch: the USASCII-8 mode bit of the System/360 PSW > was used ONLY for determining the representation of decimal results. The point was clearly understood by at least some of us. > NOTHING else in the hardware architecture makes assumptions about > character set. I welcome people to prove me wrong, by mail, and I > will summarize replies. I am wearied by statements that "IBM is an > EBCDIC machine"; it is so, but only for the purposes of the decimal > (radix=10) instructions. True, although a lot of translating would have to be done for all the I/O devices. However, the question of software compatibility with other OSes (not a totally irrelevant question, considering Amdahl's UTS runs under VM along with other OSes) also comes up - character set translations would have to be done when shipping data between MVS/CMS/etc. and an ASCII UNIX. Furthermore, one would need two flavors of C compiler; one for UNIX and one for the other OSes. Guy Harris {seismo,ihnp4,allegra}!rlgvax!guy
zben@umcp-cs.UUCP (05/14/84)
>> Ebcdic has a different layout pattern -- ascii is quite linear and >> dense, while ebcdic is , well just say its different and go find a >> green card (or is it yellow now?) and take a look. How many tricks do >> we all use from time to time, assumptions like lower case "b" and upper >> case "B" being related by a constant? Is it true in ebcdic? And we >> all know that "B" = "A" + 1 right. But is it true in ebdic? 1. Yellow. :-) 2. Lower case "b" and upper case "B" *are* related by a constant. In fact, it is the same constant for all the letters... :-) 3. Well, "B" does equal "A" + 1, ***but*** "J" does *not* equal "I" + 1, nor does "S" equal "R" + 1... I went around and around on this trying to convert my Pascal screen editor from Univac 1100 to IBM 4341 (right, one dinosaur to another) and was able to cobble together reasonable ISUPPER functions etc. It just turned out that one *cannot* inhibit the stupid period (.) prompt the thing gives you when ready to read, and that played hob with the screen editing. Don't *anybody* write back and tell me about "set prompt 3101". That substitutes a linefeed for the period. Unless you want to give up the top line of your terminal that doesn't work either... Grump. -- Ben Cranston ...seismo!umcp-cs!zben zben@umd2.ARPA
darrelj@sdcrdcf.UUCP (05/15/84)
Another difference between ASCII and EBCDIC which can throw off some programs: in ASCII, upper case characters are lower values than lower case. In EBCDIC, they are larger. The pattern of code assignment is such that various shortcut tricks differ even in availability. EBCDIC text can be shifted to upper case by just ORing X'40' (a blank) with other characters (it will modify control characters, but these usually don't get past a terminal controller). -- Darrel J. Van Buer, PhD System Development Corp. 2500 Colorado Ave Santa Monica, CA 90406 (213)820-4111 x5449 ...{allegra,burdvax,cbosgd,hplabs,ihnp4,sdccsu3,trw-unix}!sdcrdcf!darrelj VANBUER@USC-ECL.ARPA
phil@unisoft.UUCP (05/16/84)
>> > The original IBM 360 assembler that we used in Kingston, NY in >> > 1966 was actually a cross assembler which ran on IBSYS on a >> > 7094 across the road. We used a three tape rotation system. >> > This is probably why the 360 had an ASCII bit in the PSW. >> >> But didn't the 7094 use BCDIC (the six-bit predecessor to EBCDIC) rather >> than ASCII? There was probably an ASCII bit there because IBM wanted to >> be able to support 8-bit ASCII when it came out - 7-bit ASCII was already >> out (or coming out soon), and there was an 8-bit version in the works - but >> it turned out the final version of 8-bit ASCII was not going to be what >> IBM expected it to be and not what the 360 implemented. For that reason, >> and because converting all the 360 hardware and software to ASCII would >> have been a major undertaking, IBM canned the ASCII mode in the 370. >> >> Guy Harris >> {seismo,ihnp4,allegra}!rlgvax!guy No no. IBM provided the ASCII bit in the PSW to allow the packed decimal instructions (and a few other instructions of the same ilk) to handle ASCII character data to/from numeric computations. This bit was used in the new 370's to indicate new modes of machine supervisor state operation. IBM announced at the time that it took a survey of customers to find out who was using the ASCII bit and found nobody was. Thus, since bits in the PSW were getting in short supply, they resused it. There was never any intention by IBM to convert to ASCII, nor was this bit anything more than a numeric character conversion aid.
guy@rlgvax.UUCP (Guy Harris) (05/17/84)
> >> > The original IBM 360 assembler that we used in Kingston, NY in > >> > 1966 was actually a cross assembler which ran on IBSYS on a > >> > 7094 across the road. We used a three tape rotation system. > >> > This is probably why the 360 had an ASCII bit in the PSW. > >> > >> But didn't the 7094 use BCDIC (the six-bit predecessor to EBCDIC) rather > >> than ASCII? There was probably an ASCII bit there because IBM wanted to > >> be able to support 8-bit ASCII when it came out - 7-bit ASCII was already > >> out (or coming out soon), and there was an 8-bit version in the works - but > >> it turned out the final version of 8-bit ASCII was not going to be what > >> IBM expected it to be and not what the 360 implemented. For that reason, > >> and because converting all the 360 hardware and software to ASCII would > >> have been a major undertaking, IBM canned the ASCII mode in the 370. > >> > >> Guy Harris > >> {seismo,ihnp4,allegra}!rlgvax!guy > No no. IBM provided the ASCII bit in the PSW to allow the packed decimal > instructions (and a few other instructions of the same ilk) to handle > ASCII character data to/from numeric computations. This bit was used > in the new 370's to indicate new modes of machine supervisor state > operation. IBM announced at the time that it took a survey of customers > to find out who was using the ASCII bit and found nobody was. Thus, > since bits in the PSW were getting in short supply, they resused it. > There was never any intention by IBM to convert to ASCII, nor was > this bit anything more than a numeric character conversion aid. There's a very easy way to handle ASCII character data to/from numeric computations without an ASCII bit; convert to EBCDIC using "UNPACK" (or whatever the BAL opcode was) or "EDIT"/"EDMK" and then convert to ASCII using the translate instruction. The ASCII bit would have been most useful for programs running entirely with ASCII character strings, and such programs would have been useful only if IBM had converted to ASCII. In other words, there are easier ways to convert packed decimal to ASCII than providing a (non-user-settable! - LPSW is privileged) mode bit. True, the only *effect* of the ASCII bit is to change the behavior of PACK, UNPACK, EDIT, and EDMK; but that change isn't really useful to programs running in EBCDIC, only to programs running in ASCII. Early in the life cycle of the 360, it might have been possible for IBM to convert to ASCII. Once enough of the machines got out there with EBCDIC software and data files (ooh, sorry, "datasets"), it was too late - the ASCII-8 standard didn't settle down in time. It's not surprising that nobody used it, considering 1) nobody was interested enough in ASCII and 2) IBM didn't support it except in the hardware. Guy Harris {seismo,ihnp4,allegra}!rlgvax!guy
rld@bentley.UUCP (Bob Duncanson) (05/22/84)
> From: darrelj@sdcrdcf.UUCP > Date: Tue, 15-May-84 08:26:39 EDT > Article-I.D.: sdcrdcf.1083 > > ... EBCDIC text can be > shifted to upper case by just ORing X'40' (a blank) ... So? in ASCII, a lowercase character can be shifted to uppercase by SUBtracting an x'20' (also a blank). Your machine has a SUBtract, I hope. Converting upper to lower is ADDing the same x'20'. ADD usually takes no longer than OR. -- ------------- -******------------- -**********------------- -************------------- Bob Duncanson -************------------- AT&T Bell Laboratories --********-------------- Piscataway, NJ ---------------------- {ihnp4,allegra,cbosgd}!bentley!rld ------------------ ------------