[comp.arch] Why 36 bits.

cik@l.cc.purdue.edu (Herman Rubin) (07/28/89)

In article <3490016@wdl1.UUCP>, bobw@wdl1.UUCP (Robert Lee Wilson Jr) writes:
> And note that the tube type machines from which the 709x evolved were also
> 36 bitters, and in turn that 36 is a nice multiple of 12 which is the
> number of rows in an "IBM" card. Since old machines like the 704/701 read
> cards as binary images into core, and translated bit patterns into
> (EBCDIC) characters via internal software rather than something in an I/O
> channel, this was not a coincidence!
> The past marches on!
> Bob Wilson
> (bobw@wdl1.fac.ford.com)

This reasoning happens to be totally false.  It was not until much later
that cards were read by columns.  The 704 and 709, certainly read by ROWS,
not columns.  The first word read was the left 36 bits of the 9's row,
the next bits 37-72, then the left 36 bits of the 8's row, etc.  The
right 8 columns were not read at all; this is the reason why Fortran
used only 72 columns.

The early computers used 6-bits (some even 5) per character.  Thus 36
bits was 6 characters.  The standard tapes were 7 tracks including parity.
There were several 6 bit character codes on different machines.

The CDC 1604 and 3600 were 48 bits, and the CDC 6x00 was 60 bits.  These
machines only had upper case, and 6 bit character sets.  They did use the
more efficient column binary.
-- 
Herman Rubin, Dept. of Statistics, Purdue Univ., West Lafayette IN47907
Phone: (317)494-6054
hrubin@l.cc.purdue.edu (Internet, bitnet, UUCP)

tjd@halley.UUCP (Tom Davidson) (07/29/89)

In article <1451@l.cc.purdue.edu>, cik@l.cc.purdue.edu (Herman Rubin) writes:


> The CDC 1604 and 3600 were 48 bits, and the CDC 6x00 was 60 bits.  These
> machines only had upper case, and 6 bit character sets.  They did use the
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
> more efficient column binary.

At least in the case of the 6x00, 6 bit character sets were the norm.
The character set was called display code and came from the fact
that the system console understood 6 bit characters. (get it? display code)

Under the timesharing operating systems KRONOS and NOS, there was two
other character sets which were used.  Extended display code (or
more commonly called 6/12 ascii) reserved the codes 74(8) and 76(8)
to escape into an expanded set in the next 6 bits.  This made programming
rather fun, since not all characters were the same size.  Also, real
ascii(yup!) could be had if you didn't mind using 8 bits out of 12.
Packed ascii (8 in 8) was(is) used in the network interfaces where you
had 7.5 characters in one 60 bit word!

Alas, most compilers understood only the 6 bit and 6/12 characters sets.

bobw@wdl1.UUCP (Robert Lee Wilson Jr) (08/01/89)

Two people, at least, seem to have thought I said 704's, etc, read cards
columnwise. I did not think so, and don't believe I said so!
On the other hand I was wrong, just on a different point:
I was thinking that it was nice that the array, once read in, be a whole
number of words. Mr. Rubin is of course right that the 72 column limit
came from what was read in, and so the divisibility of the whole number of
bits (12 * 72) by the word length was guaranteed anyway.
Bob Wilson
(bobw@wdl1.fac.ford.com)