philm@astroatc.UUCP (Phil Mason) (11/11/86)
In article <363@yabbie.rmit.oz> rcodi@yabbie.rmit.oz (Ian Donaldson) writes: >In article <2447@hcr.UUCP>, mike@hcr.UUCP (Mike Tilson) writes: > . . . >The CDC Cyber 170 series uses this concept to advantage with most languages; >since it has 60-bits (a silly number, I agree), it sets all 'bss' storage to >0600000000000004nnnnn, where nnnnnn is the address of the storage. > . . . >Ian Donaldson. The Cyber word length was selected to be 60 bits because of the number of exact divisors it has : 2, 3, 4, 5, 6, 10, 12, 15, 20, and 30. As you can see, one can pack quite a variety of different length data fields in one word and not have to worry about your 3 (or whatever) bit fields from extending over a word boundary! CDC thought that nobody would ever use more than 64 different symbols for I/O so they made their "byte" six bits long. Packing ten of them in a word is convenient, to say the least. -- =-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-= Kirk : Bones ? | Phil Mason, Astronautics Technology Center Bones : He's dead Jim. | Madison, Wisconsin - "Eat Cheese or Die!" - - - - - - - - - - - - - - - -| ...seismo-uwvax-astroatc!philm | I would really like to believe that my ...ihnp4-nicmad/ | employer shares all my opinions, but . . . =-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=
bobr@zeus.UUCP (Robert Reed) (11/11/86)
In article <612@astroatc.UUCP> philm@astroatc.UUCP (Phil Mason) writes: >The Cyber word length was selected to be 60 bits because of the number of >exact divisors it has : 2, 3, 4, 5, 6, 10, 12, 15, 20, and 30. That's a great myth. Almost believable. But isn't it true that the Cyber word length was set a 60 bits to be compatible with the old CDC-6000 series? Isn't the operant concern here to be a multiple of 6? See, when the 6000 was born, it was a successor of the CDC-3300 series, which used 36 bit words (like many contemporaries, such as Univac and Honeywell and DEC). Back in those days, most systems used 6 bit codes for characters, and DEC for example, had machines with word sizes of 12 (PDP-8) and 18 (LINC/PDP-15) bits. If anything, the reason for 60 bits is historical precedent. -- Robert Reed, Tektronix CAE Systems Division, bobr@zeus.TEK
kim@amdahl.UUCP (Kim DeVaughn) (11/12/86)
In article <612@astroatc.UUCP>, philm@astroatc.UUCP (Phil Mason) writes: > CDC thought that nobody would ever use more than 64 > different symbols for I/O so they made their "byte" six bits long. Packing > ten of them in a word is convenient, to say the least. And then they discovered that people would use more than 64 symbols, and had to come up with an escape kluge to get additional symbols. So now CDC Display Codes can be either 6-bits or 12-bits in length. /kim -- UUCP: {sun,decwrl,hplabs,pyramid,ihnp4,seismo,oliveb}!amdahl!kim DDD: 408-746-8462 USPS: Amdahl Corp. M/S 249, 1250 E. Arques Av, Sunnyvale, CA 94086 CIS: 76535,25 [ Any thoughts or opinions which may or may not have been expressed ] [ herein are my own. They are not necessarily those of my employer. ]
karl@haddock.UUCP (Karl Heuer) (11/12/86)
In article <612@astroatc.UUCP> philm@astroatc.UUCP (Phil Mason) writes: >The Cyber word length was selected to be 60 bits because of the number of >exact divisors it has : 2, 3, 4, 5, 6, 10, 12, 15, 20, and 30. I thought it was an arbitrary decision based on some piece of hardware they had to interface to. (Hearsay.) >CDC thought that nobody would ever use more than 64 different symbols for >I/O so they made their "byte" six bits long. Actually, I believe they used "byte" to denote a 12-bit quantity, so one byte normally contains two characters. >Packing ten of them in a word is convenient, to say the least. Yeah. I liked being able to store strings in integers instead of arrays! (This was before FORTRAN had a character type.) Karl W. Z. Heuer (ima!haddock!karl or karl@haddock.isc.com), The Walking Lint
latham@bsdpkh.UUCP (Ken Latham) (11/12/86)
> > In article <612@astroatc.UUCP>, philm@astroatc.UUCP (Phil Mason) writes: >> CDC thought that nobody would ever use more than 64 >> different symbols for I/O so they made their "byte" six bits long. Packing >> ten of them in a word is convenient, to say the least. > > And then they discovered that people would use more than 64 symbols, and > had to come up with an escape kluge to get additional symbols. So now > CDC Display Codes can be either 6-bits or 12-bits in length. > No! not really. CDC uses prefix characters for additional display codes in much the same way that ANSI uses ^[ to prefix an incoming (outgoing) control string. You could no more say that ^[[2J (ANSI clear screen) is 32-bits long, than you can call CDC codes 12 bits long. CDC sends one of several prefix codes ( micro, super, sub, shift ... ) to apply to the following character ( sometimes combined they affect more than one character ) This is definitely NOT a KLUDGE !!! It is a valid way of extending display codes. It is far better than extending the bit length to include one number for each display alternative you have. Ken Latham
robert@gitpyr.gatech.EDU (Robert Viduya) (11/12/86)
>kim@amdahl.UUCP (Kim DeVaughn) (kim@amdahl.UUCP, <4169@amdahl.UUCP>): > In article <612@astroatc.UUCP>, philm@astroatc.UUCP (Phil Mason) writes: > > CDC thought that nobody would ever use more than 64 > > different symbols for I/O so they made their "byte" six bits long. > And then they discovered that people would use more than 64 symbols, and > had to come up with an escape kluge to get additional symbols. So now > CDC Display Codes can be either 6-bits or 12-bits in length. This actually happened years ago. Recently, they've decided to adopt the ultimate kludge. They've dropped all the 60-bit word and 6-bit character nonsense and now are using 64-bit words and 8-bit bytes (not to mention using the ASCII character set in their operating system :-). Oh yeah, their new architecture is also a twos-complement one. robert -- Robert Viduya robert@pyr.ocs.gatech.edu Office of Computing Services (404) 894-4660 Georgia Institute of Technology Atlanta, Georgia 30332
philm@astroatc.UUCP (Phil Mason) (11/12/86)
In article <852@zeus.UUCP> bobr@zeus.UUCP (Robert Reed) writes: >>The Cyber word length was selected to be 60 bits because of the number of >>exact divisors it has : 2, 3, 4, 5, 6, 10, 12, 15, 20, and 30. > >That's a great myth. Almost believable. But isn't it true that the Cyber >word length was set a 60 bits to be compatible with the old CDC-6000 series? >Isn't the operant concern here to be a multiple of 6? See, when the 6000 >was born, it was a successor of the CDC-3300 series, which used 36 bit words >(like many contemporaries, such as Univac and Honeywell and DEC). Back in >those days, most systems used 6 bit codes for characters, and DEC for >example, had machines with word sizes of 12 (PDP-8) and 18 (LINC/PDP-15) >bits. If anything, the reason for 60 bits is historical precedent. Yes, that is part of the reason. One may well ask, "why didn't CDC just double the word length in the 6600 versus the 3600 machine : 36 to 72 bits? Or perhaps just bump it up to 42 (a great choice Hitchhiker's Guide Fans!) or 48 bits? In addition to being a multiple of 6, and having many bits for floating point precision, you can also pack many different sized data into 60 bit words without slop. 60 bits works out the best of any reasonably small word length in this regard. -- =-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-= Kirk : Bones ? | Phil Mason, Astronautics Technology Center Bones : He's dead Jim. | Madison, Wisconsin - "Eat Cheese or Die!" - - - - - - - - - - - - - - - -| ...seismo-uwvax-astroatc!philm | I would really like to believe that my ...ihnp4-nicmad/ | employer shares all my opinions, but . . . =-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=
rick@seismo.CSS.GOV (Rick Adams) (11/13/86)
From: "Assembly Language Programming for the Control Data 6000 Series and the Cyber 70 Series" by Ralph Grishman, Algorithmics Press, 1974, page 39 As mentioned eariler, each word has 60 bits, relatively large as machine word sizes go. This size permits a floating point number with about 15 decimal places accuracy, sufficient for virtually all applications. A large word also permits several instructions to be put into one word, so that the number of memory accesses required toget out instructions is reduced. Finally, 60 is a multiple of 2, 3, 4, 5, and 6, so that several different subdivisions of the word may be conveniently made. It is also interesting to note later on page 43: Though 71 instructions isn't very many (most very large computers have several hundred), the 6600 instructions are sufficiently versatile and powerful and so fast that the 6600 can run circles around many other large computers with many more instructions. Often an entire program loop on a 6600 will be faster than a single instruction on another machine that performs the same calculation! So, the CDC 6600 was a RISC machine! ---rick
kim@amdahl.UUCP (Kim DeVaughn) (11/14/86)
In article <254@bsdpkh.UUCP>, latham@bsdpkh.UUCP (Ken Latham) writes: > > And then they discovered that people would use more than 64 symbols, and > > had to come up with an escape kluge to get additional symbols. So now > > CDC Display Codes can be either 6-bits or 12-bits in length. > > CDC uses prefix characters for additional display codes in > much the same way that ANSI uses ^[ to prefix an incoming > (outgoing) control string. ^^^^^^^^^^^^^^ > You could no more say that ^[[2J (ANSI clear screen) is > 32-bits long, than you can call CDC codes 12 bits long. > > CDC sends one of several prefix codes ( micro, super, sub, shift ... ) > to apply to the following character ( sometimes combined they > affect more than one character ) No, I wouldn't say <CSI>2J is a 32-bit character. It is, as you pointed out, a *control string*. I find that to be quite different than the basic character set a machine uses to represent textual information. When the "A" character is 6 bits, and the "a" character is 12 bits, I would still say that the character codes are either 6-bits or 12-bits. > This is definitely NOT a KLUDGE !!! It is a valid way of extending > display codes. It is far better than extending the bit length > to include one number for each display alternative you have. Sure, it's valid, but it sure makes writing things like a driver for a Tektronix 4010 Graphics Display "interesting"; the CPU code was fairly straightforward, but the PP code was a real mess. And that was a *direct* result of the 6/12-bit character set, which is why I call it a Kluge. This was on a 6600 back in the early-mid 1970's ... the newer Cyber PP's may have been improved so that doing such things is far less painful than it was then. Hindsight is wonderful, isn't it! /kim -- UUCP: {sun,decwrl,hplabs,pyramid,ihnp4,seismo,oliveb}!amdahl!kim DDD: 408-746-8462 USPS: Amdahl Corp. M/S 249, 1250 E. Arques Av, Sunnyvale, CA 94086 CIS: 76535,25 [ Any thoughts or opinions which may or may not have been expressed ] [ herein are my own. They are not necessarily those of my employer. ]
rcd@ico.UUCP (Dick Dunn) (11/15/86)
> >The Cyber word length was selected to be 60 bits because of the number of > >exact divisors it has : 2, 3, 4, 5, 6, 10, 12, 15, 20, and 30. > > That's a great myth. Almost believable... Come on...it's neither myth nor the real reason. Don't you think that there are more considerations to word size than the number of factors??? (I know of a machine that had a 29-bit word--so there!:-) There are about eleventeen things to consider (see below). Maybe this whole discussion should have been in comp.arch... >...But isn't it true that the Cyber > word length was set a 60 bits to be compatible with the old CDC-6000 series? The "Cyber" name was attached to the 6000 series later in its life. The move to the 64-bit word, 2's comp, and all that came along much later, but the Cyber 7x machines were essentially the same hardware as the 6y00's (y!=x, of course:-) > Isn't the operant concern here to be a multiple of 6? Sort of. 6-bit characters were a concern. Another concern made it desirable to have wordsize a multiple of 12. >...See, when the 6000 > was born, it was a successor of the CDC-3300 series, which used 36 bit words Flamers live for postings like this. The CDC 3300 (and 3200...) - the "lower 3000" series machines - were fairly slow machines and not really intended for scientific/numerical work. The 6600 (the first of the 6000 line) was in a completely different market...it might be considered some sort of a successor to the 3600/3800 machines. Oh yeah, and look it up before you post it--the lower 3000 machines were =>24 bit<= word size and the upper 3000 were =>48 bit<=. Both of these were somewhat unusual for the day, but the 48-bit word was much more useful for single-precision floating point work than the 36 popular those days. Other considerations on word size: The way the machine was actually built, the 6600 used a memory module which was 4K x 12 bit. One of these modules was enough for the memory of a peripheral processor (PPU, a little I/O guy, of which there were 10 in the standard configuration). Stack them 5 wide and you get central memory; the memory was interleaved in 4K chunks (since the processor design almost required at least 16 banks of memory for the context switch instruction, but I digress:-) So there's an argument for a multiple of 12. I think the core modules may have been used in other CDC machines like the 3000's or the 8090, but I'm not sure on that. Also, remember that this is a RISC-style machine. It wants 2 forms of instructions--those which contain address-size constants and those which don't. The ones that don't are 3-address reg-to-reg. If you work out the numbers, it comes up that 15 and 30 bit instructions are nice. For a floating-point value with a healthy exponent and lots of precision (to try to stay away from double precision as much as you can) you want at least the 48 bits of the upper 3000's, maybe more. 60 is ok; 72 probably would have been a bit much. Before the days of byte-addressible 32-bit designs, word size was a balancing act among a lot of factors. -- Dick Dunn {hao,nbires,cbosgd}!ico!rcd (303)449-2870 ...Relax...don't worry...have a homebrew.
jack@mcvax.uucp (Jack Jansen) (11/15/86)
In article <852@zeus.UUCP> bobr@zeus.UUCP (Robert Reed) writes: >In article <612@astroatc.UUCP> philm@astroatc.UUCP (Phil Mason) writes: >>The Cyber word length was selected to be 60 bits because of the number of >>exact divisors it has : 2, 3, 4, 5, 6, 10, 12, 15, 20, and 30. > >That's a great myth. Almost believable. But isn't it true that the Cyber >word length was set a 60 bits to be compatible with the old CDC-6000 series? >Isn't the operant concern here to be a multiple of 6? Well, a Cyber isn't really the successor of the CDC-6000 series, as far as I know, it's the same thing. At some point, they changed the name CDC-6000 series into CDC Cyber 60 (and then 70, etc.). 60 bits was very convenient, since it was not only 10 6-bit charachters, but also 4 15-bit instructions. A long time ago, someone told me the following story to explain the 60 bit wordlength. Note that I don't really believe it, it's probably too good to be true. At some time, the DoD (or some equally powerful body) said they would, from now on, only buy 64 bit machines, which supported an 8 bit character set. Control Data, being fairly sure of the fact that their machine would be bought *anyway*, because the immense performance gap between it and it's nearest competitors, decided not only to build a machine with 6 bit characters, but also use 60 bit words, so that it would be virtually impossible to transfer tapes containing any binary data from their machine to anything else...... -- Jack Jansen, jack@mcvax.UUCP The shell is my oyster.
capshaw@milano.UUCP (11/16/86)
Annals of the History of Computing, Vol. 2, No. 4 (October 1980) contains the article ''The CDC 6600 Project'' by James E. Thornton. In the article Thornton writes It was my good fortune to design the 6600 CPU. [Seymour] Cray and I established a clean, simple, and logically very powerful instruction set, biased to scientific and binary processing. ... The selection of 60-bit word length came after a lengthy investigation into the possibility of 64 bits. Without going into to it in depth, our octal background got the upper hand. Another aspect of the 60-bit word, though, was how efficiently the small instruction format (15 bits) and the large instruction format (30 bits) fit. I have long felt that a sixteenth bit would have demolished our clean and simple instruction set. We were not ready for the vector and array processing to come much later. We were also not from the the school of variable-length string processing. -- Dave Capshaw
ed@mtxinu.UUCP (Ed Gould) (11/18/86)
>Well, a Cyber isn't really the successor of the CDC-6000 series, as >far as I know, it's the same thing. At some point, they changed the >name CDC-6000 series into CDC Cyber 60 (and then 70, etc.). The story as I heard it is this: When CDC sold 6600's - and later 6400's, 6500's, and 7600's - to the government, there was a clause in the sales contract stating that if CDC *ever* sold these machines (specifically these model numbers) to *anyone* at a lower proce, the government would recieve a refund of the difference. As the 6000 series aged (it was new in the mid 60's, still being sold in the mid 70's), there was some desire to price it lower. But CDC couldn't face the millions of dollars the government would have due if they sold it for less. So they changed the paint job, renamed the machines Cyber, and lowered the price. No reimbursements to Uncle Sam. This may all be legend, but that's the way I heard it. What's it doing in unix-wizards, anyway? -- Ed Gould mt Xinu, 2560 Ninth St., Berkeley, CA 94710 USA {ucbvax,decvax}!mtxinu!ed +1 415 644 0146 "A man of quality is not threatened by a woman of equality."