[comp.arch] The ENIAC & Decimal Arithmetic

webber@athos.rutgers.edu (Bob Webber) (03/18/90)

In article <1094@ra.cs.Virginia.EDU>, mac@ra.cs.Virginia.EDU (Alex Colvin) writes:
> >(One exception seems to be ENIAC, which seems to have used a decoded
> >decimal representation.  Can someone confirm that?  Can anyone explain
> >why they did it that way?)
> It's easier to detect dropped bits in such a number system.
> Some of the other early machines used 2-out-of-5 decimal digits.
> The MTBF of tubes in Eniac was expected to be hours.

Section 7 of The ENIAC by John W. Mauchly (essay in A History of Computing
in the Twentieth Century editied by Metropolis, Howlett, and Rota, Academic
Press, 1980) is entitled ``Reasons for a Decimal ENIAC.''  It is enlightening.

The problem is that humans use computers and humans work in base 10.
The main reason, according to Mauchly, for using base 10 was that it
made it easier for humans to figure out what was going on.  Remember
that the ENIAC was not a box that set down in ``central computing,''
it was a laboratory for building computers.  The people that wanted to
use it wired it the way they wanted it (at least in the beginning).
Remember this was before the days of people who toggled octal into Dec
boxes and read hex dumps on IBM mainframes.  Certainly the decade
counters on the ENIAC (that stored the base 10 digit as one on in a
cycle of 10 tube-pairs and sent digits similarly) were easier to see how
were working than a binary counter (which were known at the timme) would 
have been.

A secondary reason mentioned is that the data comes into the computer
on cards and goes out on cards.  Cards traditionally contained base 10
numbers.  The ENIAC was generally used in tandom with other card
manipulation devices that were oriented toward base 10 (as well as
standard IBM readers and writers).  Indeed, at the time, people were
building up collections of cards containing math tables of interest
due to the pre-computer usage of punched cards in computing.  The
ENIAC fit into this tradition nicely.  Thus, the claim was made that
to convert back and forth between binary and decimal would have
actually increased the tube count.  In the context of this, it is
worthwhile remembering that the ENIAC was originally designed to have
only a third the number of tubes it eventually had (once people
decided they liked the idea, they wanted it to be bigger and better,
of course, just like designs today).

--- BOB (webber@athos.rutgers.edu ; rutgers!athos.rutgers.edu!webber)