[comp.arch] BCD

torek@elf.ee.lbl.gov (Chris Torek) (02/16/91)

>In article <1991Feb14.151831.15426@linus.mitre.org> bs@linus.mitre.org
(Robert D. Silverman) suggests once again that BCD arithmetic could be
replaced with long (i.e., 64 or more bits) integer arithmetic:

>>No COBOL programs would change -- only the compilers would change.

In article <4158@gazette.bcm.tmc.edu> rick@pavlov.ssctr.bcm.tmc.edu
(Richard H. Miller) writes:
>This is certainly not correct.

The quoted (>>) statement *is* correct.  However:

>... Thus, if you eliminated BCD, the calculations would be faster,

(there is even some disagreement as to this point, but ignoring that:)

>but every time you wanted to put the results out, you would have to
>convert the integer values to decimal format, edit the number into
>the output format and then output it.

(The conversion and editing can be done simultaneously.)

>Another important consideration is the fact that many files already are set
>up with decimal fields. If you change to compiler to handle integer only, you
>will either have to automatically convert input records from decimal to 
>integer so you now have conversion->processing->conversion, or you have to
>invest a lot of money in doing the systems design and maintenance to convert
>all of the BCD fields to either integer or character. [You now are talking
>about a fundemental change to the application which requires (or should) 
>the services of a systems analyst, programmers, testing and quality
>assurance.] 

This is all true.

>The bottom line is that programs would change or processing time will go up. 

This is not necessarily so.

The `processing time' to run some application is the real time
consumed, not the CPU time.  CPU usage affects real time indirectly.
If the system running the application is sufficiently I/O bound, or if
the gain in arithmetic time (if any) exceeds the loss in conversion
time (if any) by a sufficient amount (where `sufficient' again depends
on to what extent the system is I/O bound), the real time used will
be unchanged or reduced.

It should therefore be possible to compute:

	1) the amount of processing done
	2) the gain or loss in processing time
	3) the gain or loss in conversion time
	4) the percentage of `wasted' CPU time on the system in question

to find the overall effect of replacing BCD operations with extended
integer operations.  Note that conversion between decimal and binary
representations is not as hard as it seems at first (Richard O'Keeffe,
for instance, has examples of clever conversion routines).

I suspect that the above computation has been done (at least roughly)
by at least one company (MIPS).
-- 
In-Real-Life: Chris Torek, Lawrence Berkeley Lab EE div (+1 415 486 5427)
Berkeley, CA		Domain:	torek@ee.lbl.gov