[comp.arch] Binary<->Decimal conversion

ok@goanna.oz.au (Richard O'keefe) (02/13/90)

There was a lot of talk last year about integer multiply/divide and
whether they are important enough to go in RISCs.  One of the things
for which they seem useful is binary<->decimal conversion.
I have experimented with table-based binary<->decimal conversion.
At the moment my files are on a tape, but I mean to post sources for
table-based conversion routines to one of the sources newsgroups soon.

The bottom line is that table-based 32-bit-binary <-> byte decimal
conversion not only performs well on RISCs that lack hardware multiply
and divide, but it beats the UNIX library functions on machines (VAX,
32?32, Clipper) that _have_ got hardware multiply and divide.  For
example, my version of strtoul() runs about 20%-24% faster than the
BSD UNIX library routine on an Encore Multimax or a VAX.

Of course the C functions have overheads (skipping leading white space
and so on) which do not apply to COBOL or PL/I; if data alignment and
size were known in advance the table-based approach would look even better.