[comp.sys.next] printf conversion with "%.16g"

bates@wingra.stat.wisc.edu (Douglas Bates) (02/06/90)

printf on the NeXT behaves differently than on many other machines I
have tested (Sun-3, DECstation, Sequent, IBM RT, HP 9000, AT&T 3B2).
Compiling and executing the following

#include <stdio.h>

main(argc, argv)
     int argc;
     char **argv;
{
  double x = 1524.68;
  printf("x = %.16g\n", x);
}

produces the result

x = 1524.6800000000001

on a NeXT and

x = 1524.68

on all the other machines mentioned above.  According to the NeXT
documentation the precision specification (16 in this case) is the
number of digits that would appear after the decimal point if the
value is represented in the e format.  Since there is one digit before
the decimal point, you get a total of 17 significant digits (as
above).  Manual pages on other machines (Sun, for example) say that in
g format the precision is the maximum number of significant digits.

Needless to say, this difference causes some moderately ugly output
from the program when run on a NeXT.  Does anyone know if any of the
standards (ANSI C?) define what the precision specification under g
format in printf should mean?  Is NeXT "correct" and everyone else
wrong?