[comp.lang.c] Why trust stdio

hjm@cernvax.UUCP (Hubert Matthews) (11/15/88)

Why don't we avoid using any I/O routines except
read-a-char and write-a-char?  No smiley; I'm serious.  If
you want to write bomb-proof code, then you have to do all
the error handling yourself.  I don't like implementations
that barf when a number has a non-numeric character in the
middle of it.  I hate core-dumps.  In control
applications, one cannot afford to have a program that
crashes.  If I want to test a program, I turn the keyboard
upside down and bounce it a lot.  If it doesn't crash,
then it's acceptable; if it crashes, then it's broken.

The answer is to write as much of the code as possible
yourself, and use as little code provided by the
implementation as possible.  That means don't use printf,
don't use scanf AND DON'T USE GETS.  Use getc() and putc()
and that's it.  If they're broken, then you can't really
do anything useful on the machine anyway.  All the rest
can be derived from just those two - using anymore is
putting your reputation (both as a company and as a
programmer) in the hands of someone else who you don't
know whose bugs could ditch you at any time.  If I'm
programming a robot control program and I use a simple
scanf that crashes the machine because someone sneezed at
the keyboard and hit ESCAPE by mistake, who is going to
pay for the damage caused by the robot?  And what if I'm
controlling a nuclear reactor?

So, use as small a set of input/output routines as
possible, and write your own library functions to do
things properly.  Help stamp out core-dumps!  If you use
your own routines which, naturally, are written in a
portable way, then you save yourself a lot of nasty
surprises when presented with a non-user-friendly version
of stdio (for n-u-f, read broken).

(Efficiency freaks might say that doing this is slow.
Well, the library routines have to do it anyway, and
correctness is *always* more important than speed.  Error
handling never comes for free.  Ever.)
-- 

	Hubert Matthews

frank@Morgan.COM (Frank Wortner) (11/17/88)

In article <879@cernvax.UUCP> hjm@cernvax.UUCP (Hubert Matthews) writes:
> [...]
>The answer is to write as much of the code as possible
>yourself, and use as little code provided by the
>implementation as possible.  That means don't use printf,
>don't use scanf AND DON'T USE GETS.  Use getc() and putc()
>and that's it.
>
>So, use as small a set of input/output routines as
>possible, and write your own library functions to do
>things properly.  Help stamp out core-dumps!  If you use
>your own routines which, naturally, are written in a
>portable way, then you save yourself a lot of nasty
>surprises when presented with a non-user-friendly version
>of stdio (for n-u-f, read broken).

I think I'd go crazy if I had to do that.  Library writing takes
*lots* of time and effort.  I know, I designed a few compiler runtime
libraries in a previous job.  That was a full time job in itself.
Standards, even pseudo-standards like the C library, exist at least
partially to save everyone the trouble of designing *everything* from
scratch.

Of course, if we all did "roll our own," think of what would happen
if the resulting code had to be ported.  The debugging job would
include debugging the library as well as the program.

What about the compiler itself?  Can it be bug/surprise free?  I would
think that just as many implementation quirks would show up in a
compiler as in a stdio library.  I really don't want to have to write
a compiler so I can be sure of order of evaluation, side-effects,
implementation limits, etc.

Thanks, but, no thanks.

-- 
						Frank

"Computers are mistake amplifiers."