[comp.sys.mac] LightSpeed C 2.15 putchar fix, hope done in LSC3.0, etc.

clive@drutx.ATT.COM (Clive Steward) (02/23/88)

Well, got into a discussion with the posting program, and lost both proper
title and concluding remarks from previous message.

Anyway, we know how to fix this particular problem.

My other comment was on unix gruish-type responses, of which I am
probably somewhat guilty myself.

In this case, the problem didn't turn out to be the char-int confusion 
the putc-getc twins have for many newcomers, and it's true actually
that it shouldn't have.

Nontheless, I think it's very important that this usage and the reasons 
be well understood, not glossed over with the 'most C compilers promote
passed char arguments to ints anyway'.  VERY bad practice to depend on
anything of the sort; it's the sort of foolishness that makes people
hate C and its programmers.  And it's completely un-necessary.  
Makes a lot more sense just to do things right.

When you get to C++, the world changes again, and though some
promotions remain, and others are added, there is a real mire pit
waiting if you try to make things work by such 'inner' details.
Trust me on this one.

Declare and use data types consistently, is an easy and enjoyable way.


The other possibility in this case was that fflush () might have been
needed on stdout to get the output.  I missed this.  It turns out not
to be necessary in LightSpeed C, since both stdout and stderr are
initially declared with non-buffered output (though one may easily
change this).  In most Unix environments, stdout at least is buffered,
though paradoxically printf may call fflush and mask this, and it would 
be necessary, with fputc, to call fflush.

See the moral of the story?   If Lightspeed followed one Unix version
in a perfect way (maybe it does), we still wouldn't 'know' the
answers.  I was at fault to react with too much assurance, myself.


Clive Steward

dc@gcm (Dave Caswell) (02/25/88)

)Nontheless, I think it's very important that this usage and the reasons 
)be well understood, not glossed over with the 'most C compilers promote
)passed char arguments to ints anyway'.  VERY bad practice to depend on
)anything of the sort; it's the sort of foolishness that makes people
)hate C and its programmers.  And it's completely un-necessary.  
)Makes a lot more sense just to do things right.
)
)When you get to C++, the world changes again, and though some
)promotions remain, and others are added, there is a real mire pit
)waiting if you try to make things work by such 'inner' details.
)Trust me on this one.

I trust K&R The C Programming Language.  I quote p. 184  
"First, any operands of type char or short are converted to int ..."

This is the language definition.

All C compilers opromost passed char arguments to ints, just as 
they make 3+4 equal to 7, and I depend on both things happening.

Sorry to be so dogmatic, but I've read this stuff a >100 times and it
isn't true.

chris@umbc3.UMD.EDU (Chris Schanzle) (02/28/88)

In article <413@white.gcm>, dc@gcm (Dave Caswell) writes:
> )Nontheless, I think it's very important that this usage and the reasons 
> )be well understood, not glossed over with the 'most C compilers promote
> )passed char arguments to ints anyway'.  VERY bad practice to depend on
> )anything of the sort; it's the sort of foolishness that makes people
> )hate C and its programmers.  And it's completely un-necessary.  
> )Makes a lot more sense just to do things right.

> "dc@gcm" writes (sorry, no name available & I couldn't mail)
> I trust K&R The C Programming Language.  I quote p. 184  
> "First, any operands of type char or short are converted to int ..."

True, this is the "bible" of programming, but I think you are trying
to compare apples and oranges here.  He was discussing about passing
char and int ARGUMENTS - whether or not they automatically get casted
into ints or not.  Your quotation surprised me, and I looked it up on
p. 184.  It is under the subheading of "Arithmetic conversions" - this
is not the same thing as passing arguments to functions!  K&R are saying
something like:
        char  c = 3;
        short x = 2;
        int   result;
        result = x + c;
Here, the variables x and c are converted to integers BEFORE the arithmetic
is done.

> This is the language definition.
The *arithmetic* conversions are!

_____________________
ARPA   : chris@umbc3.UMD.EDU            BITNET : chris@umbc

"He was betrayed by the limits of his own potential."

-- 
ARPA   : chris@umbc3.UMD.EDU		BITNET : chris@umbc

"He was betrayed by the limits of his own potential."