[comp.lang.misc] Modern langauges

sommar@enea.UUCP (Erland Sommarskog) (12/28/87)

I know, I know this does not belong the least in comp.os.vms/INFO-VAX
but I just couldn't keep quiet any more. I have cross-posted it to
comp.lang.misc and will probably get damned for that I did. I have
directed follow-ups there as well. ARPA and BITnet people who must
flame me: Please mail be persoanlly.

To comp.lang.misc people: You haven't missed anything important. It's
just the same my-language-is-better-than-yours quarrel.

J_ELORAN@FINJYU.BITNET writes:
>   (talking about C...) So why would we use 'formula translator for an
>   operating system?' Because some people have learned it and are NOT
>   bothering to learn new programming languages. I was part of this
>   'old programming languages' category, until I got thrilled enough
>   (inspired by other C programmers) to learn C. After I learned C,
>   I began to understand what the REAL difference between the 'old'
>   and the 'new' programming languages were. And it also affected to my view
>   of the operating systems (along the study of them).
>
>   Long live C!

But C is no new language. Well, may be a little younger than Fortran and
Cobol, but not much. And they are all archaic. There are plenty of 
motivation for the former two to continue living, namely tons of
programs written in them. And with the current vogues and trends, the
same motives will keep C alive too long.

There *are* motives for using C, I shall not deny that, but they have
nothing to do with the qualities of the language itself. It's just happens
to be that C is wide-spread and available on many machines, particulary
this is due to one of today's most poupular operating systems (which,
by the way, has an archaic user interface). Note the difference VMS/
Unix here. In VMS you can do system programming in virutually any language,
you don't have to stick to Bliss, whereas Unix more or less forces you
to use C.

Why I find C archaic? Let me just say I think that a good language
should save you many mistakes as early as possible. It should also
be readable, i.e. it shouldn't unnecessarily use artificiasl symbols.

Modern languages, but still in the 3rd generation, are for example
Ada, Modula-2 and Eiffel.

(The latter language, or rather the only current implementataion of it,
demonstrates a perfect use of C. The Eiffel system runs on Unix and
produces C code as output. That is C; a portable assembler, not a 
modern high-level language.)
-- 
Erland Sommarskog       
ENEA Data, Stockholm    
sommar@enea.UUCP        
                   C, it's a 3rd class language, you can tell by the name.

pase@ogcvax.UUCP (Douglas M. Pase) (12/31/87)

J_ELORAN@FINJYU.BITNET writes:
>   [...]
>
>   Long live C!

sommar@enea.UUCP(Erland Sommarskog) writes:
>
>But C is no new language. Well, may be a little younger than Fortran and
>Cobol, but not much. And they are all archaic.

Someone once said that Lady Algol was an improvement on most of her successors.
That includes C, Pascal, and Modula-2 (but not earlier versions of FORTRAN).

> [Some naive comments on why C will outlive its usefulness, including
> incorrect remarks about the relationship between Unix and C]

>Why I find C archaic? Let me just say I think that a good language
>should save you many mistakes as early as possible.

This is a noble goal.  A major problem with this is the definition of what
constitutes a 'mistake'.  Stuffing a 16 bit value into an 8 bit slot under
some circumstances may constitute an error.  Other times it may not.
Explicitly specifying what should be obvious semantically causes programs
to be verbose and tedious, both to read and to write.

Overrunning the end of an array is usually accepted as an error.  BUT,
allowing array bounds to be determined at runtime can also be an extremely
useful feature.  It allows more flexible and efficient usage of memory.
Unfortunately, detecting array overruns at compile time and allowing run time
array definition are difficult to put together, and runtime bound checks are
expensive in terms of efficiency.

Some things are almost universally recognized as big no-nos.  An example might
be using a double precision floating point value as a structure pointer.  C
doesn't allow that type of operation more than any other language.  But then,
the arguments never seem to be over that anyway.  They usually end up being
stylistic arguments (and everyone has their own opinion on what constitutes
'good' style/syntax).

Another source of criticism is features which at times may be useful, at
other times may be hazardous.  C takes a liberal view of the world.  It assumes
(sometimes rashly) that the programmer knows what s/he is doing.  The saw that
cuts the wood will also cut the hand.  We have known that for years, but we
still have saws.  (And no, I don't want to hear about how C is missing guards
that keep hands out -- the guards are there, but they must be used to be
effective.)

>It should also
>be readable, i.e. it shouldn't unnecessarily use artificial symbols.

Ah, the old verbosity argument.  Shorthand should never be used in business
because only those who know shorthand can read it.  Let's hear it for English
as a programming language.  C *is* readable to those who know C.  Now I do not
argue that C cannot be made unreadable, but then, so can APL, Lisp, Modula-2,
Pascal, and FORTRAN.  I will be the first to agree that some operators in C
are a bit strange, and frankly I don't know how to use them.  But every time
I try to include a particular op in a list of indefensible operators, someone
shows me a case I hadn't thought of in which it is indispensible, or at least
very useful.

>Modern languages, but still in the 3rd generation, are for example
>Ada, Modula-2 and Eiffel.
      ^^^^^^^^
Modula-2 is hardly a modern language, at least in the sense that it brings any
new ideas with it.  It is a rather poor extension of a language which, in its
purest form, is overly simple.  It is closer to a repackaging of an older
language than it is to being anything new.  For a more thorough criticism of
Modula-2, look up an article I wrote in the November 1985 ACM SIGPLAN Notices,
"System Programming in Modula-2".

Ada has much in its favor, but also its share of problems.  I know nothing of
Eiffel.

> [An illustration of C as a low level language]

I agree, C is a low level language, at least in the sense that it gives you
a programming model which is still very close to the machine.  It was intended
to be that way.  It will never replace higher level languages.  It was not
intended to.  I don't use C when I should use Lisp, APL, SmallTalk, ML, or
Prolog.  This shouldn't be a supprise to anyone -- I don't use a hammer to cut
boards, either.

>-- 
>Erland Sommarskog       
>ENEA Data, Stockholm
>sommar@enea.UUCP
>                   [Opinionated comment deleted]

--
Doug Pase   --   ...ucbvax!tektronix!ogcvax!pase  or  pase@cse.ogc.edu.csnet

sommar@enea.se (Erland Sommarskog) (01/01/88)

To appreciate a modern langauge you must be a modern programmer.
Some of the arguments Mr. Pase presents, and of which we have
and will see many times, seems just as conservative as those
that must have been around when assembler vs. Fortran was the issue.

Douglas M. Pase (pase@ogcvax.UUCP) writes:
>Erland Sommarskog (sommar@enea.UUCP) writes:
>>Why I find C archaic? Let me just say I think that a good language
>>should save you many mistakes as early as possible.
>
>This is a noble goal.  A major problem with this is the definition of what
>constitutes a 'mistake'.  Stuffing a 16 bit value into an 8 bit slot under
>some circumstances may constitute an error.  Other times it may not.
>Explicitly specifying what should be obvious semantically causes programs
>to be verbose and tedious, both to read and to write.

Even more tedious is the work to find the bug caused by such a mistake, 
which ought to be caught by the compiler or a run-time check. As for
reading, I think the explicit type conversions help understanding what's
happening.

>Overrunning the end of an array is usually accepted as an error.  BUT,
>allowing array bounds to be determined at runtime can also be an extremely
>useful feature.  It allows more flexible and efficient usage of memory.

A modern language should of course allow dynamic arrays. So does Ada.
Checking array bounds at compile time can only be done in rare cases,
since even the array size is static, the index is not. 

>array definition are difficult to put together, and runtime bound checks are
>expensive in terms of efficiency.

This, I'd say, is an really old-fashioned argument. In these days
when hardware is cheaper than software, buy faster software, don't
remove the array checks. Believe, you save the money just in having
the program crash on array overrun, than searching for those funny
errors caused by having data unexpectly over-written.

>Another source of criticism is features which at times may be useful, at
>other times may be hazardous.  C takes a liberal view of the world.  It assumes
>(sometimes rashly) that the programmer knows what s/he is doing.  The saw that

And this is just so wrong, wrong, wrong! We have lived with computers so
long that we know that programmers very often only are just dimly aware
of what they are doing. Or do you really believe in bug-free software?

>>Modern languages, but still in the 3rd generation, are for example
>>Ada, Modula-2 and Eiffel.
>      ^^^^^^^^
>Modula-2 is hardly a modern language, at least in the sense that it brings any
>new ideas with it.  It is a rather poor extension of a language which, in its
>purest form, is overly simple.  It is closer to a repackaging of an older

You may be very right here. I have only read through Wirth's book once,
and it's true, it's really just another Pascal dialect. Why I mentioned
it at all will be apparent below.

>Ada has much in its favor, but also its share of problems.  I know nothing of
>Eiffel.

Eiffel is *very* modern, at least with respect to its age. 2 years and 3
months by now. Check out comp.lang.ada where Bertrand Meyer, the father
of the language, mentioned some on it in an article recently. 

>I agree, C is a low level language, at least in the sense that it gives you
>a programming model which is still very close to the machine.  It was intended
>to be that way.  It will never replace higher level languages.  It was not
>intended to.  I don't use C when I should use Lisp, APL, SmallTalk, ML, or
>Prolog.  This shouldn't be a supprise to anyone -- I don't use a hammer to cut
>boards, either.

Yes, but here is the big big big problem. There are many many people
who belives it does. If C only where used instead for assembler in
low-level applications, I wouldn't mind. But this it used more wider
than so. I work as a consultant and there seems to be many customers
that have decided to use C in all their further development and may be
even will convert old code. Why? I don't know their motives, but I can
guess. The want something standardized so they can change hardware
without to much influence on the software. That doesn't many leave 
languages. Pascal is useless without extensions, and they are non-
standardized. Often their processors are micros, so Ada may not, 
today, be realistic. 
  But C is quite standardized and quite available. The only alternative
I could think of is Modula-2, however, I am not sure whether it's
generally available for an aritrary processor. Neither do I know
whether it's standardized enough. I have some suspicions against
the I/O, which is not part of the language.
  
-- 
Erland Sommarskog       
ENEA Data, Stockholm    
sommar@enea.UUCP        
                   C, it's a 3rd class language, you can tell by the name.

gudeman@arizona.edu (David Gudeman) (01/02/88)

In article  <2570@enea.se> sommar@enea.se (Erland Sommarskog) writes:
>To appreciate a modern langauge you must be a modern programmer...

I have been following the discussion in question, and this is a
completely unjustified slur.  Douglas Pase never said that very high
level languages are unnecessary, he only said that intermediate-level
languages like C are also necessary.

>... point-counter-point about security in compilers ...

>Even more tedious is the work to find the bug caused by such a mistake, 
>which ought to be caught by the compiler or a run-time check. As for
>reading, I think the explicit type conversions help understanding what's
>happening.

You are entiled to your opinion about what makes things readable, I
tend to agree, but there are a lot of people who disagree.  As to
compile-time checking, you can always use lint if you want extra
security.  One advantage of C is that it allows programmers to choose
a style they find comfortable.  Pascal, Ada, et. al. try to put the
programmer into a straight-jacket.  This is fine for companies who
want to restrict their programmers in this manner, but when I choose
my _own_ language, for my _own_ purposes, I'm picking one that lets me
do what I want.  (Probably Lisp or Icon :-)

>... discussion of array bounds checking and its inefficiency ...

>This, I'd say, is an really old-fashioned argument. In these days
>when hardware is cheaper than software, buy faster software, don't
>remove the array checks....

Now _that's_ an old argument.  You've already convinced us that the
only reason to use C is for efficiency.  If I'm programming an
application that is so cramped for resources that I'm willing to
resort to C, I certainly don't want it wasting time checking array
bounds.

>>... C takes a liberal view of the world.  It assumes (sometimes
>>rashly) that the programmer knows what s/he is doing....
>
>And this is just so wrong, wrong, wrong! We have lived with computers so
>long that we know that programmers very often only are just dimly aware
>of what they are doing. Or do you really believe in bug-free software?

Did you see the parenthetical remark?  The writter of this sentence is
fully aware of the dangers of C.  But there are times when efficiency
is so important that the dangers are necessary.  You are blindly
parroting arguments that we have all heard a thousand times (I have
made the same arguments myself).  But the argument that "efficiency is
seldom important" is irrelevant, since Mr. Pase is arguing that C is a
good language in those cases where efficiency _is_ important.

>>I agree, C is a low level language, at least in the sense that it
>>gives you a programming model which is still very close to the
>>machine.  It was intended to be that way.  It will never replace
>>higher level languages.  It was not intended to....
>
>Yes, but here is the big big big problem. There are many many people
>who belives it does. If C only where used instead for assembler in
>low-level applications, I wouldn't mind...

Then you were pretending Mr. Pase is a bad guy for rhetorical effect.
I don't think rhetorical effect is an excuse for rudeness.  In case
anyone is wondering, I have never met Douglas Pase, I just get tired
of hearing the same language debates over and over, often misdirected,
intolerant, and rude.

pase@ogcvax.UUCP (Douglas M. Pase) (01/02/88)

Since this is a reply to a reply, comments by sommar@enea.UUCP (Erland
Sommarskog) will be prefixed by `ES>'.  My comments (Douglas M. Pase
(pase@ogcvax.UUCP)) will be prefixed with `DP>'.

ES> To appreciate a modern langauge you must be a modern programmer.
ES> Some of the arguments Mr. Pase presents, and of which we have
ES> and will see many times, seems just as conservative as those
ES> that must have been around when assembler vs. Fortran was the issue.

I quite agree that I am conservative, as long as "conservative" is not being
used to mean "unwilling to accept change -- even when change is merrited."
I am neither in favor of change nor against it.  I am in favor of flexible
languages and programmers understanding what they are doing.  It seems,
however, the intent of the previous paragraph was to claim my arguments
were reducible to an aversion to change.  This is simply not the case.

A far better summary of my opinions, I think, would be "The right tool for
the job."  I don't use C for combinatorial searches, and I don't use Prolog
for system work.  I don't use Modula-2 for anything.  (I once wrote a 15,000
line compiler in Modula-2, and later maintained a 12,000 line library.  The
same thing in C would have taken 1/3 the memory, 1/10 the time to write, and
run about 2x faster.)

ES> [...] a good language should save you many mistakes as early as possible.

DP> This is a noble goal.  A major problem with this is the definition of what
DP> constitutes a 'mistake'.  [followed by an example...]

ES> [a response to the example, not the statement]  As for
ES> reading, I think the explicit type conversions help understanding what's
ES> happening.

This doesn't answer the question.  Is it just a matter of type casts?  What
other expressions are considered `mistakes'?  When are explicit type
conversions required?  Whenever the types do not match exactly?  Consider
the following list of conversions:

	int <- char	-- types are different, no information lost
	double <- float -- lower bits contain essentially garbage
	float <- double	-- possibly lose precision, small number becomes 0
	int <- float	-- ditto
	float <- int	-- may lose bits on the lower end of large ints

Problems could arise in all but one of these cases, but it is very rare
that such things trouble me.  However, it would be tedious and not in the
least bit more readable to me if I had to cast every piece of data whenever
I wanted a mixed expression.  (assume a is int, b is float, and c is char)

	(float)a + b * (float)(int)c

is much less readable to me than

	a + b * c

I may be unusual in this (I doubt it), but programming errors in this area
give me few problems.  What's more, I've never had much difficulty finding
them when they did occur.  C does what I expect, and I expect what it does.

One area where I believe explicit casts should be required, even though such
casts rarely change the bit-pattern of the representation of the value, is
is where pointers are used.  In my earlier days I ignored casts to pointers
because the values did not change.  It proved to be the source of my most
obscure and difficult bugs.  It was not and is not my intention ever to
suggest that casts on pointers (when different types are involved) should
be ignored.  Our C compiler here gives warnings whenever pointer types do
not match.  But I believe that is as it should be.

DP> [arguments about array bounds-checks]

ES> A modern language should of course allow dynamic arrays.

That, more than almost anything else, is something I use.  C gives me that;
Modula-2 does not.

DP>  [Runtime] bound checks are expensive in terms of efficiency.

ES> This, I'd say, is an really old-fashioned argument.

Some things are simply timeless...

ES> when hardware is cheaper than software, buy faster software, don't
ES> remove the array checks. Believe, you save the money just in having
ES> the program crash on array overrun, than searching for those funny
ES> errors caused by having data unexpectly over-written.

This is another error I have had few problems with.  I am always careful
to put the checks in, but I put them in where they are less likely to slow
the execution.  For example, I build the checks into the preliminary stages
of loops, rather than array accesses within loops.  That way I check once
at the beginning of the loop that the indexes will not overrun, rather than
a dozen times within each pass of the loop.  Few compilers have the smarts
to do equivalent things for me, and I'd usually end up doing it anyway.

This whole thing is really a minor point anyway.  I wouldn't condemn or
avoid a language because it had runtime array checking.  That sort of thing
just doesn't bite me very often.

DP> [C] assumes (rashly) that the programmer knows what s/he is doing.

ES> And this is just so wrong, wrong, wrong! We have lived with computers so
ES> long that we know that programmers very often only are just dimly aware
ES> of what they are doing.

This is not the problem of a language, it is the problem of a programmer!  If
a person does not understand what s/he is doing, how will I have confidence in
what s/he has written, whatever the language!?!  A programmer is simply a
person who translates concepts from one language (that of the application) to
concepts in another language (that of the programming model).  It is very
similar to natural language interpreters.  If I were to hire a Russian
to interpret for me, I would want one who understood both Russian and English.
A failure to understand either one would mean he was not an interpreter.  If I
hire a programmer, I want him to understand the machine (model) as well as the
application.

If you were to argue that programmers who know and understand C well are not
able to rapidly produce efficient and relatively bug-free code, or that C
is inordinately difficult to learn for the benefit it gives, I might be more
simpathetic.  Its expanding popularity serves as evidence against both of
those arguments.  All you have argued is that the standards for graduation
in computer science are much too lax.

ES> [C is used when other languages might be better, because C is more
ES> standardized and portable.]

Again I say, "The right tool for the job."  C has some very nice features,
but it is not the only language in town, and it does require a certain
amount of skill and caution in its use.  Does that mean it should be
avoided?  Absolutely not!  Programmers should know several languages, since
no single language can handle all applications (not even Ada).  Programmers
should also understand what they are doing.  If too few can make that claim,
then C is not the biggest problem we have.  We should require the same
professionalism of programmers that we require of most other engineering
and scientific disciplines.

ES> Erland Sommarskog       
ES> ENEA Data, Stockholm    
ES> sommar@enea.UUCP        
ES>                    [opinionated comment deleted, again]

--
Doug Pase   --   ...ucbvax!tektronix!ogcvax!pase  or  pase@cse.ogc.edu.csnet

reggie@pdnbah.UUCP (George Leach) (01/04/88)

In article <2570@enea.se> sommar@enea.UUCP(Erland Sommarskog) writes:
>Douglas M. Pase (pase@ogcvax.UUCP) writes:
>>Erland Sommarskog (sommar@enea.UUCP) writes:
>>>Why I find C archaic? Let me just say I think that a good language
>>>should save you many mistakes as early as possible.

     One quick comment on this -> I feel that a professional should be provided
professional level tools.  Pascal was fine for school, but not for gettting real
work done.  Besides, use lint!

    [lots of discussion on why Doug likes C and Erland does not]

>This, I'd say, is an really old-fashioned argument. In these days
>when hardware is cheaper than software
      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

>I work as a consultant and there seems to be many customers
>that have decided to use C in all their further development and may be
>even will convert old code. Why? I don't know their motives, but I can
>guess. 

     You answered your own question, not only in the text that I deleted
but also earlier in your article (see above).  Why do we still see people
tied to the same hardware for years and years?  Because it would be too
expensive to port the code to another platform.  There are enough COBOL
applications out there to keep maintenance programmers employed for decades
to come.  That is until IBM discontinues those mainframes and forces their
customers to another architecture with another OS and another new language.


     I have seen projects shift hardware in mid-development.  Why?  Because C
was the implementation language and the code was written with portability in 
mind.  One can easily upgrade to newer hardware technology with little change
to the existing investment in software.  What other language is available on
everything from a PC up to a Cray?  Which languages and operating systems are
being moved onto newer architectures?  C may not be the latest in language 
technology, but that is not what the commercial world wants.  They want to get 
the job done and not have to throw resources at constantly reworking things.  
Why do you think there is so much interest in programmer productivity and 
software reuse?  I think that C (and UNIX) has shown the world the value of 
hard ware independence.  The companies who have interests in proprietary systems
do not like this, but it is too late to stop the ball from rolling.  The 
computer industry has seen the leverage that a non-proprietary, machine 
independent system can provide.  UNIX and C are responsible.  Future OS and 
programming language designers must build upon this foundation in the future.  
There is no turning back now.



George W. Leach					Paradyne Corporation
{gatech,rutgers,attmail}!codas!pdn!reggie	Mail stop LF-207
Phone: (813) 530-2376				P.O. Box 2826
						Largo, FL  34649-2826

mason@tmsoft.UUCP (Dave Mason) (01/05/88)

Pase & Sommarskog have been bashing each other for a while now, and
both of them have made some valid points.  Let me just add another
viewpoint in case anybody else is interested in this debate.
(Hmmm 45 minutes later this has gotten rather long, but there are some
newish (and maybe even interesting :-) ideas, and only about 70 of the
lines are mine, the rest are context)

In article <1522@ogcvax.UUCP> pase@ogcvax.UUCP (Douglas M. Pase) writes:
>Since this is a reply to a reply, comments by sommar@enea.UUCP (Erland
>Sommarskog) will be prefixed by `ES>'.  My comments (Douglas M. Pase
>(pase@ogcvax.UUCP)) will be prefixed with `DP>'.
>
>ES> To appreciate a modern langauge you must be a modern programmer.
>I am neither in favor of change nor against it.  I am in favor of flexible
>languages and programmers understanding what they are doing.
>
>A far better summary of my opinions, I think, would be "The right tool for
>the job."  I don't use C for combinatorial searches, and I don't use Prolog
>for system work.  I don't use Modula-2 for anything.

I think part of the point here is that there aren't many `modern
languages'.  Particularly not in this discussion (here I don't mean
modern to mean `recently designed', but rather `designed with new and
better ideas').  C, Modula2, Pascal and to some extent Ada are
equivalent languages, 15 years old with minor syntactic sugar
differences.  The major differences are in how strictly the compilers
enforce language rules.  (Ada does have some `new' (and maybe even some
`better') ideas.  Prolog has some `new' (and probably `better' for some
application areas) ideas.)

>ES> [...] a good language should save you many mistakes as early as possible.
>
>DP> This is a noble goal.  A major problem with this is the definition of what
>DP> constitutes a 'mistake'.  [followed by an example...]
>
>This doesn't answer the question.  Is it just a matter of type casts?  What
>other expressions are considered `mistakes'?  When are explicit type
>conversions required?  Whenever the types do not match exactly?
> ...
>I may be unusual in this (I doubt it), but programming errors in this area
>give me few problems.  What's more, I've never had much difficulty finding
>them when they did occur.  C does what I expect, and I expect what it does.

I feel there are 2 areas where C is more error prone than the Pascal
class of languages: parameters and pointer arithmetic.  I think the
new ANSI C definition with function prototypes addresses all the
problems with value parameters, and wish the compilers required
function prototypes all the time.  I feel previous C compilers left
too much of the consistency validation to `lint', which because it is
a separate program is too often forgotten.  var parameters in C are a
design flaw, but I don't think they cause TOO many bugs (not of the
lurking type anyway).  Pointer arithmetic is a trade off: lots of
power ==> potential errors.  I think it's worth the trade-off.

>DP>  [Runtime] bound checks are expensive in terms of efficiency.
>ES> This, I'd say, is an really old-fashioned argument.
>
>This is another error I have had few problems with.  I am always careful
>to put the checks in, but I put them in where they are less likely to slow
>the execution.  For example, I build the checks into the preliminary stages
>of loops, rather than array accesses within loops.  That way I check once
>at the beginning of the loop that the indexes will not overrun, rather than
>a dozen times within each pass of the loop.  Few compilers have the smarts
>to do equivalent things for me, and I'd usually end up doing it anyway.

I agree with both on this one.  Bounds checks are worth having (partly
because not all programmers have DP's discipline, and you usually
over-run bounds when the program has been in place & working for
years (unless you test it RIGHT)).  I have a language I've designed
that combines the best of pointer arithmetic & checked bounds and is
easy to compile.  (I.e. the efficiency of pointer arithmetic, with
checked ranges, but the check is done at compile time or outside the loop)
So I feel it's a language design problem, but that the Pascal
languages didn't solve it right.

>DP> [C] assumes (rashly) that the programmer knows what s/he is doing.
>ES> And this is just so wrong, wrong, wrong! We have lived with computers so
>ES> long that we know that programmers very often only are just dimly aware
>ES> of what they are doing.
>
>This is not the problem of a language, it is the problem of a programmer!
Yes, and no.  At a certain level, we better hope the programmer knows
what they are doing (regrettably hope is often all we have).  But that
does not mean that the language should stand idly by and watch the
programmer foul up.  All of us as programmers can deal with certain
levels of complexity & detail (with order-of-magnitude differences
between programmers).  The more the programming language helps us deal
with and hide the complexity and detail, the more likely that Klutze
Programmer's weapons control system will work rather than start WWIII
(and the more likely that the genius programmers among us can build
Truly Great And Important Programs to Better Our Way Of Life).

>ES> [C is used when other languages might be better, because C is more
>ES> standardized and portable.]
That's the main reason I use it (although I'm an assembler hack at
heart, too, so the `portable assembler' nature of C attracted me).  I
used to use Pascal, but as I started using more different systems, the
only thing they had in common was C, and as I was getting too old to
re-write all my tools for every new environment, I standardized on C
(and Unix when I can get it). 
>
>Again I say, "The right tool for the job."
Good in theory, but if you want to stay portable, C is often the only
tool (except for Fortran, of course :-).  That doesn't mean I don't
use other languages.  But rarely for anything big.  If it's big, I'll
probably want it in several different environments, and also that it's
worth my time writing the utility functions to give me the power that
the other (`better') language would have given me.

One of the most important reasons for knowing other languages though
is the introduction of other programming paradigms.  I'm currently
playing with LOGO, for exactly that reason (and cuz there's a couple
good books by Brian Harvey, and I'm considering teaching first year
computer science in Logo).

> ... We should require the same
>professionalism of programmers that we require of most other engineering
>and scientific disciplines.
Now that's a REALLY scary thought! (Baupal, India; Challenger; Love Canal)

>ES> Erland Sommarskog
>ES> ENEA Data, Stockholm
>ES> sommar@enea.UUCP
>Doug Pase   --   ...ucbvax!tektronix!ogcvax!pase  or  pase@cse.ogc.edu.csnet

	../Dave Mason,	TM Software Associates	(Compilers & System Consulting)
			Ryerson Polytechnical Institute
	..!{utzoo uunet!mnetor utcsri utgpu lsuc}!tmsoft!mason

eugene@pioneer.arpa (Eugene Miya N.) (01/06/88)

In article <1949@pdn.UUCP> reggie@pdn.UUCP (George Leach) writes:
>to the existing investment in software.  What other language (C) is
>available on everything from a PC up to a Cray?  Which languages
>and operating systems are being moved onto newer architectures?
Plenty of languages: FORTRAN, Pascal, and C.  I'm just waiting for
PC/DOS on a Cray ;-).

More seriously, I'm growing to appreciate Icon (programming C since
1977), but these (Ada(tm), C, Pascal, Fortran, Icon, Basic, etc.)
are all imperative languages.  I don't think I would teach introductory
programming using C.  It's just a personal preference that I use C.
Talk about LISP (one person mentioned Prolog).  What about
Smalltalk-80(tm)?  Simula-67?  Other declarative functional languages?
Backus is working on an interesting new language named FL.  VAL/ID/SISAL
are all interestig languages.  Maybe, we should talk about the
requirements of languages?  Performance was specifically placed second
with Ada because of problems in software development.  I don't know?
What's modern?  Wirth gives lots of Modula-2's credit to Mesa.
Talk to me about classes of languages.  Post the follow up.

From the Rock of Ages Home for Retired Hackers:

--eugene miya, NASA Ames Research Center, eugene@ames-aurora.ARPA
  "You trust the `reply' command with all those different mailers out there?"
  "Send mail, avoid follow-ups.  If enough, I'll summarize."
	ignore this time.
  {uunet,hplabs,hao,ihnp4,decwrl,allegra,tektronix}!ames!aurora!eugene

sommar@enea.se (Erland Sommarskog) (01/06/88)

Since this discussion is getting embarassingly long, I'll try to stay
brief. I had intented to write no more articles on the subject, but
Doug Pase makes some points that tempts me continue a little more.

On type conversions: (Explicit or not?)
  The key word here for me is data abstraction. Even integer to integer
may be wrong.
  Say we have a data structure which has an index array (IA). Each record 
contains a pointer to a data block. So to address a specific word we 
need two indexes, one in the IA and on the block. We declare two types:
(Using Pascal here)
   SIA_range : (some interval);      (* Index in the IA *)
   DIL_range : (some interval);      (* Index in a block *)
Assignment from type to the other does seem like a error. (Easy
swapping the parameters in a procedure call.) The compiler I have 
doesn't mind, so I had to have my eyes open for such. With Ada, 
for instance, I could left this task entirely to the compiler.

It is these kinds of logical mistakes I talk about. Not the risk
of losing some accuracy. (But allowing reals to integer without
explicit conversion was something I thought only Fortran did. 
C does also?)

Array-index checking:
>This is another error I have had few problems with.  I am always careful
>to put the checks in, but I put them in where they are less likely to slow
>the execution.  For example, I build the checks into the preliminary stages

If you write checks yourself, that seems like a source of error to me.
More brutely said, it sound like the NIH syndrome...

Another comments on range checking in loops: It's only safe to leave
them out if the index can't be altered while in the loop. Don't 
know about C, but this is certainly possible in Pascal. Ada, on other
regards the loop index as a constant.

Generally, a good compiler should give you the possiblity to surpress
checking, either for the entire program, or just a part of it. So you   
can always the remove the checks on delivery. The quetsion is just: Dare
you? The hostile user's may provide data you've never dreamt of.

On "knowing what he's doing"
>ES> And this is just so wrong, wrong, wrong! We have lived with computers so
>ES> long that we know that programmers very often only are just dimly aware
>ES> of what they are doing.
>
>This is not the problem of a language, it is the problem of a programmer!  If

Yes. But he should get as much support as possible from the language. Thus,
the compiler should suspect any input.

> A programmer is simply a
>person who translates concepts from one language (that of the application) to
>concepts in another language (that of the programming model).  It is very
>similar to natural language interpreters.  If I were to hire a Russian
>to interpret for me, I would want one who understood both Russian and English.

Not really. Your Russian interpreter doesn't need to have a full notion
of what you are talking of as long as he gets the words right. But the 
programmer must. He must understand everything in the specification he has.
Very often he can't. The text may not be too clear. The one who wrote it 
may not even know what he really meant, because he knows not what he want.
  
What have this to with C? Well, a powerful language with good possibilities
abstracting the data is a help here to see: "What is to be done?". 
Does C have the good devices for this? I doubt. Strong type checking is
a must here, I believe.

>ES>                    [opinionated comment deleted, again]
You'll see it soon again. It happens to have been part of my signature
file for a while. Anyone taking it seriously, has to blame himself.


-- 
Erland Sommarskog       
ENEA Data, Stockholm    
sommar@enea.UUCP        
                   C, it's a 3rd class language, you can tell by the name.

eugene@pioneer.arpa (Eugene Miya N.) (01/07/88)

A slightly older reference, I have a newer one some where.

%A John Backus
%A John H. Williams
%A Edward L. Wimmers
%T FL Language Manual (Preliminary Version)
%I IBM Almaden Research Center
%C San Jose, CA
%R RJ 5339 (54809)
%D November 1986
%K FP successor,

From the Rock of Ages Home for Retired Hackers:

--eugene miya, NASA Ames Research Center, eugene@ames-aurora.ARPA
  "You trust the `reply' command with all those different mailers out there?"
  "Send mail, avoid follow-ups.  If enough, I'll summarize."
  {uunet,hplabs,hao,ihnp4,decwrl,allegra,tektronix}!ames!aurora!eugene

pase@ogcvax.UUCP (Douglas M. Pase) (01/08/88)

In article <enea.2582> sommar@enea.UUCP(Erland Sommarskog) writes:
>On type conversions: (Explicit or not?)
>  The key word here for me is data abstraction. Even integer to integer
>may be wrong.

Had I known you were referring to data abstractions we would have been in
agreement long ago.  I absolutely agree that they are extremely useful.
The Pascal derivatives, unfortunately, do not implement them either.  That
is one of the advantages I had in mind when I said earlier that Ada has some
good points.  CLU is another excellent language which has this feature.

>Array-index checking:
>If you write checks yourself, that seems like a source of error to me.
>More brutely said, it sound like the NIH syndrome...

If you* wait for the compiler or runtime system to tell you when you make a
mistake, that's lazy, perhaps even irresponsible programming to me.  If you
understood what you were doing, you would have reasonable confidence in your
code.  This does not mean that such checks are not useful, but a heavy
emphasis on them suggests such programmers don't know what they are doing.
Again, I am not maligning array checks -- I am saying that programmers
should understand their own code!

[*"You" in this paragraph is the "generic" form.  It does *not* specifically
refer to ES.]

>On "knowing what he's doing"
>Yes. But he should get as much support as possible from the language. Thus,
>the compiler should suspect any input.

Not the compiler -- THE PROGRAMMER!  The compiler can't help much when user
enters data the program isn't prepared to handle.  In fact, it can't even
recover.  It can only blow up.  Programmers should make reasonable checks to
insure the data conforms to expectations, BECAUSE COMPILERS CAN'T.

>> [A programmer is simply a translator]
>
>Not really. Your Russian interpreter doesn't need to have a full notion
>of what you are talking of as long as he gets the words right.

And how do you suppose he gets the words right if he doesn't understand what
he's saying?  Does "The vodka was good but the meat was rotten" ring a bell?

>What have this to with C? Well, a powerful language with good possibilities
>abstracting the data is a help here to see: "What is to be done?". 
>Does C have the good devices for this? I doubt. Strong type checking is
>a must here, I believe.

Wonderful.  You have just constructed a "straw man" to tear down.  You could
as easily have complained about C's lack of pattern matching ability, or the
fact that it has no backtracking.  My comments were never designed to defend
C for all applications, nor do they assert that improvements could not be
made.  They do not even assert that C is better than every other language.  I
do intend to say that C can be very useful as a tool to a programmer who takes
the time and effort to use it correctly, and those who condemn it as a
language are unjustified.

>Erland Sommarskog       
>ENEA Data, Stockholm    
>sommar@enea.UUCP        
>                   [I blame no one, but the comment is deleted, again]
--
Doug Pase   --   ...ucbvax!tektronix!ogcvax!pase  or  pase@cse.ogc.edu.csnet

craig@comp.lancs.ac.uk (Craig) (01/09/88)

>Talk about LISP (one person mentioned Prolog).  What about
>Smalltalk-80(tm)?  Simula-67?  Other declarative functional languages?
>Backus is working on an interesting new language named FL.  VAL/ID/SISAL

>Talk to me about classes of languages.  Post the follow up.

One of the classes of languages that has received little attention is the class
of concurrent languages, VAL and SISAL are mentioned above and come under the
general class of data flow languages.  One of the aspects of the single
assignment data flow language (re SISAL) is that it appears to be functional
in nature, is this a fact of life, do all data flow languages need to be
functional ?  I would say no, it is not but is in fact
a product of our up-bringing with von-Neumann machines, iterative programming
structures etc.  The Single Assignment Rule (SAR) appears to be an attempt to
control the chaos of the iterative programming language construct, if we accept
that iteration and recursion are equivalent why not then just stick to
recursion, sure make it look iterative (so that we all don't get technology 
shock :-)) but remove iteration from the program model (we could talk about
GOTOs if you like :-) :-)).  The actor model and continuations have a solution
to distributing recursion, why not use it.

Anyway, back to the subject, I feel that few modern languages can afford
to ignore the problem of concurrency ( be it parallelism or distribution) and
that the determination of that concurrency should be implicit (ie tasks
etc. are right out).  Those interested in implicit concurrency should
look at languages such as Ether (an actor language), SISAL (data flow)
or get my forthcoming internal report (ah ha you say, all is revealed) on the 
implicitly parallel, non-single assignemnt (more not very many assignemnts)
data flow language (AARC - Implicitly parallel Data Flow Language, internal
report LG-02-88, available end of Feb 1988). All reports are free
from the University of Lancaster at the address given below (or via e-mail
to me).


Craig.

-- 
UUCP:	 ...!seismo!mcvax!ukc!dcl-cs!craig| Post: University of Lancaster,
DARPA:	 craig%lancs.comp@ucl-cs          |	  Department of Computing,
JANET:	 craig@uk.ac.lancs.comp           |	  Bailrigg, Lancaster, UK.
Phone:	 +44 524 65201 Ext. 4476   	  |	  LA1 4YR

sommar@enea.se (Erland Sommarskog) (01/10/88)

According the follow-ups I was a bit vague on some points. This is an
attempt to clarify

>(Using Pascal here)
>   SIA_range : (some interval);      (* Index in the IA *)
>   DIL_range : (some interval);      (* Index in a block *)

I just used Pascal notation, being lazy. Note that Pascal allows
assignments from SIA_range to DIL_range. The intervals are only
there for range checks. As a whole, I have never promoted Pascal
as a "modern" language, and I never will. 

>the compiler should suspect any input.
That is, any input from the programmer, thus, the code itself. Not the
input the programmer gets from *his* users. This input he must validate
and check himself. 

>Not really. Your Russian interpreter doesn't need to have a full notion
>of what you are talking of as long as he gets the words right. But the 
>programmer must. He must understand everything in the specification he has.
>Very often he can't. The text may not be too clear. The one who wrote it 
>may not even know what he really meant, because he knows not what he want.
If one speaker is (deliberatly) vague or ambiguous, the interpreter can
(and should) keep this unclarity in his translation. The programmer
can't. He must find out the facts. Note also that the programmer is 
"translating" a "langauge" on a lower level.
-- 
Erland Sommarskog       
ENEA Data, Stockholm    
sommar@enea.UUCP        
                                                   Post more reviews!

mac3n@babbage.acc.virginia.edu (Alex Colvin) (01/13/88)

In article <462@dcl-csvax.comp.lancs.ac.uk>, craig@comp.lancs.ac.uk (Craig) writes:
> that iteration and recursion are equivalent why not then just stick to
> recursion, sure make it look iterative (so that we all don't get technology 
> shock :-)) but remove iteration from the program model (we could talk about
> GOTOs if you like :-) :-)).  The actor model and continuations have a solution
> to distributing recursion, why not use it.

Iteration and recursion aren't quite the same: only some cases of
recursion are iterative.  Those are usually easier to describe with
iterative constructs. 

As for single-assignment languages, check out Lucid (the stream
language, not the Lisp sublanguage). 

One problem I have with Lucid (and Icon, for that matter) is that a
single unit can't easily generate several unsynchronized streams.  This
makes it really hard to generate intermittent debugging messages without
destroying your program.