[comp.lang.c] Modern langauges

pase@ogcvax.UUCP (Douglas M. Pase) (12/31/87)

J_ELORAN@FINJYU.BITNET writes:
>   [...]
>
>   Long live C!

sommar@enea.UUCP(Erland Sommarskog) writes:
>
>But C is no new language. Well, may be a little younger than Fortran and
>Cobol, but not much. And they are all archaic.

Someone once said that Lady Algol was an improvement on most of her successors.
That includes C, Pascal, and Modula-2 (but not earlier versions of FORTRAN).

> [Some naive comments on why C will outlive its usefulness, including
> incorrect remarks about the relationship between Unix and C]

>Why I find C archaic? Let me just say I think that a good language
>should save you many mistakes as early as possible.

This is a noble goal.  A major problem with this is the definition of what
constitutes a 'mistake'.  Stuffing a 16 bit value into an 8 bit slot under
some circumstances may constitute an error.  Other times it may not.
Explicitly specifying what should be obvious semantically causes programs
to be verbose and tedious, both to read and to write.

Overrunning the end of an array is usually accepted as an error.  BUT,
allowing array bounds to be determined at runtime can also be an extremely
useful feature.  It allows more flexible and efficient usage of memory.
Unfortunately, detecting array overruns at compile time and allowing run time
array definition are difficult to put together, and runtime bound checks are
expensive in terms of efficiency.

Some things are almost universally recognized as big no-nos.  An example might
be using a double precision floating point value as a structure pointer.  C
doesn't allow that type of operation more than any other language.  But then,
the arguments never seem to be over that anyway.  They usually end up being
stylistic arguments (and everyone has their own opinion on what constitutes
'good' style/syntax).

Another source of criticism is features which at times may be useful, at
other times may be hazardous.  C takes a liberal view of the world.  It assumes
(sometimes rashly) that the programmer knows what s/he is doing.  The saw that
cuts the wood will also cut the hand.  We have known that for years, but we
still have saws.  (And no, I don't want to hear about how C is missing guards
that keep hands out -- the guards are there, but they must be used to be
effective.)

>It should also
>be readable, i.e. it shouldn't unnecessarily use artificial symbols.

Ah, the old verbosity argument.  Shorthand should never be used in business
because only those who know shorthand can read it.  Let's hear it for English
as a programming language.  C *is* readable to those who know C.  Now I do not
argue that C cannot be made unreadable, but then, so can APL, Lisp, Modula-2,
Pascal, and FORTRAN.  I will be the first to agree that some operators in C
are a bit strange, and frankly I don't know how to use them.  But every time
I try to include a particular op in a list of indefensible operators, someone
shows me a case I hadn't thought of in which it is indispensible, or at least
very useful.

>Modern languages, but still in the 3rd generation, are for example
>Ada, Modula-2 and Eiffel.
      ^^^^^^^^
Modula-2 is hardly a modern language, at least in the sense that it brings any
new ideas with it.  It is a rather poor extension of a language which, in its
purest form, is overly simple.  It is closer to a repackaging of an older
language than it is to being anything new.  For a more thorough criticism of
Modula-2, look up an article I wrote in the November 1985 ACM SIGPLAN Notices,
"System Programming in Modula-2".

Ada has much in its favor, but also its share of problems.  I know nothing of
Eiffel.

> [An illustration of C as a low level language]

I agree, C is a low level language, at least in the sense that it gives you
a programming model which is still very close to the machine.  It was intended
to be that way.  It will never replace higher level languages.  It was not
intended to.  I don't use C when I should use Lisp, APL, SmallTalk, ML, or
Prolog.  This shouldn't be a supprise to anyone -- I don't use a hammer to cut
boards, either.

>-- 
>Erland Sommarskog       
>ENEA Data, Stockholm
>sommar@enea.UUCP
>                   [Opinionated comment deleted]

--
Doug Pase   --   ...ucbvax!tektronix!ogcvax!pase  or  pase@cse.ogc.edu.csnet

V4039%TEMPLEVM.BITNET@CUNYVM.CUNY.EDU (Stan Horwitz) (01/02/88)

  Regarding the critisms of C recently posted, as a new C programmer, just
as a hobby ... not work related, I find the whole language very strange
but intriguing.  It does offer the imagitive programmer lots of control
not easily available in higher level languages.  It's syntax is strange
but that can be fixed to some extent by simply creating a file with
#DEFINE statements defining thins in easier terms.  I was tempted to
set up just such a file of definitions with the goal of making the syntax
appear similar to that of Pascal which I know very well, but this is not
something one does when learning a language.  Some of the symbols used
for operations are insane.  If more care were used by C's authors in
selecting symbols to use, it would have made it a little more difficult
to make stupid errors.
   I have one question though.  The whole concept of C is in my opinion
innovative in that the authors have developed a very powerful higher level
assembly language.  It gives all the power of assembly language plus much
more without sacrificing too much in the way of efficiency of object code.
The question is, given the imagination of C's authors why couldn't they
think of a better name to call the language?  Where the heck did the
name come from?  Was the name C the result of a night of heavy drinking
or what?  Not that it really matters, but I am just curious?

   Happy New Year ... Stan Horwitz  V4039 at TEMPLEVM

farren@gethen.UUCP (Michael J. Farren) (01/06/88)

V4039%TEMPLEVM.BITNET@CUNYVM.CUNY.EDU (Stan Horwitz) writes:
>It's syntax is strange
>but that can be fixed to some extent by simply creating a file with
>#DEFINE statements defining thins in easier terms.

This is an astoundingly bad idea.  Doing this may result in the program
being more understandable for you (at least, until you learn C a little
better), but will make it damn near indecipherable for anyone else.  A
programmer should NEVER make the assumption that his code will never
be read or modified by anyone else; make it clear enough that anyone
who knows the language can understand it with a little work.  Excessive
use of #define statements leads to an incredible mishmash - check the
Obfuscated C Contest entries for extreme examples.

>Some of the symbols used for operations are insane.

No, just unusual.  APL, on the other hand, is insane :-)

-- 
Michael J. Farren             | "INVESTIGATE your point of view, don't just 
{ucbvax, uunet, hoptoad}!     | dogmatize it!  Reflect on it and re-evaluate
        unisoft!gethen!farren | it.  You may want to change your mind someday."
gethen!farren@lll-winken.llnl.gov ----- Tom Reingold, from alt.flame 

noise@eneevax.UUCP (Johnson Noise) (01/07/88)

In article <519@gethen.UUCP> farren@gethen.UUCP (Michael J. Farren) writes:
>V4039%TEMPLEVM.BITNET@CUNYVM.CUNY.EDU (Stan Horwitz) writes:
>>It's syntax is strange

>>Some of the symbols used for operations are insane.
>
>No, just unusual.  APL, on the other hand, is insane :-)

	I definately agree about APL (overstikes? definately the product
of a twisted mind (I know, limited symbols etc.)).  But, really guys, I
think that the syntax of C is not strange at all.  It all seems logical
if one has the mindset of an assembler programmer (I reserve the right to
restrict my generalization to VAX 11, MC68000, and, of course PDP-11).
Many of the C constructs (both source and object) mirror assembly, which
is very cool.  Also, I'd rather type

char foo(i, j)
int i, j;
{
}

than the "more readable"

FUNCTION foo(i: INTEGER, j: INTEGER) : CHAR
BEGIN
END;

and so on and so forth.

peter@sugar.UUCP (Peter da Silva) (01/07/88)

In article ... V4039%TEMPLEVM.BITNET@CUNYVM.CUNY.EDU (Stan Horwitz) writes:
>   Regarding the critisms of C recently posted, as a new C programmer...
>                                              It's syntax is strange
> but that can be fixed to some extent by simply creating a file with
> #DEFINE statements defining thins in easier terms.

NO! NO! NO! NO! NO! NO!

Sorry. This is one of the first things a beginning 'C' programmer seems to
want to do. I don't know why I never had the urge (especially since I was
a Pascal fanatic before I was introduced to 'C'), maybe it's a character
flaw... BUT...

This is the greatest way in the world to creat totally unreadable and
unmaintainable code.

	1. No two people use the same set of #defines. You look at a
	   piece of code that's been pascalised, and you have no idea
	   whether it's even syntactically correct.

	2. It keeps you from really learning 'C'. So long as you think of 'C'
	   as just Pascal with a weird character set, you'll have no end
	   of problems with the language. Making it look like Pascal is
	   going to extend this learning period indefinitely.

	3. It makes debugging much harder, because the compiler will be giving
	   you error messages that have little relation to the source code.

> I was tempted to
> set up just such a file of definitions with the goal of making the syntax
> appear similar to that of Pascal which I know very well, but this is not
> something one does when learning a language.

My god, sanity prevails. Thank you. Now I know I won't be getting any
pascalised 'C' from you in comp.sources.

> Some of the symbols used
> for operations are insane.

Do you mean clinically insane, or just unusual?

> Where the heck did the
> name come from?  Was the name C the result of a night of heavy drinking
> or what?  Not that it really matters, but I am just curious?

C's predecessor was a language called B.
-- 
-- Peter da Silva  `-_-'  ...!hoptoad!academ!uhnix1!sugar!peter
-- Disclaimer: These U aren't mere opinions... these are *values*.

chip@ateng.UUCP (Chip Salzenberg) (01/12/88)

In article <1374@sugar.UUCP> peter@sugar.UUCP (Peter da Silva) writes:
>
>In article ... V4039%TEMPLEVM.BITNET@CUNYVM.CUNY.EDU (Stan Horwitz) writes:
>>Its syntax is strange
>>but that can be fixed to some extent by simply creating a file with
>>#DEFINE statements defining thins in easier terms.
>
>This is the greatest way in the world to create totally unreadable and
>unmaintainable code.

So true.  Peter Bourne wrote the Bourne shell using macros that make the
program itself look like a shell script:

	/* Here is some code in Bourne shell style
	 */
	IF a == 1 THEN this; that; the_other FI
	WHILE condition()
	DO      stuff;
		more stuff
	OD

This code is so difficult to read and maintain that I gave up on it and
ported the C shell instead. (Thanks, Bill!)

-- 
Chip Salzenberg                 UUCP: "{codas,uunet}!ateng!chip"
A T Engineering                 My employer's opinions are a trade secret.
    "Anything that works is better than anything that doesn't."  -- me

gwyn@brl-smoke.ARPA (Doug Gwyn ) (01/12/88)

In article <149@ateng.UUCP> chip@ateng.UUCP (Chip Salzenberg) writes:
>So true.  Peter Bourne wrote the Bourne shell using macros that make the
>program itself look like a shell script:

Any relation to Steve?

>This code is so difficult to read and maintain that I gave up on it and
>ported the C shell instead. (Thanks, Bill!)

The Bourne shell sources were turned back into non-ALGOLized C several
years ago. (Thanks, Dave!)

john13@garfield.UUCP (John Russell) (01/12/88)

In article <519@gethen.UUCP> farren@gethen.UUCP (Michael J. Farren) writes:
>V4039%TEMPLEVM.BITNET@CUNYVM.CUNY.EDU (Stan Horwitz) writes:
>>It's syntax is strange
>>but that can be fixed to some extent by simply creating a file with
>>#DEFINE statements defining thins in easier terms.
>
>This is an astoundingly bad idea.  Doing this may result in the program
>being more understandable for you (at least, until you learn C a little
>better), but will make it damn near indecipherable for anyone else.  

I wonder about this. One of the uses I have for many #defines is often to
create functions with short, mnemonic names that _I_ am comfortable with,
then have verbose, self-explanatory symbolic names that automatically cast
the types of paramaters. I can write programs for my own use quickly, or
easy-to-understand code with a little more effort. I might even use "verbose"
mode for a tricky section where I want to make sure I don't get lost.

eg:

#define forward(x) fd((double)(x))

/* move graphics cursor forward by some predefined amount (a la Logo) */

fd(distance)
double distance;
{ ... }

main()
{
 double l1;
 int l2;
 short l3;
 long l4;

... assign some values to these variables ...

 /* I don't want to assume that everyone can follow my notation, so... */

 forward(l1);
 forward(l2);
 forward(l3);
 forward(l4);
}

More readable to someone else, portable, and if I mix different types I
don't have to worry about what type C will promote the result to. Of course
this means I could pass a pointer without complaint, but you have to give
the programmer *some* credit :-).

John

PS I'm sure once you've programmed a few operating systems this becomes
academic.
-- 
"Operating systems room, prepare for shutdown."
"I never thought I'd be HAPPY to see our ratings do DOWN!"
		-- lots of these were sprinkled throughout the 
		   last broadcast episode of _Max_Headroom_