[comp.lang.ada] problems/risks due to programming language

billwolf%hazel.cs.clemson.edu@hubcap.clemson.edu (William Thomas Wolfe, 2847 ) (02/22/90)

From hammondr@sunroof.crd.ge.com (Richard A Hammond):
>>     3. The AT&T breakdown a month ago was caused by a break statement
>>	in C. See the following mail (multiple forwarding headers deleted):
> 
>>| | In the switching software (written in C), there was a long
>>| | "do . . . while" construct, which contained
>>| |    a "switch" statement, which contained 
>>| |       an "if" clause, which contained a
>>| |          "break," which was intended for
>>| |       the "if" clause, but instead broke from
>>| |    the "switch" statement.
>>| | 
>>
>>	Again it looks like this bug wouldn't have occurred in another
>>	programming language.
% 
% What other programming language?  Only one without any GOTO or restricted
% GOTO (e.g. exit, break, ...).  This leaves out Ada!!!!!!
% 
%  [...]  for N in 1 .. MAX loop 
% 	    case ...
% 	    when ... =>
%                 if NEW_ITEMS(N) = FALSE then 
% 			-- some other useful work gets done here
%                         exit; 			-- exits loop, not if!
%                 end if; 
% 	    when ... =>
%   [...] 
% So, in the AT&T case using Ada we would have exited both the switch and the
% loop rather than just the switch.  Hardly an improvement!

   This is not a valid analogy.  In C, the case statement *requires* the
   use of a restricted GOTO in order to accomplish "normal" processing;
   at the end of the section of code processing a given case, one must
   use a restricted GOTO in order to prevent C from sending the flow of
   control straight into the section of code which was intended to process
   the NEXT case.  In other words, C requires the programmer to use a
   dangerous construct on a routine basis.

   With the if construct in C, the default is to exit the if construct 
   automatically, as opposed to continuing on to execute the section of
   code associated with the else part.  Thus, we have an inconsistency
   in C's design: with one flow-of-control construct (the switch), it is
   necessary to use a dangerous GOTO to achieve normal processing, whereas
   with a similar flow-of-control construct (the if-else), the default is
   reversed.  Given such a language design, it should not surprise anyone
   that programmers become confused, particularly when the constructs are
   being used together. 

   Ada, on the other hand, is consistent: in both the if and case statements,
   the default is to exit the construct once the code associated with the
   specified situation has been executed.  Ada also provides the exit
   statement, a restricted GOTO which permits a loop to be exited early,
   but this construct is not used (as is C's break) on a routine basis.  

> In my limited experience the cases where Ada is introduced into a
> programming environment also introduce lots of other good software
> engineering practices.  For example, lots of people I know who
> program in C don't use LINT.   I view it as a deficiency of management
> and not of the language that they don't use available tools.

   This is certainly true; Brooks and others have noted that the good
   software engineering practices which are routinely introduced in
   conjunction with the Ada language are responsible for more of the 
   resulting improvements than the fact that the Ada language was
   introduced.  However, we cannot disregard that fact that Ada was
   specifically designed to provide maximal support for the software 
   engineering process.  C, on the other hand, was designed to provide
   maximal support for the compilation process.  Since compilers and 
   the CPU power required to operate them come far more cheaply than 
   programmers, and especially in view of the fact that better error
   prevention is worth much more than faster compilation, it would seem
   that the tradeoff made by Ada is certainly the one to be preferred.


   Bill Wolfe, wtwolfe@hubcap.clemson.edu

hammondr@sunroof.crd.ge.com (Richard A Hammond) (02/22/90)

In article <8103@hubcap.clemson.edu> billwolf%hazel.cs.clemson.edu@hubcap.clemson.edu writes:
...
>   This is not a valid analogy.  In C, the case statement *requires* the
>   use of a restricted GOTO in order to accomplish "normal" processing;
    ...
>   With the if construct in C, the default is to exit the if construct 
>   automatically, as opposed to continuing on to execute the section of
>   code associated with the else part.  Thus, we have an inconsistency
>   in C's design: with one flow-of-control construct (the switch), it is
>   necessary to use a dangerous GOTO to achieve normal processing, whereas
>   with a similar flow-of-control construct (the if-else), the default is
>   reversed.  Given such a language design, it should not surprise anyone
>   that programmers become confused, particularly when the constructs are
>   being used together. 

Bull!!! I've seen lots of errors in C code, and this is the first time I've
seen or heard of such an error.  If it really was all that confusing, I'd
expect to have seen it much more frequently.  In fact, the structures are quite
different, since a case in a switch has a sequence of statements without any
additional brackets, whereas an if statement controls only a single statement,
so it is usually followed by "{" and "}" to bracket multiple statements.
I expect it was an editing error that pulled the "break" inside the
if, rather than leaving it at the end of the case.

If that is true, then the same error could be made in Ada, it certainly is
not PREVENTED in Ada.  As the anti-nuclear protesters will gladly explain,
0 risk and a reduced risk are not the same!!!!

>   ...  C, on the other hand, was designed to provide
>   maximal support for the compilation process.

Where did you get this idea?  I believe a better statement is found in
AT&T Bell Laboratories Technical Journal, Vol 63, No 8, Part 2, pg 1686,
Oct. 1984

"A major trend in the development of C is toward stricter type checking,
along the lines of languages like Pascal.  However, in accordance with
what has been called the "spirit" of C (meaning a model of computation
that is close to that of the underlying hardware), many areas of the
language specification deliberately remain permissive."

This does not imply that it was meant to support the compilation process,
but rather, that without huge runtime libraries I could run something
through a C compiler and execute it on a bare machine.

Rich Hammond

jnixon@andrew.ATL.GE.COM (John F Nixon) (02/22/90)

billwolf (William Thomas Wolfe, 2847 ) writes:
>From hammondr@sunroof.crd.ge.com (Richard A Hammond):
>> So, in the AT&T case using Ada we would have exited both the switch and the
>> loop rather than just the switch.  Hardly an improvement!
>   This is not a valid analogy.  In C, the case statement *requires* the
>   use of a restricted GOTO in order to accomplish "normal" processing;

But we aren't talking about using the "break" in this sense, we are talking
about using the "break" to exit an "if", something which isn't C.

> In other words, C requires use [of] a dangerous construct on a routine basis.

Just as Ada requires the use of "exit" to leave the "loop" construct;
unless you use Ada'a "goto"...

>   With the if construct in C, the default is to exit the if construct 
>   automatically, as opposed to continuing on to execute the section of
>   code associated with the else part.  Thus, we have an inconsistency
>   in C's design: with one flow-of-control construct... use(s) dangerous
>   GOTO [normally] whereas a similar flow-of-control construct... default is
>   reversed.  Given such a language design, it should not surprise anyone
>   that programmers become confused, particularly when the constructs are
>   being used together. 

This argument applies equally to Ada's "loop" construct versus Ada's
"if" construct.

>   Ada, on the other hand, is consistent: in both the if and case statements,

Ignoring the example presented (Ada's loop and exit).

>   the default is to exit the construct once the code associated with the
>   specified situation has been executed.  Ada also provides the exit
>   statement, a restricted GOTO which permits a loop to be exited early,
>   but this construct is not used (as is C's break) on a routine basis.  

Unless one uses the "loop" statement on a routine basis.  Bill may not, but
what if I do?  And if the reversed sense is such a bad example of
program language design, then Ada is an example of bad program
language design.  

I'm not trying to tag Ada, or praise C, but simply to make the point that
this case is *not* an example of error a language such as C or Ada 
detects.  Both programs are legal, both will compile, both are wrong.
Neither C nor Ada are free of flaws.  However, C has not made claims
such as the one which follows:

>   However, we cannot disregard that fact that Ada was
>   specifically designed to provide maximal support for the software 
>   engineering process.

I agree.

However, Ada doesn't pass the test in this case.  It is possible (nay,
inevitable) that someone will misuse the Ada exit statement.  And it is
likely that someone will correctly use the exit statment
in exactly this fashion.  Too bad you can't tell till runtime.



----
jnixon@atl.ge.com                    ...steinmetz!atl.decnet!jnxion

machaffi@fred.cs.washington.edu (Scott MacHaffie) (02/22/90)

In article <8103@hubcap.clemson.edu> billwolf%hazel.cs.clemson.edu@hubcap.clemson.edu writes:
>   the NEXT case.  In other words, C requires the programmer to use a
>   dangerous construct on a routine basis.

(the dangerous construct is the "break" statement)
But if "break" were renamed "end case" then there wouldn't be any problem?

>   code associated with the else part.  Thus, we have an inconsistency
>   in C's design: with one flow-of-control construct (the switch), it is
>   necessary to use a dangerous GOTO to achieve normal processing, whereas

No, it is necessary to use a statement to indicate that the current case
statement is finished...like an "end case" or the next "when => " in ADA.

>   with a similar flow-of-control construct (the if-else), the default is
>   reversed.  Given such a language design, it should not surprise anyone
>   that programmers become confused, particularly when the constructs are
>   being used together. 

Some programmers become confused -- good programmers, and software engineers,
don't.

>   specified situation has been executed.  Ada also provides the exit
>   statement, a restricted GOTO which permits a loop to be exited early,
>   but this construct is not used (as is C's break) on a routine basis.  

In all the ADA I have seen, the preferred looping construct is a "loop"
with exit statements sprinkled throughout it. On the other hand, C's
"break" statement is only occasionally used to exit a loop prematurely.

>> In my limited experience the cases where Ada is introduced into a
>> programming environment also introduce lots of other good software
>> engineering practices.  For example, lots of people I know who
>> program in C don't use LINT.   I view it as a deficiency of management
>> and not of the language that they don't use available tools.

I don't use LINT. I use compilers that check certain things I want checked,
like "no prototypes in scope".  LINT does not catch the kinds of mistakes
that I make.  How many ADA programmers do you know of use LINT?

>   This is certainly true; Brooks and others have noted that the good
>   software engineering practices which are routinely introduced in
>   conjunction with the Ada language are responsible for more of the 
>   resulting improvements than the fact that the Ada language was

Well, these practices are certainly NOT being introduced in the universities
(at least not here).

>   introduced.  However, we cannot disregard that fact that Ada was
>   specifically designed to provide maximal support for the software 
>   engineering process.  C, on the other hand, was designed to provide

No, ADA was designed to have everything. The fact that it (or some subset
of it) can be used to do software engineering doesn't mean it was designed
to do it.  Software engineering can be done in any language, including C.

>   maximal support for the compilation process.  Since compilers and 
>   the CPU power required to operate them come far more cheaply than 
>   programmers, and especially in view of the fact that better error
>   prevention is worth much more than faster compilation, it would seem
>   that the tradeoff made by Ada is certainly the one to be preferred.

The "tradeoff made by ADA?" Do you mean poor code generation versus code that
can be read by non-programmers? Or do you mean the language syntax that
creates nightmares for maintenance? (Does Ada allow you to redefine the
precedence of operators?) Or maybe you mean "language that the government
will use" versus "language that non-government related software companies
will use"?

		Scott MacHaffie

pierson@encore.com (Dan L. Pierson) (02/23/90)

In article <5017@csv.viccol.edu.au> dougcc@csv.viccol.edu.au (Douglas Miller) writes:
   Valid but utterly vacuous point, as ADA *was* designed to provide maximal
   support for software engineering.  I suppose its possible that another
   (hidden?) design goal was to "have everything".  So what?

   >    Software engineering can be done in any language, including C.

   Irrelevant --- the claim here is that ADA provides *maximal* *support* for
   the software engineering process.  Like, if I said "Air travel is the
   fastest way to get to another city" and you said "You don't have to go by
   `plane.  You could go by car, or even on foot", then I'd look at you with a
   slightly glazed expression, right?   Sorry to labor this, but I've seen the
   above point made *too* many times.

This bit of ADA mythology (or dogma) has also been made too many times
for me to remain silent.  Yes, ADA did have a goal of maximal support
for the software engineering process.  However other goals (and the
committee requirements and design process) largely subverted that goal
by producing an excessively large, over-specified monster.

You can certainly do software engineering in ADA, it is in most ways a
better language for the purpose than C, but other languages such as
Modula-3, Eiffel, and maybe Turing provide at least the software
engineering benefits of ADA (though not all the "nifty" features*) in
languages that are small enough to be useable, learnable, teachable,
and efficiently implementable in less than a decade.

I'm not interested in another C vs. ADA vs. my-favorite-language war,
but I'm just plain tired of the line that ADA is equivalent to
software engineering because the DOD and those who base their careers
on it say so.

*In fact, some of these "nifty" features present more opportunity for
misuse, and thus software engineering drawbacks, than benefits.
Operator overloading comes to mind...
--

                                            dan

In real life: Dan Pierson, Encore Computer Corporation, Research
UUCP: {talcott,linus,necis,decvax}!encore!pierson
Internet: pierson@encore.com

grimlok@hubcap.clemson.edu (Mike Percy) (02/23/90)

From article <8103@hubcap.clemson.edu>, by billwolf%hazel.cs.clemson.edu@hubcap.clemson.edu (William Thomas Wolfe, 2847 ):

[Ada is great, Ada is wonderful, nothing can beat Ada, if you don't program 
 in Ada you are stupider than worm-slime, blah blah blah]

I've used Ada Bill, I even liked it. But, I am really getting tired of reading
your diatribes against every other programming language, of reading your
praises of Ada to the exclusion of all others.  I use the language I feel
is best for the particular problem at hand.  I drop into assembly to 
optimize a speed-critical section of code.  I use C for portability (right,
I know, there are n fully verified  Ada compilers running of x different
machines, so how can Ada not be portable?) and speed of coding (my speed
is higher in C than in Ada or any other language, this is subject to
change of course), I use COBOL at work because that is the language the
systems I maintain, I use a quasi-4GL when I can to spit out 3GL code. 
SOmetimes I use SNOBOL, Lisp, or Prolog, or Pascal.

I've seen some Ada code that must have been written by brain-damaged idiots.
And I've seen some beautifully written assembly.

The point is, that the programming practices are what makes the difference.
It is entirely reasonable to engineer software in C or Prolog or...
Just as it is reasonably easy to mangle Ada.  Granted, Ada lends itself
to good software engineering practices, probably you could even say that it
encourages it. However, it does not guarantee it, nor does the failure to
use Ada preclude good software engineering practices.

My guess is that Ada will gain a load of users and their respect when there 
is a Microsoft Ada or better yet (or worse yet) a Turbo Ada.

Anyway please try to keep your pro-Ada, anti-everything-else stuff in check. 

-- 

'I just couldn't convince the voters that Dukakis was Greek for
"Bubba".' -- Lloyd Benson explaining why the Democrats didn't carry
Texas

billwolf%hazel.cs.clemson.edu@hubcap.clemson.edu (William Thomas Wolfe, 2847 ) (02/23/90)

From machaffi@fred.cs.washington.edu (Scott MacHaffie):
>>   code associated with the else part.  Thus, we have an inconsistency
>>   in C's design: with one flow-of-control construct (the switch), it is
>>   necessary to use a dangerous GOTO to achieve normal processing, whereas
> 
> No, it is necessary to use a statement to indicate that the current case
> statement is finished...like an "end case" or the next "when => " in ADA.

   Such a statement already exists: either the next "case Value:", or 
   the } which ends the switch.  Why is it necessary to use a "break"?
 
>>   with a similar flow-of-control construct (the if-else), the default is
>>   reversed.  Given such a language design, it should not surprise anyone
>>   that programmers become confused, particularly when the constructs are
>>   being used together. 
> 
> Some programmers become confused -- good programmers, and software 
> engineers, don't.

   The problem cannot simply be defined out of existence by saying,
   in essence, that good programmers don't make mistakes.  All human
   programmers make mistakes, and a well-designed language will help
   to minimize this particular tendency.  In this case, C does not. 

>>   This is certainly true; Brooks and others have noted that the good
>>   software engineering practices which are routinely introduced in
>>   conjunction with the Ada language are responsible for more of the 
>>   resulting improvements than the fact that the Ada language was
> 
> Well, these practices are certainly NOT being introduced in the 
> universities (at least not here).

   In that case, I strongly suggest that you immediately bring 
   this fact to the attention of the software engineering faculty 
   at washington.edu.  At other universities (e.g., Clemson), Ada
   *is* introduced in conjunction with software engineering.


   Bill Wolfe, wtwolfe@hubcap.clemson.edu

kassover@control.crd.ge.com (David Kassover) (02/23/90)

Wow, we are getting vicious, aren't we?

In article <PIERSON.90Feb22110501@xenna.encore.com> pierson@encore.com (Dan L. Pierson) writes:
>In article <5017@csv.viccol.edu.au> dougcc@csv.viccol.edu.au (Douglas Miller) writes:
>   Valid but utterly vacuous point, as ADA *was* designed to provide maximal
>   support for software engineering.  I suppose its possible that another
>   (hidden?) design goal was to "have everything".  So what?
>
>   >    Software engineering can be done in any language, including C.
...
>
>This bit of ADA mythology (or dogma) has also been made too many times
>for me to remain silent.  Yes, ADA did have a goal of maximal support
>for the software engineering process.  However other goals (and the
...
>but I'm just plain tired of the line that ADA is equivalent to
>software engineering because the DOD and those who base their careers
>on it say so.
>Internet: pierson@encore.com

Ada vs C vs Pascal vs.... and mythology
 
 
I think what I am going to write as follows are verifyable facts.
I am not responsible for things not turning out as they were
intended...
 
 
Ada was designed by a committee (!!???  we can argue the merits
of that elsewhere) which had certain aspects of the computer
programming process that they wanted to address.  Many of these
aspects were based on real life experience  (E.G. real-time
programming, needing to acknowledge error conditions that make
continuance undesirable)  The committee succeeded in answering
most of the desires, and where there was an irresolvable
conflict, chose.  Therefore, we have goto statements, no
conditional compile, etc.

Pascal was designed by an individual who had as his primary
concern the production of a testbed from which to teach computer
language design  (or was it computer language compiler design.
NOT THE SAME THING!)  Furthermore, this individual was a
proponent of a restrictive structured programming theory.
Therefore, we have a rather incomplete language with some
extremely annoying features.

C, and Unix, were built, at least initially, by an individual who
wanted to take a crack at building an operating system from
scratch, on a more or less scrapped piece of hardware, while
employed by an employer which was more or less liberal about such
activities.  Therefore we have a language, and OS, which is
simple, and does some things very well indeed, but addresses
other things not at all.


Now, we're back to educated opinion.  Ada, as implemented, did
not achieve all it's goals.  But I submit that it acheived many
of them;  that not all stated goals were even in the original
mix; and that the set of goals was broader in scope than the goal sets
of either C or Pascal.

A long time ago, when I knew less than I do now, I said that Ada,
as specified, was the embodiment of what a Pascal programmer
thought a Cobol programmer ought to have at his disposal.  I
stand by that statement today.  But still, for me, as a programmer
working on a production job, Ada is the language of choice, all
other things being equal  (which they rarely are).  For a
personal programming job that is not likely to ever be touched by
other than me, the language of choice is FORTRAN, simply because
that is the first computer programming language I learned after
BASIC (a very long time ago!), again all other things being equal.


I would like to see this newsgroup return to discussions of Ada
and implementations of Ada.  Or maybe we could create a
sub-newsgroup like comp.lang.flame??  8-)

Dave

lawhorn@optis31.uucp (Jeff Lawhorn) (02/23/90)

It seems to me that if the program in question had been
thoroughly tested it would not matter if it had been written in
C or Ada or Pascal or anything else.  The bug would have been
found, and AT&T's network would not have been knocked off the
air.  In **EVERY** language it is possible to write valid code
that does not do what the programmer wants.  This is why every
possible code path should be tested before it is placed into a
production system.  Let's hope that AT&T didn't learn not to use
C or Ada or whatever, but that they learned to test their
software better before using it in a production system.
--
Jeff Lawhorn
lawhorn@opti.uucp
opti!lawhorn@berick.uucp
ucsd!sdsu!berick!opti!lawhorn

dougcc@csv.viccol.edu.au (Douglas Miller) (02/23/90)

In article <10811@june.cs.washington.edu>, machaffi@fred.cs.washington.edu (Scott MacHaffie) writes:
> In article <8103@hubcap.clemson.edu> billwolf%hazel.cs.clemson.edu@hubcap.clemson.edu writes:
>>   introduced.  However, we cannot disregard that fact that Ada was
>>   specifically designed to provide maximal support for the software 
>>   engineering process.  C, on the other hand, was designed to provide
> 
> No, ADA was designed to have everything.  The fact that it (or some subset
> of it) can be used to do software engineering doesn't mean it was designed
> to do it.

Valid but utterly vacuous point, as ADA *was* designed to provide maximal
support for software engineering.  I suppose its possible that another
(hidden?) design goal was to "have everything".  So what?

>    Software engineering can be done in any language, including C.

Irrelevant --- the claim here is that ADA provides *maximal* *support* for
the software engineering process.  Like, if I said "Air travel is the
fastest way to get to another city" and you said "You don't have to go by
`plane.  You could go by car, or even on foot", then I'd look at you with a
slightly glazed expression, right?   Sorry to labor this, but I've seen the
above point made *too* many times.

dougcc@csv.viccol.edu.au (Douglas Miller) (02/23/90)

In article <8103@hubcap.clemson.edu>, billwolf%hazel.cs.clemson.edu@hubcap.clemson.edu (William Thomas Wolfe, 2847 ) writes:
>    Ada, on the other hand, is consistent: in both the if and case statements,
>    the default is to exit the construct once the code associated with the
>    specified situation has been executed.  Ada also provides the exit
>    statement, a restricted GOTO which permits a loop to be exited early,
>    but this construct is not used (as is C's break) on a routine basis.  

Huh?  Surely EXIT is the only and sensible way to exit a simple loop?  Why
do you call EXIT a `restricted GOTO'?  This makes as much sense to me as
describing a statement as a performing a `restricted GOTO' from the
previous statement.

machaffi@fred.cs.washington.edu (Scott MacHaffie) (02/23/90)

In article <8126@hubcap.clemson.edu% billwolf%hazel.cs.clemson.edu@hubcap.clemson.edu writes:
%From machaffi@fred.cs.washington.edu (Scott MacHaffie):
%% No, it is necessary to use a statement to indicate that the current case
%% statement is finished...like an "end case" or the next "when =% " in ADA.
%
%   Such a statement already exists: either the next "case Value:", or 
%   the } which ends the switch.  Why is it necessary to use a "break"?

example:
	switch (x) { /* x is a character, for example */
		case '0': case '1': ... case '9':
			print_digit(x);
			break;
		case 'a': ... case 'z':
			print_lowercase(x);
			break;
	}
The semantics of a C switch/case statement are different than the semantics
of an ada case/when.

%   The problem cannot simply be defined out of existence by saying,
%   in essence, that good programmers don't make mistakes.  All human
%   programmers make mistakes, and a well-designed language will help
%   to minimize this particular tendency.  In this case, C does not. 

Good programmers understand the language they are using -- good programmers
are literate. No language can eliminate errors. Good software engineering
practices should be used to (try to) catch language-specific errors.

%%%   This is certainly true; Brooks and others have noted that the good
%%%   software engineering practices which are routinely introduced in
%%%   conjunction with the Ada language are responsible for more of the 
%%%   resulting improvements than the fact that the Ada language was
%% 
%% Well, these practices are certainly NOT being introduced in the 
%% universities (at least not here).
%
%   In that case, I strongly suggest that you immediately bring 
%   this fact to the attention of the software engineering faculty 
%   at washington.edu.  At other universities (e.g., Clemson), Ada
%   *is* introduced in conjunction with software engineering.

Software engineering faculty?  I wish. None of the undergraduate classes
here touch software engineering, and I think at most one of the graduate
classes does. Anyone who wants to be a software engineer here has to
pick it up from other sources.

			Scott MacHaffie

hammondr@sunroof.crd.ge.com (Richard A Hammond) (02/24/90)

In article <8126@hubcap.clemson.edu> billwolf%hazel.cs.clemson.edu@hubcap.clemson.edu writes:
 .... [ in discussion of the AT&T bug]
>   The problem cannot simply be defined out of existence by saying,
>   in essence, that good programmers don't make mistakes.  All human
>   programmers make mistakes, and a well-designed language will help
>   to minimize this particular tendency.  In this case, C does not. 
       ^^^^^^^^

Uh Oh! I agree with Bill.  The problem here is that the original request was
for cases where a different language would have PREVENTED the error, that is,
reduced the probability to 0.0 .

Another example was a FORTRAN program fragment:

	DO 10 I = 1.100
which the compiler treated as:	DO10I = 1.100
while the programmer wanted:	DO 10 I = 1, 100

Using another language that was sensitive to spaces between characters would
have PREVENTED the problem.

Now the AT&T problem in C & Ada was:

	C Version			Ada Version
  do {				OUTER:	loop 
	...				    ...
	switch ...			    case ... is
	    {
	    case ... :				when ... =>
		if ( ...) {			    if ... then
		    ...					...
		    break;				exit; -- exits OUTER!!
		}				    end if;
		...				    ...
		break;				when ... =>
	    } /* end case */		    end case;
	...				    ...
  } while ( ...) ;			    exit OUTER when ... ;
					end loop OUTER;

I agree with Bill that Ada would minimize the chance of getting an "exit"
in there, since it doesn't need one at the end of each case.
However, it wouldn't PREVENT it, since the Ada and C fragments are both
not only legal, but sensible.

So I claim that the AT&T problem does not fall in the same class as the
FORTRAN do loop problem, switching to Ada would have made the mistake less
likely but not removed the possibility of it entirely.  Particularly if the
programmer really had the misconception that exit/break left if statements.

Rich Hammond

mph@lion.inmos.co.uk (Mike Harrison) (02/24/90)

In article <5017@csv.viccol.edu.au> dougcc@csv.viccol.edu.au (Douglas Miller) writes:
>Valid but utterly vacuous point, as ADA *was* designed to provide maximal
>support for software engineering.  I suppose its possible that another
>(hidden?) design goal was to "have everything".  So what?
>
Wrong ! - Ada was designed primarily to save DoD money, secondarily to support
very long in-service life software (by simplifying the maintenance process), 
with all kinds of other goodies as a tertiary aim.

Way back in (about) 1975 HOLWG showed that DoD was spending huge sums on s/w 
in embedded systems, which were programmed in >300 languages (including ~ 70
different, often incomptible versions of JOVIAL).
The idea of Ada (was Ironman etc.) was to provide a *single* language in which
almost all embedded operational military s/w would be written, then it would 
only be necessary to keep one kind of programmer - an Ada programmer.

Those of us working on Ada and its environments (things which led to Stoneman)
in those days were serious about this people portablity (which was the prime
motivation for the NO SUBSETS, NO SUPERSETS policy).

Whether Ada, its implementers, DoD or anyone else succeded in these aims is a
matter of personal taste, but I believe that the aims were good and the spirit
which motivated most of the early workers was laudable.

[Whatever became of paths and boxes?]

Mike,




Michael P. Harrison - Software Group - Inmos Ltd. UK.
-----------------------------------------------------------
UK : mph@inmos.co.uk             with STANDARD_DISCLAIMERS;
US : mph@inmos.com               use  STANDARD_DISCLAIMERS;

oplinger@proton.crd.ge.com (B. S. Oplinger) (02/24/90)

Consider the following (slightly modified from the example by Scott MacHaffie):

example:
        switch (x) { /* x is a character, for example */
                case '0': 
                        number_of_groups ++;
                case '1': ... case '9':
                        print_digit(x);
                        break;
                case 'a': ... case 'z':
                        print_lowercase(x);
                        break;
        }


In my example, 0 is just like any other number except that it signals
the start of a new group, hence the increment on the number-of-groups
counter.  This is just the problem with the syntax of the C case
(switch) statement. It is designed so that a case may do some work and
then 'share' the work of the following case statement. I will submit
that this general attitude is found throughout C. Ada, although not a
perfect language, is almost always presented with some discussion on
software-engineering. I feel that this help promote good work habits. 

On LINT, I would never program without it (in C). There is no excuse
for today's language running on todays machines (hell, even the PC)
that LINT checking cannot be done during a compile. I believe that it
should also be an option that must be turned off, not on.

various random thoughts from:

brian
oplinger@crd.ge.com

<#include standard.disclaimer>

sommar@enea.se (Erland Sommarskog) (02/25/90)

Scott MacHaffie (machaffi@fred.cs.washington.edu.cs.washington.edu) writes:
)Bill Wolfe (billwolf%hazel.cs.clemson.edu@hubcap.clemson.edu) writes:
))   the NEXT case.  In other words, C requires the programmer to use a
))   dangerous construct on a routine basis.
)
)(the dangerous construct is the "break" statement)
)But if "break" were renamed "end case" then there wouldn't be any problem?
)
))   code associated with the else part.  Thus, we have an inconsistency
))   in C's design: with one flow-of-control construct (the switch), it is
))   necessary to use a dangerous GOTO to achieve normal processing, whereas
)
)No, it is necessary to use a statement to indicate that the current case
)statement is finished...like an "end case" or the next "when =) " in ADA.

I don't speak C, so I might have missunderstood something, but I'm
under the impression that you may exclude the "break" statement
achieving the effect that you execute the code for the next case too.
Sometimes possibly a nifty feature, but it seems to me that is a good
source of errors. Whether it's called "break" or "end case" has no
importance. You may inadvertantly forget it in both cases.

)I don't use LINT. I use compilers that check certain things I want checked,
)like "no prototypes in scope".  LINT does not catch the kinds of mistakes
)that I make.  How many ADA programmers do you know of use LINT?

Ada programmers don't use lint, they don't have to.

From another article by Scott MacHaffie:
)Good programmers understand the language they are using -- good programmers
)are literate. No language can eliminate errors. Good software engineering
)practices should be used to (try to) catch language-specific errors.

And good languages should have as few unnecessary traps as possible
to help the software engineer spend his efforts on essentials.
-- 
Erland Sommarskog - ENEA Data, Stockholm - sommar@enea.se

sommar@enea.se (Erland Sommarskog) (02/25/90)

Dan L. Pierson (pierson@encore.com) writes:
>*In fact, some of these "nifty" features present more opportunity for
>misuse, and thus software engineering drawbacks, than benefits.
>Operator overloading comes to mind...

Of course you could do really ugly things with operator overloading,
particulary in combination with RENAME. But in a good software
engineering environment with code reviews you would never be allowed 
to do them.

Operator overloading is an important feature. No wonder that
both C++ and Eiffel has incorporated varities thereof. (Eiffel
doesn't really have overloading, but well infix routines.)

Why? Well, in my daily work, which is in Pascal, I have to struggle
with quadword integers. An expression like a := a + (b*c) DIV d + e
becomes when a, b and e are quads:

    a := Add_quads(a, Add_quads(
           Div_quads(Mult_quads(b, Conv_lw_to_qw(c)), Conv_lw_to_qw(d)), e))

which is not very readable. Even more important is that if have
written the standard integer version above and later discover that
I have to change to quads, I don't have to change any existing code.
(Of course the example above can be alleviated in various ways. The
routines used are the standard routines at the site I am on.)
-- 
Erland Sommarskog - ENEA Data, Stockholm - sommar@enea.se

kassover@jupiter.crd.ge.com (David Kassover) (02/26/90)

In article <5498@crdgw1.crd.ge.com> hammondr@sunroof.crd.ge.com (Richard A Hammond) writes:
...
>I agree with Bill that Ada would minimize the chance of getting an "exit"
>in there, since it doesn't need one at the end of each case.
>However, it wouldn't PREVENT it, since the Ada and C fragments are both
>not only legal, but sensible.


Way back in the mists of time, before FORTRAN77 and Ratfor were
inflicted on us, I was doing computer programming with a fortran
preprocessor variously called FLEXTRAN, FLEX, FLECS, etc.
 
One of the features of this preprocessor was that it provided 4
varieties of case statement, as well as an IF-THEN-ELSE
construct.  All four had optional catchall statements.
 
We had the SELECT and the CONDITIONAL.
 
SELECT was similar to Ada Case.  One selected an expression, and
specified cases for each.  CONDITIONAL was more like an
IF-THEN-ELSE, and was not strictly necessary.  That is, one
entered the conditional, and executed the code associated with
the first of a list of conditions that was true.
 
There were also the ENTER SELECT and the ENTER CONDITIONAL.
Exactly like SELECT and CONDITIONAL, except that once a case was
selected, control passed through to all subsequent cases,
including a catchall if present.  Similar to the C Switch.

I have no idea why someone went to the trouble of providing more
than a functionally complete set of control structures, but it
was very handy, *IF* the programmer didn't choose the wrong one
by mistake, or got confused as to which one he wanted.
 
Oh, well, nothing has changed...
 
By the way, if anyone could suggest a way to make a preprocessor
whose output would be legal FORTRAN *and* a correct program,
if the fortran language did not have an ASSIGNED GOTO construct,
I would appreciate it.  My mind doesn't work that way...

misu_ss@uhura.cc.rochester.edu (What`s in a name?) (02/27/90)

In article <806@enea.se> sommar@enea.se (Erland Sommarskog) writes:
>Scott MacHaffie (machaffi@fred.cs.washington.edu.cs.washington.edu) writes:
>)Bill Wolfe (billwolf%hazel.cs.clemson.edu@hubcap.clemson.edu) writes:
>))   the NEXT case.  In other words, C requires the programmer to use a
>))   dangerous construct on a routine basis.

>))	[stuff deleted]

>))   code associated with the else part.  Thus, we have an inconsistency
>))   in C's design: with one flow-of-control construct (the switch), it is
>))   necessary to use a dangerous GOTO to achieve normal processing, whereas
>)
>)No, it is necessary to use a statement to indicate that the current case
>)statement is finished...like an "end case" or the next "when =) " in ADA.
>
>I don't speak C, so I might have missunderstood something, but I'm
>under the impression that you may exclude the "break" statement
>achieving the effect that you execute the code for the next case too.
>Sometimes possibly a nifty feature, but it seems to me that is a good
>source of errors. Whether it's called "break" or "end case" has no
>importance. You may inadvertantly forget it in both cases.
>

Yupper.  To quote the Holy Scriptures:

"Falling through cases is a mixed blessing.  On the positive side, it allows
several cases to be attached to a single action... But it also implies that
normally each case must end with a break to prevent falling through to the
next.  Falling through from one case to another is not robust, being prone
to disintegration when the program is modified.  With the exception of
multiple labels for a single computation, fall-throughs should be used
sparingly and commented.
	"As a matter of good form, put a break after the last case even
thought it is logically unnecessary.  Some day when another case gets added
at the end, this bit of defensive programming will save you." (K&R p.59)

So, yes I'll agree this is a slight flaw in the case statement, but I'm not
sure I buy all this garbage about "break" being a "dangerous construct."

There has been a war on GOTO constructs for some time and while it certainly
began with reason (remember BASIC without labels???:-)), it gets to the
point where efficiency is not saved.  One really must go to great lengths to
program anything significant without some sort of flow control, which at the
machine level breaks down into conditional or unconditional jumping... (Yes,
a GOTO (EEEEEEKKKKK!!!!)).  

So, we want to be reasonable in our jumps, giving them certain well defined
conditions and having only certain types.  So we *do* indeed, for the most
part want to get rid of the monstrosity known as "GOTO label" (To say
nothing of the kind of line I could write in TRS-80 basic: 
    
     20 IF X OR 2 100 /* This jumps to line 100 if the 2 bit is set in X */
).

In fact I would argue that all GOTO like constructs should be conditional
and local.  And so they are (almost) in C.  For, while, switch, do... -- all
local, and all conditional except one:  break (Yeah, C has goto but everyone
knows never to use *that*).  But break is rather well defined in C.  It exits
from the innermost for, while, do or switch statement.  It goes to an easily
noticeable place near it in the code, unlike goto (let's go label hunting!).
Now I could argue that making the break statement conditional i.e. :

break(expression);  performs break if expression is true, nothing if false.

would be a good idea.  For one thing certain loop breaking would be easier
to write.  Admittedly the common uses in switch statements would require
three extra characters...  I don't think it's a big enough deal to be worth
changing.  Break simply isn't that dangerous.  I'm not so sure GOTO is all
that dangerous if treated with the proper respect (But then I've done
assembly programming...:-)). 

Arguments for the break statement:  Do you ever want to break out of a
while, for or do construct?  ("Oh, no!" I hear some say, "Properly written
programs *never* need do that."  My answer:  Yes, you are right and BTW,
every reasonable language is Turing Equivalent -- go program on a turing
machine.) The only option without some kind of break statement, is to use
flag variables and a couple of extra conditionals (Quoth my algorithms
professor, "Flag variables are as ugly as gotos, but they can't be helped in
Standard Pascal...").  

Thing is, lot's of things are pretty dangerous in programming languages but
these things can be very useful.  Noone in their right mind can claim that
pointer manipulation is any less dangerous than a flat out goto label
statement (IMNSHO, it's tremendously more dangerous), but would any
reasonable programmer trade it for the world?  There is simply too much that
becomes easy and elegant when you use pointers into real memory instead of
having to create your own integer index pointers into arrays simulating
memory in order to implement, say, a tree.  We did this in my algorithms
class, in order to learn about how pointers are implemented.  It's not that
hard, just painful and complicated.  I'll take pointers any day of the
week.  

I'm saying that we need not be quite so anal-retentive about the constructs
we allow in programming languages.  Admittedly, C is not for the faint of
heart, and I'll be surprised if it catches on in the big business world.
There is too much need for programming and not enough programmers who can
handle a C environment.  That's ok by me -- I'm really not interested in
that sort of dp anyway (although the poor saps ought to get something better
than C*BOL to use).  But the one thing that makes C more dangerous, more hairy,
more able to strike fear into the heart of the novice programmer than even
the demonic teenage Mike S. with his TRS-80 model III spaghetti basic
interpreter using Machine Language subroutines left and right, is C's use of
pointers and operators.  Not the silly break statement or some small
weakness in the switch.  

C pointers and operators, while hairy, are extremely elegant.  This is why
so many people like to program in C.  But admit it, they are not trivial to
understand.  So my main point is this:  

Anyone who has any business programming in C, should be able to handle a
switch statement where cases fall through.  

And it's true, every C programmer gets steeped in the heritage of the
switch.  Why the third commandment of C states quite clearly:

"Thou shalt put a break statement at the end of each case in a switch, even
unto the last in which it is not logically necessary."

So what's the problem?  If you feel so strongly about a switch statement
than you are obviously of a different school than C'ers.  

There are two main schools of looking at language implementation.  One,
typified by Standard Pascal, says that the programmer probably doesn't know
what s/he is doing and that a language should not let him/her do anything
even slightly out of the ordinary.  The second, typified by C, says that the
programmer knows or should know what s/he is doing and so "I'll just do what
s/he tells me as long as I understand what s/he is saying."  If you are a
type I programmer, using C must feel like walking a tightrope between the
towers of the world trade center.  If you are a type II programmer, using SP
is like trying to answer the four-year-old who has learned how to ask, "Why?"

The best languages are obviously not going to go whole hog in either
direction, but in general you should use what you feel comfortable with.  I
like C, You like Ada, so use what you like.  I think K&R made a reasonable
choice with break.  I like being able to fall through a switch, even if I
have to take a little care with it.  Do you think I type *anything* in C
without taking a certain amount of care?  

							--mike


-- 
Mic3hael Sullivan,                    Society for the Incurably Pompous
		     		-*-*-*-*- 
"In the usual way":  That's a phrase that mathematicians use to let you know
they're smarter than you are.				--Norman Stein

karl@haddock.ima.isc.com (Karl Heuer) (02/27/90)

I'm redirecting followups to comp.lang.misc.

In article <5479@ur-cc.UUCP> Michael Sullivan writes:
>[A lot of strawman arguments]

I don't think anybody is seriously arguing the position that you seem to be
attacking.  I also don't know how closely my position matches that of the
other participants in this debate, but I'll state mine as a series of claims
for you to agree with or rebut as you like.

It's meaningless to say "the `break' keyword is {good,bad}"; you have to have
an alternative to compare against.  I will use the notation `C with feature X'
to mean `a hypothetical language which is a lot like C but which has feature
X'; then we can make statements like `C with X would be a better tool than C'.

I am not arguing that C should be changed in this regard (except by adding
warning messages where appropriate).  Thus, the existence of a large body of
code written in C-as-it-is-today is not a factor in such comparisons.

We need to distinguish between keywords and the operations they denote.  It
turns out to be useful to distinguish two operations, which I will call
`break-switch' and `break-loop', both of which are denoted by the keyword
`break' (depending on context).  I will use the term `fall-through' to denote
the operation of flowing from the end of one case into the beginning of the
next one; this operation is denoted in C by the absence of a `break' keyword
in reachable flow at the end of a case block.

Claim 0.  Much use of fall-through in C is simply attaching several cases to a
single action.  (In fact, this is the `positive side' you quoted from the Holy
Scriptures.)  Some other languages achieve this by allowing multiple values to
be associated with a single case label, instead.  Thus, C with multiple valued
cases would not need fall-through nearly as often.

Claim 1.  A break-switch is rarely needed at any point other than at the end
of a case block.  At the end of a case block, break-switch is needed much more
often than fall-through.

Claim 2.  As a general principle, if there is a default action it should be
the most common of the alternatives.  This tends to minimize certain kinds of
user errors.

Definitions.  Let `CX' be the language C with multiple valued cases, with
automatic break-switch at the end of each case block, and with the `break'
keyword denoting only the break-loop operation.  `CX with jumpcase' is CX
with an explicit keyword to denote the fall-through operations (overriding the
default behavior of an automatic break-switch).

Claim 3.  CX with jumpcase would be better than C.

Claim 4.  CX itself would be better than CX with jumpcase (and hence better
than C); the extra keyword doesn't buy you enough to be worth adding, and it
would destroy the commutativity of case blocks, which is a useful property of
CX.

Claim 5.  Because C, not CX, is what we actually have, and because of what I
said in Claim 1, a useful feature of C compilers and/or checkers (lint) is the
ability to produce a warning if (a) a `break' keyword denoting a break-switch
operation appears anywhere other than at the end of a case block, or (b) a
(reachable) fall-through operation occurs at the end of a case block.

Claim 6.  In the case of lint, at least, such warnings should be enabled by
default.  There should be lintpragmas (e.g. /*SWITCH*/, /*FALLTHROUGH*/) that
can selectively disable them.

Karl W. Z. Heuer (karl@ima.ima.isc.com or harvard!ima!karl), The Walking Lint

dougcc@csv.viccol.edu.au (Douglas Miller) (02/27/90)

In article <4174@ganymede.inmos.co.uk>, mph@lion.inmos.co.uk (Mike Harrison) writes:
> In article <5017@csv.viccol.edu.au> dougcc@csv.viccol.edu.au (Douglas Miller) writes:
>>Valid but utterly vacuous point, as ADA *was* designed to provide maximal
>>support for software engineering.  I suppose its possible that another
>>(hidden?) design goal was to "have everything".  So what?
>>
> Wrong ! - Ada was designed primarily to save DoD money, secondarily to support
> very long in-service life software (by simplifying the maintenance process), 
> with all kinds of other goodies as a tertiary aim.

I know, these were the *requirements*.  ADA was *designed* to meet these
requirements by providing maximal support for software engineering.

Alternatively I could argue that the requirements you mention basically
define what software engineering is, and you are just nitpicking.

sommar@enea.se (Erland Sommarskog) (03/02/90)

Michael Sullivan (misu_ss@uhura.cc.rochester.edu) writes:
>"Thou shalt put a break statement at the end of each case in a switch, even
>unto the last in which it is not logically necessary."
>
>So what's the problem?

The problem? Such rules such be enforced by the compiler, not the
programmer. The more rules you into a language which the programmer
is to obey without the compiler to verify, the more likely it is
that casual errors slip in. And the compiler checks for me, the more
I can concentrate on the essentials; trying to solve the real-world
problem at hand.
-- 
Erland Sommarskog - ENEA Data, Stockholm - sommar@enea.se

karl@haddock.ima.isc.com (Karl Heuer) (03/03/90)

In article <1004@micropen> dave@micropen (David F. Carlson) writes:
>What break does is *very* well defined and is no more prone to
>misinterpretation that any other non-linear control flow statement ...

Yes, it's well defined, but what it's defined to do is bad.

For a formal treatment of the above statement, I refer you to my article
<16039@haddock.ima.isc.com>, posted to comp.lang.misc (also .c and .ada) with
this same title.  I haven't seen any rebuttals yet.

>A multi-case switch is very handy in many situations ...

Yeah.  I wish C had this feature, instead of simulating it with fallthrough.

>That you ask the question of the usefulness of break-per-case/multiple-cases
>implies that you haven't sufficient experience with the construct to judge
>its merits/weaknesses.

I don't know about the person you were addressing, but I think I've had
sufficient experience with it.  I certainly question its usefulness in
comparison to something reasonable, like the language I described in my other
article.

In fact, even if you insist that the comparison must be between C and
plain-C-without-break-switch, I think I'd still go for the latter.  I believe
the benefit of not requiring an overloaded keyword to do a break-switch
exceeds the cost of having to use a goto to merge related cases.

Karl W. Z. Heuer (karl@ima.ima.isc.com or harvard!ima!karl), The Walking Lint