[comp.lang.ada] vs Ada - Don't forget the prime directive!

afes0isi@ZACH.FIT.EDU (Sam Harbaugh-AFES PROJECT) (06/16/91)

As I read the arguments of Ada vs <insert name of favorite language>
I recall the prime directive from starfleet command, "Achieve
lower life cycle costs". Since the DoD reports spending 80% of
life cycle costs in maintenance the software support system
requirements should be dominated by maintainability, not
development.  The software support system should thus be composed
of tools, languages and methods more suitable to maintenance
than development, when a choice must be made.  Thus having one
standard validated language, no dialects allowed, leads to
a larger qualified maintenance workforce and stability for
maintenance tool development.
Here are some beliefs I hold regarding Ada:
1. Ada is more readable than any other computer language I have seen.
I can better understand what the programmer told the computer
to do and what the designer intended to have the computer do.
2. The cost of computing is cut in half every 3 years. Thus while
I lament that Ada "costs more initially" it costs less and less
while I lament BUT only if I have chosen a software first design
approach and haven't purchased my computer hardware before beginning
my software design.  This ever decreasing cost of hardware can
further yield benefit if I design and build software which can be
reused, ported to ever more powerful and cheaper hardware as the
system is deployed.  I believe that Ada is the best language for
software first design and design for portability and reusability
of any computer language I have ever seen.
3. The computer language must be selected with respect to
system requirements, not language features and Ada was designed to
meet the DoD system requirements; therefore, it is no surprize to
me that Ada best fulfills the DoD's needs.
--
There, now I feel better!
sam harbaugh saharbaugh@ROO.FIT.EDU        
-----------

erwin@trwacs.UUCP (Harry Erwin) (06/17/91)

Ada is clearly more maintainable. It is also much easier
to prove Ada code correct (because it avoids programming
constructs that result in non-commutative parallel code).
Barry Boehm has noted, however, that there are major types
of software that Ada is very poor at implementing (simulations,
parallel processing in open environments, test generation
code, anything involving pointers to functions, anything
that approaches the full generality of a Turing Machine--
although you can still implement a Turing Machine in Ada--
most AI applications). Al Perlis took much the same position,
although he was markedly less enamored with Ada.

As a performance engineer, I find Ada unattractive. If the
conversion ratio for C is 5 MLI per source statement, then
Fortran generally comes in at 6-7 MLI and Ada at 10-12.
And these source statements are no more powerful than
C statements; there's just a lot more bounds checking,
and other overhead, going on. As long as applications
press the capabilities of the host hardware, alternatives
to Ada will be needed (and not just for the applications
listed above where Ada is clumsy at best). Another factor
is that it's very easy to write inefficient code in Ada.
The machine architecture is deliberately hidden from the
coder, and the natural implementations are usually the
most expensive. These factors come together in code that
will never be as efficient as the corresponding C code
and that requires 10-20 times as many CPU cycles until
tuned during integration and test. It's hard to overcome
those handicaps.

Harry Erwin   erwin@trwacs.fp.trw.com
--The usual disclaimers. 

-- 
Harry Erwin
Internet: erwin@trwacs.fp.trw.com

jls@netcom.COM (Jim Showalter) (06/18/91)

>As a performance engineer, I find Ada unattractive. If the
>conversion ratio for C is 5 MLI per source statement, then
>Fortran generally comes in at 6-7 MLI and Ada at 10-12.
>And these source statements are no more powerful than
>C statements; there's just a lot more bounds checking,
>and other overhead, going on.

Uh, bounds checking is not generally regarded as "overhead":
many people prefer to think of it as the sort of thing that
helps prevent your expensive satellite from spiraling slowly
into the sun. Claiming that C is faster than Ada because C
doesn't deign to check whether or not it is attempting to
execute data (having walked a pointer off into hyperspace
silently) is not a very compelling argument for its use. Besides,
I can quite easily eliminate such "overhead" from Ada if I
am determined to make it execute as dangerously as C--it's
just that in Ada I'm at least given the CHOICE.

As for the claim that the rest of the source statements are "no
more powerful than C statements", I beg to differ. Can you show
me the C statement equivalent of a task rendezvous, for example?

>The machine architecture is deliberately hidden from the
>coder,

Thereby aiding portability, reuse, and maintenance--among the
stated goals of Ada.

>These factors come together in code that
>will never be as efficient as the corresponding C code
>and that requires 10-20 times as many CPU cycles until
>tuned during integration and test. It's hard to overcome
>those handicaps.

Other than by, as you said one sentence earlier, tuning during
integration and test? Incidentally, I question your 10-20 X
figures. See other postings in this thread concerning the
relative speed of, for example, DEC Ada vs DEC C. The Ada
compilers keep speeding up, and in many cases I'm aware of
equal or exceed the speed of comparable C compilers. If your
data is more than a few years old, it is obsolete. If it
is current, I'd like more detail.
-- 
*** LIMITLESS SOFTWARE, Inc: Jim Showalter, jls@netcom.com, (408) 243-0630 ****
*Proven solutions to software problems. Consulting and training on all aspects*
*of software development. Management/process/methodology. Architecture/design/*
*reuse. Quality/productivity. Risk reduction. EFFECTIVE OO usage. Ada/C++.    *

erwin@trwacs.UUCP (Harry Erwin) (06/19/91)

jls@netcom.COM (Jim Showalter) writes:

>>As a performance engineer, I find Ada unattractive. If the
>>conversion ratio for C is 5 MLI per source statement, then
>>Fortran generally comes in at 6-7 MLI and Ada at 10-12.
>>And these source statements are no more powerful than
>>C statements; there's just a lot more bounds checking,
>>and other overhead, going on.

>Uh, bounds checking is not generally regarded as "overhead":
>many people prefer to think of it as the sort of thing that
>helps prevent your expensive satellite from spiraling slowly
>into the sun. Claiming that C is faster than Ada because C
>doesn't deign to check whether or not it is attempting to
>execute data (having walked a pointer off into hyperspace
>silently) is not a very compelling argument for its use. Besides,
>I can quite easily eliminate such "overhead" from Ada if I
>am determined to make it execute as dangerously as C--it's
>just that in Ada I'm at least given the CHOICE.

Actually, the 10-12 MLI/statement is with error checking turned 
off. With it turned on, we're in the 15-20 range. The data
are recent and for an efficient compiler.

>As for the claim that the rest of the source statements are "no
>more powerful than C statements", I beg to differ. Can you show
>me the C statement equivalent of a task rendezvous, for example?

The Ada task rendezvous is notorious in the performance engineering
community as a feature to be avoided in real-time and near-real-time
applications. I have written a set of C++ classes that give me
a multi-tasking environment when I need it, and they're a lot
more efficient than Ada. I don't usually use them, preferring
instead to use the operating system tasking mechanism, but I
will use them before I use Ada tasking.

>>The machine architecture is deliberately hidden from the
>>coder,

>Thereby aiding portability, reuse, and maintenance--among the
>stated goals of Ada.

My experience is that hiding the machine architecture from the
coder usually decreases the performance of the code by a
factor of at least 5.

>>These factors come together in code that
>>will never be as efficient as the corresponding C code
>>and that requires 10-20 times as many CPU cycles until
>>tuned during integration and test. It's hard to overcome
>>those handicaps.

>Other than by, as you said one sentence earlier, tuning during
>integration and test? Incidentally, I question your 10-20 X
>figures. See other postings in this thread concerning the
>relative speed of, for example, DEC Ada vs DEC C. The Ada
>compilers keep speeding up, and in many cases I'm aware of
>equal or exceed the speed of comparable C compilers. If your
>data is more than a few years old, it is obsolete. If it
>is current, I'd like more detail.

The data are weeks to months old and generally proprietary.
My experience over the last 12 years is that Ada, Pascal,
and similar languages (CMS2, Modula-2, etc.) generate code
that always has to be tuned in integration and test to
overcome implementation inefficiencies. The performance
improvement during tuning is almost always at least 10-to-1,
and frequently reflects subtle characteristics of the
hardware architecture, which those languages are intended
to hide from the programmer. Tuning is less frequently
needed for C code, and the inefficiencies to be overcome
are significantly less. Hence, by using Ada, you're
reducing your programming and maintenance costs (which
are also programming costs) but decreasing your performance
and increasing your integration and test costs (which
is expensive anyway). You pays your money and takes your
choice...

-- 
Harry Erwin
Internet: erwin@trwacs.fp.trw.com

mfeldman@seas.gwu.edu (Michael Feldman) (06/19/91)

In article <313@trwacs.UUCP> erwin@trwacs.UUCP (Harry Erwin) writes:
>
>The data are weeks to months old and generally proprietary.
Don't you get bored reading about proprietary data which therefore can't
be independently corroborated? Why bother to discuss it on the net?

Who gains and who loses by not making all this performance data really
public?

Mike

blakemor@software.org (Alex Blakemore) (06/19/91)

In article <311@trwacs.UUCP> erwin@trwacs.UUCP (Harry Erwin) writes:
> Barry Boehm has noted, however, that there are major types
> of software that Ada is very poor at implementing (simulations,
> parallel processing in open environments, test generation
> code, anything involving pointers to functions, anything
> that approaches the full generality of a Turing Machine--
> although you can still implement a Turing Machine in Ada--
> most AI applications). 

  I know of several people that write simulators in Ada for Nasa and
have done so for years.  From what I've seen of their code, Ada
has been a natural fit.  Why do you think Ada is "very poor" in this
area? Because Barry Boehm or anyone else says so is not a vey convincing
argument.   Can you at least post a reference to this seminal work?

  My last company developed two successful AI applications in Ada
They even made money (natural language translation within a limited domain)
I agree Ada doesnt support all the run time flexibility typically used
by AI hackers.  Ada is not a natural tool for experimenting with AI concepts 
and algorithms, but once you've chosen a technique data structure or algorithm,
you can most likely express it in Ada.  The president of a company I used
to work for would assert that even AI programs consisted of 80-90% ordinary
software (managing files, communication, DBMS, printing whatever) and that
the novel "AI" algorithm was only part of a complete application.  If this
is true, the support for software engineering in Ada could really help
in the development of that part of the application. 

  Why is it more difficult to write test generators in Ada than
any other language?

  Of course, you can simulate a Turing machine in Ada and of course
the language was designed to discourage arbitrary use gotos, etc.
That's good.

  By the way, (nit picking at semantics) not only is Ada "very poor 
at implementing" certain systems, Ada CAN'T implement, any type of software.
It's only a tool.
-- 
---------------------------------------------------------------------
Alex Blakemore           blakemore@software.org        (703) 742-7125
Software Productivity Consortium  2214 Rock Hill Rd, Herndon VA 22070

jls@netcom.COM (Jim Showalter) (06/20/91)

>Actually, the 10-12 MLI/statement is with error checking turned 
>off. With it turned on, we're in the 15-20 range. The data
>are recent and for an efficient compiler.

I refer you to other posts in this thread comparing Ada and
C compilers for counterexamples. Sounds to me like your compiler
needs some work.

>The Ada task rendezvous is notorious in the performance engineering
>community as a feature to be avoided in real-time and near-real-time
>applications.

I guess it kind of depends on how you define "performance engineering
community". The people I've been working with over the last four years
certainly qualify as members of that community, and they have successfully
used Ada tasking on numerous applications with hard real time scheduling
deadlines (a telecommunications switch and a series of ships with a
distributed real time approach to managing ship operations come to mind).

If you're not familiar with the rate monotonic scheduling algorithm, you
might do well to go read up on it.

>I have written a set of C++ classes that give me
>a multi-tasking environment when I need it, and they're a lot
>more efficient than Ada.

Prove it. Provide numbers. Specify platforms, operating systems,
compilers, etc. Otherwise, this is just anecdotal.

>I don't usually use them, preferring
>instead to use the operating system tasking mechanism, but I
>will use them before I use Ada tasking.

Uh, in case you've forgotten, one of the reasons Ada has tasking
is for embedded systems where the overhead of an operating system
is simply unacceptable. Claiming that you've solved the problem
of real time performance by escaping to a real time operating
system is fine, provided you HAVE a real time operating system.
If, on the other hand, you've got a bare CPU, you basically
have two choices: 1) use a language other than Ada and cobble
together your own homebrew version of a real time kernel (which
as we all know is both a very efficient use of one's time and
highly portable) or 2) use Ada and rely on the real time kernel
provided by the Ada vendor to support Ada tasking.

>My experience is that hiding the machine architecture from the
>coder usually decreases the performance of the code by a
>factor of at least 5.

What sorts of things do you build? I would agree that writing
an abstract device driver is both not a good idea and probably
a contradiction in terms. On the other hand, writing a DBMS
that depends for most of its implementation on highly machine-
dependent calls is rather shortsighted if you have any intention
of ever running said DBMS on anything other than one platform.

Nothing is black-and-white; there are appropriate times to get
down to the base hardware to make something go really fast, but
there are lots of other times when getting down to the base hardware
is a really dumb idea. That's why Ada supports the range of extremes
(including inlining and machine code insertion).

>The data are weeks to months old and generally proprietary.

Yeah, isn't that always the way? "I could prove that Ada is
too slow to be used for real time, it's just that the data
I have is secret". And yet, I have NON-secret success stories
concerning Ada on hard real time projects. For example, call
up Rational and ask for their writeup on the Bofors Ship System
2000 project--they'll be happy to share this with anybody.

>Hence, by using Ada, you're
>reducing your programming and maintenance costs (which
>are also programming costs) but decreasing your performance
>and increasing your integration and test costs (which
>is expensive anyway).

Actually, integration and test is pretty darned simple with
Ada, since the formal specifications and such make things plug
together better than with most languages. There are even some studies
that prove (from real project data) that integration and test takes less
time on Ada projects than for comparable projects in other languages.
I think you meant to say "decreasing your performance and increasing your
tuning costs". Considering that maintenance represents the biggest
chunk of the pie by far, and considering that tuning is a relatively
straightforward "final pass", I think you've just made a 
real strong argument for using Ada instead of C as a means of
reducing total lifecycle software development costs, which has
been my argument all along.
-- 
*** LIMITLESS SOFTWARE, Inc: Jim Showalter, jls@netcom.com, (408) 243-0630 ****
*Proven solutions to software problems. Consulting and training on all aspects*
*of software development. Management/process/methodology. Architecture/design/*
*reuse. Quality/productivity. Risk reduction. EFFECTIVE OO usage. Ada/C++.    *

jls@netcom.COM (Jim Showalter) (06/20/91)

>In article <311@trwacs.UUCP> erwin@trwacs.UUCP (Harry Erwin) writes:
>> Barry Boehm has noted, however, that there are major types
>> of software that Ada is very poor at implementing
>> [others in list deleted] test generation code

Actually, while I was at Rational we wrote a tool that runs over the
DIANA tree for a unit and automatically generates a unit test for that
unit. Works great. Or doesn't this count for some reason?
-- 
*** LIMITLESS SOFTWARE, Inc: Jim Showalter, jls@netcom.com, (408) 243-0630 ****
*Proven solutions to software problems. Consulting and training on all aspects*
*of software development. Management/process/methodology. Architecture/design/*
*reuse. Quality/productivity. Risk reduction. EFFECTIVE OO usage. Ada/C++.    *

sampson@cod.NOSC.MIL (Charles H. Sampson) (06/20/91)

In article <313@trwacs.UUCP> erwin@trwacs.UUCP (Harry Erwin) writes:

>My experience is that hiding the machine architecture from the
>coder usually decreases the performance of the code by a
>factor of at least 5.

     Is this statement what was intended?  It seems to be saying that if
if you write two versions of a program, version A in a language that hides
the machine architecture and version B in a language that makes it easy to
get at the architecture, then the execution time of version A will be five
times that of version B.  I find this extremely surprising.  Usually when
such comparisons are made, version B's language is assembly and the slow-
down attributed to version A's language is stated as a percentage, most of
the time less than 100%.  Notice that the key adjective in the statement is
_usually_, not _sometimes_.  Even for _sometimes_ I'd consider a factor of
five surprising, except for some very special small programs that fully
exploit a special hardware feature.

>The data are weeks to months old and generally proprietary.
>My experience over the last 12 years is that Ada, Pascal,
>and similar languages (CMS2, Modula-2, etc.) generate code
>that always has to be tuned in integration and test to
>overcome implementation inefficiencies. The performance
>improvement during tuning is almost always at least 10-to-1,
>and frequently reflects subtle characteristics of the
>hardware architecture, which those languages are intended
>to hide from the programmer. Tuning is less frequently
>needed for C code, and the inefficiencies to be overcome
>are significantly less. Hence, by using Ada, you're
>reducing your programming and maintenance costs (which
>are also programming costs) but decreasing your performance
>and increasing your integration and test costs (which
>is expensive anyway).  ...

     I think that there is a moral obligation to publish these results.
They are contrary to _every_ Ada project I know about.  At the beginning
we were surprised to find out that our test and integration costs and
time were reduced when using Ada.  We expected to pay a high development
cost, to be recouped during maintenance, the accepted belief of the day.
We were surprised to find out that our total development time using Ada
was about the same as before, with more time spent in design and coding,
less in testing and integration.  We now schedule projects based on this
experience.  (For _we_ in this paragraph, read _I and every experienced
Ada person I know_.)

     That 10-to-1 performance improvement during tuning is yet another
surprising claim.  I was quite pleased with myself a few years ago when
I approached 2-to-1 in tuning an old program that had never been tuned
before and that job involved inline insertion of assembly code in three
inner loops!  (The name of the language that would allow me to do that
is withheld to protect innocent ears.)

                             Charlie

khalfall@loria.crin.fr (Adel Khalfallah) (06/23/91)

To: jls@netcom.COM (Jim Showalter)
In-reply-to: jls@netcom.COM's message of 19 Jun 91 18:26:01 GMT
Subject: Re: vs Ada - Don't forget the prime directive!
--text follows this line--
The following is extracted from a paper published  at Ada-Europe
international conference Athens, Greece May 1991. it appeared in LNCS
499. The title of the paper is: "Ada in Safety Critical Applications." 
by A. Welz. It's the conclusion of a study as part of the development
of the Inertial Measurement Unit, a flight control subsystem of the
European Fighter Aircraft(EFA):

	The language Ada is no less safe than other languages. Because
	of its strong typing, the predefined exception mechanism and
	the standard tasking features, it has an even greater
	advantage compared to other languages. There are reasonable
	alternatives to restricting Ada to a 'Pascal subset' as
	required in the EFA Safe Ada Study. With some precise rules,
	Ada  fulfills all requirements of safety critical avionic
	applications. The adherence to these safety rules can be
	controlled with appropriate tools and methods.
 
--
-------------------------------------------------------------------------------
| Adel KHALFALLAH  |                       | Insert here your favourite motto |
| CRIN BP239       | e-mail:               |                                  |
| 54500 VANDOEUVRE | khalfall@loria.crin.fr|                                  |
| FRANCE           |                       |                                  |
-------------------------------------------------------------------------------