[comp.sw.components] Ada 9X objectives

billwolf%hazel.cs.clemson.edu@hubcap.clemson.edu (William Thomas Wolfe, 2847 ) (10/03/89)

  In an earlier comp.lang.ada article I included a copy of a recent 
  article from Stephen Crawley in comp.sw.components.  I'd like to 
  comment here on some of the points raised. 

> Well how come ADA seems to be largely irrelevant outside of 
> the defense sector?  

  That depends strongly on your definition of "largely irrelevant";
  there is a large and growing number of non-defense projects and 
  companies using Ada.  The new generation of highly optimizing 
  Ada compilers deserves at least some of the credit for this 
  substantial and accelerating growth.

> ADA 83 was 5 - 10 years out of date even before it was finalised.  Unless 
> it gets a RADICAL overhaul, ADA 9x will be 10 - 15 years out of date.
> Doubtless, the reasctionaries and religious zealots from the software 
> engineering industry will make sure that much of the important work done 
> by researchers over the last 15 years (like GC technology, functional
> programming, designing languages to make formal verification possible) 
> will be ignored ... just like they did for ADA 83.

  In fact, this is not correct.  Ada 83 explicitly provides for garbage
  collection as an optional compiler service.  The rule that functions
  must not modify their parameters was probably a direct result of
  functional programming ideas.  Finally, formal verification is a
  major goal of the software engineering community, and Ada was designed
  to support it to as great an extent as possible.  For example, the
  use of the termination model of exception handling was (at least in
  part) motivated by formal verification considerations. 

> Production language design should be an on-going evolutionary process.
> The language design should reviewed regularly to incorporate new proven
> ideas from research languages and the results of experience from the
> production language itself.  A new language version every 2 years sounds
> about right to me. 

  This is too frequent; five years might be reasonable, but not two.
  I don't think the compiler validation suites, etc., would be able to
  respond meaningfully to a revision cycle which was THAT frequent.
 
> What about all the software in old versions of the language?  Who does 
> the job of converting it I hear you ask?  It should be the task of the 
> people who build programming support environments to write conversion 
> tools to largely automate the task of converting code from one version 
> of the PL to the next one.

  The US Government is actively planning to maximize the use of
  automatic translation technology during the transition from Ada
  83 to Ada 9X.  

> Maybe these ideas are not workable right now ... production programming
> support environments aren't really up to the task yet.  But this is the 
> direction the Software Engineering industry should be aiming.  The process
> of change in computing is inevitable; we should be going with the flow
> not trying to hold back the tide.

  On this I agree.  But another good reason to only revise no more 
  quickly than five years at a time is to give new ideas a chance to 
  mature.  Once a new idea has proven itself, and has become reasonably
  agreed upon to be a good thing that production languages should have,
  there should be a process by which production languages incorporate
  new developments in software engineering technology, and this is what
  should be accomplished by the Ada 9X scheduled revision process.


  Bill Wolfe, wtwolfe@hubcap.clemson.edu
 

scc@cl.cam.ac.uk (Stephen Crawley) (10/04/89)

I wrote:
>> Well how come ADA seems to be largely irrelevant outside of 
>> the defense sector?  

Bill Wolfe replies:
> That depends strongly on your definition of "largely irrelevant";
> there is a large and growing number of non-defense projects and 
> companies using Ada.  The new generation of highly optimizing 
> Ada compilers deserves at least some of the credit for this 
> substantial and accelerating growth.

OK, I'll clarify myself.  Undoubtedly there are companies moving to
Ada for non-defence work.  But there seem to be MORE companies 
moving to other languages such as C++ and (I hate to say it) C.
This is only my perception of what is going on.  Does anyone have 
any meaningful statistics on recent trends in programming language 
usage?

>> ADA 83 was 5 - 10 years out of date even before it was finalised.  Unless 
>> it gets a RADICAL overhaul, ADA 9x will be 10 - 15 years out of date.
>> Doubtless, the reasctionaries and religious zealots from the software 
>> engineering industry will make sure that much of the important work done 
>> by researchers over the last 15 years (like GC technology, functional
>> programming, designing languages to make formal verification possible) 
>> will be ignored ... just like they did for ADA 83.

> In fact, this is not correct. 

ADA 83 most certainly was 5 - 10 years out of date!  And given that the
ADA 9x will be going through the same long, drawn out process as 83,
I can't see why it should be any less out of date. 

> Ada 83 explicitly provides for garbage
> collection as an optional compiler service.

But they cocked it up.  Optional garbage collection is close to useless, 
since you can't depend on it being there ... unless you write code that
assumes a particular ADA compiler.  This particular lesson should have 
been learned from Algol-68.  Maybe some of the ADA design time knew this
.. but the reactionaries won the day.

> The rule that functions
> must not modify their parameters was probably a direct result of
> functional programming ideas.  

I doubt it very much.  It is more likely it was a result of bad
experiences with FORTRAN and PASCAL "VAR" parameters.

> Finally, formal verification is a
> major goal of the software engineering community, and Ada was designed
> to support it to as great an extent as possible.  For example, the
> use of the termination model of exception handling was (at least in
> part) motivated by formal verification considerations. 

Excuse me while I laugh ...

Verifying ADA 83 is a fundamentally intractible problem for any number
of reasons.  I don't believe anyone has even managed to formally define
the semantics of ADA 83!  Maybe some members of the ADA design team did 
have verification in their minds ... but others didn't, and the others
won the day.

>> Production language design should be an on-going evolutionary process.
>> ... A new language version every 2 years sounds about right to me.

> This is too frequent; five years might be reasonable, but not two.
> I don't think the compiler validation suites, etc., would be able to
> respond meaningfully to a revision cycle which was THAT frequent.

There is no reason why the revisions should not be pipelined.  And
what is wrong with some people using pre-validated compilers?  After
all that is how much of the rest of the computing industry works at
the moment.  It is often better to use a new, somewhat flakey compiler
now if it offers significant benefits.

> But another good reason to only revise no more 
> quickly than five years at a time is to give new ideas a chance to 
> mature.  Once a new idea has proven itself, and has become reasonably
> agreed upon to be a good thing that production languages should have,
> there should be a process by which production languages incorporate
> new developments in software engineering technology, and this is what
> should be accomplished by the Ada 9X scheduled revision process.

The time from a new idea being introduced, to its being mature is much 
less than 5 years.  Besides, new ideas are developed in parallel not 
serially.  The problem is that too many people in industry are too 
busy meeting project deadlines to keep up with research.  The result 
is that it takes far too long for mature ideas to be perceived as such, 
and hence to come into general use.

You would do well to consider the ongoing development of the Eiffel 
language and environment.  Currently, there seems to be a minor 
revision cycle of ~6 months and a major cycle of ~2 years.  Nobody
seems to be complaining ...

-- Steve

ted@nmsu.edu (Ted Dunning) (10/04/89)

In article <935@scaup.cl.cam.ac.uk> scc@cl.cam.ac.uk (Stephen Crawley) writes:


   Bill Wolfe replies:

   > Finally, formal verification is a
   > major goal of the software engineering community, and Ada was designed
   > to support it to as great an extent as possible.  For example, the
   > use of the termination model of exception handling was (at least in
   > part) motivated by formal verification considerations. 

   Excuse me while I laugh ...

   Verifying ADA 83 is a fundamentally intractible problem for any number
   of reasons.  I don't believe anyone has even managed to formally define
   the semantics of ADA 83!  Maybe some members of the ADA design team did 
   have verification in their minds ... but others didn't, and the others
   won the day.


compare this situation to that of scheme where the formal semantics of
the language _have_ been defined, and they are concise enough to fit on
a couple of pages.  in fact, they are so simple and straightforward
that the formal semantics can be used in an undergraduate computer
science class.

even more interesting, these formal semantics were automatically
derived from a running scheme program which provides an executable
model of the semantics.  this allows much simpler testing of the
semantics than just writing down the equations and having people stare
at them.



--
ted@nmsu.edu
			remember, when extensions and subsets are outlawed,
			only outlaws will have extensions or subsets

peirce@claris.com (Michael Peirce) (10/06/89)

In article <935@scaup.cl.cam.ac.uk> scc@cl.cam.ac.uk (Stephen Crawley) writes:
>
>>> Production language design should be an on-going evolutionary process.
>>> ... A new language version every 2 years sounds about right to me.
>
>> This is too frequent; five years might be reasonable, but not two.
>> I don't think the compiler validation suites, etc., would be able to
>> respond meaningfully to a revision cycle which was THAT frequent.
>
>There is no reason why the revisions should not be pipelined.  And
>what is wrong with some people using pre-validated compilers?  After
>all that is how much of the rest of the computing industry works at
>the moment.  It is often better to use a new, somewhat flakey compiler
>now if it offers significant benefits.
>

Are you serious???  It's better to use a somewhat flakey compiler???

Those of us trying to solve real world problems can't afford to use
a "slightly flakey compiler".  When shipping product to make money to
feed the kids, spending days tracking down weird bugs caused by flakey
compilers isn't the way to stay in business!

The ivory tower world makes some terrific contributions, but they can
keep their flakey compilers until such time as they aren't flakey any
more, thank you very much.  Personally, I usually skip any compiler that's
no at least to release 2.0 or later.  They're just not worth the trouble!


Claris Corp. | Michael R. Peirce
-------------+--------------------------------------
             | 5201 Patrick Henry Drive MS-C4
             | Box 58168
             | Santa Clara, CA 95051-8168
             | (408) 987-7319
             | AppleLink: peirce1
             | Internet:  peirce@claris.com
             | uucp:      {ames,decwrl,apple,sun}!claris!peirce

simpson@trwarcadia.uucp (Scott Simpson) (10/07/89)

In article <10602@claris.com> peirce@claris.com (Michael Peirce) writes:
>In article <935@scaup.cl.cam.ac.uk> scc@cl.cam.ac.uk (Stephen Crawley) writes:
>>There is no reason why the revisions should not be pipelined.  And
>>what is wrong with some people using pre-validated compilers?  After
>>all that is how much of the rest of the computing industry works at
>>the moment.  It is often better to use a new, somewhat flakey compiler
>>now if it offers significant benefits.
>Are you serious???  It's better to use a somewhat flakey compiler???
>
>Those of us trying to solve real world problems can't afford to use
>a "slightly flakey compiler".  When shipping product to make money to
>feed the kids, spending days tracking down weird bugs caused by flakey
>compilers isn't the way to stay in business!

I agree.  If you spend more time debugging your tool than creating your
application, you are spending too much time on someone elses product.
This is an easy and dangerous trap to fall into.  Interestingly, Barry
Boehm's COCOMO model lists the following constants for VIRT or Virtual
machine volatility (Virtual machine volatility corresponds to the 
software tool you are using, e.g., compiler, database, etc.)

		Rating (from Intermediate COCOMO)
Low		Nominal		High		Very High
.87		1.00		1.15		1.30

These constants are somewhat high reflecting the additional time you
must spend when debugging the tool you are using rather than spending time
building your application.
	Scott Simpson
	TRW Space and Defense Sector
	usc!trwarcadia!simpson  	(UUCP)
	trwarcadia!simpson@usc.edu	(Internet)

jcardow@blackbird.afit.af.mil (James E. Cardow) (10/10/89)

rcd@ico.ISC.COM (Dick Dunn) writes:

>> ...Consider the ten to twenty year development cycle for large projects...

>If you have a ten-year development cycle for a software project, you're
>going to be producing obsolete software!  You can't help it.  Ten years is
>just too long for anything tied to computers--the technology moves too
>fast.

>You've got to get software up and working, and performing at least some of
>the needed functions *soon*.  You also need it to be adaptable, so that it
>can evolve as needs and technology change.

I must admit that my comments were made with only my own experience in mine, 
that being large DOD sponsored projects that had developments spanning two
to three computer generations.  However, that is the primary Ada environment.
In the military software support world we are for the most part just entering
the support for JOVIAL systems.  Having been responsible for "selling" Ada
to the people attempting to prepare for "new" software, I'm convinced that 
injecting new technology especially evolving technology may very well be 
a cure more painful than the disease.  

Consider the problem in a real context.  System software in the +100,000 
lines of code, with supporting software at a 4:1 ratio.  Add to that 
simulator software that must function exactly like the real thing.  Now 
add unique hardware, especially processors.  If the system were 
stagnant and the budget available the conversion to a new language would
be simple (simpler?).  But reality says the money for change is small, and
the user demand for improvements is large.  The changes come in modification
of 10 percent of a unit here, 5 percent there.  The only real opportunity
is when major systems are effected, but that is rare.

>What I'm getting at is that I think we're trying to address the wrong
>problem.  Rather than trying to solve "How do we deal with long development
>cycles?" we should be solving "How do we shorten the development cycles?"
>-- 
In the years I have spent chasing DoD software I have always worried about
how to get it delivered closer to the expected date, the idea of shorter 
never occured to me.  But, I'm changing roles now to teach software 
engineering and would greatly love to discuss ways to shorten the 
development cycle, or ways to inject new technology into old systems.  If you
have ideas on the subject within the context of large, complex systems or 
know of any work in these areas let me know.

As a side note, Ada can be added to older systems.  It takes convincing 
people that the benefits over the long run are worth the effort.