[comp.software-eng] Productivity and error rates for Ada projects

wtwolfe@hubcap.clemson.edu (Bill Wolfe) (03/04/90)

   From the November 1988 issue of IEEE Software, page 89 ("Large 
   Ada Projects Show Productivity Gains"): Productivity ranged 
   from 550 to 704 lines per staff-month at the 1.2-million-line 
   level -- a sharp contrast with the average productivity of the 
   1,500 systems in productivity consultant Lawrence Putnam's 
   database: only 77 lines per staff-month.  Reuseable software
   developed on the project was counted only once, and reuseable
   software not developed on the project was not counted at all.

   Excerpts from a recent NASA internal study were recently
   published in the September/October 1989 SIGAda Ada Letters 
   (page 58): by the third Ada project, 42% of code was reused, 
   productivity was 33.9 noncomment lines per staff-day (that's 
   746 lines per staff-month), and there were only 1.0 defects per 
   thousand lines of code.  The study recommended that NASA should
   adopt Ada as its standard programming language. 

   Does anyone know of any empirical results regarding the level of
   productivity and defect rate associated with C-language projects?

   It would be interesting to compare them to the results cited above.

 
   Bill Wolfe, wtwolfe@hubcap.clemson.edu

tada@athena.mit.edu (Michael J Zehr) (03/04/90)

In article <8221@hubcap.clemson.edu> wtwolfe@hubcap.clemson.edu (Bill Wolfe) writes:
>   From the November 1988 issue of IEEE Software, page 89 ("Large 
>   Ada Projects Show Productivity Gains"): 
>   [ada productivity higher and error rate lower than for C]
>
>   Does anyone know of any empirical results regarding the level of
>   productivity and defect rate associated with C-language projects?
>
>   It would be interesting to compare them to the results cited above.
>
>   Bill Wolfe, wtwolfe@hubcap.clemson.edu

One of the projects i've worked on is a hybrid of C and a 4GL called
Stratagem (sold by Computer Associates).  We don't have hard numbers of
productivity or errors, but i have a few observations.

1. Beyond a shadow of a doubt, productivity (measured in debugged code
modules) is much higher for the 4GL part than the C part.  Errors are
less common, and tend to be easier to find (this might in part be due to
more people having better training in the 4GL language than in C,
though).

2. If we had done the entire project in Stratagem, it would have been
finished much earlier, and there would have been fewer bugs.  And it
would have been totally useless for our client!

There is simply no way we could get the kind of speed we needed from the
program without resorting to C (or some other low-level language. note
however that there was really no quesion about which language we had to
use to go with Stratgem, however).

Sometimes you need to use a low level language to do a job efficiently
at run-time.  Our company policy has always been to use the highest
level language that will yeild acceptable run-time performance.  We use
C only when we have to.

[On a related issue, i noticed a much greater correlation between the
errors and the original coder than i did between errors and the language
being used.  And for a while i was doing most of the error-finding and
debugging for theproject....]

Comparing Ada and C as programming languages is much like comparing
buses and cars as transportation.  They are each good for some things
and bad for some things.

-michael j zehr

xrtnt@amarna.gsfc.nasa.gov (Nigel Tzeng) (03/05/90)

In article <8221@hubcap.clemson.edu>, wtwolfe@hubcap.clemson.edu (Bill Wolfe) writes...

>   Excerpts from a recent NASA internal study were recently
>   published in the September/October 1989 SIGAda Ada Letters 
>   (page 58): by the third Ada project, 42% of code was reused, 
>   productivity was 33.9 noncomment lines per staff-day (that's 
>   746 lines per staff-month), and there were only 1.0 defects per 
>   thousand lines of code.  The study recommended that NASA should
>   adopt Ada as its standard programming language. 
> 

I believe that at the Software Engineering Symposium at Goddard last year 
there was a report on the reponse of the various NASA Centers on this issue.
There was support from most centers that ADA should be adopted but the primary
real time shops wanted both C and ADA adopted.  If I can find my notes from
that conference I can get more info (as in why the various centers diagreed).

I recall that the majority of the centers, while in general support ADA, wanted
to evaluate the impact of the adoption of ADA before doing so.

Nigel Tzeng
xrtnt@csdr.gsfc.nasa.gov

-------------------------------------------------------------------------------
No Nifty Syings...This space unintentionally left Blank...

karl@haddock.ima.isc.com (Karl Heuer) (03/05/90)

In article <1990Mar3.235039.13870@athena.mit.edu> tada@athena.mit.edu (Michael J Zehr) writes:
>On a related issue, i noticed a much greater correlation between the
>errors and the original coder than i did between errors and the language
>being used.

I think you've just provided the key observation.

Karl W. Z. Heuer (karl@ima.ima.isc.com or harvard!ima!karl), The Walking Lint

djones@megatest.UUCP (Dave Jones) (03/07/90)

From article <8221@hubcap.clemson.edu>, by wtwolfe@hubcap.clemson.edu (Bill Wolfe):
> 
>    From the November 1988 issue of IEEE Software, page 89 ("Large 
>    Ada Projects Show Productivity Gains"): Productivity ranged 
>    from 550 to 704 lines per staff-month at the 1.2-million-line 
>    level -- a sharp contrast with the average productivity of the 
>    1,500 systems in productivity consultant Lawrence Putnam's 
>    database: only 77 lines per staff-month.



Does this mean that if I program in C rather than Ada, I can get
the job done with one nineth the expendature in lines of code? That would
be an improvement, I'll say. Over the last year, programming in C, I've
been turning it out at the rate of 1900 lines a month. If I could get that
down to, say 200, that would be great!

Or does it mean that if I program in Ada rather than C, I'll get nine
times as much real work done? Sign me up. Nine production compilers in a
year. I could make a few bucks at that rate.

Or is it possibly an inherently meaningless statistic, made even more
worthless by a complete lack of controls?

Cut me some slack, Jack.

mitchell@community-chest.uucp (George Mitchell) (03/07/90)

In article <12236@goofy.megatest.UUCP> djones@megatest.UUCP (Dave Jones) wrote:
`Does this mean that if I program in C rather than Ada, I can get
`the job done with one nineth the expendature in lines of code? 
` ....
`Or does it mean that if I program in Ada rather than C, I'll get nine
`times as much real work done? Sign me up.

If you look at some of the work of T. Capers Jones, you would see that
as third generation languages, there is little difference between the
SLOC count per function point for C and Ada.
--
/s/ George   vmail:  703/883-6029
email:  gmitchel@mitre.org    [alt: mitchell@community-chest.mitre.org]
snail:  GB Mitchell, MITRE, MS Z676, 7525 Colshire Dr, McLean, VA  22102