[comp.object] ada-c++ productivity

johnsonc@wdl1.wdl.loral.com (Craig C Johnson) (03/08/91)

Can anyone provide references which address productivity (either development
or maintenance) utilizing an full object-oriented language (say C++) vs.
Ada?  I see lots of flames and anecdotal data but little hard data.

I'll be happy to summarize to the net.

Craig Johnson
johnsonc@wdl1.wdl.loral.com

wiese@forwiss.uni-passau.de (Joachim Wiese) (03/10/91)

johnsonc@wdl1.wdl.loral.com (Craig C Johnson) writes:

>Can anyone provide references which address productivity (either development
>or maintenance) utilizing an full object-oriented language (say C++) vs.
>Ada?  I see lots of flames and anecdotal data but little hard data.

C++ is a "better C" but not a full OO-language. C++ is an OO-language that
 offers a lot of flexibility. To much flexibility and to
much pointers (to much C) to lead to quality software and productivity.
 I would rather be interested in comparing a _real_ 
 "full OO-language" as  EIFFEL vs. ADA.

A book that addresses such issues as productivity, reuseability and
 maintenance is "Object-oriented software construction" 
                 from Meyer B. 
                 Prentice Hall 1988
It compares fetures of ADA and EIFFEL (exception handling, modularisation atc.)
 



-- 
--------  O   Joachim Wiese - \O/  ---------------------------   O  ---------
-------- /!\  Uni. Passau----  !   wiese@forwiss.uni-passau.de  /!\ ---------
-------- / \  GERMANY ------- / \  ---------------------------  / \ ---------

jbuck@galileo.berkeley.edu (Joe Buck) (03/14/91)

In article <1991Mar10.151220.2581@forwiss.uni-passau.de>, wiese@forwiss.uni-passau.de (Joachim Wiese) writes:
|> C++ is a "better C" but not a full OO-language. C++ is an OO-language that
|>  offers a lot of flexibility. To much flexibility and to
|> much pointers (to much C) to lead to quality software and productivity.
|>  I would rather be interested in comparing a _real_ 
|>  "full OO-language" as  EIFFEL vs. ADA.

C++ and Eiffel are full object-oriented languages; Ada is not.  Some
definitions, from Grady Booch's "Object-Oriented Design With Applications",
which has been generally recognized in this group as one of the best texts
available:

"Object-oriented programming is a method of implementation in which programs
are organized as cooperative collections of objects, each of which represents
an instance of some class, and whose classes are all members of a hierarchy
of classes united via inheritance relationships."

He quotes Cardelli and Wegner's definition:

"A language is object-oriented if and only if it satisfies the following
requirements:

- It supports objects that are data abstractions with an interface of named
  operations and a hidden local state

- Objects have an associated type (class)

- Types (classes) may inherit attributes from supertypes (superclasses)"

Ada lacks attribute #3 and is therefore not an object-oriented language.
Languages that satisfy the first two clauses but not the third are called
by Cardelli and Wegner "object-based" languages; Ada is an object-based
language.  C++ satisfies all three attributes.

Eiffel certainly has some nice features lacking in C++, but then C++
has some nice features lacking in Eiffel.  And of course it is possible
to write bad programs in any language.


--
Joe Buck
jbuck@galileo.berkeley.edu	 {uunet,ucbvax}!galileo.berkeley.edu!jbuck	

eachus@aries.mitre.org (Robert I. Eachus) (03/15/91)

In article <11966@pasteur.Berkeley.EDU> jbuck@galileo.berkeley.edu (Joe Buck) writes:

   C++ and Eiffel are full object-oriented languages; Ada is not....
   [lots deleted]

   "A language is object-oriented if and only if it satisfies the following
   requirements:

   - It supports objects that are data abstractions with an interface of named
     operations and a hidden local state

   - Objects have an associated type (class)

   - Types (classes) may inherit attributes from supertypes (superclasses)"

   Ada lacks attribute #3 and is therefore not an object-oriented language.
   Languages that satisfy the first two clauses but not the third are called
   by Cardelli and Wegner "object-based" languages; Ada is an object-based
   language.  C++ satisfies all three attributes... [more deleted]

   I guess you never heard of (Ada) derived types?  There are some
problems with using derived types to implement Smalltalk like class
heirarchies, these are being addressed in Ada 9X. There are also
people like me who prefer to build class heirarchies using generics,
and you build such heirarchies in a much different fashion than in
Smalltalk, but Ada is definitely an OOP.

   Now if you want to argue about whether multiple inheritance in Ada
(or any language) is a good idea, that is subject to debate.

--

					Robert I. Eachus

with STANDARD_DISCLAIMER;
use  STANDARD_DISCLAIMER;
function MESSAGE (TEXT: in CLEVER_IDEAS) return BETTER_IDEAS is...

jordan@aero.org (Larry M. Jordan) (03/16/91)

"Something's" missing from this definition of OOP--Dynamic binding.
(also, OOPs don't require classes--see SELF).  I'd say we dismiss
the term OOP and talk about langs in term of features.

Inh. and dyn. binding are not enough.  There are other language
features I find desirable--parameterized types and exceptions.  What 
about library management?  What about environements with integrated 
debuggers? (I'd sure hate to build anything bigger than a hello world 
program without a library manager.  ).  Some may finding tasking essential.  
I did a comparison of lang. features a while back.  Some may find 
this interesting. 

Language	Inheritance	DynamicBnd'g	Type[T]	Except.	Task 	LibMgr
C++		yes-MI/class	yes-vft		no[1]	no[2]	no	hand
Objective-C	yes-SI/class	yes-tlu		no	no	no	hand
Smalltalk	yes-SI/class	yes-tlu		no	no	no	?
SELF		yes-MI/object	yes-?		no	no	no	?
TP6.0		yes-SI/class	yes-vft		no	no	no	auto
Eiffel		yes-MI/class	yes-vft		yes	yes	no	auto
Ada		yes[3]		no		no[4]	yes	yes	auto

MI=mult. inh.
SI=single inh.
vft=virtual function tables
tlu=run-time table lookup with caching
class=class-based inh.
object=object-based inh.
hand=hand crank a make file
auto=automated
?=don't know

[1] available next version
[2] available next version?
[3] derived types are not assignment compatible without casting.
[4] parameterized procedures and packages but not types		

I hate the C'ness of C++, but I find myself implementing many things in
C++ just because of inheritance and dynamic binding.  If Ada is ever
to become mainstream (and I seriously hope it does) inheritance and
dyn. binding had better be incorporated into the language.

--Larry

craig@elaine35.Stanford.EDU (Craig Chambers) (03/16/91)

In article <1991Mar15.224626.27077@aero.org> jordan@aero.org (Larry M. Jordan) writes:
>Language	Inheritance	DynamicBnd'g	Type[T]	Except.	Task 	LibMgr
>C++		yes-MI/class	yes-vft		no[1]	no[2]	no	hand
>Objective-C	yes-SI/class	yes-tlu		no	no	no	hand
>Smalltalk	yes-SI/class	yes-tlu		no	no	no	?
>SELF		yes-MI/object	yes-?		no	no	no	?
>TP6.0		yes-SI/class	yes-vft		no	no	no	auto
>Eiffel		yes-MI/class	yes-vft		yes	yes	no	auto
>Ada		yes[3]		no		no[4]	yes	yes	auto

I'd argue that the mechanism for implementing dynamic binding is not
so important, certainlyt not as important as many of the other
features you list.  But if you want to include it, Self's
implementation of dynamic binding is based on hash table lookup with
possible optimizations like in-line caching (also present in the
ParcPlace Smalltalk-80 implementation), static binding, and inlining.
I'd also guess that Eiffel uses tlu instead of vft to implement
dynamic binding.

Both Smalltalk and Self support multiple lightweight communicating
threads within a single address space, so those values should probably
read "yes" (depending on what you mean by "Task").

Also, both Smalltalk and Self implementations are effectively one big
library manager.  If you require some sort of separate compilation,
then they don't have it, but if you just want automatic linking and
compilation after programming changes or the ability to invoke code of
objects/classes written by other people, then they are nice
environments.

-- Craig Chambers

P.S.  The eariler mention of "Cardelli and Wegner's" definition of OO
should be attributed to just Wegner.  Wegner's definitions are overly
narrow and constrained in my view, and I believe that Cardelli would
want to be able to form his own definitions.

johnson@cs.uiuc.EDU (Ralph Johnson) (03/17/91)

|> Language	Inheritance	DynamicBnd'g	Type[T]	Except.	Task 	LibMgr
|> Smalltalk	yes-SI/class	yes-tlu		no	no	no	?

The latest versions of Smalltalk-80 from ParcPlace (ObjectWorks for
Smalltalk 2.5 and 4.0) have exceptions.  Smalltalk-80 has always had
processes and semaphores, but the user interface was not compatible
with running things in the background until 4.0.  I know, because I
hacked the user interface to make background processes useful.

There have been several versions of Smalltalk-80 that had multiple
inheritance, but the Smalltalk implementors never cared for it very
much so it is not well supported and probably isn't in the ParcPlace
official versions.

Last, but not least, I'm implementing (more accurately, my students
are implementing) a type system for Smalltalk, so someday that column
may change, too.

Ralph Johnson -- University of Illinois at Urbana-Champaign

ted@grebyn.com (Ted Holden) (03/17/91)

Productivity of C++ users will vary according to skills, experience
levels, tools available, such as the fabulous new Borland interface, and
the task at hand.  An idea of productivity in Ada projects may be had
from the Feb. 11 issue of Federal Computer Week:

   "The GSA Board of Contract appeals issued a ruling last month that
   could effect how the military evaluates the cost effectiveness of Ada
   software.

   "The board upheld a decision by the Air Force to award a contract to
    a high-priced bidder based on a measure of productivity that equals
    three lines of Ada code per programmer per day.

    "A lower priced bidder, and others in the Ada community, said this
    standard is much too low.  The protester in the case, DynaLantic
    Corp, offered an average of ten lines of code per day per
    programmer.

    "Three lines of code per day is absurd [as if ten wasn't], said
    Ralph Crafts, editor of a newsletter on Ada, and an expert
    witness for the protester.....

Whether any realistic combination of events exists which could reduce
Pascal, C, or C++ programmers to this level of productivity is anybody's
guess;  my own opinion is that most C programmers would require a bullet
through the brain to be brought to such a level.

The really comical thing about this is the way in which Ada gurus cite
"productivity" as the main advantage of Ada.  Apparently, they use the
phrase in somewhat the same vein as Mr. Hussein uses terms like "moral
victory".

Ted Holden

jls@rutabaga.Rational.COM (Jim Showalter) (03/17/91)

Semantics. An object is defined as something with state that suffers
actions. Thus, a Boolean global variable is an object, and two seconds
of reflection will tell you that just about anything qualifies as an
object. Ada does a fine job of representing objects, thus, it is
object-oriented.

But not, of course, by THIS definition:

>"A language is object-oriented if and only if it satisfies the following
>requirements:

>- It supports objects that are data abstractions with an interface of named
>  operations and a hidden local state

>- Objects have an associated type (class)

>- Types (classes) may inherit attributes from supertypes (superclasses)"

But wait, the only thing missing is #3 (by the way, derived types get you
part way there). And #3 doesn't even CONTAIN the word "object". What it
talks about is "inheritance".

Aha! So the argument comes down to whether a language is inheritance
oriented or not, which I view as much more significant distinction.
Object-oriented certainly includes Ada. Inheritance-oriented certainly
does not include Ada (where are you, 9x?).

Personally, I'm much more interested in whether or not a language is
SOFTWARE ENGINEERING oriented or not. Ada and C++ certainly qualify,
with spec/body separation, formalization of interface, opaque types,
strong typing, etc. Languages like C and FORTRAN and COBOL don't make
the cut.
--
***** DISCLAIMER: The opinions expressed herein are my own. Duh. Like you'd
ever be able to find a company (or, for that matter, very many people) with
opinions like mine. 
              -- "When I want your opinion, I'll read it in your entrails."

jls@rutabaga.Rational.COM (Jim Showalter) (03/17/91)

>I hate the C'ness of C++, but I find myself implementing many things in
>C++ just because of inheritance and dynamic binding.  If Ada is ever
>to become mainstream (and I seriously hope it does) inheritance and
>dyn. binding had better be incorporated into the language.

Enlighten me. How is it that many of the largest software systems
ever attempted--including all of the flight control software for
North America and all the software for the Space Station--are being
written in Ada, even though Ada doesn't have "dynamic binding"?

Second question: assume Ada got dynamic binding tomorrow. What could
be done with it that can't be done with it today?
--
***** DISCLAIMER: The opinions expressed herein are my own. Duh. Like you'd
ever be able to find a company (or, for that matter, very many people) with
opinions like mine. 
              -- "When I want your opinion, I'll read it in your entrails."

ftpam1@acad3.alaska.edu (MUNTS PHILLIP A) (03/17/91)

In article <1991Mar16.205228.4268@grebyn.com>, ted@grebyn.com (Ted Holden) writes...
>Productivity of C++ users will vary according to skills, experience
>levels, tools available, such as the fabulous new Borland interface, and
>the task at hand.  An idea of productivity in Ada projects may be had
>from the Feb. 11 issue of Federal Computer Week:

..

>    "Three lines of code per day is absurd [as if ten wasn't], said
>    Ralph Crafts, editor of a newsletter on Ada, and an expert
>    witness for the protester.....
> 
>Whether any realistic combination of events exists which could reduce
>Pascal, C, or C++ programmers to this level of productivity is anybody's
>guess;  my own opinion is that most C programmers would require a bullet
>through the brain to be brought to such a level.

..

     Actually, 10 lines per day isn't all that unreasonable, averaged over
a year for example.  The keyword is AVERAGED: on a good day I may write
hundreds of lines of code; I may spend the next day trying to find an 
obscure bug in one of those lines.

     A programmer also does a lot of other things like attending meetings,
reading documentation, writing documentation, studying a problem, answering
the phone, filling out forms, reformatting that [censored] hard disk, etc.
Then there is waiting for the network to come back up, putting paper in the
laser printer...

     In theory, you are supposed to spend most of your time DESIGNING rather
than CODING, as well.  I have spent days perfecting algorithms that were set
down in code in hours or even minutes.

     In the ideal case, the manager would chain his slaves, er, employees to
their computer until the job is done.  In practice things don't work out
that way.  The whole idea of "lines per day" is pretty artificial anyway.
(The contract says x lines per day so we'll just pad things out with a little
whitespace...)  What matters is whether a product is delivered on time or
not, and if it meets the spec.

CAVEAT: Most of what I do is in assembly language or Turbo Pascal.  I don't
particularly like C or C++ (or assembly language for that matter) and I haven't
found an Ada compiler that generates decent code.  (I am constrained to the
low end of the marketplace, being self- rather than government-employed.)

Philip Munts N7AHL
NRA Extremist, etc.
University of Alaska, Fairbanks

rreid@ecst.csuchico.edu (Ralph Reid III) (03/17/91)

In article <1991Mar16.205228.4268@grebyn.com> ted@grebyn.UUCP (Ted Holden) writes:
> . . .
>from the Feb. 11 issue of Federal Computer Week:
>
>   "The GSA Board of Contract appeals issued a ruling last month that
>   could effect how the military evaluates the cost effectiveness of Ada
>   software.
>
>   "The board upheld a decision by the Air Force to award a contract to
>    a high-priced bidder based on a measure of productivity that equals
>    three lines of Ada code per programmer per day.
>
>    "A lower priced bidder, and others in the Ada community, said this
>    standard is much too low.  The protester in the case, DynaLantic
>    Corp, offered an average of ten lines of code per day per
>    programmer.
>
>    "Three lines of code per day is absurd [as if ten wasn't], said
>    Ralph Crafts, editor of a newsletter on Ada, and an expert
>    witness for the protester.....
> . . .


I don't know where these companies are digging up these kind of
unproductive machine operators (I hesitate to call them real
programmers), but they would never get through the computer science
program here at Chico State.  It kind of makes me wonder what schools
they came from, if they even have degrees.  The kind of productivity
discussed in this article sounds like the level I might expect from
beginning programming students at a junior college.  I would like to
know what in this world could reduce a serious programmer's
productivity to these levels.

-- 
Ralph.  SAAC member.
ARS: N6BNO
Compuserve: 72250.3521@compuserve.com
email: rreid@cscihp.ecst.csuchico.edu

csq031@umaxc.weeg.uiowa.edu (03/18/91)

In article <1991Mar17.142756.25676@ecst.csuchico.edu> rreid@ecst.csuchico.edu (Ralph Reid III) writes:
>I don't know where these companies are digging up these kind of
>unproductive machine operators (I hesitate to call them real
>programmers), but they would never get through the computer science
>program here at Chico State.  It kind of makes me wonder what schools
>they came from, if they even have degrees.  The kind of productivity
>discussed in this article sounds like the level I might expect from
>beginning programming students at a junior college.  I would like to
>know what in this world could reduce a serious programmer's
>productivity to these levels.
>
>-- 
Lines of code per day is an absurd measure at best.  Using it in contracts
is just a way of lulling people who care about such things into thinking
that programmers (and software companies) know what they're doing and
that their output is quantifiable.

The actual situation (as most people know) is that when you set out to
write something non-trivial, that hasn't been done already, you're
much more like Lewis and Clark setting out in canoes than you are like
a machinist putting a block of steel into the lathe. This scares the
living shit out of bean counters.

But anyway, if you measure lines/day after the fact, i.e after the
program has been designed, written, tested, documented, beta-tested, and
accepted as done by the customer, you'll find 3-10 lines per code a
day per programmer to be fairly respectable.  If the program really
works well, most customers wouldn't care if they only did 1 line per
day, so long as it was finished in a timely manner.


--
             Kent Williams --- williams@umaxc.weeg.uiowa.edu 
"'Is this heaven?' --- 'No, this is Iowa'" - from the movie "Field of Dreams"
"This isn't heaven, ... this is Cleveland" - Harry Allard, in "The Stupids Die"

jls@rutabaga.Rational.COM (Jim Showalter) (03/18/91)

>Whether any realistic combination of events exists which could reduce
>Pascal, C, or C++ programmers to this level of productivity is anybody's
>guess;  my own opinion is that most C programmers would require a bullet
>through the brain to be brought to such a level.

1) If those Pascal, C, or C++ programmers were required to operate under
   DoD-Std-2167/A standards, their productivity would drop by two orders
   of magnitude automatically. This is not a language issue.

2) SLOC/day tends to decrease drastically as a function of complexity.
   Complexity depends on a number of factors, including sheer size,
   number of programmers, number of configurations, number of targets,
   number of hosts, number of languages, number of contractors, etc etc
   etc.

   I've been meaning to ask Mr. "I Live In The Real World" Holden this
   question for two years: how complex are the systems on which Mr. Holden
   works? If the answer is that he's working with three other guys in a garage
   writing device drivers for PC's, I'm sorry, but I'm really not very
   impressed--one should EXPECT high SLOC/day rates for what is, in essence
   a solved problem (e.g. programming in the small). It is programming in
   the large that is still a black art for most companies, and it is on
   such projects that low productivity rates are experienced. That Ada
   tends to be the language of choice on such projects should not be used
   against it when the rates are low--the REASON Ada is the language of
   choice is that other languages, including COBOL, FORTRAN, and C, are
   simply not up to the job.

3) I have personally contracted on a 4,000,000 SLOC C/C++ project
   that was lucky to achieve 3 lines a day, on a GOOD day. The programmers
   had not, as Mr. Holden claims, been shot in the head--they were just
   suffering the same liabilities as anybody else who is trying to build
   extremely large systems using a stone-knives-and-bearskins kind of
   technology and paradigm.

4) I am able to program in Pascal, C, C++, and Ada. Can Mr. Holden make
   the same claim, or does he damn Ada from, as I suspect is the case,
   a position of relative ignorance? He certainly SOUNDS ignorant.

>The really comical thing about this is the way in which Ada gurus cite
>"productivity" as the main advantage of Ada.

Productivity rates range from no gain to order of magnitude gains. We
have lots of success stories backed up by factual accounting data if
Mr. Holden would care to read them.

But I suspect he would find the truth inconvenient.
--
***** DISCLAIMER: The opinions expressed herein are my own. Duh. Like you'd
ever be able to find a company (or, for that matter, very many people) with
opinions like mine. 
              -- "When I want your opinion, I'll read it in your entrails."

mfeldman@seas.gwu.edu (Michael Feldman) (03/18/91)

In article <1991Mar17.142756.25676@ecst.csuchico.edu> rreid@ecst.csuchico.edu (Ralph Reid III) writes:
>>    a high-priced bidder based on a measure of productivity that equals
>>    three lines of Ada code per programmer per day.
>>
>>    "A lower priced bidder, and others in the Ada community, said this
>>    standard is much too low.  The protester in the case, DynaLantic
>>    Corp, offered an average of ten lines of code per day per
>>    programmer.
>>
>>    "Three lines of code per day is absurd [as if ten wasn't], said
>>    Ralph Crafts, editor of a newsletter on Ada, and an expert
>>    witness for the protester.....
>> . . .
Hmmmm. Interesting to see this pop up on the net. It happens I was also
involved in this case (somehow they got the idea I was an expert in
something) and so have read the decision. You may recall that I posted
a question about productivity measures a while back; now you know why.

The answers I got were varied and interesting; the only generalization
in them was that nothing can be generalized. Going back to the Ed Yourdon et al
books of 15 years ago, one finds that in those days end-to-end software
productivity was cited in the neighborhood of 5-10 LOC per day, independent
of language. Yourdon et al were making those numbers an argument for 
high-level languages, because 5-10 assembler instructions and 5-10 lines
of HLL code clearly have different functionality, yet the studies showed
that the 5-10 was "language-independent." The purpose of "structured
programming" and all that was to raise the productivity. More recent
reports showed that indeed the range from doubled to 10-20 after using
these "new programming technologies" as they were called then.

In preparing for the case I dug into the literature, good professor that
I am. Everything I could find in the literature on Ada productivity on real
projects pointed somewhere in the neighborhood of 30-40 LOC/day. Remember
that these are _average_ numbers, end-to-end, including documentation and
all that. (sources: Reifer's productivity studies, WAdaS and TRi-Ada papers,
and the like). The 10 LOC/day offered by the company in question for a
relatively small project (~15k LOC) was, in my opinion, _quite conservative_
given the numbers I just cited. The company knew the application area.

That the USAF people should have derided the 10 LOC estimate as _overly 
optimistic_, preferring an absurdly low number of 3 LOC, sends me a
message that the government (or at least some people in it) are afraid
of Ada and/or not really interested in productivity. I'm under nondisclosure
or I'd post some details that are so sad they're funny. The administrative
law judge's decision is public record, so it's sage to comment. This guy
not only upheld the award (I figured he would because this was a contract 
where USAF had a lot of discretion) but gratuitously accused the company
of "low-balling," that is, bidding TOO-HIGH productivity in order to get the
contract.

Given the "where cost-effective" clause in the legislative mandate for
Ada, this case could be an awful precedent, because the government can
accuse anyone bidding high productivity (arguing Ada's cost-effectiveness)
of low-balling. Those in the government determined to resist any kind
of progress (it doesn't even have to be _Ada_ progress) can have a field
day shooting down high(er) productivity estimates. Let's hope most of the
government is too smart or has too high a sense of integrity to pull
tricks like this. Stay tuned: there may be appeals in the case.

My work in this weird case convinced me even more that this whole crazy
industry still doesn't really know how to estimate things. LOC/day is
not a terrific measure of anything. It's all we seem to have, though.
And people use it as though it were gospel.

Mike Feldman

euamts@eua.ericsson.se (Mats Henricson) (03/18/91)

csq031@umaxc.weeg.uiowa.edu writes:

>Lines of code per day is an absurd measure at best.  Using it in contracts
>is just a way of lulling people who care about such things into thinking
>that programmers (and software companies) know what they're doing and
>that their output is quantifiable.

>The actual situation (as most people know) is that when you set out to
>write something non-trivial, that hasn't been done already, you're
>much more like Lewis and Clark setting out in canoes than you are like
>a machinist putting a block of steel into the lathe. This scares the
>living shit out of bean counters.

>But anyway, if you measure lines/day after the fact, i.e after the
>program has been designed, written, tested, documented, beta-tested, and
>accepted as done by the customer, you'll find 3-10 lines per code a
>day per programmer to be fairly respectable.  If the program really
>works well, most customers wouldn't care if they only did 1 line per
>day, so long as it was finished in a timely manner.

I don't understand this line-counting AT ALL. Not just that it's a stupid
way of measuring productivity, but the fact that one line of code could
be so much. If I as a C++ programmer have a huge, easy to browse, well
documented and well designed library of classes, one line of code can
do SO much work, without me ever having to care HOW it works. When I in
the future can get my dirty hands on such libraries, you can start talking
about productivity. I could design my application for a while, and then
throw well designed classes into a file, and let them do the work.

This is why I think that OO-languages will be a long time winner when it's
about productivity. 

Mats Henricson

klapper@oravax.UUCP (Carl Klapper) (03/18/91)

In article <4921@ns-mx.uiowa.edu>, csq031@umaxc.weeg.uiowa.edu writes:
> Lines of code per day is an absurd measure at best.  Using it in contracts
> is just a way of lulling people who care about such things into thinking
> that programmers (and software companies) know what they're doing and
> that their output is quantifiable.
> 
> The actual situation (as most people know) is that when you set out to
> write something non-trivial, that hasn't been done already, you're
> much more like Lewis and Clark setting out in canoes than you are like
> a machinist putting a block of steel into the lathe. This scares the
> living shit out of bean counters.

There are, however, many cases where the productivity of the job which 
the program assists can be estimated, with and without the program.
The (presumably positive) differential of productivity with the program
over productivity without it provides an economically valid measure
of the value of the program. Expressing this measure in percentage terms
would remove any bias related to the volume of usage and thus provide
a measure of the productivity of the programmers over the life of the project.
Dividing by the number of hours spent on the project provides a measure
of hourly productivity.

In simple terms, "the ends justify the means" so we ought to measure
the value of the end result rather than the number of keystrokes, the number
of lines of code. This will also "scare the living shit out of the bean 
counters" because it is a post hoc measure. That is, managers of computer
programmers will still have to take risks and can only moderate those risks
by knowing thoroughly their project's task and tools. Only after the scores
come in will they know whether their team was good.

I do not mean to suggest that this measure is easy to calculate.
It should include code maintenance in the before and after use measures;
but, in most cases, time spent on maintenance can only be guessed.
Separating the programmers' contribution from that of the platform
could prove difficult. Assessing the indiviual contributions of project
members is an unwieldy task, though this might be ameliorated by dividing
the work into modules and only allowing one programmer to work on each module.
We will have to devise multipliers for reusable modules written on the project
and estimates of the value of those modules where they have been inserted.
Finally, the productivity of the job which the program assists may be hard
or impossible to measure, and the program could even create new uses 
for itself.

I only claim that this measure, which I may style the "Machiavellian measure",
makes economic sense whereas the LOC productivity measure plainly does not.
In fact, LOC are a liability, not an asset, a use or even abuse of resources
rather than a useful product. We would not reward a chef for the number
of lines in his recipe, but a program listing is just a recipe. For the chef,
the proof is in the pudding. Surely, for the programmer, the proof is in
the execution and use of the program.

+-----------------------------+--------------------------------------------+
|  			 _    | Carl Klapper				   |
|  Love Your Mother.	(_)   | Odyssey Research Associates, Inc.	   |
|                      earth  | 301A Harris B. Dates Drive		   |
|  Sell your car.	      | Ithaca, NY  14850			   |
|  			      | (607) 277-2020				   |
|                             | klapper@oracorp.com			   |
+-----------------------------+--------------------------------------------+

arny@cbnewsl.att.com (arny.b.engelson) (03/19/91)

In article <1991Mar16.205228.4268@grebyn.com> ted@grebyn.UUCP (Ted Holden) writes:
>   "The GSA Board of Contract appeals issued a ruling last month that
>   could effect how the military evaluates the cost effectiveness of Ada
>   software.
>
>   "The board upheld a decision by the Air Force to award a contract to
>    a high-priced bidder based on a measure of productivity that equals
>    three lines of Ada code per programmer per day.
>
>    "A lower priced bidder, and others in the Ada community, said this
>    standard is much too low.  The protester in the case, DynaLantic
>    Corp, offered an average of ten lines of code per day per
>    programmer.
>
>    "Three lines of code per day is absurd [as if ten wasn't], said
>    Ralph Crafts, editor of a newsletter on Ada, and an expert
>    witness for the protester.....
>
>Whether any realistic combination of events exists which could reduce
>Pascal, C, or C++ programmers to this level of productivity is anybody's
>guess;  my own opinion is that most C programmers would require a bullet
>through the brain to be brought to such a level.
>
>Ted Holden

You don't quote prices for many large Air Force proposals, do you Ted?
A quoted productivity rate of 10 lines per staff day for the entire
development cycle is not unusual, REGARDLESS OF THE LANGUAGE.  The
apparently dismal productivity is caused by time taken to do the many
other things required by the customer (Air Force), including preparing
for and holding requirements, design, and test reviews, preparing the
many documents required, etc.  I have seen very similar productivity
figures for C, Ada, and other languages in this type of job.  The actual
coding phase goes along merrily at 50 to 200 lines per day (depending
on the programmer, and not including overtime :-) ).  Too bad the Air
Force won't simply take our word for it that the code works, and that
it does everything they want it to, and that if it ever has to be
changed, we'll all be around to make those changes for them.

You really ought to look deeper into things before posting an inflammatory
article based on a one column article in Federal Computer Week (or
wherever).  By the way, you (and everyone else) should go read the
Ada 9X Mapping Document and the Mapping Rationale Document, since the
availability of that language spells the downfall of C/C++  :-).
Wait, it's a joke, stop the language war, it's a joke.  But the documents
ARE very interesting reading.

  -- Arny Engelson   att!wayback!arny   (arny@wayback.att.com)

martin@edwards-saftd-2.af.mil (03/19/91)

In article <1991Mar16.205228.4268@grebyn.com>, ted@grebyn.com (Ted Holden) writes:
> 
> The really comical thing about this is the way in which Ada gurus cite
> "productivity" as the main advantage of Ada.  Apparently, they use the
> phrase in somewhat the same vein as Mr. Hussein uses terms like "moral
> victory".
> 

Ada was not developed with the goal of improving raw "productivity" in terms of
lines of code per day.  Rather, it was designed to provide reliable and
maintainable code for large, long-lived, continuously evolving, performance
constrained, highly reliable embedded systems. In that harsh environment,
software maintenance costs are several times the cost of the original
development.

Ada was designed to reduce lifecycle costs by reducing maintenance costs.  This
was done at the expense of increased costs during the development of original
software. It is intended that the need to develop original software will be
substantially reduced through software component reuse.

This is a long term solution to a long term problem.  We are still some time
away from proving that it is the proper solution, even though many now believe
that it is.  

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - 
Gary S. Martin                !  (805)277-4509  DSN 527-4509
6510th Test Wing/TSWS         !  Martin@Edwards-SAFTD-2.af.mil
Edwards AFB, CA 93523-5000    ! 
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - 

stachour@sctc.com (Paul Stachour) (03/19/91)

>>from the Feb. 11 issue of Federal Computer Week:
>>
>>   "The board upheld a decision by the Air Force to award a contract to
>>    a high-priced bidder based on a measure of productivity that equals
>>    three lines of Ada code per programmer per day.
  ...
> I would like to
> know what in this world could reduce a serious programmer's
> productivity to these levels.
>email: rreid@cscihp.ecst.csuchico.edu

-----

   This question reminds me of the discussion in Gerald Weinburg's
"The Psychology of Computer Programming".  If I remember it right,
it was a discussion between two programmers. Programmer B had just
completed reprogramming a task originally given to programmer A.
A was complaining about the speed of B's code, pointing out that
A's original code was much faster.  B replied with something to
the effect that << But your code isn`t correct.  Mine is. If mine
doesn't ahve to be correct, I can made it as fast as you want.>>

   The point about productivity is that it can only be properly
measured for "correct" code.  And knowing what the characteristics
of the code is makes is correct or not.  For example, suppose I
have a set of C-code.  I compile it on 6 different C-compilers.
It compiles without error, executes, and gives the same "right"
answer on all.  Now I give this code to 7 different programmers.
They each compile, link, load, and execute it.  For those 7
programmers, each of which used a different C-compiler, they all
gave the wrong answers.  Yes, I may have had great-prodcutivity
in my original code.  But if everyone else gets the wrong answers
due to the fact that I don't know about a critical environmental
component, or fail to document it, or fail to note that most
C-compilers for physical-memory multi-tasking systems will not
get the answer right, is my code correct?  And given the time
for all of them to discover it, how much prodcutivity was there
really?

   I can give you a great lines-of-code productivity feature if:

  1)  I can presume all my inputs are correct.
  2)  I can presume all the subroutines I call are correct.
  3)  I can presume that the hardware never flips bits.
      Nor are there any other hardware errors.
  4)  I can presume that the compiler always generares correct code.
  5)  I can presume that I have enough memory and CPU cyles so
      that I can always think "clarity" instead of "make it fit"
      or "do swapping yourself".
  6)  All my users speak English so that I don't have to worry about
      how to handle multi-lingual error messages.
  7)  My cleanup routines will always get called on power-down
      or any other kill-condition so that my disk-data will
      always be consistant.
  8)  The client has a clear, simple, and unabigous specification so
      that I can spend all my time designing and implementing instead
      of trying to figure out what he really needs, instead of what he
      says he wants (which is full of contradictions).
  9)  I don't have to be compatible with existing applications that
      process the same data.
 10)  I have no numerical problems with overflow or zero-division.
 11)  I am working on a small enough project, in one single location,
      that I can get questions answered promptly.
 12)  I am allowed to work with hardware that is documented, instead of
      "new improved hardware" for which no documentation yet exists,
      and I have to play trial-and-error to see what it does now
      (which may be very different next week).
 13)  When I'm forced onto new hardware and software in the middle
      of a project, they are actually compatible with the old.

  All of these, and more, are assumtions that are often false.

  Sure, in the context of a simple, university-level, class assignment
that got thrown away as soon as done, I have great productivity also.
How many students at the university where you go/teach could write a
resuable component for something as simple as a stack?  I'd bet
that 95% couldn't.  I base that on me, my colleages, and those I teach.
I base that on the fact that most hardware can't even do a shift
properly (well, at least the way that the documentation says it does it).

  If one accepts the premise that 90% of good-code is for the conditions
that happen less than 10% of the time (personally, I'll even say 1%
of the time), then one can get a 10-fold increase in productivity by
merely ignoring error-conditions.  And if you buy component code
from a commercial vendor, as we sometimes do, you can make some
pretty big discoveries of bugs.  We bought some code. We've used it
for several years.  In the first year, we found only a few bugs.
In the second year, we really started to use to code, and found a few
mode.  In the third year, we discovered that there were enough 
seldom-happen items not covered by the code that it was not useful.
And so we are re-writing it.  And this time (I hope) we will test it.

  I could go on and on about the groups that I've seen and worked
close-to that claimed to have "great productivity".  A few did.
But most merely turned out code fast.  And that code was full of
headaches for long-time to come.

  Enough for now.

  ...Paul
-- 
Paul Stachour          SCTC, 1210 W. County Rd E, Suite 100           
stachour@sctc.com          Arden Hills, MN  55112
                             [1]-(612) 482-7467

adam@visix.com (03/19/91)

[jordan]I hate the C'ness of C++, but I find myself implementing many things in
[jordan]C++ just because of inheritance and dynamic binding.  If Ada is ever
[jordan]to become mainstream (and I seriously hope it does) inheritance and
[jordan]dyn. binding had better be incorporated into the language.

[jls]	Enlighten me. How is it that many of the largest software systems
[jls]	ever attempted--including all of the flight control software for
[jls]	North America and all the software for the Space Station--are being
[jls]	written in Ada, even though Ada doesn't have "dynamic binding"?

Surely we all know that there is no simple reason why languages
succeed or fail.  The presence or absence of a single feature almost
never makes a language either necessary or insufficient.  You can't
even say success depends entirely on technical factors; you must also
consider politics, history, and plain dumb luck.

[jls]	Second question: assume Ada got dynamic binding tomorrow. What could
[jls]	be done with it that can't be done with it today?

No real-world language is so deficient that things "can't be done".
Some people feel that their problems are more easily solved using a
language with dynamic binding, so they see Ada as less convenient.
This is a very different question from the one you raise.

That's all I want to say.

Adam

jls@rutabaga.Rational.COM (Jim Showalter) (03/19/91)

>That the USAF people should have derided the 10 LOC estimate as _overly 
>optimistic_, preferring an absurdly low number of 3 LOC, sends me a
>message that the government (or at least some people in it) are afraid
>of Ada and/or not really interested in productivity.

Indeed, in many cases productivity is unwelcome because it reduces the
amount of money a contractor can charge the government. See what happens
when people try to end-run free-market economics?

By the way, I did some checking around after posting my reply to Mr.
"I Live In The Real World" Holden, and our technical consultants are
averaging about 150KSLOC/year of debugged, fielded, reasonably well
documented Ada code (of course, we're not writing under 2167A).

Numbers like these really aren't all that remarkable. What IS remarkable
is that the industry standard is two orders of magnitude lower.
--
***** DISCLAIMER: The opinions expressed herein are my own. Duh. Like you'd
ever be able to find a company (or, for that matter, very many people) with
opinions like mine. 
              -- "When I want your opinion, I'll read it in your entrails."

jls@rutabaga.Rational.COM (Jim Showalter) (03/19/91)

>You really ought to look deeper into things before posting an inflammatory
>article based on a one column article in Federal Computer Week (or
>wherever). 

You don't understand--this is Ted "I Live in the Real World" Holden, who
has never let his brute ignorance stand in the way of his expressing a
groundless opinion.

P.S. Ted believes the earth is 600 years old.
--
***** DISCLAIMER: The opinions expressed herein are my own. Duh. Like you'd
ever be able to find a company (or, for that matter, very many people) with
opinions like mine. 
              -- "When I want your opinion, I'll read it in your entrails."

jls@rutabaga.Rational.COM (Jim Showalter) (03/19/91)

We should measure functionality/day, not SLOC/day. This obsession with
SLOC hinders reuse, since the numbers aren't as high. 
--
***** DISCLAIMER: The opinions expressed herein are my own. Duh. Like you'd
ever be able to find a company (or, for that matter, very many people) with
opinions like mine. 
              -- "When I want your opinion, I'll read it in your entrails."

klimas@iccgcc.decnet.ab.com (03/20/91)

In article <1991Mar17.142756.25676@ecst.csuchico.edu>, rreid@ecst.csuchico.edu (Ralph Reid III) writes:
> In article <1991Mar16.205228.4268@grebyn.com> ted@grebyn.UUCP (Ted Holden) writes:
>> . . .
>>from the Feb. 11 issue of Federal Computer Week:
>>
>>   "The GSA Board of Contract appeals issued a ruling last month that
>>   could effect how the military evaluates the cost effectiveness of Ada
>>   software.
>>
>>   "The board upheld a decision by the Air Force to award a contract to
>>    a high-priced bidder based on a measure of productivity that equals
>>    three lines of Ada code per programmer per day.
>>
>>    "A lower priced bidder, and others in the Ada community, said this
>>    standard is much too low.  The protester in the case, DynaLantic
>>    Corp, offered an average of ten lines of code per day per
>>    programmer.
>>
>>    "Three lines of code per day is absurd [as if ten wasn't], said
>>    Ralph Crafts, editor of a newsletter on Ada, and an expert
>>    witness for the protester.....
>> . . .
> 
> 
> I don't know where these companies are digging up these kind of
> unproductive machine operators (I hesitate to call them real
> programmers), but they would never get through the computer science
> program here at Chico State.  It kind of makes me wonder what schools
> they came from, if they even have degrees.  The kind of productivity
> discussed in this article sounds like the level I might expect from
> beginning programming students at a junior college.  I would like to
> know what in this world could reduce a serious programmer's
> productivity to these levels.
        Ten lines of code per man day is quite believable in a corporate
	environment!  $50-$75/line of code is typical even in non military
	applications (e.g. Mentor Graphics C++ based RELEASE 8.0 supposedly
	has about a million lines of code and cost the company $75million.

	I believe the difference in code quality, testing, documentation
	and the usual intergalactic corporate overhead are the problems.
	On the other hand I'm not so sure I'd want shareware in a cruise
	missle either!

> 
> -- 
> Ralph.  SAAC member.
> ARS: N6BNO
> Compuserve: 72250.3521@compuserve.com
> email: rreid@cscihp.ecst.csuchico.edu

klimas@iccgcc.decnet.ab.com (03/20/91)

> |> Language	Inheritance	DynamicBnd'g	Type[T]	Except.	Task 	LibMgr
> |> Smalltalk	yes-SI&MI/class	yes-tlu		no	no	yes	yes

1.)Smalltalk does support multiple inheritance.  The framework is still in place
in Smalltalk/V in class Behavior.  There are third party add-on 
products for MI on Smalltalk/V.  There are a number of simple "mixin"
examples that one can cite as arguments for MI but the real problem 
seems to be the added complexity of MI from a conceptual standpoint 
makes real world applications using MI difficult to manage.  A. Borning
and D. Ingalls documented this in "Multiple Inheritance in Smalltalk-80".
Proc. of National Conf on AI, 1982.

2.)There are several library management tools for Smalltalk.  They
range in price, functionality and utility all over the spectrum

Low end 

	Carleton APP Manager- part of Digitalk Goodies

medium 
	Instantiations App Manager for ST-80
	Zuniq App Manager for Digitalk ST and 
	Coopers&Lybrand Softpert Application Manager for Digitalk

high end
	Object Technology International ENVY Manager for multiuser 
	online Smalltalk CMS and library management.

One of these tools will do the job depending upon your Smalltalk library 
management needs.

3.)Some very impressive things are happening with Digitalk's Smalltalk/V-PM
running on OS/2 that warrant qualification as ST supporting true preemptive,
multithreaded multitasking.

jbuck@galileo.berkeley.edu (Joe Buck) (03/20/91)

In article <jls.669170821@rutabaga>, jls@rutabaga.Rational.COM (Jim Showalter) writes:
> >I hate the C'ness of C++, but I find myself implementing many things in
> >C++ just because of inheritance and dynamic binding.  If Ada is ever
> >to become mainstream (and I seriously hope it does) inheritance and
> >dyn. binding had better be incorporated into the language.
> 
> Enlighten me. How is it that many of the largest software systems
> ever attempted--including all of the flight control software for
> North America and all the software for the Space Station--are being
> written in Ada, even though Ada doesn't have "dynamic binding"?

Most of us learned about "Turing equivalence" in school.  You can write
any program in assembly language.  The software for the Space Station
(which may never be launched anyway) is being written in Ada because of
government mandate, not necessarily because Ada is the best language for
the job.

> Second question: assume Ada got dynamic binding tomorrow. What could
> be done with it that can't be done with it today?

Absolutely nothing (you can do object-oriented programming in assembly
language).  However, programmer productivity would increase when solving
large problems if their language directly supported the concepts that
they are using to solve the problem, and the total life-cycle cost of
the software would decrease.

--
Joe Buck
jbuck@galileo.berkeley.edu	 {uunet,ucbvax}!galileo.berkeley.edu!jbuck	

jls@rutabaga.Rational.COM (Jim Showalter) (03/20/91)

Nice post! To your list of things that reduce productivity, I'd add:

-- Requirements never change in midstream

-- I'm not required to conform to 2167/A

In my experience--which seems to mirror yours--often the claims made
for SLOC/day turn out to be not so much for debugged, documented,
tested, fielded code, but for error-ridden trash that winds up as
shelfware.

In short, most SLOC/day rates are really just a measure of some hacker's
TYPING SPEED.
--
***** DISCLAIMER: The opinions expressed herein are my own. Duh. Like you'd
ever be able to find a company (or, for that matter, very many people) with
opinions like mine. 
              -- "When I want your opinion, I'll read it in your entrails."

bacon@ucunix.san.uc.edu (Edward M. Bacon) (03/21/91)

In article <jls.669170444@rutabaga> jls@rutabaga.Rational.COM (Jim Showalter) writes:
>Semantics. An object is defined as something with state that suffers
>actions. Thus, a Boolean global variable is an object, and two seconds
>of reflection will tell you that just about anything qualifies as an
>object. Ada does a fine job of representing objects, thus, it is
>object-oriented.

Five years ago, when I first started using Ada and learning about OOD, 
it rapidly became obvious there two distinct uses for the word "Object",
1) the Object-Oriented camp and 2) the Ada LRM.  It seemed as if at some 
time in the past the two diverged, never to meet again.  Neither camp seems 
willing to change, so we poor grunts have to tell them apart by context.  
I don't know what the history of this is, and I don't know which has the 
more valid copyright on the word, but I think if you held a vote today 
the O-O camp would win by a landslide.

ted@grebyn.com (Ted Holden) (03/21/91)

In article <jls.669262321@rutabaga> jls@rutabaga.Rational.COM (Jim Showalter) writes:

>1) If those Pascal, C, or C++ programmers were required to operate under
>   DoD-Std-2167/A standards, their productivity would drop by two orders
>   of magnitude automatically. This is not a language issue.

Agreed.  The language issue (Ada) is simply another symptom of the same
disease (terminal stupidity) as the other symptom (2167/A).

>2) SLOC/day tends to decrease drastically as a function of complexity.
>   Complexity depends on a number of factors, including sheer size,
>   number of programmers, number of configurations, number of targets,
>   number of hosts, number of languages, number of contractors, etc etc
>   etc.

Object oriented languages are a partial solution to the problem;  the
best answer American computer science has yet devised.  Pinson and
Weiner's book on C++ and OOP describe means of using these techniques to
eliminate complexity and vastly simplify maintenance of programs.  Too
bad Ada doesn't have these features.  Ada simply adds (greatly) to the
complexity which programmers must deal with, and offers no paybacks or
quid-pro-quo for the added complexity.

>   I've been meaning to ask Mr. "I Live In The Real World" Holden this
>   question for two years: how complex are the systems on which Mr. Holden
>   works? 

One such is the popular VMUSIC multipart musical routine for PCs
(formerly thought to be impossible), available on BBSs.  If programmed
in Ada, it would sound like two or three dogs growling at eachother,
that is, if it could be programmed in Ada.  I doubt it.

If the answer is that he's working with three other guys in a garage
>   writing device drivers for PC's, I'm sorry, but I'm really not very
>   impressed--one should EXPECT high SLOC/day rates for what is, in essence
>   a solved problem (e.g. programming in the small). It is programming in
>   the large that is still a black art for most companies, and it is on
>   such projects that low productivity rates are experienced. That Ada
>   tends to be the language of choice on such projects should not be used
>   against it when the rates are low--the REASON Ada is the language of
>   choice is that other languages, including COBOL, FORTRAN, and C, are
>   simply not up to the job.

Bullshit.  Unix is written in C, WordPerfect, Ami, and most modern
software.  The bulk of American scientific software is in Fortran.  The
bulk of American business software is in Cobol, and there are even
object-oriented versions of Cobol out now.  That makes Ada more of a
dinosaur than Cobol.

>3) I have personally contracted on a 4,000,000 SLOC C/C++ project
>   that was lucky to achieve 3 lines a day, on a GOOD day. 
>   had not, as Mr. Holden claims, been shot in the head--they were just
>   suffering the same liabilities as anybody else who is trying to build
>   extremely large systems using a stone-knives-and-bearskins kind of
>   technology and paradigm.

You need to trade the stone knives in for modern software tools.  A
recent article in the Dobbs Journal (about Smalltalk) provided a stark
contrast with the situation you describe;  it mentioned situations in
which equivalent productivity to programmers putting out 200,000 lines
of code a day could be achieved, and claimed that such will be needed if
todays software problems are to be solved.

>4) I am able to program in Pascal, C, C++, and Ada. Can Mr. Holden make
>   the same claim, or does he damn Ada from, as I suspect is the case,
>   a position of relative ignorance? He certainly SOUNDS ignorant.

I damn Ada from the various horror stories I read and hear regarding it.
I have managed to avoid it in my personal life, other than having to
write interfaces between it and low-level file-handling routines written
in C.  Doing that, I personally watched an Ada compiler take 25 minutes
to compile a 30 line program into a 600K byte executable;  I never saw a
C compiler do that.  
 
>>The really comical thing about this is the way in which Ada gurus cite
>>"productivity" as the main advantage of Ada.

>Productivity rates range from no gain to order of magnitude gains. We
>have lots of success stories backed up by factual accounting data if
>Mr. Holden would care to read them.

I found the tales on the Adawoe BBS to be more of an indication of the
real effects of Ada than the kind of bullshit you're describing.


Ted Holden

jls@rutabaga.Rational.COM (Jim Showalter) (03/21/91)

I think multiple inheritance is a solution in search of a problem.
--
***** DISCLAIMER: The opinions expressed herein are my own. Duh. Like you'd
ever be able to find a company (or, for that matter, very many people) with
opinions like mine. 
              -- "When I want your opinion, I'll read it in your entrails."

jls@rutabaga.Rational.COM (Jim Showalter) (03/21/91)

>The software for the Space Station
>is being written in Ada because of
>government mandate, not necessarily because Ada is the best language for
>the job.

People always get hung up on the mandate thing, but it is instructive
to consider WHY there is a mandate in the FIRST place. The reason?:
because the DoD determined that the tower of Babel of languages they
were suffering with (which included just oodles of C and FORTRAN) was
not doing the job. Ada is precisely the best language for the job because
all others were found--through empirical testing--to be woefully inadequate.

In short, C and FORTRAN and COBOL and LISP and Smalltalk-80 and all the
others had their chance, and blew it.
--
***** DISCLAIMER: The opinions expressed herein are my own. Duh. Like you'd
ever be able to find a company (or, for that matter, very many people) with
opinions like mine. 
              -- "When I want your opinion, I'll read it in your entrails."

Sid Kitchel <kitchel@iuvax.cs.indiana.edu> (03/21/91)

jls@rutabaga.Rational.COM (Jim Showalter) writes:
|->People always get hung up on the mandate thing, but it is instructive
|->to consider WHY there is a mandate in the FIRST place. The reason?:
|->because the DoD determined that the tower of Babel of languages they
|->were suffering with (which included just oodles of C and FORTRAN) was
|->not doing the job. Ada is precisely the best language for the job because
|->all others were found--through empirical testing--to be woefully inadequate.

|->In short, C and FORTRAN and COBOL and LISP and Smalltalk-80 and all the
|->others had their chance, and blew it.

	Whoa!!! Let's interject a little programming language history here.
Smalltalk-80 prima facie should not be in your list because the problems arose
in the 60s and 70s. The languages that caused the most problems in the DOD's
Tower of Babel were the multitude of assembly languages used for speed and
compactness. (Way back in them dark ages, deployed computers were often
equipped with 32K or less of RAM.) To a lesser extent, languages such as
JOVIAL and CMS-2 were also causing problems.
	Most of us academic types (even when we are ex-military) doubt 
that Ada is the solution to the original problem or any problem.

						Former naval person,
						(reformed CMS-2 user and
						 OODBS developer)
						   --Sid
-- 
Sid Kitchel...............WARNING: allergic to smileys and hearts....
Computer Science Dept.                         kitchel@cs.indiana.edu
Indiana University                              kitchel@iubacs.BITNET
Bloomington, Indiana  47405-4101........................(812)855-9226

tom@ssd.csd.harris.com (Tom Horsley) (03/21/91)

>>>>> Regarding Re: ada-c++ productivity; jls@rutabaga.Rational.COM (Jim Showalter) adds:

jls> I think multiple inheritance is a solution in search of a problem.

So far my (limited) personal experience with C++ tends to make me agree with
this. Lots of times I have been working on something and thought, "Yea,
multiple inheritance might be just the way to go with this". Then I would
continue work, and pretty soon I would have a solution, and MI never came
up. It's like it sounds good, but I just never find a use for it.

Perhaps the day will come when it does turn out to be just what I need. I
still think it sounds good, I just haven't needed it so far...
--
======================================================================
domain: tahorsley@csd.harris.com       USMail: Tom Horsley
  uucp: ...!uunet!hcx1!tahorsley               511 Kingbird Circle
                                               Delray Beach, FL  33444
+==== Censorship is the only form of Obscenity ======================+
|     (Wait, I forgot government tobacco subsidies...)               |
+====================================================================+

andyk@kermit.UUCP (Andy Klapper) (03/21/91)

In article <1991Mar18.072252.23378@eua.ericsson.se> euamts@eua.ericsson.se (Mats Henricson) writes:
>csq031@umaxc.weeg.uiowa.edu writes:
>
>>Lines of code per day is an absurd measure at best.  Using it in contracts


For sheer power of expression per line of code I think APL wins hands down !
I, personally, would not write anything over 200 lines in APL because of the
complexity involved.  I guess the point is, LINES OF CODE PER DAY DOES NOT
MEASURE THE PRODUCTIVITY OF A PROGRAMMER, OR HOW APPROPRIATE A LANGUAGE IS FOR
DOING A PARTICULAR JOB.

Use the right tool for the job !


-- 
The Stepstone Corporation                    Andy Klapper
75 Glen Rd.                                  andyk@stepstone.com
Sandy Hook, CT 06482                         uunet!stpstn!andyk
(203) 426-1875    fax (203)270-0106

brad@terminus.umd.edu (Brad Balfour) (03/22/91)

In article <12159@pasteur.Berkeley.EDU> jbuck@galileo.berkeley.edu (Joe Buck) writes:
>The software for the Space Station
>(which may never be launched anyway) is being written in Ada because of
>government mandate, not necessarily because Ada is the best language for
>the job.
>
>--
>Joe Buck
>jbuck@galileo.berkeley.edu	 {uunet,ucbvax}!galileo.berkeley.edu!jbuck

In the interest of accuracy, I think that it is important to note that
NASA and the Space Station project were under no mandate to use a particular
language for the Space Station software. In fact, a series of detailed
evaluations were made several years ago and it was the decision of the
Space Station Program Office that Ada was the best choice for that project.

NASA was not forced into using Ada, they chose to use Ada. Similarly,
the FAA was not forced to use Ada on its Advanced Automation System, but
chose to use Ada.

Brad Balfour
EVB Software Engineering, Inc.
brad@terminus.umd.edu

sabbagh@acf5.NYU.EDU (sabbagh) (03/22/91)

>People always get hung up on the mandate thing, but it is instructive
>to consider WHY there is a mandate in the FIRST place. The reason?:
>because the DoD determined that the tower of Babel of languages they
>were suffering with (which included just oodles of C and FORTRAN) was
>not doing the job. Ada is precisely the best language for the job because
>all others were found--through empirical testing--to be woefully inadequate.

>In short, C and FORTRAN and COBOL and LISP and Smalltalk-80 and all the
>others had their chance, and blew it.

Bzzzt.  Wrong, but thanks for playing.

The original intent of Ada is to replace the "tower of Babel" of languages
that was present in EMBEDDED SYSTEMS PROGRAMMING. This family includes 
languages like Forth, JOVIAL, and a number of others.  Embedded systems
are those found in missiles, submarines, airplanes, etc., where the
computers are used to control sensors, actuators, etc.

C, Fortran, COBOL, Lisp, Smalltalk-80 never "had their chance and blew it".
In fact, DoD looked at the scientific (Fortran) and business (COBOL) worlds
and asked: "why can't we standardize on embedded systems programming?".
The "mandate" applies only to contractors working on software to be embedded
in equipment, and I have heard that DoD has stepped back from even this
position.


Hadil G. Sabbagh
E-mail:		sabbagh@cs.nyu.edu
Voice:		(212) 998-3125
Snail:		Courant Institute of Math. Sci.
		251 Mercer St.
		New York,NY 10012

"Injustice anywhere is a threat to justice everywhere."
					- Martin Luther King, Jr.
Disclaimer: This is not a disclaimer.

jimad@microsoft.UUCP (Jim ADCOCK) (03/22/91)

In article <4921@ns-mx.uiowa.edu> csq031@umaxc.weeg.uiowa.edu () writes:

|Lines of code per day is an absurd measure at best.  Using it in contracts
|is just a way of lulling people who care about such things into thinking
|that programmers (and software companies) know what they're doing and
|that their output is quantifiable.

I used to work for a hardware company [can you say "software-IC"]
where many of the managers thought measuring lines of code per day
were a good idea.  I suggested that they measure the productivity
of their hardware engineers by measuring the number of IC's they 
designed into the digital circuits per day.  "What do you mean",
the managers responded in horror, "we pay our hardware designers to get the
job done with the least amount of ICs possible!"  

They never did figure out the analogy.

Maybe LOC managers could be taught to pay their software engineers
on a "bugs per day" basis -- at least such would make it clear that they're
getting what they pay for.

jls@rutabaga.Rational.COM (Jim Showalter) (03/22/91)

Accurate terminology is important. Focussing on objects blurs the
distinctions between "pure" OO languages and languages like Ada,
because by the standard definition of an object, any language that
supports a Boolean variable is object-oriented (and object, after
all, is just something that has state and suffers actions that affect
that state--a Boolean variable certainly satisfies this simple set
of criteria). A much more accurate distinction is that between
INHERITANCE-oriented languages and non-inheritance oriented languages.
An inheritance-oriented language supports the notion of classes, instances
of classes, subclasses, inheritance of operations, local overriding of
inherited operations, etc. THAT is the fundamental distinction that
should be emphasized, and that is why I'd like to retire the term
object-oriented as largely irrelevant and/or misleading.
--
***** DISCLAIMER: The opinions expressed herein are my own. Duh. Like you'd
ever be able to find a company (or, for that matter, very many people) with
opinions like mine. 
              -- "When I want your opinion, I'll read it in your entrails."

jls@rutabaga.Rational.COM (Jim Showalter) (03/22/91)

>Agreed.  The language issue (Ada) is simply another symptom of the same
>disease (terminal stupidity) as the other symptom (2167/A).

This is a contentless argument, since you provide nothing to back up
the claim that Ada is the result of stupidity. Your uniformed biases
don't constitute evidence.

>Object oriented languages are a partial solution to the problem;  the
>best answer American computer science has yet devised.

Actually, this is complete bullshit. Stroustrup himself says that C++
addresses 5% of the actual problem, the other 95% consisting of problems
in design, architecture, and process. Computer science (actually, more
properly, software engineering) has developed some methods for addressing
this other 95% of the problem.

>Weiner's book on C++ and OOP describe means of using these techniques to
>eliminate complexity and vastly simplify maintenance of programs.  Too
>bad Ada doesn't have these features.  Ada simply adds (greatly) to the
>complexity which programmers must deal with, and offers no paybacks or
>quid-pro-quo for the added complexity.

Again, contentless argument. In what way does Ada add to the complexity?
Ada introduces strong typing, separation of specification and implementation,
separate compilation, algorithmic and metatype parameterization (via
generics), opaque types, and exceptions. These features were the result
of years of first-rate research into ways to REDUCE the complexity of
programs. Oddly enough, these SAME features constitute 80% of the difference
between C++ and C.

>One such is the popular VMUSIC multipart musical routine for PCs
>(formerly thought to be impossible), available on BBSs.

You didn't answer my question (why am I not surprised?). What I asked
was: "How complex is the stuff you work on?". Your response here tells
me nothing. I want to know:

a) How many SLOC is it?
b) How many people worked on it?
c) How many development platforms did the team work on?
d) How many configurations does it support?
e) How many targets does it run on?
f) How many separate processes does it contain?
g) What is the method of inter-process communication?
h) What documentation standard was it developed under?
i) How many subcontractors were involved?
j) How many years did it take?

On first blush, something called a "routine" seems more like a
two-guys-in-a-garage-typing-away project than something even
remotely as complex as the sorts of projects I'm impressed by.

>If programmed
>in Ada, it would sound like two or three dogs growling at eachother,

Again, this is not supported by evidence. You merely reveal your
prejudice against Ada with no justification. This is generally called
"ignorance".

>that is, if it could be programmed in Ada.  I doubt it.

Are you aware of the concept of Turing equivalence? Anything can be
programmed in anything. If you find a counterexample, you get a Nobel.

>Bullshit. Unix is written in C,

Yeah, and that's one of the reasons IBM had to rewrite it from scratch
for the RS6000. I used to work in a UNIX shop (we made 32-bit superminis
[didn't everybody, once upon a time?]). Looking at the guts of the kernel
was a VERY scary experience. One guy, describing UNIX, paraphrased the
description of a helicopter: "A bunch of spare parts flying in loose
formation". Every time I execute a UNIX command and my system doesn't
crash, I'm amazed.

>WordPerfect, Ami, and most modern
>software.

"Most" seems rather excessive, since you then go on to say:

>The bulk of American scientific software is in Fortran.  The
>bulk of American business software is in Cobol,

Which is it? After all, COBOL represents 65% of ALL software in the
world. (Incidentally, COBOL and FORTRAN are acronyms, so one typically
writes them in all uppercase.)

>I damn Ada from the various horror stories I read and hear regarding it.

Uh huh. So the short answer is that you don't know how to write in Ada.
I figured as much. You say you've read various horror stories. Interesting:
have you ever bothered to read success stories--there are many of those,
you know. Or would reading that Ada had saved some sites millions and
millions of dollars be too inconvenient for your worldview?

Most people, when they don't know enough about something to form an
opinion based on fact, have enough sense to keep their mouth shut. You,
on the other hand, don't let your nearly complete ignorance of Ada keep
you from shooting off your mouth about it.

>I have managed to avoid it in my personal life, other than having to
>write interfaces between it and low-level file-handling routines written
>in C.  Doing that, I personally watched an Ada compiler take 25 minutes
>to compile a 30 line program into a 600K byte executable;  I never saw a
>C compiler do that.  

So you had a crappy compiler. Does that make the LANGUAGE bad? Try to
keep in mind that Ada pushes the state of the art of compiler technology
and that early compilers ate dust bunnies. Compiler technology has 
continued to improve, and today you can buy Ada compilers that produce
code every bit as tight as the best C compilers.

Furthermore, Rational sells compilers that perform incremental
compilation, and for many classes of change--even to specs (even
to the implementation of the private types in specs)--we can
compile in the change in a matter of seconds. Indeed, our effective
batch compilation rate often exceeds 100KSLOC/minute. I've never seen
a C compiler do THAT.

>>Productivity rates range from no gain to order of magnitude gains. We
>>have lots of success stories backed up by factual accounting data if
>>Mr. Holden would care to read them.
>I found the tales on the Adawoe BBS to be more of an indication of the
>real effects of Ada than the kind of bullshit you're describing.

So anecdotal evidence is better, in your eyes, than the balance sheets
of major corporations? So far you've proven to my satisfaction that
you'd make a lousy Ada programmer or software engineer, but this last
comment of yours demonstrates, in addition, that you'd make a lousy
scientist or accountant. What DO you do for a living, anyway?
--
***** DISCLAIMER: The opinions expressed herein are my own. Duh. Like you'd
ever be able to find a company (or, for that matter, very many people) with
opinions like mine. 
              -- "When I want your opinion, I'll read it in your entrails."

amanda@visix.com (Amanda Walker) (03/22/91)

My own feeling about this whole issue is that the actual choice of language
is pretty much a red herring.  My own experience has led me to the conclusion
that the two factors which most strongly influence software engineering
productivity are the self-discipline of the programmers involved and the
structure of their surrounding organization.

Ada itself is neither good nor bad, merely unwieldy.  The same can be
said for C++.  However, given equivalently good compilers and
programming environments, the choice of language is irrelevant.
Productivity (for which I find SLOC/day to be a metric of dubious
utility at best) depends almost entirely upon the people doing the
programming.

From a pragmatic point of view, however, there are a lot more (and a
lot better) programming tools for supporting C and C++ than there are
for Ada (likewise with programmer experience).  For projects in which
the development budget is a valid concern, they are quite often more
appropriate for the job than Ada would be.  On the other hand, if the
DoD is paying you huge amounts of money to develop software for them,
Ada may well be a better choice.  The reasons remain pragmatic, not
based on inherent advantages of one language or the other.

--
Amanda Walker						      amanda@visix.com
Visix Software Inc.					...!uunet!visix!amanda
-- 
X Windows: It could be worse, but it'll take time...

jjs@sei.cmu.edu (Jeffrey Stewart) (03/22/91)

Well heck, Ted.  I didn't want to get into a language war, but since you
started it...

First, let me say that I've done projects in both C and Ada, and have been
formally trained in both languages.

Much has been made of C's concise syntax, that in fact this is a great
attribute of the language.  However, this conciseness has led to horribly
complex semantics, which I feel is one of C's greatest drawbacks.  Also,
much of what was left out of C's capabilities was put back in as the
"standard libraries".  I'll say more about those later.  A C culture has
developed around the complex semantics ("anyone who doesn't know C shouldn't
be programming in it").  In fact, rather than being used to judge the
language (it probably shouldn't have these problems), knowledge of the
complex semantics, and the required workarounds, is considered a
prerequisite of being a competent C programmer.

Ada, however, has a far more uniform set of semantics, which made learning
the language a joy for me, because once I learned something, I found that
I could apply it equally in many different areas, and it would always work
the way I thought (i.e., no "side-effects").  

Now on to the "standard libraries".  Part of the size of the Ada syntax is
that much of what are in the C libraries is built into the Ada language
itself.  We can argue about whether that was a valid design decision, but I
think it was, given the design goal of portability, and the testing required
to ensure that a compiler achieves that goal.  (I'll talk about testing in a
moment, too.)

I wish my brother were on the net.  He also knows both Ada and C, is forced
to work in C, and wishes he could work in Ada.  His Engineer's Degree
(halfway between a Masters and a Doctorate) thesis required development of
a plasma simulation, in C, to be ported from a VAX to a Cray and then to a
PC.  The horror stories about the, um, loose interpretation of whatever
passes as a spec for the C language are too numerous to detail here.  Some
of those problems had to do with the syntax and semantics, but many more had
to do with those "standard" libraries, which are anything but standard.

My brother is now in the commercial world, and is tasked with porting a
300KLOC C application to "any platform he can".  This has given him ample
opportunity to discover what a mess the C world really is.  He's keeping
notes, and may write a master' thesis about it when he resumes work on his
software engineering degree. 

I know that the C community realizes what a mess it's in, because there is
now an effort to create a validation suite for C compilation systems built
to the ANSI standard.  Of course, the Ada community anticipated this need,
and has been doing compiler validations all along.

Finally, let me quote P.G. Plauger, head of the ANSI C standardization
committee, who noted "Beyond 100,000 lines of code, you should probably be
coding in Ada."

Couldn't have said it better myself.


 

ark@alice.att.com (Andrew Koenig) (03/22/91)

In article <jls.669525137@rutabaga> jls@rutabaga.Rational.COM (Jim Showalter) writes:

> I think multiple inheritance is a solution in search of a problem.

I think computers are a solution in search of a problem.
-- 
				--Andrew Koenig
				  ark@europa.att.com

jls@rutabaga.Rational.COM (Jim Showalter) (03/24/91)

>Ada itself is neither good nor bad, merely unwieldy.  The same can be
>said for C++.

In what way are these languages, which provide strong typing, separation
of specification and implementation, opaque types, genericity, and
exception handling "unwieldy"? I would counter that a language that
permits functions accepting an indeterminate number of arguments of
arbitrary type--such as C--is considerably MORE unwieldy. It is CERTAINLY
more dangerous.

>From a pragmatic point of view, however, there are a lot more (and a
>lot better) programming tools for supporting C and C++ than there are
>for Ada

I would agree with "a lot more", but not with "a lot better". We provide
tools for Ada development that scale to the largest software projects
ever attempted, with integrated CM, full symbolic everything, incremental
compilation, dynamic type binding (yes, even in a statically-bound language),
universal host capabilities, and some of the tightest back-end code generated
anywhere. Those familiar with our system consistently deem it best of show
among all software development environments.
--
***** DISCLAIMER: The opinions expressed herein are my own. Duh. Like you'd
ever be able to find a company (or, for that matter, very many people) with
opinions like mine. 
              -- "When I want your opinion, I'll read it in your entrails."

chip@tct.uucp (Chip Salzenberg) (03/25/91)

The C standard library does not cover all OS features.  Portability from
one C implementation to another involves not only language compatibility,
but OS compatibility.  Horror stories about C porting have ZERO relevance
to the language unless it is demonstrated that the standard C libraries,
as opposed to the BSD/SysV/MS-DOS/etc. libraries, were incompatible.
-- 
Chip Salzenberg at Teltronics/TCT     <chip@tct.uucp>, <uunet!pdn!tct!chip>
   "All this is conjecture of course, since I *only* post in the nude.
    Nothing comes between me and my t.b.  Nothing."   -- Bill Coderre

diamond@jit345.swstokyo.dec.com (Norman Diamond) (03/25/91)

In article <1991Mar21.024445.8746@grebyn.com> ted@grebyn.UUCP (Ted Holden) writes:

>Object oriented languages are a partial solution to the problem;  the
>best answer American computer science has yet devised.
             -------------------------
Never mind that European computer science devised it, eh?
--
Norman Diamond       diamond@tkov50.enet.dec.com
If this were the company's opinion, I wouldn't be allowed to post it.

sdl@lyra.mitre.org (Steven D. Litvinchouk) (03/25/91)

In article <1991Mar15.224626.27077@aero.org> jordan@aero.org (Larry M. Jordan) writes:

> I hate the C'ness of C++, but I find myself implementing many things in
> C++ just because of inheritance and dynamic binding.  If Ada is ever
> to become mainstream (and I seriously hope it does) inheritance and
> dyn. binding had better be incorporated into the language.

As I understand it, single inheritance with constrained polymorphism
will probably be incorporated in Ada 9X.  The Ada community is now
aware that the old argument "Use Ada--it sure beats Fortran!" is
wearing thin.


--
Steven Litvintchouk
MITRE Corporation
Burlington Road
Bedford, MA  01730
(617)271-7753
ARPA:  sdl@mbunix.mitre.org
UUCP:  ...{att,decvax,genrad,necntc,ll-xn,philabs,utzoo}!linus!sdl
	"Where does he get those wonderful toys?"

sdl@lyra.mitre.org (Steven D. Litvinchouk) (03/25/91)

In article <1991Mar20.201633.15564@ucunix.san.uc.edu> bacon@ucunix.san.uc.edu (Edward M. Bacon) writes:

> Five years ago, when I first started using Ada and learning about OOD, 
> it rapidly became obvious there two distinct uses for the word "Object",
> 1) the Object-Oriented camp and 2) the Ada LRM.  It seemed as if at some 
> time in the past the two diverged, never to meet again.  Neither camp seems 
> willing to change, so we poor grunts have to tell them apart by context.  
> I don't know what the history of this is....

The use (or misuse) of the term "object-oriented" in the Ada community
traces back, I believe, to their *misunderstanding* of Grady Booch's
original textbook, "Software Engineering with Ada."  Grady Booch was
advocating object-oriented design, and showing how these designs could
be done in Ada.  Booch did *not* say that Ada was an "object-oriented
programming language" in the Smalltalk or C++ sense.  Nevertheless,
some others in the Ada community seem to have misunderstood this
distinction.


--
Steven Litvintchouk
MITRE Corporation
Burlington Road
Bedford, MA  01730
(617)271-7753
ARPA:  sdl@mbunix.mitre.org
UUCP:  ...{att,decvax,genrad,necntc,ll-xn,philabs,utzoo}!linus!sdl
	"Where does he get those wonderful toys?"

sdl@lyra.mitre.org (Steven D. Litvinchouk) (03/25/91)

In article <jls.669589592@rutabaga> jls@rutabaga.Rational.COM (Jim Showalter) writes:

> INHERITANCE-oriented languages and non-inheritance oriented languages.
> An inheritance-oriented language supports the notion of classes, instances
> of classes, subclasses, inheritance of operations, local overriding of
> inherited operations, etc. THAT is the fundamental distinction that
> should be emphasized, and that is why I'd like to retire the term
> object-oriented as largely irrelevant and/or misleading.

This raises an issue I would like to see discussed more often.
Namely, does a language need to support modules or packages in
addition to classes, in order to be considered truly
"software-engineering oriented" as well as "inheritance-oriented?"
Eiffel supports classes, but not groupings of these class declarations
into packages or modules.  Classes are apparently the library units of
management as well as semantic declarations in Eiffel.  On the other
hand, Modula-3 considers classes just a kind of type declaration, and
you can have several such class declarations in a module.

Personally, I have never found classes alone to be a complete
substitute for packages or modules.  I frequently need to consider
several related classes as comprising a single concept, which must be
developed as a single software unit (module).  At least the
Smalltalk-80 environment supports class categories.


--
Steven Litvintchouk
MITRE Corporation
Burlington Road
Bedford, MA  01730
(617)271-7753
ARPA:  sdl@mbunix.mitre.org
UUCP:  ...{att,decvax,genrad,necntc,ll-xn,philabs,utzoo}!linus!sdl
	"Where does he get those wonderful toys?"

barriost@gtephx.UUCP (Tim Barrios) (03/26/91)

first of all, this whole issue of productivity is missing the point. 
in the real world, new development is not the issue, software
maintenance is.  productivity is more concerned with things like
maintainability, readability, and, yes, reusability more than how fast
some hacker can get yet-another sort/stack/queue/random number routine
developed.

try writing the same function/object/class/whatever in both C++ and
Ada and give it to groups of people and ask them to make a change to
its logic (eg, add a message/operation/function) and see which
language is easier to change.  anyone who really knows both languages
(ie, not Ted Holden, see below) would agree that in general, Ada is
probably more maintainable.

yes, i know, good and bad programs can be written in any language.  
it's more of a cultural thing than a syntax issue.  C/C++ grew out of
the Unix/hacker culture whereas Ada has grown out of the software
engineering community.  as an example, take a look at the overall
readability of Ada code in Booch's Ada book vs Lipman's C++ book.  or,
browse through 'reuse' library routines in both languages and see
which are more understandable.

plus, in new development, implementing at a code level is such an
insignificant portion of the overall life-cycle that the productivity
of it's generation is hardly the issue.  what are the issues (and i
think both sides of this discussion would agree) are things like
requirements to design to implementation decomposition (OO
decomposition).

In article <1991Mar21.024445.8746@grebyn.com>, ted@grebyn.com (Ted Holden) writes:
> In article <jls.669262321@rutabaga> jls@rutabaga.Rational.COM (Jim Showalter) writes:
> >   I've been meaning to ask Mr. "I Live In The Real World" Holden this
> >   question for two years: how complex are the systems on which Mr. Holden
> >   works? 
> 
> One such is the popular VMUSIC multipart musical routine for PCs
> (formerly thought to be impossible), available on BBSs.  If programmed
> in Ada, it would sound like two or three dogs growling at each other,
> that is, if it could be programmed in Ada.  I doubt it.

i don't think this would categorize as part of the large project
domain for which Ada was intended.  where i work, we develop
[commercial] systems that are on the order of 2+ million lines of code
with teams of hundreds of engineers.  for the past 12 or so years, we
have done this work in a Pascal variant which has several similarities
to Ada.  we have recently done some newer products in C and have had
less-than positive experiences with them, especially in the
maintainability/understandability area.  let's face it, C is just
a step above assembly language and C++ has all the power of OOP but
with the same cryptic syntax of C.

> >4) I am able to program in Pascal, C, C++, and Ada. Can Mr. Holden make
> >   the same claim, or does he damn Ada from, as I suspect is the case,
> >   a position of relative ignorance? He certainly SOUNDS ignorant.
> 
> I damn Ada from the various horror stories I read and hear regarding it.
> I have managed to avoid it in my personal life, other than having to
> write interfaces between it and low-level file-handling routines written
> in C.  Doing that, I personally watched an Ada compiler take 25 minutes
> to compile a 30 line program into a 600K byte executable;  I never saw a
> C compiler do that.  

an expression from an old Steve Martin (that's right, the comedian)
skit comes to mind when i read Holden's comments: "criticize things
you don't know about".  i find this amazing that you blow so much how
air about something you know absolutely nothing about.  let's here
from the many people who have worked on real projects in both
languages.  as if he had any credibility to start with given his inept
arguments, Ted Holden's postings now have a whole new reason to be
ignored.

-- 
Tim Barrios, AG Communication Systems, Phoenix, AZ
UUCP: ...!{ncar!noao!asuvax | uunet!zardoz!hrc | att}!gtephx!barriost
Internet: gtephx!barriost@asuvax.eas.asu.edu
voice: (602) 582-7101        fax:   (602) 581-4022

westley@thunderchief.uucp (Terry J. Westley) (03/26/91)

In article <1991Mar17.142756.25676@ecst.csuchico.edu> rreid@ecst.csuchico.edu (Ralph Reid III) writes:
>In article <1991Mar16.205228.4268@grebyn.com> ted@grebyn.UUCP (Ted Holden) writes:
>>    "Three lines of code per day is absurd [as if ten wasn't], said
>>    Ralph Crafts, editor of a newsletter on Ada, and an expert
>>    witness for the protester.....
>
>I would like to
>know what in this world could reduce a serious programmer's
>productivity to these levels.

non-tailored DOD-STD-2167A

-- 
Terry J. Westley 
Calspan Corporation, P.O. Box 400, Buffalo, NY 14225
westley%planck.uucp@acsu.buffalo.edu | "planck!hercules!westley"@acsu.buffalo.edu

jls@rutabaga.Rational.COM (Jim Showalter) (03/26/91)

>In fact, DoD looked at the scientific (Fortran) and business (COBOL) worlds
>and asked: "why can't we standardize on embedded systems programming?".
>The "mandate" applies only to contractors working on software to be embedded
>in equipment, and I have heard that DoD has stepped back from even this
>position.

To quote you verbatim: "Bzzzt.  Wrong, but thanks for playing.". The
mandate was not only recently STRENGTHENED, it was EXTENDED to apply
to non-embedded stuff (including MIS). Waivers are almost non-existent,
and getting rarer by the day. Why? Because Ada has proven itself over
and over again, and each new success story confirms the wisdom of the
original mandate.
--
***** DISCLAIMER: The opinions expressed herein are my own. Duh. Like you'd
ever be able to find a company (or, for that matter, very many people) with
opinions like mine. 
              -- "When I want your opinion, I'll read it in your entrails."

jls@rutabaga.Rational.COM (Jim Showalter) (03/26/91)

>Horror stories about C porting have ZERO relevance
>to the language

But they DO have relevance to discussions of the relative merits
of Ada vs C with respect to porting, which was the topic of the
original post.

C is separable from its standard libraries. That's the PROBLEM.
Ada comes complete with a standard set of predefined packages.
Note the term "standard". Ada written solely against these
predefined packages stands a very good chance of being portable
with ease across a wide variety of different platforms.
--
***** DISCLAIMER: The opinions expressed herein are my own. Duh. Like you'd
ever be able to find a company (or, for that matter, very many people) with
opinions like mine. 
              -- "When I want your opinion, I'll read it in your entrails."

chip@tct.uucp (Chip Salzenberg) (03/28/91)

[ Relevance to comp.object: C++ is based on ANSI C, and will therefore
  (I presume) include the entire ANSI C library. ]

According to jls@rutabaga.Rational.COM (Jim Showalter):
>C is separable from its standard libraries. That's the PROBLEM.

C has a standard library: the one defined by ANSI.  All hosted
implementations of ANSI C have that entire library.  There is such a
thing as non-hosted ANSI C, but that's irrelevant to this discussion.

>Ada comes complete with a standard set of predefined packages.
>Note the term "standard". Ada written solely against these
>predefined packages stands a very good chance of being portable
>with ease across a wide variety of different platforms.

s/Ada/C:

    C comes complete with a standard set of predefined routines and
    variables.  Note the term "standard".  C written solely against
    these predefined routines and variables stands a very good chance
    of being portable with ease across a wide variety of different
    platforms.

Moral:  Portability is not an Ada exclusive.

OS libraries, by definition, are not standard.  If a given C program
is not portable across environments, then dependence on OS libraries
is often the reason.  That is not the fault of ANSI, it is the fault
of the programmer.

If you still don't understand, I could retype it a little slower.
-- 
Chip Salzenberg at Teltronics/TCT     <chip@tct.uucp>, <uunet!pdn!tct!chip>
   "All this is conjecture of course, since I *only* post in the nude.
    Nothing comes between me and my t.b.  Nothing."   -- Bill Coderre

emery@aries.mitre.org (David Emery) (03/28/91)

There is a wide body of experience (and at least one book) on porting
C code.  Here are some observations:
	First:  How do you *know* if a compiler implements full ANSI
C?  We have some experience here moving ANSI code across compilers and
coming up with some unpleasent surprises, such as "Feature not
implemented" or "syntax error".    
	The same thing is even more true with C++.  The language
standardization effort is just now starting.
	Second:  How do you know if the entire library is implemented
correctly?  For instance, Microsoft C has signal handling routines in
its MS-DOS compiler, but it only handles Control-C.  There's no way to
get correct fork semantics on (single-tasking) MS-DOS.  
	Finally:  I've seen lots of C code that assumes int and *char
are interchangable.  Despite every book's advise to the contrary, it's
easy to write non-portable C code.  The problem with C is that it's
awfully hard to find these places when you're in the middle of a port.

C does have a lot more "default standard" library interfaces, and a
growing set of standard interfaces (e.g. most of the POSIX work).  Ada
is just starting into the "standard interfaces" business.   

However, the biggest advantage of Ada for portability is compiler
validation, that guarantees the compiler implements Ada semantics
correctly.  If C really wants portability, it needs a similar
validation suite.

			dave emery
			emery@aries.mitre.org

ark@alice.att.com (Andrew Koenig) (03/30/91)

In article <EMERY.91Mar28111404@aries.mitre.org> emery@aries.mitre.org (David Emery) writes:

> However, the biggest advantage of Ada for portability is compiler
> validation, that guarantees the compiler implements Ada semantics
> correctly.  If C really wants portability, it needs a similar
> validation suite.

I know a guy who wrote a library of generic algorithms in Ada.
He told me that he had a lot of trouble moving it from one
implementation to another, because even though the implementations
he had used had been `validated,' they weren't correct.

Testing can confirm the presence of bugs, but not their absence.
-- 
				--Andrew Koenig
				  ark@europa.att.com

barriost@gtephx.UUCP (Tim Barrios) (04/01/91)

[i posted this a while back but think it got lost in our messed up
local news system...]

first of all, this whole issue of productivity is missing the point. 
in the real world, new development is not the issue, software
maintenance is.  productivity is more concerned with things like
maintainability, readability, and, yes, reusability more than how fast
some hacker can get yet-another sort/stack/queue/random number routine
developed.

try writing the same function/object/class/whatever in both C++ and
Ada and give it to groups of people and ask them to make a change to
its logic (eg, add a message/operation/function) and see which
language is easier to change.  anyone who really knows both languages
(ie, not Ted Holden, see below) would agree that in general, Ada is
probably more maintainable.

yes, i know, good and bad programs can be written in any language.  
it's more of a cultural thing than a syntax issue.  C/C++ grew out of
the Unix/hacker culture (as did i, originally) whereas Ada has grown
out of the software engineering community.

plus, in new development, implementing at a code level is such an
insignificant portion of the overall life-cycle that the productivity
of it's generation is hardly the issue.  what are the issues (and i
think both sides of this discussion would agree) are things like
requirements to design to implementation decomposition (OO
decomposition).

In article <1991Mar21.024445.8746@grebyn.com>, ted@grebyn.com (Ted Holden) writes:
> In article <jls.669262321@rutabaga> jls@rutabaga.Rational.COM (Jim Showalter) writes:
> >   I've been meaning to ask Mr. "I Live In The Real World" Holden this
> >   question for two years: how complex are the systems on which Mr. Holden
> >   works? 
> 
> One such is the popular VMUSIC multipart musical routine for PCs
> (formerly thought to be impossible), available on BBSs.  If programmed
> in Ada, it would sound like two or three dogs growling at each other,
> that is, if it could be programmed in Ada.  I doubt it.

i don't think this would categorize as part of the large project
domain for which Ada was intended.  where i work, we develop
[commercial] systems that are on the order of 2+ million lines of code
with teams of hundreds of engineers.  for the past 12 or so years, we
have done this work in a Pascal variant which has several similarities
to Ada.  we have recently done some newer products in C and have had
less-than positive experiences with them, especially in the
maintainability/understandability area.  let's face it, C is just
a step above assembly language and C++ has all the power of OOP but
with the same maintainability characteristics of C.

> >4) I am able to program in Pascal, C, C++, and Ada. Can Mr. Holden make
> >   the same claim, or does he damn Ada from, as I suspect is the case,
> >   a position of relative ignorance? He certainly SOUNDS ignorant.
> 
> I damn Ada from the various horror stories I read and hear regarding it.
> I have managed to avoid it in my personal life, other than having to
> write interfaces between it and low-level file-handling routines written
> in C.  Doing that, I personally watched an Ada compiler take 25 minutes
> to compile a 30 line program into a 600K byte executable;  I never saw a
> C compiler do that.  

an expression from an old Steve Martin (that's right, the comedian)
skit comes to mind when i read Holden's comments: "criticize things
you don't know about".  i find this amazing that you blow so much how
air about something you know absolutely nothing about.  let's here
from the many people who have worked on real projects in both
languages.  as if he had any credibility to start with given his inept
arguments, Ted Holden's postings now have a whole new reason to be
ignored.

-- 
Tim Barrios, AG Communication Systems, Phoenix, AZ
UUCP: ...!{ncar!noao!asuvax | uunet!zardoz!hrc | att}!gtephx!barriost
Internet: gtephx!barriost@asuvax.eas.asu.edu
voice: (602) 582-7101        fax:   (602) 581-4022

chip@tct.com (Chip Salzenberg) (04/02/91)

According to emery@aries.mitre.org (David Emery):
>There is a wide body of experience (and at least one book) on porting
>C code.  Here are some observations:

"At least one book"?  Shirly, you jest.  There are dozens.

> First:  How do you *know* if a compiler implements full ANSI C?

There are validation suites for ANSI C.  Of course, no validation
suite -- not even the gummint's -- can guarantee that a compiler is
bug-free.

> The same thing is even more true with C++.  The language
>standardization effort is just now starting.

Indeed.  There is no C++ standard to validate -- yet.

> Second:  How do you know if the entire library is implemented correctly?

In ANSI C, no distinction is made between "compiler" and "library",
though such is frequently made in informal communication, based on
common implementation techniques.

>For instance, Microsoft C has signal handling routines in its MS-DOS
>compiler, but it only handles Control-C.

Microsoft may well be compliant with the standard anyway, even though
it is (gasp!) unlike UNIX.  Have you read the relevant portions of the
ANSI standard?  No, of course not, how silly of me to ask...

>There's no way to get correct fork semantics on (single-tasking) MS-DOS.

There is no ANSI C function named "fork()".

> Finally:  I've seen lots of C code that assumes int and *char
>are interchangable.

I trust you mean "char *".  Yup, there are lots of incompetent code
grinders in the world, people who use BFI (Brute Force and Ignorance)
instead of skill and knowledge.  There are also a lot of truly skilled
and motivated people using C.  And Fortran for that matter.  So?

>The problem with C is that it's awfully hard to find these places when
>you're in the middle of a port.

Can you say "lint"?  I knew you could...
-- 
Chip Salzenberg at Teltronics/TCT     <chip@tct.com>, <uunet!pdn!tct!chip>
   "All this is conjecture of course, since I *only* post in the nude.
    Nothing comes between me and my t.b.  Nothing."   -- Bill Coderre