[comp.lang.ada] ada-c++ productivity

johnsonc@wdl1.wdl.loral.com (Craig C Johnson) (03/08/91)

Can anyone provide references which address productivity (either development
or maintenance) utilizing an full object-oriented language (say C++) vs.
Ada?  I see lots of flames and anecdotal data but little hard data.

I'll be happy to summarize to the net.

Craig Johnson
johnsonc@wdl1.wdl.loral.com

jls@rutabaga.Rational.COM (Jim Showalter) (03/09/91)

>Can anyone provide references which address productivity (either development
>or maintenance) utilizing an full object-oriented language (say C++) vs.
>Ada?

Oh good grief. I'm so sick of Ada being called non-object-oriented. An
Ada type is an object, as is anything with state. What you really mean
is non-inheritance-oriented.
--
***** DISCLAIMER: The opinions expressed herein are my own. Duh. Like you'd
ever be able to find a company (or, for that matter, very many people) with
opinions like mine. 
                   -- "When I want your opinion, I'll beat it out of you."

wiese@forwiss.uni-passau.de (Joachim Wiese) (03/10/91)

johnsonc@wdl1.wdl.loral.com (Craig C Johnson) writes:

>Can anyone provide references which address productivity (either development
>or maintenance) utilizing an full object-oriented language (say C++) vs.
>Ada?  I see lots of flames and anecdotal data but little hard data.

C++ is a "better C" but not a full OO-language. C++ is an OO-language that
 offers a lot of flexibility. To much flexibility and to
much pointers (to much C) to lead to quality software and productivity.
 I would rather be interested in comparing a _real_ 
 "full OO-language" as  EIFFEL vs. ADA.

A book that addresses such issues as productivity, reuseability and
 maintenance is "Object-oriented software construction" 
                 from Meyer B. 
                 Prentice Hall 1988
It compares fetures of ADA and EIFFEL (exception handling, modularisation atc.)
 



-- 
--------  O   Joachim Wiese - \O/  ---------------------------   O  ---------
-------- /!\  Uni. Passau----  !   wiese@forwiss.uni-passau.de  /!\ ---------
-------- / \  GERMANY ------- / \  ---------------------------  / \ ---------

jbuck@galileo.berkeley.edu (Joe Buck) (03/14/91)

In article <1991Mar10.151220.2581@forwiss.uni-passau.de>, wiese@forwiss.uni-passau.de (Joachim Wiese) writes:
|> C++ is a "better C" but not a full OO-language. C++ is an OO-language that
|>  offers a lot of flexibility. To much flexibility and to
|> much pointers (to much C) to lead to quality software and productivity.
|>  I would rather be interested in comparing a _real_ 
|>  "full OO-language" as  EIFFEL vs. ADA.

C++ and Eiffel are full object-oriented languages; Ada is not.  Some
definitions, from Grady Booch's "Object-Oriented Design With Applications",
which has been generally recognized in this group as one of the best texts
available:

"Object-oriented programming is a method of implementation in which programs
are organized as cooperative collections of objects, each of which represents
an instance of some class, and whose classes are all members of a hierarchy
of classes united via inheritance relationships."

He quotes Cardelli and Wegner's definition:

"A language is object-oriented if and only if it satisfies the following
requirements:

- It supports objects that are data abstractions with an interface of named
  operations and a hidden local state

- Objects have an associated type (class)

- Types (classes) may inherit attributes from supertypes (superclasses)"

Ada lacks attribute #3 and is therefore not an object-oriented language.
Languages that satisfy the first two clauses but not the third are called
by Cardelli and Wegner "object-based" languages; Ada is an object-based
language.  C++ satisfies all three attributes.

Eiffel certainly has some nice features lacking in C++, but then C++
has some nice features lacking in Eiffel.  And of course it is possible
to write bad programs in any language.


--
Joe Buck
jbuck@galileo.berkeley.edu	 {uunet,ucbvax}!galileo.berkeley.edu!jbuck	

eachus@aries.mitre.org (Robert I. Eachus) (03/15/91)

In article <11966@pasteur.Berkeley.EDU> jbuck@galileo.berkeley.edu (Joe Buck) writes:

   C++ and Eiffel are full object-oriented languages; Ada is not....
   [lots deleted]

   "A language is object-oriented if and only if it satisfies the following
   requirements:

   - It supports objects that are data abstractions with an interface of named
     operations and a hidden local state

   - Objects have an associated type (class)

   - Types (classes) may inherit attributes from supertypes (superclasses)"

   Ada lacks attribute #3 and is therefore not an object-oriented language.
   Languages that satisfy the first two clauses but not the third are called
   by Cardelli and Wegner "object-based" languages; Ada is an object-based
   language.  C++ satisfies all three attributes... [more deleted]

   I guess you never heard of (Ada) derived types?  There are some
problems with using derived types to implement Smalltalk like class
heirarchies, these are being addressed in Ada 9X. There are also
people like me who prefer to build class heirarchies using generics,
and you build such heirarchies in a much different fashion than in
Smalltalk, but Ada is definitely an OOP.

   Now if you want to argue about whether multiple inheritance in Ada
(or any language) is a good idea, that is subject to debate.

--

					Robert I. Eachus

with STANDARD_DISCLAIMER;
use  STANDARD_DISCLAIMER;
function MESSAGE (TEXT: in CLEVER_IDEAS) return BETTER_IDEAS is...

jordan@aero.org (Larry M. Jordan) (03/16/91)

"Something's" missing from this definition of OOP--Dynamic binding.
(also, OOPs don't require classes--see SELF).  I'd say we dismiss
the term OOP and talk about langs in term of features.

Inh. and dyn. binding are not enough.  There are other language
features I find desirable--parameterized types and exceptions.  What 
about library management?  What about environements with integrated 
debuggers? (I'd sure hate to build anything bigger than a hello world 
program without a library manager.  ).  Some may finding tasking essential.  
I did a comparison of lang. features a while back.  Some may find 
this interesting. 

Language	Inheritance	DynamicBnd'g	Type[T]	Except.	Task 	LibMgr
C++		yes-MI/class	yes-vft		no[1]	no[2]	no	hand
Objective-C	yes-SI/class	yes-tlu		no	no	no	hand
Smalltalk	yes-SI/class	yes-tlu		no	no	no	?
SELF		yes-MI/object	yes-?		no	no	no	?
TP6.0		yes-SI/class	yes-vft		no	no	no	auto
Eiffel		yes-MI/class	yes-vft		yes	yes	no	auto
Ada		yes[3]		no		no[4]	yes	yes	auto

MI=mult. inh.
SI=single inh.
vft=virtual function tables
tlu=run-time table lookup with caching
class=class-based inh.
object=object-based inh.
hand=hand crank a make file
auto=automated
?=don't know

[1] available next version
[2] available next version?
[3] derived types are not assignment compatible without casting.
[4] parameterized procedures and packages but not types		

I hate the C'ness of C++, but I find myself implementing many things in
C++ just because of inheritance and dynamic binding.  If Ada is ever
to become mainstream (and I seriously hope it does) inheritance and
dyn. binding had better be incorporated into the language.

--Larry

craig@elaine35.Stanford.EDU (Craig Chambers) (03/16/91)

In article <1991Mar15.224626.27077@aero.org> jordan@aero.org (Larry M. Jordan) writes:
>Language	Inheritance	DynamicBnd'g	Type[T]	Except.	Task 	LibMgr
>C++		yes-MI/class	yes-vft		no[1]	no[2]	no	hand
>Objective-C	yes-SI/class	yes-tlu		no	no	no	hand
>Smalltalk	yes-SI/class	yes-tlu		no	no	no	?
>SELF		yes-MI/object	yes-?		no	no	no	?
>TP6.0		yes-SI/class	yes-vft		no	no	no	auto
>Eiffel		yes-MI/class	yes-vft		yes	yes	no	auto
>Ada		yes[3]		no		no[4]	yes	yes	auto

I'd argue that the mechanism for implementing dynamic binding is not
so important, certainlyt not as important as many of the other
features you list.  But if you want to include it, Self's
implementation of dynamic binding is based on hash table lookup with
possible optimizations like in-line caching (also present in the
ParcPlace Smalltalk-80 implementation), static binding, and inlining.
I'd also guess that Eiffel uses tlu instead of vft to implement
dynamic binding.

Both Smalltalk and Self support multiple lightweight communicating
threads within a single address space, so those values should probably
read "yes" (depending on what you mean by "Task").

Also, both Smalltalk and Self implementations are effectively one big
library manager.  If you require some sort of separate compilation,
then they don't have it, but if you just want automatic linking and
compilation after programming changes or the ability to invoke code of
objects/classes written by other people, then they are nice
environments.

-- Craig Chambers

P.S.  The eariler mention of "Cardelli and Wegner's" definition of OO
should be attributed to just Wegner.  Wegner's definitions are overly
narrow and constrained in my view, and I believe that Cardelli would
want to be able to form his own definitions.

johnson@cs.uiuc.EDU (Ralph Johnson) (03/17/91)

|> Language	Inheritance	DynamicBnd'g	Type[T]	Except.	Task 	LibMgr
|> Smalltalk	yes-SI/class	yes-tlu		no	no	no	?

The latest versions of Smalltalk-80 from ParcPlace (ObjectWorks for
Smalltalk 2.5 and 4.0) have exceptions.  Smalltalk-80 has always had
processes and semaphores, but the user interface was not compatible
with running things in the background until 4.0.  I know, because I
hacked the user interface to make background processes useful.

There have been several versions of Smalltalk-80 that had multiple
inheritance, but the Smalltalk implementors never cared for it very
much so it is not well supported and probably isn't in the ParcPlace
official versions.

Last, but not least, I'm implementing (more accurately, my students
are implementing) a type system for Smalltalk, so someday that column
may change, too.

Ralph Johnson -- University of Illinois at Urbana-Champaign

ted@grebyn.com (Ted Holden) (03/17/91)

Productivity of C++ users will vary according to skills, experience
levels, tools available, such as the fabulous new Borland interface, and
the task at hand.  An idea of productivity in Ada projects may be had
from the Feb. 11 issue of Federal Computer Week:

   "The GSA Board of Contract appeals issued a ruling last month that
   could effect how the military evaluates the cost effectiveness of Ada
   software.

   "The board upheld a decision by the Air Force to award a contract to
    a high-priced bidder based on a measure of productivity that equals
    three lines of Ada code per programmer per day.

    "A lower priced bidder, and others in the Ada community, said this
    standard is much too low.  The protester in the case, DynaLantic
    Corp, offered an average of ten lines of code per day per
    programmer.

    "Three lines of code per day is absurd [as if ten wasn't], said
    Ralph Crafts, editor of a newsletter on Ada, and an expert
    witness for the protester.....

Whether any realistic combination of events exists which could reduce
Pascal, C, or C++ programmers to this level of productivity is anybody's
guess;  my own opinion is that most C programmers would require a bullet
through the brain to be brought to such a level.

The really comical thing about this is the way in which Ada gurus cite
"productivity" as the main advantage of Ada.  Apparently, they use the
phrase in somewhat the same vein as Mr. Hussein uses terms like "moral
victory".

Ted Holden

jls@rutabaga.Rational.COM (Jim Showalter) (03/17/91)

Semantics. An object is defined as something with state that suffers
actions. Thus, a Boolean global variable is an object, and two seconds
of reflection will tell you that just about anything qualifies as an
object. Ada does a fine job of representing objects, thus, it is
object-oriented.

But not, of course, by THIS definition:

>"A language is object-oriented if and only if it satisfies the following
>requirements:

>- It supports objects that are data abstractions with an interface of named
>  operations and a hidden local state

>- Objects have an associated type (class)

>- Types (classes) may inherit attributes from supertypes (superclasses)"

But wait, the only thing missing is #3 (by the way, derived types get you
part way there). And #3 doesn't even CONTAIN the word "object". What it
talks about is "inheritance".

Aha! So the argument comes down to whether a language is inheritance
oriented or not, which I view as much more significant distinction.
Object-oriented certainly includes Ada. Inheritance-oriented certainly
does not include Ada (where are you, 9x?).

Personally, I'm much more interested in whether or not a language is
SOFTWARE ENGINEERING oriented or not. Ada and C++ certainly qualify,
with spec/body separation, formalization of interface, opaque types,
strong typing, etc. Languages like C and FORTRAN and COBOL don't make
the cut.
--
***** DISCLAIMER: The opinions expressed herein are my own. Duh. Like you'd
ever be able to find a company (or, for that matter, very many people) with
opinions like mine. 
              -- "When I want your opinion, I'll read it in your entrails."

jls@rutabaga.Rational.COM (Jim Showalter) (03/17/91)

>I hate the C'ness of C++, but I find myself implementing many things in
>C++ just because of inheritance and dynamic binding.  If Ada is ever
>to become mainstream (and I seriously hope it does) inheritance and
>dyn. binding had better be incorporated into the language.

Enlighten me. How is it that many of the largest software systems
ever attempted--including all of the flight control software for
North America and all the software for the Space Station--are being
written in Ada, even though Ada doesn't have "dynamic binding"?

Second question: assume Ada got dynamic binding tomorrow. What could
be done with it that can't be done with it today?
--
***** DISCLAIMER: The opinions expressed herein are my own. Duh. Like you'd
ever be able to find a company (or, for that matter, very many people) with
opinions like mine. 
              -- "When I want your opinion, I'll read it in your entrails."

ftpam1@acad3.alaska.edu (MUNTS PHILLIP A) (03/17/91)

In article <1991Mar16.205228.4268@grebyn.com>, ted@grebyn.com (Ted Holden) writes...
>Productivity of C++ users will vary according to skills, experience
>levels, tools available, such as the fabulous new Borland interface, and
>the task at hand.  An idea of productivity in Ada projects may be had
>from the Feb. 11 issue of Federal Computer Week:

..

>    "Three lines of code per day is absurd [as if ten wasn't], said
>    Ralph Crafts, editor of a newsletter on Ada, and an expert
>    witness for the protester.....
> 
>Whether any realistic combination of events exists which could reduce
>Pascal, C, or C++ programmers to this level of productivity is anybody's
>guess;  my own opinion is that most C programmers would require a bullet
>through the brain to be brought to such a level.

..

     Actually, 10 lines per day isn't all that unreasonable, averaged over
a year for example.  The keyword is AVERAGED: on a good day I may write
hundreds of lines of code; I may spend the next day trying to find an 
obscure bug in one of those lines.

     A programmer also does a lot of other things like attending meetings,
reading documentation, writing documentation, studying a problem, answering
the phone, filling out forms, reformatting that [censored] hard disk, etc.
Then there is waiting for the network to come back up, putting paper in the
laser printer...

     In theory, you are supposed to spend most of your time DESIGNING rather
than CODING, as well.  I have spent days perfecting algorithms that were set
down in code in hours or even minutes.

     In the ideal case, the manager would chain his slaves, er, employees to
their computer until the job is done.  In practice things don't work out
that way.  The whole idea of "lines per day" is pretty artificial anyway.
(The contract says x lines per day so we'll just pad things out with a little
whitespace...)  What matters is whether a product is delivered on time or
not, and if it meets the spec.

CAVEAT: Most of what I do is in assembly language or Turbo Pascal.  I don't
particularly like C or C++ (or assembly language for that matter) and I haven't
found an Ada compiler that generates decent code.  (I am constrained to the
low end of the marketplace, being self- rather than government-employed.)

Philip Munts N7AHL
NRA Extremist, etc.
University of Alaska, Fairbanks

rreid@ecst.csuchico.edu (Ralph Reid III) (03/17/91)

In article <1991Mar16.205228.4268@grebyn.com> ted@grebyn.UUCP (Ted Holden) writes:
> . . .
>from the Feb. 11 issue of Federal Computer Week:
>
>   "The GSA Board of Contract appeals issued a ruling last month that
>   could effect how the military evaluates the cost effectiveness of Ada
>   software.
>
>   "The board upheld a decision by the Air Force to award a contract to
>    a high-priced bidder based on a measure of productivity that equals
>    three lines of Ada code per programmer per day.
>
>    "A lower priced bidder, and others in the Ada community, said this
>    standard is much too low.  The protester in the case, DynaLantic
>    Corp, offered an average of ten lines of code per day per
>    programmer.
>
>    "Three lines of code per day is absurd [as if ten wasn't], said
>    Ralph Crafts, editor of a newsletter on Ada, and an expert
>    witness for the protester.....
> . . .


I don't know where these companies are digging up these kind of
unproductive machine operators (I hesitate to call them real
programmers), but they would never get through the computer science
program here at Chico State.  It kind of makes me wonder what schools
they came from, if they even have degrees.  The kind of productivity
discussed in this article sounds like the level I might expect from
beginning programming students at a junior college.  I would like to
know what in this world could reduce a serious programmer's
productivity to these levels.

-- 
Ralph.  SAAC member.
ARS: N6BNO
Compuserve: 72250.3521@compuserve.com
email: rreid@cscihp.ecst.csuchico.edu

csq031@umaxc.weeg.uiowa.edu (03/18/91)

In article <1991Mar17.142756.25676@ecst.csuchico.edu> rreid@ecst.csuchico.edu (Ralph Reid III) writes:
>I don't know where these companies are digging up these kind of
>unproductive machine operators (I hesitate to call them real
>programmers), but they would never get through the computer science
>program here at Chico State.  It kind of makes me wonder what schools
>they came from, if they even have degrees.  The kind of productivity
>discussed in this article sounds like the level I might expect from
>beginning programming students at a junior college.  I would like to
>know what in this world could reduce a serious programmer's
>productivity to these levels.
>
>-- 
Lines of code per day is an absurd measure at best.  Using it in contracts
is just a way of lulling people who care about such things into thinking
that programmers (and software companies) know what they're doing and
that their output is quantifiable.

The actual situation (as most people know) is that when you set out to
write something non-trivial, that hasn't been done already, you're
much more like Lewis and Clark setting out in canoes than you are like
a machinist putting a block of steel into the lathe. This scares the
living shit out of bean counters.

But anyway, if you measure lines/day after the fact, i.e after the
program has been designed, written, tested, documented, beta-tested, and
accepted as done by the customer, you'll find 3-10 lines per code a
day per programmer to be fairly respectable.  If the program really
works well, most customers wouldn't care if they only did 1 line per
day, so long as it was finished in a timely manner.


--
             Kent Williams --- williams@umaxc.weeg.uiowa.edu 
"'Is this heaven?' --- 'No, this is Iowa'" - from the movie "Field of Dreams"
"This isn't heaven, ... this is Cleveland" - Harry Allard, in "The Stupids Die"

jls@rutabaga.Rational.COM (Jim Showalter) (03/18/91)

>Whether any realistic combination of events exists which could reduce
>Pascal, C, or C++ programmers to this level of productivity is anybody's
>guess;  my own opinion is that most C programmers would require a bullet
>through the brain to be brought to such a level.

1) If those Pascal, C, or C++ programmers were required to operate under
   DoD-Std-2167/A standards, their productivity would drop by two orders
   of magnitude automatically. This is not a language issue.

2) SLOC/day tends to decrease drastically as a function of complexity.
   Complexity depends on a number of factors, including sheer size,
   number of programmers, number of configurations, number of targets,
   number of hosts, number of languages, number of contractors, etc etc
   etc.

   I've been meaning to ask Mr. "I Live In The Real World" Holden this
   question for two years: how complex are the systems on which Mr. Holden
   works? If the answer is that he's working with three other guys in a garage
   writing device drivers for PC's, I'm sorry, but I'm really not very
   impressed--one should EXPECT high SLOC/day rates for what is, in essence
   a solved problem (e.g. programming in the small). It is programming in
   the large that is still a black art for most companies, and it is on
   such projects that low productivity rates are experienced. That Ada
   tends to be the language of choice on such projects should not be used
   against it when the rates are low--the REASON Ada is the language of
   choice is that other languages, including COBOL, FORTRAN, and C, are
   simply not up to the job.

3) I have personally contracted on a 4,000,000 SLOC C/C++ project
   that was lucky to achieve 3 lines a day, on a GOOD day. The programmers
   had not, as Mr. Holden claims, been shot in the head--they were just
   suffering the same liabilities as anybody else who is trying to build
   extremely large systems using a stone-knives-and-bearskins kind of
   technology and paradigm.

4) I am able to program in Pascal, C, C++, and Ada. Can Mr. Holden make
   the same claim, or does he damn Ada from, as I suspect is the case,
   a position of relative ignorance? He certainly SOUNDS ignorant.

>The really comical thing about this is the way in which Ada gurus cite
>"productivity" as the main advantage of Ada.

Productivity rates range from no gain to order of magnitude gains. We
have lots of success stories backed up by factual accounting data if
Mr. Holden would care to read them.

But I suspect he would find the truth inconvenient.
--
***** DISCLAIMER: The opinions expressed herein are my own. Duh. Like you'd
ever be able to find a company (or, for that matter, very many people) with
opinions like mine. 
              -- "When I want your opinion, I'll read it in your entrails."

mfeldman@seas.gwu.edu (Michael Feldman) (03/18/91)

In article <1991Mar17.142756.25676@ecst.csuchico.edu> rreid@ecst.csuchico.edu (Ralph Reid III) writes:
>>    a high-priced bidder based on a measure of productivity that equals
>>    three lines of Ada code per programmer per day.
>>
>>    "A lower priced bidder, and others in the Ada community, said this
>>    standard is much too low.  The protester in the case, DynaLantic
>>    Corp, offered an average of ten lines of code per day per
>>    programmer.
>>
>>    "Three lines of code per day is absurd [as if ten wasn't], said
>>    Ralph Crafts, editor of a newsletter on Ada, and an expert
>>    witness for the protester.....
>> . . .
Hmmmm. Interesting to see this pop up on the net. It happens I was also
involved in this case (somehow they got the idea I was an expert in
something) and so have read the decision. You may recall that I posted
a question about productivity measures a while back; now you know why.

The answers I got were varied and interesting; the only generalization
in them was that nothing can be generalized. Going back to the Ed Yourdon et al
books of 15 years ago, one finds that in those days end-to-end software
productivity was cited in the neighborhood of 5-10 LOC per day, independent
of language. Yourdon et al were making those numbers an argument for 
high-level languages, because 5-10 assembler instructions and 5-10 lines
of HLL code clearly have different functionality, yet the studies showed
that the 5-10 was "language-independent." The purpose of "structured
programming" and all that was to raise the productivity. More recent
reports showed that indeed the range from doubled to 10-20 after using
these "new programming technologies" as they were called then.

In preparing for the case I dug into the literature, good professor that
I am. Everything I could find in the literature on Ada productivity on real
projects pointed somewhere in the neighborhood of 30-40 LOC/day. Remember
that these are _average_ numbers, end-to-end, including documentation and
all that. (sources: Reifer's productivity studies, WAdaS and TRi-Ada papers,
and the like). The 10 LOC/day offered by the company in question for a
relatively small project (~15k LOC) was, in my opinion, _quite conservative_
given the numbers I just cited. The company knew the application area.

That the USAF people should have derided the 10 LOC estimate as _overly 
optimistic_, preferring an absurdly low number of 3 LOC, sends me a
message that the government (or at least some people in it) are afraid
of Ada and/or not really interested in productivity. I'm under nondisclosure
or I'd post some details that are so sad they're funny. The administrative
law judge's decision is public record, so it's sage to comment. This guy
not only upheld the award (I figured he would because this was a contract 
where USAF had a lot of discretion) but gratuitously accused the company
of "low-balling," that is, bidding TOO-HIGH productivity in order to get the
contract.

Given the "where cost-effective" clause in the legislative mandate for
Ada, this case could be an awful precedent, because the government can
accuse anyone bidding high productivity (arguing Ada's cost-effectiveness)
of low-balling. Those in the government determined to resist any kind
of progress (it doesn't even have to be _Ada_ progress) can have a field
day shooting down high(er) productivity estimates. Let's hope most of the
government is too smart or has too high a sense of integrity to pull
tricks like this. Stay tuned: there may be appeals in the case.

My work in this weird case convinced me even more that this whole crazy
industry still doesn't really know how to estimate things. LOC/day is
not a terrific measure of anything. It's all we seem to have, though.
And people use it as though it were gospel.

Mike Feldman

matt@mozartasd.contel.com (Matthew S. Granger) (03/18/91)

In article <1991Mar17.142756.25676@ecst.csuchico.edu>, rreid@ecst.csuchico.edu (Ralph Reid III) writes:
|> 
|> I don't know where these companies are digging up these kind of
|> unproductive machine operators (I hesitate to call them real
|> programmers), but they would never get through the computer science
|> program here at Chico State.  It kind of makes me wonder what schools
|> they came from, if they even have degrees.  The kind of productivity
|> discussed in this article sounds like the level I might expect from
|> beginning programming students at a junior college.  I would like to
|> know what in this world could reduce a serious programmer's
|> productivity to these levels.
|> 

Ralph, Ralph, Ralph

You are missing the point. We unproductive machine operators could spit out
hundreds of lines of code a day if all we had to worry about was our grade 
point average. When producing a million line Command and Control System or maybethe life critical software for the avionics of airliner there is more going
on than just writing code. Documentation, Design meetings, Documentation, more
Design meetings, Documentation, code writing, code walk throughs, unit testing,
more testing, integration testing ad nauseam......

And then your company gets gobbled up by a monolithic coporation.
-- 
Matt Granger 		    
Contel Federal Systems      (I guess that's GTE now)
Chantilly, VA 22301	    

The opinions expressed here are in no way reflective of the policies of Contel Federal Systems, its parent company or any of its subsidiaries. All statements carry no warranties implied or otherwise.

simonian@x102c.ess.harris.com (simonian richard 66449) (03/18/91)

In article <1991Mar17.142756.25676@ecst.csuchico.edu> rreid@ecst.csuchico.edu
(Ralph Reid III) writes:
>I don't know where these companies are digging up these kind of
>unproductive machine operators (I hesitate to call them real
>programmers), but they would never get through the computer science
>program here at Chico State.  It kind of makes me wonder what schools
>...

PLEASE keep in mind that when we talk about LOC productivity in the
Real World, it is averaged over the entire software lifecycle, including
requirements analysis.  The actual time spent coding may be as little as
20% of the lifecycle.  And obviously during that coding period, the
programmers are writing a lot more than 10 LOC/day.  There is substantial
evidence (eg, the SoftCost database from RCI) showing that Ada programmers
become more productive than other languages.  I would contend that this
would hold true for C++ as well on a large project (I've used both 
extensively).


Richard Simonian
Harris Space Systems Corp.  407-633-3800
simonian@x102c.ess.harris.com
rsimonian@nasamail.nasa.gov

arny@cbnewsl.att.com (arny.b.engelson) (03/19/91)

In article <1991Mar16.205228.4268@grebyn.com> ted@grebyn.UUCP (Ted Holden) writes:
>   "The GSA Board of Contract appeals issued a ruling last month that
>   could effect how the military evaluates the cost effectiveness of Ada
>   software.
>
>   "The board upheld a decision by the Air Force to award a contract to
>    a high-priced bidder based on a measure of productivity that equals
>    three lines of Ada code per programmer per day.
>
>    "A lower priced bidder, and others in the Ada community, said this
>    standard is much too low.  The protester in the case, DynaLantic
>    Corp, offered an average of ten lines of code per day per
>    programmer.
>
>    "Three lines of code per day is absurd [as if ten wasn't], said
>    Ralph Crafts, editor of a newsletter on Ada, and an expert
>    witness for the protester.....
>
>Whether any realistic combination of events exists which could reduce
>Pascal, C, or C++ programmers to this level of productivity is anybody's
>guess;  my own opinion is that most C programmers would require a bullet
>through the brain to be brought to such a level.
>
>Ted Holden

You don't quote prices for many large Air Force proposals, do you Ted?
A quoted productivity rate of 10 lines per staff day for the entire
development cycle is not unusual, REGARDLESS OF THE LANGUAGE.  The
apparently dismal productivity is caused by time taken to do the many
other things required by the customer (Air Force), including preparing
for and holding requirements, design, and test reviews, preparing the
many documents required, etc.  I have seen very similar productivity
figures for C, Ada, and other languages in this type of job.  The actual
coding phase goes along merrily at 50 to 200 lines per day (depending
on the programmer, and not including overtime :-) ).  Too bad the Air
Force won't simply take our word for it that the code works, and that
it does everything they want it to, and that if it ever has to be
changed, we'll all be around to make those changes for them.

You really ought to look deeper into things before posting an inflammatory
article based on a one column article in Federal Computer Week (or
wherever).  By the way, you (and everyone else) should go read the
Ada 9X Mapping Document and the Mapping Rationale Document, since the
availability of that language spells the downfall of C/C++  :-).
Wait, it's a joke, stop the language war, it's a joke.  But the documents
ARE very interesting reading.

  -- Arny Engelson   att!wayback!arny   (arny@wayback.att.com)

martin@edwards-saftd-2.af.mil (03/19/91)

In article <1991Mar16.205228.4268@grebyn.com>, ted@grebyn.com (Ted Holden) writes:
> 
> The really comical thing about this is the way in which Ada gurus cite
> "productivity" as the main advantage of Ada.  Apparently, they use the
> phrase in somewhat the same vein as Mr. Hussein uses terms like "moral
> victory".
> 

Ada was not developed with the goal of improving raw "productivity" in terms of
lines of code per day.  Rather, it was designed to provide reliable and
maintainable code for large, long-lived, continuously evolving, performance
constrained, highly reliable embedded systems. In that harsh environment,
software maintenance costs are several times the cost of the original
development.

Ada was designed to reduce lifecycle costs by reducing maintenance costs.  This
was done at the expense of increased costs during the development of original
software. It is intended that the need to develop original software will be
substantially reduced through software component reuse.

This is a long term solution to a long term problem.  We are still some time
away from proving that it is the proper solution, even though many now believe
that it is.  

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - 
Gary S. Martin                !  (805)277-4509  DSN 527-4509
6510th Test Wing/TSWS         !  Martin@Edwards-SAFTD-2.af.mil
Edwards AFB, CA 93523-5000    ! 
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - 

stachour@sctc.com (Paul Stachour) (03/19/91)

>>from the Feb. 11 issue of Federal Computer Week:
>>
>>   "The board upheld a decision by the Air Force to award a contract to
>>    a high-priced bidder based on a measure of productivity that equals
>>    three lines of Ada code per programmer per day.
  ...
> I would like to
> know what in this world could reduce a serious programmer's
> productivity to these levels.
>email: rreid@cscihp.ecst.csuchico.edu

-----

   This question reminds me of the discussion in Gerald Weinburg's
"The Psychology of Computer Programming".  If I remember it right,
it was a discussion between two programmers. Programmer B had just
completed reprogramming a task originally given to programmer A.
A was complaining about the speed of B's code, pointing out that
A's original code was much faster.  B replied with something to
the effect that << But your code isn`t correct.  Mine is. If mine
doesn't ahve to be correct, I can made it as fast as you want.>>

   The point about productivity is that it can only be properly
measured for "correct" code.  And knowing what the characteristics
of the code is makes is correct or not.  For example, suppose I
have a set of C-code.  I compile it on 6 different C-compilers.
It compiles without error, executes, and gives the same "right"
answer on all.  Now I give this code to 7 different programmers.
They each compile, link, load, and execute it.  For those 7
programmers, each of which used a different C-compiler, they all
gave the wrong answers.  Yes, I may have had great-prodcutivity
in my original code.  But if everyone else gets the wrong answers
due to the fact that I don't know about a critical environmental
component, or fail to document it, or fail to note that most
C-compilers for physical-memory multi-tasking systems will not
get the answer right, is my code correct?  And given the time
for all of them to discover it, how much prodcutivity was there
really?

   I can give you a great lines-of-code productivity feature if:

  1)  I can presume all my inputs are correct.
  2)  I can presume all the subroutines I call are correct.
  3)  I can presume that the hardware never flips bits.
      Nor are there any other hardware errors.
  4)  I can presume that the compiler always generares correct code.
  5)  I can presume that I have enough memory and CPU cyles so
      that I can always think "clarity" instead of "make it fit"
      or "do swapping yourself".
  6)  All my users speak English so that I don't have to worry about
      how to handle multi-lingual error messages.
  7)  My cleanup routines will always get called on power-down
      or any other kill-condition so that my disk-data will
      always be consistant.
  8)  The client has a clear, simple, and unabigous specification so
      that I can spend all my time designing and implementing instead
      of trying to figure out what he really needs, instead of what he
      says he wants (which is full of contradictions).
  9)  I don't have to be compatible with existing applications that
      process the same data.
 10)  I have no numerical problems with overflow or zero-division.
 11)  I am working on a small enough project, in one single location,
      that I can get questions answered promptly.
 12)  I am allowed to work with hardware that is documented, instead of
      "new improved hardware" for which no documentation yet exists,
      and I have to play trial-and-error to see what it does now
      (which may be very different next week).
 13)  When I'm forced onto new hardware and software in the middle
      of a project, they are actually compatible with the old.

  All of these, and more, are assumtions that are often false.

  Sure, in the context of a simple, university-level, class assignment
that got thrown away as soon as done, I have great productivity also.
How many students at the university where you go/teach could write a
resuable component for something as simple as a stack?  I'd bet
that 95% couldn't.  I base that on me, my colleages, and those I teach.
I base that on the fact that most hardware can't even do a shift
properly (well, at least the way that the documentation says it does it).

  If one accepts the premise that 90% of good-code is for the conditions
that happen less than 10% of the time (personally, I'll even say 1%
of the time), then one can get a 10-fold increase in productivity by
merely ignoring error-conditions.  And if you buy component code
from a commercial vendor, as we sometimes do, you can make some
pretty big discoveries of bugs.  We bought some code. We've used it
for several years.  In the first year, we found only a few bugs.
In the second year, we really started to use to code, and found a few
mode.  In the third year, we discovered that there were enough 
seldom-happen items not covered by the code that it was not useful.
And so we are re-writing it.  And this time (I hope) we will test it.

  I could go on and on about the groups that I've seen and worked
close-to that claimed to have "great productivity".  A few did.
But most merely turned out code fast.  And that code was full of
headaches for long-time to come.

  Enough for now.

  ...Paul
-- 
Paul Stachour          SCTC, 1210 W. County Rd E, Suite 100           
stachour@sctc.com          Arden Hills, MN  55112
                             [1]-(612) 482-7467

adam@visix.com (03/19/91)

[jordan]I hate the C'ness of C++, but I find myself implementing many things in
[jordan]C++ just because of inheritance and dynamic binding.  If Ada is ever
[jordan]to become mainstream (and I seriously hope it does) inheritance and
[jordan]dyn. binding had better be incorporated into the language.

[jls]	Enlighten me. How is it that many of the largest software systems
[jls]	ever attempted--including all of the flight control software for
[jls]	North America and all the software for the Space Station--are being
[jls]	written in Ada, even though Ada doesn't have "dynamic binding"?

Surely we all know that there is no simple reason why languages
succeed or fail.  The presence or absence of a single feature almost
never makes a language either necessary or insufficient.  You can't
even say success depends entirely on technical factors; you must also
consider politics, history, and plain dumb luck.

[jls]	Second question: assume Ada got dynamic binding tomorrow. What could
[jls]	be done with it that can't be done with it today?

No real-world language is so deficient that things "can't be done".
Some people feel that their problems are more easily solved using a
language with dynamic binding, so they see Ada as less convenient.
This is a very different question from the one you raise.

That's all I want to say.

Adam

jls@rutabaga.Rational.COM (Jim Showalter) (03/19/91)

>You really ought to look deeper into things before posting an inflammatory
>article based on a one column article in Federal Computer Week (or
>wherever). 

You don't understand--this is Ted "I Live in the Real World" Holden, who
has never let his brute ignorance stand in the way of his expressing a
groundless opinion.

P.S. Ted believes the earth is 600 years old.
--
***** DISCLAIMER: The opinions expressed herein are my own. Duh. Like you'd
ever be able to find a company (or, for that matter, very many people) with
opinions like mine. 
              -- "When I want your opinion, I'll read it in your entrails."

klimas@iccgcc.decnet.ab.com (03/20/91)

In article <1991Mar17.142756.25676@ecst.csuchico.edu>, rreid@ecst.csuchico.edu (Ralph Reid III) writes:
> In article <1991Mar16.205228.4268@grebyn.com> ted@grebyn.UUCP (Ted Holden) writes:
>> . . .
>>from the Feb. 11 issue of Federal Computer Week:
>>
>>   "The GSA Board of Contract appeals issued a ruling last month that
>>   could effect how the military evaluates the cost effectiveness of Ada
>>   software.
>>
>>   "The board upheld a decision by the Air Force to award a contract to
>>    a high-priced bidder based on a measure of productivity that equals
>>    three lines of Ada code per programmer per day.
>>
>>    "A lower priced bidder, and others in the Ada community, said this
>>    standard is much too low.  The protester in the case, DynaLantic
>>    Corp, offered an average of ten lines of code per day per
>>    programmer.
>>
>>    "Three lines of code per day is absurd [as if ten wasn't], said
>>    Ralph Crafts, editor of a newsletter on Ada, and an expert
>>    witness for the protester.....
>> . . .
> 
> 
> I don't know where these companies are digging up these kind of
> unproductive machine operators (I hesitate to call them real
> programmers), but they would never get through the computer science
> program here at Chico State.  It kind of makes me wonder what schools
> they came from, if they even have degrees.  The kind of productivity
> discussed in this article sounds like the level I might expect from
> beginning programming students at a junior college.  I would like to
> know what in this world could reduce a serious programmer's
> productivity to these levels.
        Ten lines of code per man day is quite believable in a corporate
	environment!  $50-$75/line of code is typical even in non military
	applications (e.g. Mentor Graphics C++ based RELEASE 8.0 supposedly
	has about a million lines of code and cost the company $75million.

	I believe the difference in code quality, testing, documentation
	and the usual intergalactic corporate overhead are the problems.
	On the other hand I'm not so sure I'd want shareware in a cruise
	missle either!

> 
> -- 
> Ralph.  SAAC member.
> ARS: N6BNO
> Compuserve: 72250.3521@compuserve.com
> email: rreid@cscihp.ecst.csuchico.edu

klimas@iccgcc.decnet.ab.com (03/20/91)

> |> Language	Inheritance	DynamicBnd'g	Type[T]	Except.	Task 	LibMgr
> |> Smalltalk	yes-SI&MI/class	yes-tlu		no	no	yes	yes

1.)Smalltalk does support multiple inheritance.  The framework is still in place
in Smalltalk/V in class Behavior.  There are third party add-on 
products for MI on Smalltalk/V.  There are a number of simple "mixin"
examples that one can cite as arguments for MI but the real problem 
seems to be the added complexity of MI from a conceptual standpoint 
makes real world applications using MI difficult to manage.  A. Borning
and D. Ingalls documented this in "Multiple Inheritance in Smalltalk-80".
Proc. of National Conf on AI, 1982.

2.)There are several library management tools for Smalltalk.  They
range in price, functionality and utility all over the spectrum

Low end 

	Carleton APP Manager- part of Digitalk Goodies

medium 
	Instantiations App Manager for ST-80
	Zuniq App Manager for Digitalk ST and 
	Coopers&Lybrand Softpert Application Manager for Digitalk

high end
	Object Technology International ENVY Manager for multiuser 
	online Smalltalk CMS and library management.

One of these tools will do the job depending upon your Smalltalk library 
management needs.

3.)Some very impressive things are happening with Digitalk's Smalltalk/V-PM
running on OS/2 that warrant qualification as ST supporting true preemptive,
multithreaded multitasking.

jls@rutabaga.Rational.COM (Jim Showalter) (03/20/91)

Nice post! To your list of things that reduce productivity, I'd add:

-- Requirements never change in midstream

-- I'm not required to conform to 2167/A

In my experience--which seems to mirror yours--often the claims made
for SLOC/day turn out to be not so much for debugged, documented,
tested, fielded code, but for error-ridden trash that winds up as
shelfware.

In short, most SLOC/day rates are really just a measure of some hacker's
TYPING SPEED.
--
***** DISCLAIMER: The opinions expressed herein are my own. Duh. Like you'd
ever be able to find a company (or, for that matter, very many people) with
opinions like mine. 
              -- "When I want your opinion, I'll read it in your entrails."

ryer@inmet.inmet.com (03/20/91)

In this recap of the LOC/day issue, I think one point has gone unmentioned:

   LOC/day measured in DOD projects is *deliverable*, *product* lines per
   day.  Prototypes, things that had to be re-written, testcases, tools
   built to help produce the ultimate product, etc., don't count in the
   delivered code, but the labor that goes into them does count in the
   total person-days.  The size of this non-product and/or non-deliverable code 
   can be large in comparison the the deliverable product.

Mike Ryer
Intermetrics

jls@rutabaga.Rational.COM (Jim Showalter) (03/21/91)

I think multiple inheritance is a solution in search of a problem.
--
***** DISCLAIMER: The opinions expressed herein are my own. Duh. Like you'd
ever be able to find a company (or, for that matter, very many people) with
opinions like mine. 
              -- "When I want your opinion, I'll read it in your entrails."

garys@cs.tamu.edu (Gary W Smith) (03/21/91)

My own personal experiences on the matter:

  =>
    I used to work for General Dynamics on F-16 avionics.  This was
    pre-Ada Jovial but I think the arguments apply to Ada as well.
    I (and 30 others) spent over a year on thinks such as a specifications
    document, requirements document, design document, building a prototype
    in Pascal, PDL, etc. before we even started thinking about coding
    in Jovial.  The actual coding took less than 2 months, followed
    by an extensive code review and test period.

    Some of these self-employed, academic people just don't understand
    the development process for defense projects.

  <=
    I do feel, however, that there is a certain amount of waste in the
    process.  For the most part, the defense dept./GD feel that 2167A is 
    a concrete rule that must be followed.  What I want to know is where
    has it been shown that a document-driven waterfall model is the *best*
    process to follow in the design of software.

    Another problem that I saw at GD was the amount of charging.
    The Air Force paid by the hour.  GD was under no pressure to keep
    the hours down and their basic solution to any problem was to
    hire more people.  I realize that a lot of this does not hold
    anymore with the recent layoffs at GD.  I was there during the
    Reagan buildups and when local boy Jim Wright was still big in
    the House.

- gary

--
"Eliminate the whole degree-and-grading system and 
     then you'll get real education."
Robert M. Pirsig - Zen and the Art of Motorcycle Maintenance

depriest@dtoa1.dt.navy.mil (Depriest) (03/22/91)

In article <13570@helios.TAMU.EDU> garys@cs.tamu.edu (Gary W Smith) writes:

>    I do feel, however, that there is a certain amount of waste in the
>    process.  For the most part, the defense dept./GD feel that 2167A is
>    a concrete rule that must be followed.  What I want to know is where
>    has it been shown that a document-driven waterfall model is the *best*
>    process to follow in the design of software.

The main benefit that I see in 2167-type documentation does not lie
within the design/build phase but in the post-deployment software support
environment. The Navy's thrust is for complete organic software maintenance
capability for life of type. Software support responsibility typically
transitions after first deployment, which is several _years_ after the
software Critical Design Review. It's likely that the software engineers
who inherit the code at transition will not be the same folks who had
the review/approval authority at CDR. The extensive 2167 documentation
will help close the familiarity gap.


-Mike DePriest                              |  I had to lay off my
-Naval Aviation Depot Jacksonville          |  disclaimer after Dick
-depriest@dtoa1.dt.navy.mil                 |  Cheney cancelled A-12.

sdl@lyra.mitre.org (Steven D. Litvinchouk) (03/25/91)

In article <1991Mar15.224626.27077@aero.org> jordan@aero.org (Larry M. Jordan) writes:

> I hate the C'ness of C++, but I find myself implementing many things in
> C++ just because of inheritance and dynamic binding.  If Ada is ever
> to become mainstream (and I seriously hope it does) inheritance and
> dyn. binding had better be incorporated into the language.

As I understand it, single inheritance with constrained polymorphism
will probably be incorporated in Ada 9X.  The Ada community is now
aware that the old argument "Use Ada--it sure beats Fortran!" is
wearing thin.


--
Steven Litvintchouk
MITRE Corporation
Burlington Road
Bedford, MA  01730
(617)271-7753
ARPA:  sdl@mbunix.mitre.org
UUCP:  ...{att,decvax,genrad,necntc,ll-xn,philabs,utzoo}!linus!sdl
	"Where does he get those wonderful toys?"

westley@thunderchief.uucp (Terry J. Westley) (03/26/91)

In article <1991Mar17.142756.25676@ecst.csuchico.edu> rreid@ecst.csuchico.edu (Ralph Reid III) writes:
>In article <1991Mar16.205228.4268@grebyn.com> ted@grebyn.UUCP (Ted Holden) writes:
>>    "Three lines of code per day is absurd [as if ten wasn't], said
>>    Ralph Crafts, editor of a newsletter on Ada, and an expert
>>    witness for the protester.....
>
>I would like to
>know what in this world could reduce a serious programmer's
>productivity to these levels.

non-tailored DOD-STD-2167A

-- 
Terry J. Westley 
Calspan Corporation, P.O. Box 400, Buffalo, NY 14225
westley%planck.uucp@acsu.buffalo.edu | "planck!hercules!westley"@acsu.buffalo.edu

jls@rutabaga.Rational.COM (Jim Showalter) (03/26/91)

>The main benefit that I see in 2167-type documentation does not lie
>within the design/build phase but in the post-deployment software support
>environment. The Navy's thrust is for complete organic software maintenance
>capability for life of type. Software support responsibility typically
>transitions after first deployment, which is several _years_ after the
>software Critical Design Review. It's likely that the software engineers
>who inherit the code at transition will not be the same folks who had
>the review/approval authority at CDR. The extensive 2167 documentation
>will help close the familiarity gap.

I agree that this is the theory, but, sadly, much of the 2167/A documentation
I've been subjected to over the past several years has been essentially
worthless for understanding the documented system. When this happens,
the extensive documentation just makes a very expensive doorstop.
--
***** DISCLAIMER: The opinions expressed herein are my own. Duh. Like you'd
ever be able to find a company (or, for that matter, very many people) with
opinions like mine. 
              -- "When I want your opinion, I'll read it in your entrails."

mfeldman@seas.gwu.edu (Michael Feldman) (03/26/91)

In article <jls.669954776@rutabaga> jls@rutabaga.Rational.COM (Jim Showalter) writes:
   ... stuff deleted ...
>>software Critical Design Review. It's likely that the software engineers
>>who inherit the code at transition will not be the same folks who had
>>the review/approval authority at CDR. The extensive 2167 documentation
>>will help close the familiarity gap.
>
>I agree that this is the theory, but, sadly, much of the 2167/A documentation
>I've been subjected to over the past several years has been essentially
>worthless for understanding the documented system. When this happens,
>the extensive documentation just makes a very expensive doorstop.

Hmmm. Deja vu all over again. 20 years ago I worked for a company that
required flowcharts and all sorts of other paper junk to be submitted
when programs were turned over to production. This was not a contractor,
the software was all internal stuff. The gatekeeper of the production
library had a little form with nice little boxes to check.

The flowcharts were usually generated, after the program was readied
for production, by Autoflow. Remember Autoflow? The program that produced
a 20-page flowchart on a line printer to document a 1-page Fortran
routine? Yeah, that's the one. That one little program killed a _lot_
of trees.

Unfortunately I never had the guts to test my theory that it didn't
matter if the flowchart matched a _different_ program. I can tell you for
sure that _nobody_ ever read the charts; they were useless. The only
things that really mattered were the record-layout diagrams, to show the
file structures. Unfortunately these almost never matched the code...

Don't flame at this fuzzy-headed academic for not understanding. I'm not
down on documentation, 2167A or otherwise, as long as it's useful.

Mike

jls@rutabaga.Rational.COM (Jim Showalter) (03/27/91)

>Unfortunately I never had the guts to test my theory that it didn't
>matter if the flowchart matched a _different_ program. I can tell you for
>sure that _nobody_ ever read the charts; they were useless.

I'd be willing to bet your theory is correct: I know a guy who worked
at a Big Contractor who wrote an SDD that discussed, at one point,
non-existent conversion routines from furlongs to cubits. Nobody ever
noticed.
--
***** DISCLAIMER: The opinions expressed herein are my own. Duh. Like you'd
ever be able to find a company (or, for that matter, very many people) with
opinions like mine. 
              -- "When I want your opinion, I'll read it in your entrails."

ryer@inmet.inmet.com (03/30/91)

Long ago, someone at Intermetrics put a joke entry in the list of variables
in an 800 page formal design document for NASA.  They found it, and made us
take it out.  However, USUALLY design documents are only carefully read by 
separate IV&V contractors who only make sure that the document meets the 
standards for design documents.

Mike "Would rather look at the code" Ryer

depriest@dtoa1.dt.navy.mil (Depriest) (04/01/91)

In article <20600094@inmet> ryer@inmet.inmet.com writes:
>However, USUALLY design documents are only carefully read by
>separate IV&V contractors who only make sure that the document meets the
>standards for design documents.

I have recent experience with a major aircraft program under 2167 where
the organic IV&V agent (NWC China Lake) went over the design documents
with a fine-tooth comb and forced major improvements in the quality
of the _content_ - not just the compliance with the DID.

IMHO, the usefulness of the documentation produced by a developer is
directly related to how well the _customer_ understands what the documents
are for and how committed the _customer_ is to having them done:
   -  right,
   -  at the right time,
   -  for the right reasons.

I've been on both sides of the issue, and I know that I have a reluctance
to do things that I think have little added value to my tasking if the
customer doesn't express a real interest in them either. However, if
the customer believes in the 'why' of the tasks, I'm willing to do them
to the best of my ability as long as she's willing to pay.

This is a common issue in more than one facet of DoD contracting. We
face the same problems every time we ask for Logistic Support Analysis
(MIL-STD-1388).

- Mike DePriest (depriest@dtoa1.dt.navy.mil)
- Software Technology Branch Manager
- Naval Aviation Depot Jacksonville