"VAXTSD::SCHILLING1@atc.alcoa.COM".UUCP (06/16/87)
Reply-To: SCHILLING%ATC.ALCOA.COM@RELAY.CS.NET This may seem incredibly naive to those who have long experience in the Defense contracting game, but it deserves saying anyway: If the benefits of software reuse are real, then -- - A product that makes significant reuse of software should be ready sooner than one that represents all new code. - Saying than another way, a product that makes significant reuse of software could be started later in the system development cycle. - The reliability of a product that makes significant reuse of software should be higher that an equivalent product that represents all new code. - Other quality characteristics of a product that makes significant reuse of software should be much more predictable than one that represents all new code. - The maintenance cost of the product that makes significant reuse of software should be lower. If all of the above is really true, then the customer should be willing to pay MORE for a product that makes significant reuse of software than for one that represents all new code. Pete Schilling CSNET: SCHILLING@ATC.ALCOA.COM Technical Specialist PHONE: 412/337-2724 Process Control and PAPER MAIL: Aluminum Co. of America Computer Technology Divn. Alcoa Center, PA 15069 Alcoa Laboratories USA
schimsky@NADC.ARPA (D. Schimsky) (06/17/87)
As a "Government" manager of major software developments, let me offer that most of comments I've seen here relative to "re-use" appear academic. 1. No, I would NOT pay more for a product that takes advantage of re-use, since virtually my entire concern is to LOWER the price of software development. There is no way, short of pushing my nose in it, that I will believe a paper analysis showing me how much I will save tomorrow if only I give you more today. 2. Short of absolute mandate (translate to "I order you to...") the detailed methods employed in the development of software are made on a very local level by the people first in line for the responsibilty. These are the Project Engineers, or Program Officers, or some equivalent title. I believe I know what their opinion is concerning re-use, but it would be interesting to hear them speak. 3. By what magic do you expect an organization as large and diverse and, sometimes, divergent, to adopt a methodology that begs for standardization, cooperation, flexibility and a non-parochial attitude, when you would do well to find one, small, isolated group, anywhere, doing the same?
munck@MITRE-BEDFORD.ARPA.UUCP (06/18/87)
I don't know if I've ever seem a more straightforward explanation of the DoD's Software Crises than this: > As a "Government" manager of major software developments, ... There > is no way, short of pushing my nose in it, that I will believe a > paper analysis showing me how much _I_ will save tomorrow if only I > give you more today. Remember that the _I_ (underlining mine) is really "the poor slob who'll be maintaining this system years after I get a job in Industry." Remember too, that EVERYBODY'S project is too important (major) to take the risk of being the first to use any new technique. Of course, with FORTRAN, JOVIAL, CMS-2, and/or the Chinese Army approach to programming, there's no risk. We KNOW that it will be late, over budget, and buggy. No surprises, no risks. -- Bob Munck, MITRE
dday@mimsy.UUCP (Dennis Doubleday) (06/25/87)
In article <4658@utah-cs.UUCP> shebs@utah-cs.UUCP (Stanley Shebs) writes: >An interesting analogy. It says a lot about prevailing software culture: > >1. Available chips do not always meet requirements exactly. For instance, >a board might need 3 NAND gates, but the 7400 has 4. EEs just ignore the >extra gate, or tie its pins to something stable. In a similar situation, >software people fume and gnash their teeth over "wasted space". > I've seen this standardized circuits analogy to software packages a number of times, and I don't really buy it. Software people are attempting to deal with a much larger and less well-defined problem domain than hardware people. Otherwise, we wouldn't even need software. We could just design hardware to handle every application. >2. Running wires around boards loses some performance, relative to cramming >everything onto a single chip. All the techniques for modules, objects, etc, >tend to slow things down. Again, software types tear their hair out and >vow to recode everything into one assembly language procedure. Performance is an important issue in many time-critical applications. I don't know anybody who wants to code everything in assembler. I do know people who are WILLING to code in assembler if it's the only way timing constraints can be met. >In short, I believe there are no technical problems or issues with reuse; >it's the software culture that has to change. At present, the prevailing >attitude is that the densely-coded, highly-optimized, do-everything program >is a sort of ideal to which everyone aspires. I don't think you're up to date on what's happening, at least in the Ada community. I just completed a 13.5K source line Ada program, of which 7.5K source lines were contributed by reusable utility packages that I got from the Ada repository (abstract string, stack, list, counted ordered set, and binary tree data types as well as packages for command line interface, lexical analysis of Ada, and parsing). -- UUCP: seismo!mimsy!dday Dennis Doubleday CSNet: dday@mimsy University of Maryland ARPA: dday@brillig.umd.edu College Park, MD 20742 Fan of: Chicago Cubs, Chicago Bears, OU Sooners (301) 454-6154
pase@ogcvax.UUCP (07/07/87)
In article <glacier.17113> jbn@glacier.UUCP (John B. Nagle) writes: > > The trouble with this idea is that we have no good way to express >algorithms "abstractly". [...] Well, I'm not sure just where the limits are, but polymorphic types can go a long way towards what you have been describing. It seems that a uniform notation for operators + the ability to define additional operators + polymorphically typed structures are about all you need. Several functional languages already provide an adequate basis for these features. One such language is called LML, or Lazy ML. Current language definitions tend to concentrate on the novel features rather than attempt to make LML a full-blown "production" language, and therefore may be missing some of your favorite features. However, my point is that we may well be closer to your objective than some of us realize. I apologize for the brevity of this article -- if I have been too vague, send me e-mail and I will be more specific. -- Doug Pase -- ...ucbvax!tektronix!ogcvax!pase or pase@Oregon-Grad.csnet