[comp.software-eng] Software Engineering Digest v5n37

soft-eng@MITRE.ARPA (Alok Nigam) (10/13/88)

Soft-Eng Digest             Wed, 12 Oct 88       V: Issue  37

Today's Topics:
                    ?Relationship of OOP and JSD?
                   Cost / Schedule estimation tools
                 Cynic's Guide to SE #6: Forthcoming
              Cynic's Guide to SE #6: Forthcoming Revolt
            Description of software development practices
                    Fomal specification techniques
                         Formal Requirements
                        Glass tty's vs. micros
                             KnowlegeWare
                    PDL Processor Wanted (2 msgs)
----------------------------------------------------------------------

Date: 7 Oct 88 16:54:26 GMT
From: ai.etl.army.mil!mike@ames.arpa  (Mike McDonnell)
Subject: ?Relationship of OOP and JSD?

We here at ETL are embarking on a medium-sized software development
project.  We want to use "structured techniques", but we need help.  I
like JSP and JSD because of Jackson's "entities", which seem to be
something like the "objects" in C++, Flavor systems, and the Common Lisp
Object System (CLOS).

It seems to me that what Jackson was after was a sort of object-oriented
programming [OOP] method that he developed using the tools of his day
(early 70's I think).  His idea of modeling objects rather than
procedures is attractive, but the implementation seems complicated.
Would it be possible to do a more simple and natural implementation of
Jackson's ideas using a more modern programming language?  Also, the
languages available to me (C, C++, Common Lisp and CLOS) don't have
coroutine linkages.  I'm not sure how to simulate them.  I also don't
understand what he means by "inverting a program", but it seems
unnatural.

I need to know more before I can commit to a design method.  Are my
conclusions about the relationship between JSD and OOP justified?  If
there is a separate discipline of program and system design that has
evolved to the maturity of JSP and JSD and is based on modern concepts
of OOP, what is it?  So many questions, so little time...

Apologies if this has been thrashed out here already, but I am a new
reader of this interest group.

------------------------------

Date: Mon, 10 Oct 88 10:00:55 SET
From: nigel Head <ESC1111@ESOC.BITNET>
Subject: Cost / Schedule estimation tools

After some investigation of the available tools/methodologies and based on
some 10 years experience of doing the job by hand (only for medium sized
projects though) I have a couple of comments which may interest somebody (!).

Many (most?) of the proposed methodologies seem to start from LOC (lines
of code) and work on from there, attempting to evaluate the influence of
various environmental factors (technology levels, staff capabilities etc) to
predict implementation costs and schedule bounds for the given LOC.

So - firstly, the answers you get are only as good as the LOC estimate you put
in in the first place (and getting even within a factor of 2 on LOC is a VERY
difficult job for any reasonable sized project); secondly the specification of
the environmental factors has a very large impact on the results that come out;
in COCOMO models I can get a factor of 2 or 3 variation without having to
make any outrageous modifications to some of these (subjective) judgements.

To use things at all safely it seems to me that one would have to calibrate
the model VERY carefully with historical information from ONES OWN project team
working on IDENTICAL type of developments - in other words to use ones own
experience - so why do we need a tool other than a spreadsheet ?

After being negative about all of this I must say that I have one positive
feeling (which I am currently trying to work on) which is that I am pleasantly
surprised by the accuracy of the Function Point Analysis approach to getting
LOC figures in the first place. I have compared the results of an FPA estimate
of some of our past developments with the actual LOC's which resulted. The
results are well within the sort of accuracy which I try to quote at
the Requirements Definition stage (plus/minus about 15%) and, for some reason
which I don't quite understand, the subjective factors seem to have less
influence. This is based on a demo version of a PC tool called 'Before you
Leap' (cute huh?) - I haven't yet managed to track down the original IBM manual
which defines the model itself (anybody help ??).

Again - I expect careful calibration would be required before using the thing
with real money (or my neck) at stake. Does anyone have any comments/experience
or other contributions to a discussion of FPA  and the complexity of the data
which would need to be gathered to do some calibration of it??

I would be very interested in any opinions etc. in this area - I don't see that
we'll be able to continue doing things by hand much longer and it's important,
given management tendency to treat anything issuing from a computer as absolute
truth, that whatever tools we use be realistic and that we know how much weight
to place on the tool's opinion.

------------------------------

Date: 7 Oct 88 02:26:00 GMT
From: a.cs.uiuc.edu!m.cs.uiuc.edu!wsmith@uxc.cso.uiuc.edu
Subject: Cynic's Guide to SE #6: Forthcoming

>A good rule of thumb I've observed is that anytime a project can
>be done by one or (maybe) two people, a PC is probably your best choice.

I don't believe this because of the lack of memory managment on all but the
most recent PC's.   If a bug causes my system to crash because of wild
pointers (which are easy enough to create on a Unix system and even moreso
on segmented architectures like the 80x86) I lose a good bit of time waiting
for the system to reboot.

Also, it is quite easy for one person to develop a project that won't fit
within the memory constraints of a PC (which could be solved by a large
virtual address space).

I agree with your premise that the tools should be selected so that they are
appropriate for the problem at hand.  My current project is 20k lines and 200+
source files and I think that without the simple facility of a core dump to
save the state of the program when a fatal runtime error occurs, I would be
still focusing on debugging instead of on future enhancements.  Also, the
project is big enough that each module has its own directory.  I don't
have much respect for PC make facilities especially since a PC has limited
concurrent execution powers to allow for the nested makes needed for the nested
directory structure.

------------------------------

Date: Mon, 10 Oct 88 06:05:24 CDT
From: Scott Guthery <spar!ascway!guthery@decwrl.dec.com>
Subject: Cynic's Guide to SE #6: Forthcoming Revolt

While we on the topic of empowering students with PCs, I might observe
that the rot that destroyed our secondary schools seems to be spreading
rapidly to our universities. Judging from the computer science degrees we
see, today's graduates have less to offer and expect to have more done for
them than 5 or 10 years ago. Generally speaking their knowledge is
of the Byte magazine variety and has the historical perspective of about
6 months.  Dumping more Macs and PCs on 'em will only make matters
worse.  Whatever happened to books and libraries?  Knowledge is not
a desk accessory nor will it ever be.  Like most revolts, the forthcoming
one will be firmly founded in ignorance and stupidity. Powerful tools
(technical, economic, or social) in hands connected to empty heads
make for big fun ... stay tuned.

------------------------------

Date: 9 Oct 88 03:33:59 GMT
From: wucs1!wuibc!gmat@uunet.uu.net  (Gregory Martin Amaya Tormo)
Subject: Description of software development practices

         I have to do a short (5 page) report on software engineering as it
effects the MIS department.  I am hearby requesting first-hand experiences
on how YOU manage software developement at your place of business.  I am
interested in:

        1) What role does software developement play in your company?

        2) How does software developement fit into your MIS department

        3) What strategy does your MIS department use for software
                developement

        4) What are the stakes your MIS department must consider (costs,
                benefits, risks)

        I am also interested in how CASE tools fit into your work
environment, what you do with project management, and if you deal with user
groups within your company in your system developement.

        I want to write a report based on real world experience (since our
prof is always telling us about his real-world experiences, snoozzzzzz).

        Please mail any responses (just a few paragraphs at the most,
please) to the following email address, not the posting reply address.


                David Deitch, Computer Connection
                dwd0238@wucec1.wustl.bitnet
                Fido 1:100/22

------------------------------

Date: 7 Oct 88 19:26:02 GMT
From: mcvax!ukc!etive!lfcs!sean@uunet.uu.net  (Sean Matthews (AI))
Subject: Fomal specification techniques

I am interested to find out how much interest there is in Formal program
specification techniques in the world.

This is where the formal description of what the program does is
seperated from how it does it; so that instead of being lost in the
detail of implimentation, the designer can concentrate on the
concepts and then prove---using mathematics---that the specification
is free from contradictions, does what it is supposed to do and variou
other properties.

There are two ways that are normally used to do this (apart from the
slightly suspect `specification' languages like logic and fuctional
programming languages):

there is the logic and sets approach of systems like

        VDM (the Vienna Development Method)
and
        Z (developed by the Programming Research Group in Oxford).

Then there is the algebraic approach of languages like Clear.

these techniques are in use a the moment in some parts of the world for
real development projects.  But how much are they in use;  I have heard
that IBMs federal systems division uses formal specification techniques in
it `clean room'.

It is my impression that the set theoretic aproach is more common at the
moment, and I think there is a lot of work going on quietly, that I have not
heard about.

Mail me if you yave experience of this sort of technique, if you know
of examples, or if you just have thoughts on the are and if there is
sufficient response I will post a summary.

------------------------------

Date: 5 Oct 88 17:36:50 GMT
From: oliveb!intelca!mipos3!omepd!psu-cs!warren@ames.arpa  (Warren Harrison)
Subject: Formal Requirements

I am currently studying the use of formal requirements methodologies and would
be interested in hearing from anyone whose company uses one.  In particular
I'd like to know which one you're using, why, and if you currently have
automated support for it.  If sufficient interest is shown I will post a
summary.  I'd also be interested in learning the same about specifications.

------------------------------

Date: Tue, 11 Oct 88 01:46:59 PDT
From: Mike (I'll think of something yet) Meyer <mwm@violet.Berkeley.EDU>
Subject: Glass tty's vs. micros

Glass ttys? Why not just make people use card punches? :-)

Seriously, a state-of-the art micro(*) pretty much makes a glass tty
tied to a supermini look pitiful(**) as a software development
environment - just so long as you've only got a few people working on
the project. More than that, and the problems of sharing files on a
micro begin to eat you alive.

But why bother with such, when you can have the best of both worlds?
Put an ethernet card in the micros, and let your Unix box play file
server for the group. That way, you can choose from anywhere in the
spectrum from "development on a micro" to "development on a mainframe
- using bitmapped display tubes running local window managers". To a
degree, each developer can also pick a place on the spectrum that
suits them best.

        <mike

(*) A state-of-the art micro (meaning: the best stuff I could go out
and buy now) would be a 32-bit processor running at 16MHz or faster,
8Meg or more of 32-bit wide memory, a couple of hundred meg of SCSI
disk, and a megapixel display. Software could be Unix, or something
that's learned from the last 20 years of OS research and has most of
the Unix tools - or replacements that have been improved by experience
- running on it.

(**) Not because the tools are better. They may be, but not by a great
deal. It's just that emacs on a glass tty is a lousy substitute for
real windows on a bitmapped display.

------------------------------

Date: 9 October 88 16:19-PAC
From: COSTEST@IDUI1.BITNET
Subject: KnowlegeWare

If you're looking for some printed info on KnowledgeWare's
Information Engineering Workbench/Analyst Workstation look
in the March 1988 issue of IEEE Software page 97.  I wrote
a product review that appeared in the Software Reviews column.

Just in summary I was favorably impressed with the product.  Since
that time I obtained a copy of the Design Workstation but haven't
had much time to "play" with it.  I recently talked with one of
the KnowledgeWare marketing people who advised me that I should
try and get my hands on an 80386 machine if I intend to do
much serious work with the tools.  My little experience with the
Design WS indicated that accessing some of the diagrams (in particular
a data structure diagram) could become a significant time problem
on my PC/AT.

I would also say that their use of color windows is nice and they
do seem to have good product support people available by phone.

------------------------------

Date: 10 Oct 88 23:59:46 GMT
From: mailrus!uflorida!haven!vrdxhq!verdix!qtc!marc@rutgers.edu  (Marc Frommer)
Subject: PDL Processor Wanted

I have been using Programming Design Languages (ie, pseudo-code) for the last
few years during the design phase and am interested if anyone knows of any
automated processors.  I would like the processor to take care of indenting on
printouts, compiling cross-reference lists of modules, hierarchical charts of
modules, etc. Any information regarding packages that execute on either UNIX
or VMS would be appreciated.

------------------------------

Date: 11 Oct 88 15:23:25 GMT
From: nascom!rar@gatech.edu  (Alan Ramacher)
Subject: PDL Processor Wanted

Caine, Farber, Gordon, 1010 East Union St, Pasadena, CA 91106, (818) 449-3070,
have a product (PDL) that appears to fit your needs. I have used it several
years ago and found it usefull. Their literature would indicate that
the the product has continued to improve. Hope this helps.

------------------------------

End of Soft-Eng Digest
******************************