[comp.software-eng] Soft-Eng Digest V4 #7

MDAY@XX.LCS.MIT.EDU (Moderator, Mark S. Day) (02/11/88)

Soft-Eng Digest             Wed,  10 Feb 88       Volume 4 : Issue  7

Today's Topics:
                     Contest for an ETA10 Model P
               Coordinating Software Development (2 msgs)
      Correctness in parallel and distributed systems. (2 msgs)
                          Correctness Proofs
                         Cubicles vs. offices
                  Is it Art or Engineering? (4 msgs)
                       Workstation productivity
----------------------------------------------------------------------

Date: 3 Feb 88 15:32:51 GMT
From: cgs@umd5.umd.edu  (Chris Sylvain)
Subject: Contest for an ETA10 Model P

[ This is from the Jan. 7 issue of _Electronic Design_, in the section
  entitled "Design Alert" ]

	The Promise of Supercomputers Lures Future Designers

By 1990, well over 1000 supercomputers will be operating worldwide compared
to 300 today. To prepare the next generation of designers to run these power-
ful machines, ETA Systems, of St. Paul, Minn., is sponsoring a nationwide
science competition for high school students. An ETA10 Model P supercomputer
will be installed at the school attended by the winners.

Entrants work as teams made up of three students and a teacher coach to devise
scientific problems and solutions that call for a supercomputer's horsepower.
The four finalist teams will attend a seven-week learning institute this
summer at ETA's headquarters in St. Paul. After receiving supercomputer
training, the finalists, who will receive $3000 stipends (their teachers a
$7000 stipend), will run their computational solutions on the ETA10-P super-
computer. The winning team will be announced the last week. Additional prizes
will be awarded to the finalist teacher coaches, team members, and schools.

For more information, call Marcia Shilling at ETA at (612) 853-6538.       

[ Just think: if a high school near you won the contest, and they entered due
  to your urging ? Selfish motivation, yes, but the prize... ]

   ARPA: cgs@umd5.UMD.EDU     BITNET: cgs%umd5@umd2
   UUCP: ..!uunet!umd5.umd.edu!cgs

------------------------------

Date: 2 Feb 88 14:22:21 GMT
From: mcvax!ukc!stc!datlog!dlhpedg!cl@uunet.uu.net  (Charles Lambert)
Subject: Coordinating Software Development

We're not really in dispute, here.  I'm sure that the engineering and
manufacturing disciplines have produced a wealth of good terms;  I'm simply
not aware of them all nor convinced that the ones I know are most appropriate.

I have three criteria for a good term: it should be concise, unambiguous
and evocative.

>We call that a "build tree" or, more specifically, a "build
>directory-tree".  (I don't know if Merriam Webster would approve of
>using dual attributive nouns
> ...
>We call the private instances "private build-trees" or "experimental
>build-trees".

This is just the kind of linguistic forest I want to avoid; it makes for
congested documents and long-winded speech.  Any noun requiring that degree
of qualification is ambiguous, and the resulting term is not concise.

>It seems to me that the term "view" is just as bad (if not worse) than
>the term "build".  The word "view" also has several meanings...
>...or it can be a projection of an object on an engineering drawing.

"View" seems evocative for just that reason.  It is the projection of two or
more (sparsely populated) "builds" to produce a superimposed collection.  A
"build" is to a "view" what a scalar element is to a vector. This is
essential to the way we share stable (or beta-testable) sources.

>By the way, we avoid maintaining multiple, actively
>used, versions of a given module's codee whenever possible.  We have
>found that it is very difficult to propagate bug fixes common to
>several product lines to each product without having to "un-fix" some
>bugs that are unique to a product.

Avoiding multiple concurrent changes to the same module is ideal - an ideal
we are unable to maintain in practice.  The mechanisms of source-sharing
and "reconciliation" in our prototype system are intended to take the pain
out of propagating fixes amongst product lines and amongst concurrent
developments of the same line.

Most of the responses I have received to the latter idea suggest that it is
hopelessly ambitious.  Our experience doesn't yet support that view.  There
are some subtle complexities in merging concurrent developments but we
are presently managing by a combination of automation (using SCCS delta
inclusion) and assisted inspection.  We have not yet exhausted the potential
for automatic assistance.

Charles Lambert		Data Logic UK, Harrow, England.

------------------------------

Date: 4 Feb 88 22:55:08 GMT
From: brspyr1!tim@itsgw.rpi.edu  (Tim Northrup)
Subject: Coordinating Software Development 

|> Okay, suppose your company is working on the next release of a product
|> already in the field.  An important customer reports a critical bug,
|> ...
|> 
| RCS supports this very nicely...

Yes, RCS and SCCS can both do this, but it doesn't solve the problem.

What  happens  is that the ongoing work for the next release already has
some things changed (enhancements, fixes, whatever).  It is  often  very
difficult  to  make  a  "quick fix" for a customer in the field once new
release work begins without getting unwanted/untested code into it.

Once new release work has begun, unless you have a parallel set of
code which contains only the currently released product, you need
to extract an older revision EVERYTHING that has changed from
RCS/SCCS/whatever, not just the module you are making a fix to.

For this reason, we maintain separate areas for each release.  Yeah, it
takes more disk space, and yeah you have to make the fix in more than
one place (the release the customer has, and the next release), but
it requires a lot less work than extracting back revisions of everything
from from SCCS, making the fix, recompiling, and magically merging the
fix back into SCCS (and praying that it merged it correctly).

-- Tim
-- 
tim@brspyr1.BRS.Com
uunet!steinmetz!brspyr1!tim

Tim "The Enchanter" Northrup  

------------------------------

Date: 4 Feb 88 02:18:55 GMT
From: dgreen@locus.ucla.edu  (Dan Greening)
Subject: Correctness in parallel and distributed systems.

In article <899@hubcap.UUCP> steve@hubcap.UUCP ("Steve" Stevenson) writes:
>I would like to find a reasonably up-to-date bibliography on
>proving parallel programs correct.  I would also like same for
>distributed programs, say for the huge hypercube numerical programs being
>developed.

Can't give a bibliography, but Leslie Lamport is working on some
things related to this.  He appeared at a UCLA seminar last year.  He
claimed to be writing a book on semantics of parallel programs.  If
this feeble brain recalls correctly, he used a control flow model of
computation.

Data flow and functional approaches to parallelism produce programs
that are much easier to discuss in a theoretic framework.  I suggest
you look at

  J. Backus, Can Programming be Liberated from the von Neumann Style?
  A Functional Style and Its Algebra of Programs, Communications of the
  ACM, 21(8):613-641 (August 1978).

This classic article supplies an interesting set of program
transformations that can assist in proving functional programs
correct.

The Scientific Citation Index will provide a list of successor
articles.

A major problem with control-flow (i.e., "normal" multiprocessing),
and any other non-applicative system is the indeterminacy of memory
stores and fetches.  This also applies to some data flow languages,
though in a much more restricted sense.  "Id Nouveau" from MIT comes
immediately to mind.  In that language, non-strict evaluation can
cause deadlock problems.  But I'd much rather prove an Id Nouveau
program correct than a Concurrent Pascal program correct.  Yikes!

Good luck!

Dan Greening  Internet  dgreen@CS.UCLA.EDU
              UUCP      ..!{sdcrdcf,ihnp4,trwspp,ucbvax}!ucla-cs!dgreen
	      USPS      3436 Boelter Hall / UCLA / Los Angeles, CA 90024-1596

------------------------------

Date: 4 Feb 88 13:27:14 GMT
From: steinmetz!macbeth!kerpelma@itsgw.rpi.edu  (dan kerpelman)
Subject: Correctness in parallel and distributed systems.

This is a little old but...

Susan Owicki, David Gries, _Verifying Properties of Parallel Programs: An
Axiomatic Approach_, Communications of the ACM, Volume 19, Number 5, May 1976.
 

Dan Kerpelman                         ARPAnet: kerpelman@ge-crd.arpa
GE Corporate R&D                      UUCP   : crd!kerpelman@steinmetz.UUCP
Schenectady, NY                       GEnet  : ctsvax::kerpelman
USA                                   phone  : (518) 387-5086

------------------------------

Date: 6 Feb 88 18:39:02 GMT
From: nuchat!steve@uunet.uu.net  (Steve Nuchia)
Subject: Correctness Proofs

I'm a member of the "program after proof" camp, but having the
computer do the complete proof in all its gory detail is probably
a good idea since I seldom do more than convince myself that my
proof sketches are right.

In my experience though, the volume of detail in the proof isn't
the problem.  The problem is the lack of a formal statement of
the specification - Its damned hard to prove "user friendliness"
or "secure", for instance.  Proving that all your searches search
and your sorts sort might make you feel better, but until you
prove that the overall program meets its objective you might
as well sit under a coconut tree.  Such a proof requires a
formal statement of the objective, and I have never seen one
of those for a significant program.

In summary, I think automatic proof verifiers would be in common,
if not wide, use _now_ if there was something useful for them
to prove.  And having the verifiers won't (directly) do anything
to bring that about - work on specification languages and
methodologies might.
-- 
Steve Nuchia	    | [...] but the machine would probably be allowed no mercy.
uunet!nuchat!steve  | In other words then, if a machine is expected to be
(713) 334 6720	    | infallible, it cannot be intelligent.  - Alan Turing, 1947

------------------------------

Date: 1 Feb 88 17:40:11 GMT
From: mnetor!utzoo!dciem!nrcaer!cognos!bobb@uunet.uu.net  (Bob Barr)
Subject: Cubicles vs. offices

I appreciate your dilemma.  About 2 years ago our company
planned to build a new building, which we now inhabit.  It was
announced that the "open office" concept was to be used.  This
is the architectural propaganda for "cubicles -- no offices".  I
was certain that the distraction factor would drive down my
productivity, and that of everyone who worked for me.  Hence, I
did a little research at 2 local university libraries, trying to
locate studies done on the issue of how cubicles/offices affect
productivity.  What I found -- mostly in architectural journals
-- addresses your questions and frustration.  Briefly, the
studies collectively seem to conclude that:

>  "open offices" are good for clerical workers, particularly
secretaries, who need to see a lot of people during their day,
and who don't do much work that requires intense concentration.
>  open offices are poor for the morale of all workers,
particularly because they feel a lack of privacy.
>  the major factor that determines people's degree of
satisfaction with their work environment is how much physical
space they have (or appear to have -- a good illusion often
helping immensely, which I can attest to since I have begun
sitting beside a large window.
>  the major reasons for adopting an open-office concept are
cost-saving reasons: 1) it is cheaper to light an open office
because the wiring is simpler and the diffusion effect from
central lights reduces the number of individual lighting units
needed, 2) it is cheaper to heat (and air-condition) cubicled
offices, because there is less ductwork that needs to be
installed in the building and air circulates more freely -- no
doubt a major consideration here in the Great White North, and
3) it is allegedly cheaper to maintain the furnishings with
cubicles, because (in the long run) when the need to change how
the work areas are laid out occurs, cubicles can be dismantled
and re-shuffled much more easily that knocking down walls and
putting up new ones.
>  No one mentions the tax advantages.  I know nothing about the
tax situation myself.

No one in the studies I looked at tackles the issue of
productivity for professional workers head-on -- probably
because of the difficulty in measuring white-collar
productivity.  They hint that it is counter-productive for
people who need to concentrate, and that people leave their
desks a lot to gain privacy, but no one seems willing to try to
measure how much disgruntlement affects productivity.

About a year ago I changed jobs and departments.  I had to move
into a smaller workspace (another cubicle).  Since I had less
storage space for documents, I was forced to throw out all the
articles I had gathered on this subject, so I can't give you my
bibliography.  It may help you to know, however, that there are
articles out there in architectural journals on this subject.

By the way, I presented all my findings to the person in charge
of designing the new building.  It was ignored. 

I wish you well in convincing your people that cubicles are a
mistake.

------------------------------

Date: 6 Feb 88 01:36:35 GMT
From: agate!garnet.berkeley.edu!csm@ucbvax.Berkeley.EDU  (Brad Sherman)
Subject: Is it Art or Engineering?

There seems to be a perception among programmers that
current software development is not really "engineering."

What do "real" engineers do that we do not?

Is there anything in programming that is analogous to
the term "tolerance" in engineering?

	Brad Sherman - (You See Bee)

------------------------------

Date: 6 Feb 88 17:59:53 GMT
From: apte@cs.duke.edu  (jitendra apte)
Subject: Is it Art or Engineering?

how about the following :

designing and writing programs that are expected to yield acceptable results
most of the time, but not guaranteeing good results for all possible inputs.
a typical attitude while  writing programs to solve problems which are known
to have very time consuming exact solutions, but which can probably be solved
to near exctness using heuristic methods. (np problems and the more expensive
p problems).

jitendra apte.

------------------------------

Date: 7 Feb 88 02:51:53 GMT
From: xanth!kent@mcnc.org  (Kent Paul Dolan)
Subject: Is it Art or Engineering?

Real engineers build bridges that work when delivered.

If by tolerance you mean the kind you measure with micrometers, sure.
That is what all those convergance criteria are in numerical analysis.

If you mean the twelvefold over strength needed wooden beams in some old
houses, then again yes.  We do it when each routine checks its inputs,
even when they are created within another subroutine of the same
program, when we check the return code on the "close()" file command
in C, which is folk rumored never to fail, but has actually been known
to do so.  We call it paranoia!  ;-)

Kent, the man from xanth.

------------------------------

Date: 8 Feb 88 12:30:11 GMT
From: genrad!panda!teddy!svb@eddie.mit.edu  (Stephen V. Boyle)
Subject: Is it Art or Engineering?

Since my background is Chemical Engineering, I'll use examples from that field.

In Chem. E., when I needed to design a heat exchanger, I used a set of refer-
ences that told me what the constants were for the materials the heat exchanger
was going to operate on, and the *standard* design equations for the exchanger
itself. I don't mean to imply that every problem in Chem. engineering (or any 
other engineering) design is easily and completely reduced to "look up the
numbers", but it sure happens a heck of a lot more than when I'm writing soft-
ware. 

Sure, there are some examples of "canned" routines and algorithms (quicksort
is what immediately comes to mind), but in general, unless I, or someone else
in my engineering group has read or remembers and makes known a solution to a 
past problem, I'm doomed to re-create the solution.  The closest thing I can
think of to what I consider "real" engineering in software are the user-
interface building tools that let you quickly design a screen layout and which
then generate the corresponding code.  I guess what I consider the critical 
difference is the ability to put together little pieces of the problem that are
relatively well known, without having to generate a custom solution for every 
application. This would allow software people to spend time on what makes the 
current application unique.

Now before people get their keyboards cranked up, I want to make it clear that
I am aware of algorithm and code libraries, but they are incomplete solutions
to what I am describing. (There is no "Perry's Handbook" for Software Engineer-
ing.) Plus, none of the above even whispers about scheduling. I'm not going to 
get into that tar pit, I've depressed myself enough for a Monday morning.

... !{decvax,linus,wjh12,mit-eddie,masscomp}!genrad!svb
Steve Boyle  
GenRad Inc,  Production Test Division
MS 06, 300 Baker Ave, Concord, Mass.  01742

------------------------------

Date: 4 Feb 88 17:56:23 GMT
From: amcc!keithw@nosc.mil  (da staff)
Subject: Workstation productivity

    I am trying to justify the purchase of workstations for each
programmer at this site.  Does anyone have any studies that show how
much a workstation can increase productivity for a programmer?

   By workstation, I mean a computer with a high resolution monitor with
some sort of windowing environment and the ability to do more than one
thing at a time (multi-tasking).  Examples of this are the Sun 3/50 and
Apollo DN3000 systems. 

    Thanks in advance,

    Keith Wilke
    Applied Micro Circuits Corp.
    6195 Lusk Blvd.
    (619) 450-9333 x 5581
    San Diego, Ca 92121

    UUCP: ucsd!nosc!amcc!keithw
    

------------------------------

End of Soft-Eng Digest
******************************
-------