[net.lang.c] Program design techniques?

freeman@spar.UUCP (Jay Freeman) (08/01/85)

The recent discussion on "writing code" seems to suggest that program design
techniques are worth discussing:  I wonder if there is any consensus -- or
any interesting diversity -- of such methods.

I myself try to think about a program for as long as possible before writing
anything -- sometimes one will be on "back burner" in my head for months.
This practice often helps me come up with ways to simplify things.  Then I
usually go through at least an informal preliminary design, often making
written notes, deciding what kind of abstract data types are necessary,
and thinking some about what kind of flow control will be required to hook
them together.  For complicated and featureful programs, I also find it
useful to think hard about a development path that will allow me to bring
the system up a piece at a time.

This last is fraught with opportunity:  Implementing from the top down can
produce an interim situation in which you have an elaborate control
structure or driver which is untestable because it's not hooked to anything
that does anything; implementing from the bottom up can result in lots of
little primitives that are untestable because there is no way to coordinate
their activities cooperatively.

I find abstract data types, psuedocode, and a HIPOish kind of process
decomposition fairly useful.  I rarely use the kind of box-and-arrow flow
charts that introductory books are full of.  I have sometimes used
Warnier-Orr diagrams.
-- 
Jay Reynolds Freeman (Schlumberger Palo Alto Research)(canonical disclaimer)

roy@phri.UUCP (Roy Smith) (08/04/85)

> I myself try to think about a program for as long as possible before writing
> anything -- sometimes one will be on "back burner" in my head for months.

	I do this too.  Occasionally I actually get around to writing the
code.  By the time I reach the point of typing it in, it's such an
anticlimax that I don't usually bother. :-)  I find flowcharts very good
for pretending to do something useful when the boss walks in.

	Most of my pre-typing thought goes into designing data structures.
It took a long time for me to figure out (i.e. to finally learn from other
people's preaching and my mistakes) that data structures are more important
than code.  If you have well laid out data, the code to manipulate it is
obvious.  If the data structures are awkward, no amount of code will make it
work right, or be easy to understand.

	On of my favorite examples is the Version 6 Unix file system.  Once
you see the layout of the inodes, directories, superblock and freelists, it
is trivial to figure out what you need to do to create a file, remove a
file, read and write a specific block, etc.  (I'm not picking on post-v6
file systems, I just don't know their details well enough to talk about
them).

	Once I actually start writing code, I begin with a stripped down
version of whatever it is.  No options, no bells and whistles, no worries
about handling special cases.  Once I get this working I can start adding
in features.  Before this goes too far, the problems with whatever data
structures I picked become obvious, and I start adjusting those and
incrementally build up the program.

	This rarely leads directly to a finished product.  For a system of
any reasonable size (no, I don't know how many lines of C that is), I will
usually reach some point where the program is more patch than planning.  At
that point, I chuck it and start from scratch, incorporating the lessons I
learned from the experimental version.

	The key is to be able to recognize when it's time to start over.
Do it too soon, and you won't get anything out of the experiment.  Do it
too late and you waste time beating a dead horse.  Sometimes the hardest
part of all this is telling your boss "Yeah, it works, but you can't use it
yet because I want to rewrite it all".

> a HIPOish kind of process decomposition fairly useful
> I have sometimes used Warnier-Orr diagrams.

	What does "HIPOish" mean?  What are Warnier-Orr diagrams?
-- 
Roy Smith <allegra!phri!roy>
System Administrator, Public Health Research Institute
455 First Avenue, New York, NY 10016

throopw@rtp47.UUCP (Wayne Throop) (08/04/85)

I am a top-down, design-by-refinement advocate.  However, that isn't
what I wanted to post about.  A paragraph in the referenced posting made
me think of a debugging/development technique that is familiar to Lisp
users, but less so to C users.  It might be useful, so I'll describe it.

> This last is fraught with opportunity:  Implementing from the top down can
> produce an interim situation in which you have an elaborate control
> structure or driver which is untestable because it's not hooked to anything
> that does anything; implementing from the bottom up can result in lots of
> little primitives that are untestable because there is no way to coordinate
> their activities cooperatively.

Given a moderately good debugger, you can set breakpoints on stubs, and
"do them by hand", allowing the control structures to be excersized.
For example, say that a stub is supposed to fill in a data structure
from input text, and return the data structure.  If the input routines
aren't fleshed out yet, one can simply write the stub to return a local
data item, and fill it in using the debugger when the breakpoint on that
stub is taken.

The same idea works for bottom-up design, but in reverse.  One can write
a top-level stub which simply calls each of the primitives you are
trying to check out.  The debugger can be used to direct control to each
primitive in the proper order "by hand".

This technique is very usefull when building a prototype, or in testing
a partially completed implementation.  It is what is sometimes refered
to as "debugging a program into existance".  It's main advantage is
that, having tested part of the program, whan an addition breaks what
used to work, one knows where to look for the bug (in the newly added
code).  All existing  code is (fairly well) debugged already.

My appologies if this technique is already well-known, or passe.
-- 
Wayne Throop at Data General, RTP, NC
<the-known-world>!mcnc!rti-sel!rtp47!throopw

jmc@ptsfa.UUCP (Jerry Carlin) (08/07/85)

> > I have sometimes used Warnier-Orr diagrams.
> 
> 	What does "HIPOish" mean?  What are Warnier-Orr diagrams?

Warnier-Orr diagrams were created by Jean-Dominique Warnier in France and
Ken Orr in the US. The are typically included in the "DSSD" (Data  Structured
Systems Development) methodology. I have found them hard to use to represent
"break" and "continue" as well as "if ((c = getchar()) != EOF)".

To slightly change the subject, has anyone out there used the full DSSD
or other design approaches (such as Yourdon) for UNIX/C programs? I am a
part of a project looking at this and I have heard little but vendor claims.
Even biased testimonials (positive or negative) would be appreciated but
I'd really like to hear about any comparative studies.
-- 
voice= 415 823-2441
uucp={ihnp4,dual}!ptsfa!jmc

pvk@ixn5f.UUCP (Pat Kauffold) (08/09/85)

HIPO means "Hierarchy Input Process Output".

I think it was an IBM-devised methodology, ca. 1968.

Pat Kauffold
AT&T Bell Laboratories
IX-1F-373
1200 E. Warrenville Rd.
Naperville, IL 60566
312-979-1918