[comp.object] Object-Oriented Requirements Analysis: An Introduction

eberard@ajpo.sei.cmu.edu (Edward Berard) (11/23/89)

			       PROLOGUE

Before I begin a discussion of object-oriented requirements analysis
(OORA), I must several very important points.

First, as I have stated in several articles in the past, the
development part of the object-oriented life-cycle is best
accomplished using a recursive/parallel approach, i.e., "analyze a
little, design a little, implement a little, and test a little."
Therefore, even though I may talk about OORA as if it were performed
at only one place during development, in reality, it may very likely
be accomplished at many places.

Second, although it may be preceded by such things as a feasibility
study, I will treat OORA as if it were the very first thing to be
accomplished during the development of a software product. For
example, if you were to attempt an object-oriented approach using the
classic waterfall life-cycle model (not recommended, but still
possible), OORA would be followed by object-oriented design (OOD),
which would be followed by object-oriented programming (OOP), and so
forth. [During the recursive/parallel life-cycle, if OORA is used, it
is the first process accomplished during each recursive application of
"analyze a little, design a little, ..."]

Third, OORA is not always appropriate. If the project is small and/or
non-critical OOP may be all that is required. As we move to larger
efforts, OOD becomes a necessary precursor to OOP. For still larger
and more critical projects, we often turn to OORA, followed by OOD,
followed by OOP.

[For organizations which continually develop (and maintain)
significant amounts of software, an on-going object-oriented domain
analysis (OODA) effort is strongly recommended. (Remember, domain
analysis is best accomplished independent of any particular project.)
The deliverables from an OODA effort are used by the OORA analyst, and
the OORA analyst contributes products to the OODA effort.]

Object-oriented requirements analysis (sometimes referred to as
object-oriented analysis (OOA)) was virtually unheard of in the
object-oriented programming community until very recently. There are
two main reasons why it is now being seriously considered:

	- When object-oriented technology is applied to large and/or
	  critical projects, the bottom-up approach, so common in OOP,
	  often proves insufficient. Very few people, for example, can
	  contemplate a project of, say, 100,000 lines of code, and
	  adequately identify most of the low-level components without
	  some form of analysis. As object-oriented thinking moves
	  into the mainstream, people are realizing that OOP
	  techniques (e.g., identify classes, create objects, and send
	  messages) _alone_ may be very inadequate for large, critical
	  efforts.

	- Object-oriented technology is now being seriously considered
	  by people who are accustomed to thinking in terms of
	  life-cycle methodologies. These people include, for example,
	  those developing large business applications, critical
	  real-time applications, and large, complex software in
	  general.

			       HISTORY

Although E.W. Dijkstra first describe what he called "structured
programming" in 1969, it was not until December of 1973 that many
programmers became aware of the term. (The December 1973 issue of
Datamation was devoted to the subject.) In 1974, the first article on
"structured design" was published in the IBM Systems Journal, and in
late 1976, Tom McCabe began publicizing something called "structured
testing."

By 1976, there was some early work going on in the area of "structured
analysis," e.g., [SofTech, 1976]. (See also [Ross, 1977] and [Ross and
Schoman, 1977].) In 1977, Gane and Sarson were describing their version
of structured analysis ([Gane and Sarson, 1977]), which they later
published in book form ([Gane and Sarson, 1979]). However, it was not
until Tom DeMarco published his best selling book (DeMarco, 1979])
that structured analysis began to become generally accepted.

By 1983, it had become apparent that "standard" structured analysis
was not adequate for real-time systems, e.g., it did very little in
the area of such things as interrupts and scheduling of concurrent
processes. A new, "real time" flavor of structured analysis began to
emerge (often combined with an equally revised version of structured
design). See, for example, [Gomaa, 1984], [Hatley and Pirbhai, 1988],
and [Ward and Mellor, 1985]. This "real time" trend is also very
apparent in modern OORA.

[Note: For those of you who are not as concerned with real time issues
as the rest of us, OORA techniques can also be easily applied to
non-real-time systems.]

As it was with OOD, virtually all of the work in OORA has taken place
within the Ada community. (See, e.g., the bibliography at the end of
this article.)

In an earlier article, I described how Grady Booch introduced the
concept of "object-oriented design" to the Ada community. By 1986, OOD
(in some form or another) was being used on a significant percentage
of Ada development efforts. However, there were some serious problems
reported, e.g.:

	- Traceability was difficult. (Traceability is a measure of
	  the ease with which a concept, requirement, or idea may be
	  traced from one point in the software life-cycle to another
	  point.) For example, contractors were often (if not
	  exclusively) furnished with _functional_ requirements, and
	  encouraged to develop _object-oriented_ code. Tracing
	  functional requirements to functional code was relatively
	  easy since the localization (i.e., around functions)
	  remained constant throughout the development effort.
	  Changing localizations (i.e., from functional to
	  object-oriented) in the middle of development made tracing
	  requirements (and, hence, acceptance testing) very
	  difficult.

	- Testing and integration became a nightmare. A very common
	  practice was to divide a large effort up into several (more
	  manageable) _functional_ pieces. Each of these functional
	  pieces was given to a separate team which then designed and
	  coded the piece in an "object-oriented" manner. Given that
	  objects are _not_ localized in a functional manner, this
	  meant that the characteristics of a given object were often
	  distributed unevenly among the functional pieces.

	  What this meant was that each team had a very small
	  probability of gaining a _complete_ understanding of any
	  particular object. For example, one team might see object X
	  as having attributes A, B, and C, and another team would see
	  object X as having attributes B, D, and F. These differences
	  only became apparent when it came time to integrate the work
	  of both teams, and the first team attempted to hand off
	  their version of object X off to the second team. Since
	  integration often occurs fairly late in development, this
	  made the required changes difficult and expensive.

	- Needless to say, there was also a great deal of duplicated
	  effort, since each separate team often (re-)developed
	  objects which were already in use by other teams.

	- Software engineers found that working with two, vastly
	  different paradigms was difficult. For example, many of them
	  had problems when attempting to "bridge the gap" between
	  structured analysis and object-oriented design. (See, e.g.,
	  [Gray, 1988].)

By 1986, it had become obvious that some form of object-oriented
requirements analysis was needed. Most efforts to develop an OORA
methodology used either "classic" structured analysis, a real-time
version of structured analysis, or a combination of either of these
approaches with entity-relationship diagrams, as a starting point.
(See, e.g., [Anderson et al, 1989], [Bailin, 1989], [Coad and Yourdon,
1989], [Khalsa, 1989], [Shlaer and Mellor, 1988], [Smith and Tockey,
1988], [Stoecklin et al, 1988], and [Ward, 1989].) To someone with a
strong background in "conventional" object-oriented programming, these
approaches seem strangely out of place.

Also in 1986, my clients, who were suffering from the problems I
mentioned earlier, began to demand that I provide them with an
object-oriented requirements methodology. By that time, I was also
undertaking some large (~300,000 lines of code) object-oriented
development efforts of my own. I needed some practical approach to
OORA for my own company.

My staff and I began to investigate the problem. From our consulting
and training efforts with other organizations, and from our own
experience, we knew the following:

	- Attempting to base an OORA methodology on conventional
	  requirements analysis techniques was a mistake. Whether you
	  referred to these approaches as "functional decomposition"
	  or "event-driven," they did not localize information around
	  objects, and they ignored virtually all aspects of
	  object-oriented technology.

	- We would have to supply some form of graphical techniques. A
	  purely textual approach was undesirable. Quite frankly, most
	  of the time, people relate better to pictures than they do
	  to words. When we began, we did not know which graphics we
	  would use, but one of our most important criteria was that
	  the graphics be object-oriented, or directly support
	  object-oriented thinking.

	- We had to have some mechanism for creating a specification
	  for objects of interest. Structured analysis, for example,
	  has a "data dictionary." We thought that we should have an
	  "object dictionary," but we were not sure how to describe
	  the "entries," i.e., the objects of interest.

	- Merely specifying objects of interest was not enough. We had
	  to have some mechanism(s) for showing how these objects were
	  related, and how they would interact. It was a given that
	  these techniques had to be graphical.

	- Reusability was a key issue. We had evolved an OOD
	  methodology which emphasized reusable objects, and we
	  planned to extend this thinking into the OORA methodology.

	- Whatever methods we came up with had to accurately reflect
	  object-oriented thinking. Since we were already familiar
	  with the standard object-oriented literature, we felt we had
	  that area covered.

	- The methodology had to be programming-language-independent
	  to the highest degree possible. Developing a methodology
	  which "reeked of Smalltalk," might not be all that
	  applicable, for example, to projects which were going to be
	  implemented in Eiffel, C++, CLOS, or Self.

	- The methodology had to be:

		- Pragmatic: real people, working on real projects,
		  under real constraints, had to be able to use the
		  methodology.

		- Quantifiable: as much as possible, the methodology
		  had to be precisely defined so that it was
		  repeatable, and so that viable alternatives could be
		  evaluated

		- Widely applicable: the approach should be one that
		  can be used, with little, or no, modifications, on a
		  complete spectrum of applications, e.g., from
		  business, to scientific, to embedded real-time
		  applications.

		- Tailorable: Each organization (and, often, each
		  project) may wish to emphasize different things.
		  Some may want to delete items, others will want to
		  add things, and still others may want to re-arrange
		  the overall process.

Over the past (almost) four years, the OORA process, as I will
describe it, has grown and matured. Techniques, graphics, and
documentation have all been modified based on experience. The OORA
process continues to mature.

		     A SUGGESTED OORA METHODOLOGY

[Truth in advertising: As I mentioned earlier, there is more than one
school of thought on how OORA might be accomplished, e.g., [Coad and
Yourdon, 1989] and [Shlaer and Mellor, 1988]. I share some ideas in
common with other OORA methodologists, but both they, and myself,
would agree that what I am about to describe is significantly
different from other OORA approaches. (I would like to think that,
since I have been at it longer, that my approach is at a "higher state
of evolution." ;-))

More truth in advertising: Although I will describe this process as if
it were to be accomplished in one contiguous block of time, this is
most often not the case. It is best accomplished in a
recursive/parallel life-cycle.

Still more truth in advertising: I will list the steps to be
accomplished as if they were in sequential order. However, in
practice, some steps may be re-arranged (within limits), and many may
be accomplished in parallel, e.g., it is possible to be working on
step 1 and step 9 at the same time.]

The general OORA process is:

	1. Identify the sources of requirements information.

	2. Characterize the sources of requirements information.

	3. Identify candidate objects.

	4. Build object-oriented models of both the problem, and the
	   potential solution, as necessary.

	5. Re-localize the information around the appropriate
	   candidate objects.

	6. Select, create, and verify documentation for candidate
	   objects.

	7. Assign the candidate objects to the appropriate section of
	   the object-oriented requirements specification (OORS).

	8. Develop and refine the Qualifications Section of the OORS.

	9. Develop and refine the Precise and Concise System Description.

Now let's briefly look at each of these steps in slightly more detail.

There is nothing specifically object-oriented about the first two
steps. Regardless of the approach to requirements analysis one
chooses, they will have to be accomplished. Sources of requirements
information can include: existing requirements documents,
knowledgeable people, existing software (including prototypes), and
standards documents.

[Note: If there is already a pre-existing set of "functional"
requirements. There are systematic techniques for converting these
requirements into object-oriented requirements. Keep in mind that one
does _not_ have to do functional requirements analysis first. You can
start off with object-oriented requirements.]

Characterizing the sources of requirements information is actually a
two-part process. You must characterize the source of the information,
and you must also characterize the information provided by the source.
The source may be characterized in terms of availability, authority,
credibility, responsiveness, longevity, ease of access, and types of
information provided. The information provided by the source may be
characterized in terms of form (e.g., textual, graphical,
machine-readable, and verbal), completeness, how current the
information is, which aspects of the product it addresses, and
understandability, among others.

Identification of candidate objects requires that we define what
"objects" are. I most frequently deal with the following objects:
classes, instances (of classes), metaclasses, subsystems, and systems
of objects. ("Subsystems" and "systems of objects" are large
object-oriented entities. I will defer their definitions, construction
techniques, and usages to another time.) Metaclasses are classes whose
instances themselves are classes.

[There are a quite a number of issues I am _not_ discussing here,
e.g., suppose that it is a given that the implementation language is a
"classless" object-oriented language, e.g., Self, or what if the
target language does not support metaclasses.]

Each object will have to be documented in some form. Classes,
metaclasses, and, in rare cases, instances, are usually documented
with an Object and Class Specification (OCS, pronounced "ox"). OCSs
have five main sections:

	- a precise and concise description (this can be thought of as
	  an "executive summary" although it is more than that)

	- a set of graphical representations showing both the static
	  relationships, and the dynamic behavior of the object (the
	  graphical techniques used here include semantic networks,
	  state transition diagrams, and Petri net graphs)

	- a description of the required and suffered operations (and
	  their associated methods)

	- a description of the state information for the object

	- any constants and exceptions exported by the object

Object-oriented models of both the problem and solution can be
fashioned from the following set of graphical techniques:

	- semantic networks
	- state transition diagrams
	- Petri net graphs
	- timing diagrams
	- object-message diagrams
	- graphics for subsystems
	- graphics for systems of objects

Each of these graphical models can have an accompanying set of
"engineering drawing notes," and should also be supported by the
"object dictionary" Models can also be created using automated
techniques, e.g., Smalltalk, Trellis, Actor, and Prograph.

A natural human tendency is to (re)localize "everything we know about
an item" in one place. In the case of a candidate object, this will
most often be a mixture of object-specific information, and
application-specific (i.e., context-specific) information. To ensure a
high degree of reusability, we must take care to separate the
characteristics of the object "in isolation" from how the object is
used in the application at hand.

By this time, we should be able to examine our (hopefully automated)
library of reusable objects and be able to select items which closely
match our requirements. If we do not find a suitable match, we will
have to create some new documentation (e.g., an OCS) for our candidate
object. (Note, however, depending on the details of our OORA process,
we may choose not to document some objects, e.g., those objects for
which we will not be creating code software.) In any case, we must
verify that either the selected or created object is appropriate.

[One OCS may inspire several different designs, and each design may be
implemented in several different ways (e.g., in multiple programming
languages). The "object dictionary" may be a full-blown reusability
system, e.g., McDonnell Douglas's AMPEE, Rockwell's ROSES, and others.
The reusability system may store other items along with the OCS, e.g.,
the design(s) and the source code.]

If you insist on "doing all of your object-oriented analysis at once,"
you will very likely be creating an object-oriented requirements
specification (OORS). The OORS is divided into two parts: an
object-general section, and an application-specific section.
Oversimplifying, reusable objects are placed into the object-general
section, and application-specific objects, along with an
object-oriented systems specification are placed in the
application-specific section.

Although an object may be reused many times within a single
application, we may have a different set of qualifications we wish to
place upon the object, depending on which context the object finds
itself in. (This is another, commonly-encountered object-oriented
software engineering issue. Each new context does not necessarily
require a new subclass.) For example, we may have defined a "name
class" such that its instances are 40 character strings. It might not
be desirable to create a new subclass simply because a specific
context calls for names no longer than 37 characters.

At sometime during the OORA process, the OORA analysts must create an
accurate description of the system which will be delivered to the
client. This description will most likely be a combination of textual
and graphical information. This description, along with cost and
schedule estimates, legal documents, and other appropriate items
belongs in the application-specific section of the OORS.

Yes, there is much more I could say, but, I am already feeling very
guilty about the length of this message. Thanks for listening.

				-- Ed Berard
				   Berard Software Engineering, Inc.
				   18620 Mateney Road
				   Germantown, Maryland 20874
				   Phone: (301) 353-9652
				   FAX:   (301) 353-9272
				   E-Mail: eberard@ajpo.sei.cmu.edu 

			     BIBLIOGRAPHY

[Anderson et al, 1989]. J.A. Anderson, J. McDonald, L. Holland, and E.
Scranage, "Automated Object-Oriented Requirements Analysis and
Design," Proceedings of the Sixth Washington Ada Symposium, June
26-29, 1989, pp. 265 - 272.

[Bailin, 1989]. S. C. Bailin, "An Object-Oriented Requirements
Specification Method," Communications of the ACM, Vol. 32, No. 5, May
1989, pp. 608 - 623.

[Booch, 1986]. G. Booch, "Object Oriented Development," IEEE
Transactions on Software Engineering, Vol. SE-12, No. 2, February
1986, pp. 211 - 221.

[Coad, 1988]. P. Coad, "Object-Oriented Requirements Analysis (OORA):
A Practitioner's Crib Sheet," Proceedings of Ada Expo 1988, Galaxy
Productions, Frederick, Maryland, 1988, 9 pages.

[Coad and Yourdon, 1989]. P. Coad and E. Yourdon, OOA --
Object-Oriented Analysis, Prentice-Hall, Englewood Cliffs, New Jersey,
1989.

[DeMarco, 1979]. T. DeMarco, Structured Analysis and System
Specification, Yourdon Press, New York, New York, 1979.

[Freeman, 1979].  P. Freeman, "A Perspective on Requirements Analysis
and Specification," IBM Design '79 Symposium, 1979.  Reprinted in
[Freeman and Wasserman, 1980], pp. 86 - 96.

[Freeman and Wasserman, 1980].  P. Freeman and A. I. Wasserman,
Editors, Tutorial on Software Design Techniques, Third Edition,
Catalog No. EHO161-0, Institute of Electrical and Electronic
Engineers, New York, New York, 1980.

[Freeman and Wasserman, 1983].  P. Freeman and A. I. Wasserman,
Editors, Tutorial on Software Design Techniques, Forth Edition,
Catalog No. EHO205-5, IEEE Computer Society Press, Silver Spring,
Maryland, 1983.

[Gane and Sarson, 1977].  C. Gane and T. Sarson, Structured Systems
Analysis: Tools and Techniques, Improved System Technologies, New
York, New York, 1977.

[Gane and Sarson, 1979].  C. Gane and T. Sarson, Structured Systems
Analysis: Tools and Techniques, Prentice-Hall, Englewood Cliffs, New
Jersey, 1979.

[Gomaa, 1984].  H. Gomaa, "A Software Design Method for Real-Time
Systems," Communications of the ACM, Vol. 27, No. 9, September 1984,
pp. 938 - 949.

[Gray, 1988]. L. Gray, "Transitioning from Structured Analysis to
Object-Oriented Design," Proceedings of the Fifth Washington Ada
Symposium, June 27 - 30, 1988, Association for Computing Machinery,
New York, New York, 1988, pp. 151 - 162.

[Hatley and Pirbhai, 1988]. D.J. Hatley and I.A. Pirbhai, Strategies
for Real-Time System Specification, Dorset House Publishing, New York,
New York, 1988.

[Khalsa, 1989]. G.K. Khalsa, "Using Object Modeling to Transform
Structured Analysis Into Object-Oriented Design," Proceedings of the
Sixth Washington Ada Symposium, June 26-29, 1989, pp. 201 - 212.

[Ross, 1977].  D. T. Ross, "Structured Analysis (SA): A Language for
Communicating Ideas," IEEE Transactions on Software Engineering, Vol.
SE-3, No. 1, January 1977, pp. 16 - 34.  Reprinted in [Freeman and
Wasserman, 1983], pp. 96 - 114.

[Ross and Schoman, 1977].  D. T. Ross and K. E. Schoman, "Structured
Analysis for Requirements Definition," IEEE Transactions on Software
Engineering, Vol. SE-3, No. 1, January 1977, pp. 6 - 15.  Reprinted in
[Freeman and Wasserman, 1983], pp. 86 - 95.

[Shlaer and Mellor, 1988]. S. Shlaer and S.J. Mellor, Object-Oriented
Systems Analysis: Modeling the World In Data, Yourdon Press:
Prentice-Hall, Englewood Cliffs, New Jersey, 1988.

[Smith and Tockey, 1988]. M. K. Smith and S.R. Tockey, "An Integrated
Approach to Software Requirements Definition Using Objects,"
Proceedings of Ada Expo 1988, Galaxy Productions, Frederick, Maryland,
1988, 21 pages.

[SofTech, 1976]. An Introduction to SADT: Structured Analysis and
Design Technique, SofTech, Inc., Waltham, Massachusetts, 1976.

[Stoecklin et al, 1988]. S.E. Stoecklin, E.J. Adams, and S.Smith,
"Object-Oriented Analysis," Proceedings of the Fifth Washington Ada
Symposium, June 27 - 30, 1988, Association for Computing Machinery,
New York, New York, 1988, pp. 133 - 138.

[Ward and Mellor, 1985].  P. T. Ward and S. J. Mellor, Structured
Development for Real-Time Systems, Volumes 1, 2 and 3, Yourdon Press,
New York, New York, 1985.

[Ward, 1989]. P.T. Ward, "How to Integrate Object Orientation with
Structured Analysis and Design," IEEE Software, Vol. 6, No. 2, March
1989, pp. 74 - 82.

dwiggins@atsun.a-t.com (Don Dwiggins) (12/05/89)

In article <628@ajpo.sei.cmu.edu> eberard (Edward Berard) writes:
   First, as I have stated in several articles in the past, the
   development part of the object-oriented life-cycle is best
   accomplished using a recursive/parallel approach, i.e., "analyze a
   little, design a little, implement a little, and test a little."

You've mentioned this approach in several articles, but I don't remember
seeing a published reference for it in your otherwise comprehensive
bibliographies.  I must admit that the name leaves me a "little" uneasy
(:-); read naively, there doesn't seem to be much structure to it.  How does
one know when to do which, and when to stop and do something else?  How is
progress to be measured in this model?  etc...

It does remind me a bit of Boehm's spiral model.  One possible point of
difference is that Boehm's model is designed to improve the management of
risk factors (indeed, it seems that risk management is what led Boehm to
propose this model), and doesn't explicitly presuppose or preclude object
(or any other) orientation to analysis, design, implementation, etc.

I'd be interested in a "comparison and contrast" of the two approaches, and
a discussion of the justification for the quoted sentence.

--
Don Dwiggins				"Solvitur Ambulando"
Ashton-Tate, Inc.
dwiggins@ashtate.a-t.com
dwiggins@ashtate.uucp

daven@ibmpcug.co.uk (D R Newman) (12/08/89)

dwiggins@atsun.a-t.com (Don Dwiggins) wrote:
>In article <628@ajpo.sei.cmu.edu> eberard (Edward Berard) writes:
>>..the development part of the object-oriented life-cycle is best accomplished
>>using a recursive/parallel approach, i.e., "analyze a little, design a little,
>>implement a little, and test a little."
>.. but I don't remember seeing a published reference for it... the name leaves
>me a "little" uneasy (:-); read naively, there doesn't seem to be much
>structure to it. How does one know when to do which, and when to stop and do
>something else? How is progress to be measured in this model? etc...

There are many references to this iterative (and recursive) approach to
product development. This process is called "Action research" by those
involved in rural development projects in the Third World, and goes under
other names when used by business studies people to explain the least risky
way to develop innovations. These people have already studied in depth the
problems that Don Dwiggins mentioned, and have answers.

I could put together a list of references, if there are any readers who
believe, as I do, that software development is not unique, but shares problems
common to all technological innovation and implantation. So far, I have only
found AI people willing to look to other disciplines for answers to their
problems, so I doubt that anyone here would be interested.

Dave Newman, Consultants in Appropriate Technology
daven@ibmpcug.co.uk (on Usenet) or gn:davenewman (on IGC/CDP networks).
-- 
Automatic Disclaimer:
The views expressed above are those of the author alone and may not
represent the views of the IBM PC User Group.

hallett@pet16.uucp (Jeff Hallett x5163 ) (12/08/89)

In article <DWIGGINS.89Dec4143109@atsun.a-t.com> dwiggins@atsun.a-t.com (Don Dwiggins) writes:
>In article <628@ajpo.sei.cmu.edu> eberard (Edward Berard) writes:
>   First, as I have stated in several articles in the past, the
>   development part of the object-oriented life-cycle is best
>   accomplished using a recursive/parallel approach, i.e., "analyze a
>   little, design a little, implement a little, and test a little."
	
Oh, I don't know about that.   I personally do  not like this approach
for  any kind of development   (note:  for  requirements, yes; general
development, no).

>It does remind me a bit of Boehm's spiral model.  One possible point of

No  small  coincidence  that it should.    Boehm's  spiral  model,  in
general, is nothing  new and neither is  this approach.  I  personally
favor an iterative approach in OOA/D - 1>  study the classes required;
2> study the instance variables; 3> formulate a message set; 4> create
relational  links; 5>  create  a  preliminary  information  model;  6>
perform message-passing scenarios to test  requirements; 7> curse that
it doesn't work  and  refine starting with step  1 again. This process
tends to converge quickly after only a few "start-overs".


--
	     Jeffrey A. Hallett, PET Software Engineering
      GE Medical Systems, W641, PO Box 414, Milwaukee, WI  53201
	    (414) 548-5163 : EMAIL -  hallett@gemed.ge.com
		  Est natura hominum novitatis avida

marc@dumbcat.UUCP (Marco S Hyman) (12/11/89)

In article <DWIGGINS.89Dec4143109@atsun.a-t.com> dwiggins@atsun.a-t.com (Don Dwiggins) writes:
    In article <628@ajpo.sei.cmu.edu> eberard (Edward Berard) writes:
       development part of the object-oriented life-cycle is best
       accomplished using a recursive/parallel approach, i.e., "analyze a
       little, design a little, implement a little, and test a little."
    
    I must admit that the name leaves me a "little" uneasy
    (:-); read naively, there doesn't seem to be much structure to it.  How does
    one know when to do which, and when to stop and do something else?  How is
    progress to be measured in this model?  etc...
    
Let me turn things around a bit.  With any large project, large being more
than one calendar year, the analyze -> design -> implement -> test
structure eventually delivers code that can be a year or more out of date.
While the programmers were doing the design -> implement -> test phases of
the project the requirements may have changed.

As to when it's time to change phases: I prefer to do it as soon as possible.
That is, as soon as an outline of the requirement are done a *prototype*
design is started.  The design will, of course, point out areas where the
requirements have to be modified or added to. Once a design is sketched out
it's time to do a *prototype* implementation.  Of course, the implementation
will point out shortcoming of the design.  As soon as there is something
to show the end user the better.  Until a prototype of the product gets in
to the end users hands feedback loop isn't complete (my opinion).  I don't
see test as a separate step.  Call it validation.  The design validates the
requirements, the implementation validates the design and the user ultimately
validates the implementation.  Programmers test their code as it is
written.

By bringing the ultimate user into the process as soon a as possible the
chances of delivering a usable program/system are much improved.  I think
this is what Brooks called 'growing a program' in his _No Silver Bullet_
article.  When developing a generic product the hard part is finding the
appropriate user surrogate.  

None of this has much to do with objects or OO anything.  Every successful
large project I've worked on has used this method.  (You can easily tell if
this method is used:  The requirements document isn't signed off until just
before the product ships.)  But: it's no panacea.  Not every large project
I've worked on using this method has been successful.
-- 
// marc				{ames,pyramid,sun}!pacbell!dumbcat!marc

dwiggins@atsun.a-t.com (Don Dwiggins) (12/15/89)

You write:
   I could put together a list of references, if there are any readers who
   believe, as I do, that software development is not unique, but shares
   problems common to all technological innovation and implantation. So far,
   I have only found AI people willing to look to other disciplines for
   answers to their problems, so I doubt that anyone here would be
   interested.

Yes, I'd definitely be interested in references, and I suspect that there
are others who would be, too; pleast post.

I presume that the harsh tone of the last sentence derives from some
disappointing personal experience; let me assure you that there are software
folks who are only too happy to learn from any source that seems to have
some relevance.  For example, Watts Humphrey of SEI has based his work on
that of Edward Deming, with notably successful results (see, for example,
his article in the March 1988 issue of IEEE Software).  There have also been
discussions on the net about the parallels between software engineering and
other engineering disciplines, particularly electrical, mechanical, and
architectural.

To clarify any misunderstanding, my questions about the recursive/parallel
approach were sincere; I see some potential dangers in the approach, and am
interested to see how they're addressed.

--
Don Dwiggins				"Solvitur Ambulando"
Ashton-Tate, Inc.
dwiggins@ashtate.a-t.com