[comp.software-eng] Experiences with Defect Prevention

marick@m.cs.uiuc.edu (06/27/90)

You should read this:

"Experiences with Defect Prevention".  Mays, et. al.  IBM Systems
Journal, Vol. 29, No. 1, 1990.

Abstract:

"Defect prevention is the process of improving quality and
productivity by preventing the injection of defects into a product.
It consists of four elements integrated into the development process:
(1) causal analysis meetings to identify the root cause of defects and
suggest preventive actions; (2) an action team to implement the
preventive actions; (3) kickoff meetings to increase awareness of
quality issues specific to each development stage; and (4) data
collection and tracking of associated data.  The Defect Prevention
Process has been successfully implemented in a variety of
organizations within IBM, some for more than six years.  This paper
discusses the steps needed to implement this process and the results
that may be obtained.  Data on quality, process costs, benefits, and
practical experiences are also presented.  Insights into the nature
of programming errors and the application of this process to a variety
of working environments are discussed."


Three things attract me about this process:

1.  Often, the recommendations of software engineering pundits are
global and revolutionary.  Many ignore the need for gradual, steady
process improvement.  Consequently, you get far too many prescriptions
like "Throw AWAY that error-prone C code!  DISPENSE with your
investment in training and tools!  Paradise lies just beyond the
learning curve!"  and far too few comments like "You can reduce the
number of misused-equal-sign errors in C if you make the habit of
writing all your tests as 'if (CONST = var)', rather than the other
way around".  This technique helps you discover what you can improve
while you're waiting for the revolution that we all hope will come.

2.  The recognition that the way to do things better is to think hard
about what you do wrong.  Obvious, but it doesn't happen unless
it's made to happen.

3.  The recognition that grovelling in your errors won't be effective
unless there's time set aside to implement solutions.

Brian Marick
Motorola @ University of Illinois
marick@cs.uiuc.edu, uiucdcs!marick

bwb@sei.cmu.edu (Bruce Benson) (06/28/90)

In article <39400109@m.cs.uiuc.edu> marick@m.cs.uiuc.edu writes:
>Three things attract me about this process:

>1.  Often, the recommendations of software engineering pundits are
>global and revolutionary.  Many ignore the need for gradual, steady

"Silver bullets".  Also, until we software folks solve our own problems, then
nonsoftware bosses will continue to try and solve it for us.

>number of misused-equal-sign errors in C if you make the habit of
>writing all your tests as 'if (CONST = var)', rather than the other
>way around".  This technique helps you discover what you can improve
>while you're waiting for the revolution that we all hope will come.

In a maintenance environment, we did the unheard of thing of looking at
the errors that were being made (those caught by formal testing).  We
concluded that most of the errors were caused by over confident programmers
sitting down at the terminal and making code changes on the fly.  These
were simple errors *quickly* fixed by the programmer (and possibly
introducing more errors for the same reason).  The solution we picked, and
at first resisted  by the programmers, was to annotate a printed listing 
with all anticipated changes (no matter how few).  This simple discipline
*dramatically* reduced the number of simple errors (the large bulk of
errors).  The programmers were quickly won over to this "obsolete" way
of planning changes (they liked creating fewer errors).

In looking at the solutions to errors reported by the customer, the large
majority of the problems would have been caught by the simple expeidence
of testing the code.  Functional testing missed too much of the actual
written code.  The programmers were required to *structurally* test their
code: i.e., exercise every line of code in the program (or every line
changed plus supporting code).  This simple, somewhat mindless, approach
to testing also resulted in extreme annoyance by formal testing, as they found
very few errors in delivered code.

They key was we *LOOKED* at the *actual* errors we were making and at the
fixes to the errors.  This was contrary to the culture of the individual
programmer and hands off supervision.  These very simple and fundamental
changes helped the individual get better as a professional programmer.  
These changes required the programmers adopt a mental discipline about
their work, it *reduced* the fun factor of programming, which was made
up for by the satisfaction of less buggy code (and annoying testers).

>2.  The recognition that the way to do things better is to think hard
>about what you do wrong.  Obvious, but it doesn't happen unless
>it's made to happen.

Three cheers for common sense that ain't so common.  I think one of the
flaws (a bit of heresy here) in the approaches SEI advocates is to focus
last on examining the actual errors one makes (SEIers correct me if I am
wrong - referring to Watt's maturity model level 5). 

>3.  The recognition that grovelling in your errors won't be effective
>unless there's time set aside to implement solutions.

For low maturity software organizations (about 85% of us), *simple* solutions,
appropriately applied, will bring about *big* improvements.  You got to
solve the simple problems before the fancy tools and techniques can provide
you any real improvements in quality and productivity.

The hard part:  If you know how to help someone stay on a diet, or exercise
regularly, or eat right, than you have the people skills to get programmers
(or youself as a programmer) to exercise essential disciplines.  You must,
as a programmer/supervisor/manager, build an environment that encourages these
activities.  

* Bruce Benson                   + Internet  - bwb@sei.cmu.edu +       +
* Software Engineering Institute + Compuserv - 76226,3407      +    >--|>
* Carnegie Mellon University     + Voice     - 412 268 8496    +       +
* Pittsburgh PA 15213-3890       +                             +  US Air Force

marick@m.cs.uiuc.edu (06/29/90)

> /* Written 10:02 am  Jun 28, 1990 by bwb@sei.cmu.edu in m.cs.uiuc.edu:comp.software-eng */
> For low maturity software organizations (about 85% of us), *simple* solutions,
> appropriately applied, will bring about *big* improvements.  You got to
> solve the simple problems before the fancy tools and techniques can provide
> you any real improvements in quality and productivity.

Agreed.  In the IBM paper, the action team (those who implement solutions) is
3-4 people for organizations of 20-50 or 8-10 for larger organizations
of 200-250.  These people spend about 10% of their time on this task
-- this comes out to less than 1% of the total people-time.  People
spend an average of 24 person-hours per action -- they're not
implementing monolithic silver bullets.

Brian Marick
Motorola @ University of Illinois
marick@cs.uiuc.edu, uiucdcs!marick

pdsmith@bbn.com (Peter D. Smith) (06/30/90)

In article <7669@fy.sei.cmu.edu> bwb@sei.cmu.edu (Bruce Benson) writes:
>For low maturity software organizations (about 85% of us), *simple* solutions,
>appropriately applied, will bring about *big* improvements.  You got to
>solve the simple problems before the fancy tools and techniques can provide
>you any real improvements in quality and productivity.
>
>* Bruce Benson                   + Internet  - bwb@sei.cmu.edu +       +
>* Software Engineering Institute + Compuserv - 76226,3407      +    >--|>
>* Carnegie Mellon University     + Voice     - 412 268 8496    +       +
>* Pittsburgh PA 15213-3890       +                             +  US Air Force

[Large amounts of text removed]

I concur.  The time to resolve bugs, and as a side effect the number of bugs
found where I used to work dropped after the QA department started to list,
in every weekly staff meeting, who had bugs over 30 days old.  Our average
responce time dropped from over a year (sic!) to two weeks.

					Peter D. Smith