[net.works] "look for bugs"

@RUTGERS.ARPA:BRYAN@SU-SIERRA.ARPA (02/05/85)

From: Doug Bryan <BRYAN@SU-SIERRA.ARPA>



I agree with Charlie Levy's statement that most of the programmers time
is spent in his step 3a looking for bugs.  This situation should alarm
us as computer scientists.  Our current "programming environments", which
today are barely more than good operating systems, are not doing the job.

Programming tools which automate design and bug detection are greatly needed.
Such things as graphical design and simulation tools have been in use in the
hardware world for quite some time and need to be added to the software
enviroments.  

Although verification will not be "state-of-the-practice" for some time,
program annotation is a very powerfull bug detection tool which is nearly
ready for industrial use.

A great deal of the debugging done on a software system is done late in
the life cycle or after its first use.  This debugging time could be 
reduced of the programming environments contained tools specifically for this
phase of the life cycle.  Configuration management and automated testing
tools for instance.

I am not saying that I have any great new ideas here.  I am just saying that
if we recogmize debugging as the most costly phase of software development
then software costs can be reduced by placing more research and development
emphasis on the environment rather than the language itself or the standard
tools like compilers, linkers, loaders, editors. . . .

doug bryan
Stanford University

-------

doug@terak.UUCP (Doug Pardee) (02/06/85)

-----------
This commentary is rated "R".  It contains subject matter which is
considered heretical by anyone with less than 17 years experience,
and less experienced readers must be accompanied by a grizzled
old-timer.
-----------

> I agree with Charlie Levy's statement that most of the programmers time
> is spent in his step 3a looking for bugs.  This situation should alarm
> us as computer scientists.  Our current "programming environments", which
> today are barely more than good operating systems, are not doing the job.
> 
> Programming tools which automate design and bug detection are greatly needed.

I can give you a very good reason why programmers spend so much time
looking for bugs.  You're staring at it right now.  The interactive
terminal connected to a timesharing system.

Waaaay back in "the good old days", we used to get *ONE* crack at
running our programs each day.  (You will recall, perhaps, that an
IBM 360/40 was considered a mainframe, but it was slower than an
Apple ][).  Programs were punched on cards, and listed off-line on
equipment such as the IBM 407 accounting machine.

We produced more programs, bigger programs, and better programs,
under those conditions than programmers do under current conditions.
But it wasn't as much "fun" because we didn't get to play with the
computer very much.

We didn't spend much time debugging.  We spent our time preventing
bugs.  Not: "Oh I see the problem, I'll just change this a little
bit and try it again... Oh damn it still didn't work, *now* what's
wrong?"  Instead it was "Does this code still make sense?  What can
I think of that would cause it to freak out?  Can that happen?"

It wasn't unusual at all to find and fix ten or more bugs each day,
without a computer run.  And since there was plenty of time to
work on the program, the fixes were done right instead of kludging.
There was no inducement to produce half-baked programs (have you
looked at the "BUGS" section of the UNIX(tm) manual pages?  Why
hasn't somebody FIXED those after all these years?)

OK, enough history lesson.  The moral is that programmers will spend
a lot of time debugging if they don't take the time to prevent bugs.
And that means time reviewing the PROGRAM, away from the computer.
-- 
Doug Pardee -- Terak Corp. -- !{hao,ihnp4,decvax}!noao!terak!doug

david@daisy.UUCP (David Schachter) (02/09/85)

Why not a programming methodology that emphasizes >>avoidance<< of bugs?  Do
it right the first time!  I use such a methodology but it is entirely manual:
the language I use and the programming environment in which I use it doesn't
have any automatic support for preventing bugs.

Make an analogy: writing a program is similar to talking to a human being.
Yet the computer always mis-understands what we say: we have to repeat and
repeat, with minor modifications on every repetition to have the computer
understand.  This is called "debugging".  But when talking to a human being,
our discourse rarely requires repetition.  There is enough redundancy in
the language and the communications channel to avoid mis-interpretation most
of the time.

Perhaps a programming environment that "understands" programming and, more
importantly, the application to which the program will be put is needed.  Such
an environment would be able to deduce what you meant and correct minor errors.
Just a small matter of AI programming (Artificial Intelligence.)  [That is
sarcasm, of course.]  Last year, some chaps from M.I.T. presented the Thursday
Afternoon Lecture to the R & D staff here at Daisy and discussed a start on
such a project.  They called it "The Programmer's Apprentice."  I wasn't too
enamoured of it: it seemed like a smart macro processor more than anything
else.  But at least it is a start at >>program synthesis<<.  Tell the machine
>>what<< you want.  Let it figure out >>how<< to get it done (what sort
routine to use, how to structure the menus, etc.)  Do it right the first time!

In college, I used a batch PL/I interpreter called PL/C from Cornell.  It was
reasonably good at correcting simple errors.  This saved two re-runs for me,
on the average.

[This article represents only my opinions, not those of my company.]  {N.F.Q.}