[comp.unix.wizards] Ware Ware Wizardjin

kemnitz@gaia.berkeley.edu (Greg Kemnitz) (04/07/91)

While we who have mastered the intricacies of the computer and have high
typing speeds may not "get any more work done" on fast computers with bitmapped
graphics, etc, there is an entire world of new people getting work done on
computers who would not be using them if it weren't for those Computationally
Expensive and Wasteful (tm) interfaces.  Gee, listening to some wizards, you'd
think the bad old days had come back when computer time was more important
than human time, and Herculian feats of engineering were required to make
the computer do much of anything.

I suppose some of us miss the days of yore when computers were the altars
through which the common people worshipped us, rather than being things that
the "common people" use to get their work done.  We may look down our noses
at those who think a device driver has something to do with auto racing, and
for whom the options to IOCTL are not the stuff of mortal feuds, but they are
the people who pay our salaries and justify our existence.

-----------------------------------------------------------------------
Greg Kemnitz                  |      "I ran out of the room - I
Postgres Chief Programmer     |      didn't want to be killed by a pile
278 Cory Hall, UCB            |      of VMS manuals" :-)
(415) 642-7520                |
kemnitz@postgres.berkeley.edu |      --A friend at DEC Palo Alto in the Quake

ed@mtxinu.COM (Ed Gould) (04/07/91)

>Gee, listening to some wizards, you'd think the bad old days had
>come back when computer time was more important than human time,
>and Herculian feats of engineering were required to make the computer
>do much of anything.

That's not the point at all.  The point is that most of the cycles
used by fancy GUIs don't help productivity - or usability by novices -
at all.  What they do is paint extra goo on the screen, and they
do it badly at that.  There are at least two high-powered systems
for managing bitmapped displays that do not wast all those cycles
(MGR and Plan 9),  Even the Macintosh makes better use of its
graphics and processor resources than does X.  Remember the original
Mac?  It had a fairly slow 68000 (not an '010 or '020) and a *total*
of 128KB of RAM.  A lot of people got very useful work done on it.
Many of them were novices.

-- 
Ed Gould			No longer formally affiliated with,
ed@mtxinu.COM			and certainly not speaking for, mt Xinu.

"I'll fight them as a woman, not a lady.  I'll fight them as an engineer."

kemnitz@POSTGRES.BERKELEY.EDU (Greg Kemnitz) (04/08/91)

I suppose my initial response may have been due to inattentive reading of this
thread - sorry about that.

>>Gee, listening to some wizards, you'd think the bad old days had
>>come back when computer time was more important than human time,
>>and Herculian feats of engineering were required to make the computer
>>do much of anything.
>
>That's not the point at all.  The point is that most of the cycles
>used by fancy GUIs don't help productivity - or usability by novices -
>at all.  What they do is paint extra goo on the screen, and they
>do it badly at that.  There are at least two high-powered systems
>for managing bitmapped displays that do not wast all those cycles
>(MGR and Plan 9),  Even the Macintosh makes better use of its
>graphics and processor resources than does X.  Remember the original
>Mac?  It had a fairly slow 68000 (not an '010 or '020) and a *total*
>of 128KB of RAM.  A lot of people got very useful work done on it.
>Many of them were novices.

X now has been elevated to the status of "Standard", and now that this has
happened, there will be little work on such protocols outside of a few
research labs for some time.  Truthfully I don't know whether this is good or
bad; it seems that machines like the Sparc II and DEC 5K have finally gotten
enough moxie to run X well and their prices will be in the "easily affordable"
range in a couple years or less.  Also, the standard means that finally there'll
be the possibility of easy-to-use, graphical software for UNIX that has decent
manuals, regular releases, 24-hour phone support, and enough sales volume to
get the unit price under $1,000.  Only when this happens will UNIX cease to be
a "niche" OS.

When I first encountered X, I thought that it was truely horrible - it was
painfully slow, quite a pain to program, and binaries linked to it tended
to fill up the disk rather quickly, especially as toolkit upon toolkit was 
layered on top of it.  But it appears that it is easier to wait for fast
machines rather than to design standard graphics protocols that aren't bloated,
politically acceptable masses.  Also, it appears that the de facto trend in
industry is to hope hardware improves fast enough to let poorly written
software run well rather than writing software properly, and it is hard to
argue that this strategy has been a complete failure, even if it is a sloppy
approach.

>-- 
>Ed Gould			No longer formally affiliated with,
>ed@mtxinu.COM			and certainly not speaking for, mt Xinu.
>
>"I'll fight them as a woman, not a lady.  I'll fight them as an engineer."

-----------------------------------------------------------------------
Greg Kemnitz                  |      "I ran out of the room - I
Postgres Chief Programmer     |      didn't want to be killed by a pile
278 Cory Hall, UCB            |      of VMS manuals" :-)
(415) 642-7520                |
kemnitz@postgres.berkeley.edu |      --A friend at DEC Palo Alto in the Quake

gwyn@smoke.brl.mil (Doug Gwyn) (04/08/91)

In article <12535@pasteur.Berkeley.EDU> kemnitz@gaia.berkeley.edu (Greg Kemnitz) writes:
>I suppose some of us miss the days of yore when computers were the altars
>through which the common people worshipped us, rather than being things that
>the "common people" use to get their work done.  We may look down our noses
>at those who think a device driver has something to do with auto racing, and
>for whom the options to IOCTL are not the stuff of mortal feuds, but they are
>the people who pay our salaries and justify our existence.

I hear this sort of thing fairly often, and don't know who started it.
I've been programming computers for around 25 years, and have never been
"worshipped", nor would I have wanted that.  Also, my existence is most
emphatically not justified by being a slave to others.

Perhaps what many of the old-timers miss most is the expectation that
people who use computers would know what they are doing.  The idea that
an arbitrary naive human should be able to properly use a given tool
without training or understanding is even more wrong for computing than
it is for other tools (e.g. automobiles, airplanes, guns, power saws).
I hate to think how much time I've lost trying to help computer users
who could have been able to help themselves if they had spent even a
few hours of study before proceeding to mess around with the computer.

cgy@cs.brown.edu (Curtis Yarvin) (04/08/91)

In article <15751@smoke.brl.mil> gwyn@smoke.brl.mil (Doug Gwyn) writes:
>In article <12535@pasteur.Berkeley.EDU> kemnitz@gaia.berkeley.edu (Greg Kemnitz) writes:
>>I suppose some of us miss the days of yore when computers were the altars
>>through which the common people worshipped us, rather than being things that
>>the "common people" use to get their work done.  We may look down our noses
>>at those who think a device driver has something to do with auto racing, and
>>for whom the options to IOCTL are not the stuff of mortal feuds, but they are
>>the people who pay our salaries and justify our existence.
>
>Perhaps what many of the old-timers miss most is the expectation that
>people who use computers would know what they are doing.  The idea that
>an arbitrary naive human should be able to properly use a given tool
>without training or understanding is even more wrong for computing than
>it is for other tools (e.g. automobiles, airplanes, guns, power saws).
>I hate to think how much time I've lost trying to help computer users
>who could have been able to help themselves if they had spent even a
>few hours of study before proceeding to mess around with the computer.

I'm a rank newbie compared to you (4 years with unix); but I beg to differ.

My observation is that an _inquisitive_ user can learn to use any software
with a simple user interface and a help facility.  It's true that studying
the manual helps; but I think the problem is the demise (or at least the
outnumberment) of curiosity.  A lot of users fear and loathe the computer,
and want to get their work done while learning as little about it as
possible.  By contrast, the inquisitive user is intrigued by the machine,
and actually enjoys learning.  These stereotypes are of course extreme and
exaggerated, but my point is serious.  As the use of computers has
progressed outside the technical community, the latter have become
outnumbered.  Hence the birth of touchy-feely interfaces, which are not
designed to give the user clean access to the machinery, but rather designed
to shield him (or her) from it.

razdan@phx.mcd.mot.com (Anshuman Razdan) (04/09/91)

In article <9104072151.AA28702@gaia> kemnitz@POSTGRES.BERKELEY.EDU (Greg Kemnitz) writes:

   Path: mcdphx!asuvax!cs.utexas.edu!swrinde!zaphod.mps.ohio-state.edu!pacific.mps.ohio-state.edu!linac!att!ucbvax!POSTGRES.BERKELEY.EDU!kemnitz
   From: kemnitz@POSTGRES.BERKELEY.EDU (Greg Kemnitz)
   Newsgroups: comp.unix.wizards

   When I first encountered X, I thought that it was truely horrible - it was
   painfully slow, quite a pain to program, and binaries linked to it tended
   to fill up the disk rather quickly, especially as toolkit upon toolkit was 
   layered on top of it.  But it appears that it is easier to wait for fast

...stuff deleted

   -----------------------------------------------------------------------
   Greg Kemnitz                  |      "I ran out of the room - I

------------------------- -------------------------

The shared libs did the trick for me. Now the binaries do not
fill up the disk.
--

Anshuman Razdan

************************************************************
* razdan@toy			Test and Methodology Group *
*							   *
* razdan@phx.mcd.mot.com	Diablo Plant, Tempe  Az    *
************************************************************

tr@SAMADAMS.PRINCETON.EDU (Tom Reingold) (04/09/91)

In article <9104072151.AA28702@gaia> kemnitz@POSTGRES.BERKELEY.EDU
(Greg Kemnitz) writes:

$ [...]
$ But it appears that it is easier to wait for fast machines rather than
$ to design standard graphics protocols that aren't bloated, politically
$ acceptable masses.  Also, it appears that the de facto trend in
$ industry is to hope hardware improves fast enough to let poorly written
$ software run well rather than writing software properly, and it is hard
$ to argue that this strategy has been a complete failure, even if it is
$ a sloppy approach.

Being a detail-minded programmer, I really hate to agree with this, but
I am finding it's sometimes correct.  I love properly designed programs
and I hate poorly designed and implemented programs, but I now work on
a product that suffered the design flaws recently covered in this
thread.  I am beginning to see that although the excess size and
duplicated function cost the user, the speed at getting the user what
he wants is also a major factor.  If he doesn't get what he wants
*soon*, even if improperly done, it may be a lost opportunity.

We hope to take some time and fix things up in our product.  If we
don't manage to find the time, it will be because we will be doing a
major rewrite anyway.  The major rewrite will be spawned by the need to
add large amounts of functionality, making our current product
obsolete.  If it's obsolete, tightening it up would be futile.  In
doing the rewrite, we hope to learn from our mistakes of excess size
and redundancy.
--
        Tom Reingold
        tr@samadams.princeton.edu  OR  ...!princeton!samadams!tr
        "Warning: Do not drive with Auto-Shade in place.  Remove
        from windshield before starting ignition."

rcd@ico.isc.com (Dick Dunn) (04/09/91)

kemnitz@gaia.berkeley.edu (Greg Kemnitz) writes:
> While we who have mastered the intricacies of the computer and have high
> typing speeds may not "get any more work done" on fast computers with bitmapped
> graphics, etc, there is an entire world of new people getting work done on
> computers who would not be using them if it weren't for those Computationally
> Expensive and Wasteful (tm) interfaces...

Wrong on several counts:
	- Ordinary folks were getting work done on computers before all the
	  gee-whiz goo was added on.
	- The complaints were not aimed at convenient interfaces.  The
	  first target was the excessive complexity of today's interfaces.
	  If anything, those of us who work with computers all day long,
	  and construct software for a living, are BETTER able to deal
	  with the complexity than the more naive users.
	- The second target was wasted resources.  If machine resources
	  are being wasted, they're wasted every bit as much for the naive
	  user as for the wizard.

At the core of the argument is a non-sequitur--that more people are using
computers, and current interfaces are enormously complex and inefficient,
therefore the complexity and inefficiency are necessary to make the
machines usable.  Correlation does not imply causality.  In fact, there is
some reasonable evidence that more people would use computers, and learn to
do more with them, if the user interfaces were simple and efficient.

Worse yet, all the arcana in user interfaces has the tendency to create a
secondary priesthood among the user community--the high priests remain the
ones who create the software, the ordinary priests are the users who can
figure out how to use the software.

>...Gee, listening to some wizards, you'd
> think the bad old days had come back when computer time was more important
> than human time, and Herculian feats of engineering were required to make
> the computer do much of anything.

There's a bit of an appeal to a guilt trip in this.

For me, the moaning about wasted computer time comes down to the fact that
machines are easily an order of magnitude faster, larger (memory), and
cheaper than a decade ago--however that multiplies out to some total
magical increase--yet we've barely carried a factor of two improvement out
through all the layers of software crap to the end user.  The point is NOT
that we (programmers) are coveting the increased capacity; it's that we're
not delivering it where it belongs--to the end user.  For example, a window
manager that uses 10% of your memory and 5-10% of your CPU to give you bas-
relief window borders and cute icons is robbing you--the user--blind.
-- 
Dick Dunn     rcd@ico.isc.com -or- ico!rcd       Boulder, CO   (303)449-2870
   ...Lately it occurs to me what a long, strange trip it's been.

ed@mtxinu.COM (Ed Gould) (04/09/91)

>X now has been elevated to the status of "Standard", and now that
>this has happened, there will be little work on such protocols
>outside of a few research labs for some time.  Truthfully I don't
>know whether this is good or bad; it seems that machines like the
>Sparc II and DEC 5K have finally gotten enough moxie to run X well
>and their prices will be in the "easily affordable" range in a
>couple years or less.

It is true that X has largely become a de facto standard.  What
that should mean, really, is that the X *protocol* has become
standard.  It also happens to mean that the current X *implementation*
has become standard.  That's where I have problems accepting X.
The current implementation is too large and too slow.  There is no
good technical reason that a small, efficient X server couldn't be
written.  The same is, to a somewhat lesser degree, true of the
client code and toolkits.

>Also, the standard means that finally there'll be the possibility
>of easy-to-use, graphical software for UNIX that has decent manuals,
>regular releases, 24-hour phone support, and enough sales volume
>to get the unit price under $1,000.  Only when this happens will
>UNIX cease to be a "niche" OS.

I have no argument with easy to use, well documented software.  I
have a problem with it only when it eats more of my system than I
believe it deserves.  I don't even have a problem with features,
so long as they have a real purpose, and aren't just somebody's
ill-considered idea of a neat hack that "will just take a few lines
of code."  3-D pushbuttons on a GUI just aren't worth the cycles.

-- 
Ed Gould			No longer formally affiliated with,
ed@mtxinu.COM			and certainly not speaking for, mt Xinu.

"I'll fight them as a woman, not a lady.  I'll fight them as an engineer."

pauld@stowe.cs.washington.edu (Paul Barton-Davis) (04/10/91)

In article <1991Apr9.020525.13001@mtxinu.COM> ed@mtxinu.COM (Ed Gould) writes:

>It is true that X has largely become a de facto standard.  What
>that should mean, really, is that the X *protocol* has become
>standard.  It also happens to mean that the current X *implementation*
>has become standard.  That's where I have problems accepting X.
>The current implementation is too large and too slow.  There is no
>good technical reason that a small, efficient X server couldn't be
>written.  The same is, to a somewhat lesser degree, true of the
>client code and toolkits.

I remember thinking when X and NeWs where still battling for the
console that its a tragedy that X could not have learnt more from the
PostScript world, and used cheap protocol requests. The classic
example I can remember was "how many bytes does it take X or NeWs
to ask for a 720 point `A' to be displayed ?" The implementation,
well, I don't enough about graphics h/w to comment. But the protocol
itself is responsible for sucking so many of the cycles this group
has been complaining about.

-- 
Paul Barton-Davis			<pauld@cs.washington.edu>
UW Computer Science Lab		``to shatter tradition makes us feel free''

rbj@uunet.UU.NET (Root Boy Jim) (04/10/91)

<9104072151.AA28702@gaia> kemnitz@POSTGRES.BERKELEY.EDU (Greg Kemnitz) writes:
>Somebody wrote:
>>>Gee, listening to some wizards, you'd think the bad old days had
>>>come back when computer time was more important than human time,
>>>and Herculian feats of engineering were required to make the computer
>>>do much of anything.
>>
>>That's not the point at all. 

Actually, that is the point.

>When I first encountered X, I thought that it was truly horrible

You are correct.

>- it was
>painfully slow, quite a pain to program, and binaries linked to it tended
>to fill up the disk rather quickly, especially as toolkit upon toolkit was 
>layered on top of it.

That's part of it. But the real problem is that it's too difficult to
program. You can't possibly remember all those include files, arguments,
and function calls. You'll need a whole shelf of manuals to write
the simplest code.

Another thing I hate about X is the protocol. There are missing
fields (requests are implicitly numbered; replies include this
number but not the function code), yet large blocks of space are
wasted in many requests. Some packets are timestamped, others aren't.
The protocol was designed to do only what it does; it is very difficult
to capture and replay an X session with any reliability (I know,
I've done it). Device independence? Hogwash! Unlike NeWS, the
interpreter is in the wrong place. Yeah the server swaps bytes,
but the client must use the server's pixel sizes, colormaps, etc.
Once a given conversation has been recorded, it is not portable
to a different server, and there is no obvious translation.
-- 
		[rbj@uunet 1] stty sane
		unknown mode: sane

mike@bria.UUCP (Michael Stefanik) (04/10/91)

In an article, ico.isc.com!rcd (Dick Dunn) writes:
|For me, the moaning about wasted computer time comes down to the fact that
|machines are easily an order of magnitude faster, larger (memory), and
|cheaper than a decade ago--however that multiplies out to some total
|magical increase--yet we've barely carried a factor of two improvement out
|through all the layers of software crap to the end user.  The point is NOT
|that we (programmers) are coveting the increased capacity; it's that we're
|not delivering it where it belongs--to the end user.  For example, a window
|manager that uses 10% of your memory and 5-10% of your CPU to give you bas-
|relief window borders and cute icons is robbing you--the user--blind.

I belive this to be 100% correct, and I think that Dick deserves a public
pat on the back (consider yourself "patted" :-)

All of the cutesy three dimensional shaded windows that explode in 32 colors
as they open are a complete waste of time and money.  Users are concerned
about *getting the job done* and nothing that I have seen in recent years
has really advanced this point.  In other, less eloquent words: "If your pour
chocolate on a turd, that don't make it no candy bar."
-- 
Michael Stefanik, MGI Inc, Los Angeles | Opinions stated are never realistic
Title of the week: Systems Engineer    | UUCP: ...!uunet!bria!mike
-------------------------------------------------------------------------------
If MS-DOS didn't exist, who would UNIX programmers have to make fun of?

tb@Materna.DE (Torsten Beyer) (04/10/91)

mike@bria.UUCP (Michael Stefanik) writes:

>All of the cutesy three dimensional shaded windows that explode in 32 colors
>as they open are a complete waste of time and money.  Users are concerned
>about *getting the job done* and nothing that I have seen in recent years
>has really advanced this point.  In other, less eloquent words: "If your pour

Not quite right. I've seen quite a lot of users who really didn't know what
they wanted, but who had been exposed to whizbang Interfaces for to long a
time to understand that this was certainly not what they needed. The
baseline is that in order for me to get my stuff sold I have to play the
game and incorporate all that silly 3D (when's 4D to come :-) stuff into my
programs. Although they get slower, in certain cases harder to use, this is
what a lot of customers want.

Any idea of how to bring such people back to the right way ?


			ciao
				-Torsten
--
Torsten Beyer                      e-mail : tb@Materna.DE
Dr. Materna GmbH		   VOX    : +49 231 5599 225
Vosskuhle 37			   FAX    : +49 231 5599 100
D-4600 Dortmund 1,  West Germany

jerry@TALOS.UUCP (Jerry Gitomer) (04/10/91)

kemnitz@POSTGRES.BERKELEY.EDU (Greg Kemnitz) writes:

|When I first encountered X, I thought that it was truely horrible - it was
|painfully slow, quite a pain to program, and binaries linked to it tended
|to fill up the disk rather quickly, especially as toolkit upon toolkit was
|layered on top of it.  But it appears that it is easier to wait for fast
|machines rather than to design standard graphics protocols that aren't bloated,
|politically acceptable masses.  Also, it appears that the de facto trend in
|industry is to hope hardware improves fast enough to let poorly written
|software run well rather than writing software properly, and it is hard to
|argue that this strategy has been a complete failure, even if it is a sloppy
|approach.

	I take exception to it being "hard to argue that this strategy 
	has been a complete failure".  What I see is an industry that has
	forgotten its brief past, ignored any research done more than ten
	years ago, and is constantly moving from fad to fad.

	Over 25 years ago IBM published data (I only have a copy of a
	final published chart) illustrating that doubling the size of
	a program quadrupled the cost of developing that program -- and
	this held true regardless of the programming language used.
	Therefore, IMHO, today's baroque software is not only a complete
	failure, but a fiasco.  I believe that baroque is a synonymn for
	broke.  This bloated software is (generally speaking) noted for
	substituting features no one really needs for the reliability that
	every serious user needs.

				Jerry

-- 
Jerry Gitomer at National Political Resources Inc, Alexandria, VA USA
I am apolitical, have no resources, and speak only for myself.
Ma Bell (703)683-9090  (UUCP: (until 4/15)  ...uupsi!pbs!npri6!jerry 

mjr@hussar.dco.dec.com (Marcus J. Ranum) (04/11/91)

tb@Materna.DE (Torsten Beyer) writes:
>[...] Although they get slower, in certain cases harder to use, this is
>what a lot of customers want.
>Any idea of how to bring such people back to the right way ?

	Make your software conceptually and intellectually consistent,
with a simple user interface, and make it faster and cheaper than the
glitz pink Cadillacs, and it might sell. The "gotcha" here is that there
are many, many customers who won't even look at something that doesn't
run under [insert your favorite bloated window thing].

	Basically, what we're talking here is a counter-revolution. We
are facing the same kind of situation that the american automobile
industry faced: features VS performance and reliability. Someone, some
day, is going to play the role of Honda and produce cheap, effective,
and reliable code in a commodity-mode. How many perfectly good applications
have you seen crash because of interactions with stupidities in window
managers? I don't know of any studies, but my intuition is that simple
code is not only faster, it is more reliable. N-d widgets increase the
bugginess of window managers proportionally to N. ;)  (wait for the 4-d
widgets and the "backing temporal store" X extension).

	The way to bring people back to the "right way" is to be cheaper,
better, faster, and to protect users investments in hardware by allowing
it to support MORE software without an upgrade. Would I be able to sell
a product that implemented 60% of the functionality of "foo" but cost
less, was less buggy, and could support twice as many concurrent users
as "foo" without requiring a hardware upgrade? The gotcha is if "foo" is
a standard and a FIPS and all that...


mjr.
--
	"I don't care if my lettuce has DDT on it - as long as it's crisp"
		Jorma Kaukonen

als@bohra.cpg.oz.au (Anthony Shipman) (04/11/91)

In article <1991Apr10.181600.20926@decuac.dec.com>, mjr@hussar.dco.dec.com (Marcus J. Ranum) writes:
> industry faced: features VS performance and reliability. Someone, some
> day, is going to play the role of Honda and produce cheap, effective,
> and reliable code in a commodity-mode. How many perfectly good applications

No they won't.  The look-and-feel copyrights, patents etc. will ensure that
only the establishment can write such software.  If newcomers want to they will
have to pay royalties through the nose, ensuring there is no price competition.

-- 
Anthony Shipman                 "You've got to be taught before it's too late,
Computer Power Group             Before you are six or seven or eight,
19 Cato St., East Hawthorn,      To hate all the people your relatives hate,
Melbourne, Australia             You've got to be carefully taught."  R&H

adoyle@bbn.com (Allan Doyle) (04/11/91)

In article <9104081805.AA14112@samadams.Princeton.EDU> tr@SAMADAMS.PRINCETON.EDU (Tom Reingold) writes:
>and I hate poorly designed and implemented programs, but I now work on
>a product that suffered the design flaws recently covered in this
>thread.  I am beginning to see that although the excess size and
>duplicated function cost the user, the speed at getting the user what
>he wants is also a major factor.  If he doesn't get what he wants
>*soon*, even if improperly done, it may be a lost opportunity.
>
>We hope to take some time and fix things up in our product.  If we
>don't manage to find the time, it will be because we will be doing a
>major rewrite anyway.  The major rewrite will be spawned by the need to
>add large amounts of functionality, making our current product
>obsolete.  If it's obsolete, tightening it up would be futile.  In
>doing the rewrite, we hope to learn from our mistakes of excess size
>and redundancy.

Let's step back a minute and look at the shop-worn analogy of software
to cars. People have flogged the car/program user-interface issue to
death - to the point where I'm reluctant to mention the auto industry.
But here goes...

In the U.S. it takes up to seven years for a car to go from concept to
the showroom. (The cycle is getting shorter and 7 may be too long, no
flames please). It takes the Japanes about 3. Just accept the fact
that it takes more than 1 year. Yet new models come out every year.
Yes, some of the models are just tweaks of features on top of old models
but there are still *new* models every year. How do they do it? They
use a design pipeline. While Design A is in production, Design B
is in retooling, Design C is in prototyping, Design D is in concept 
formation, etc. Design A gets dumped, design B moves into production,
etc.

How many companies do this with software? I suspect it is near heresy
to suggest that a SW company begin design of a new rev of software using
new technology before the previous rev is out the door. Look at how
long it takes Lotus to get from 2.0 to 3.0, Apple to get from 6.3 to 7.0,
etc. If you want to introduce new technology on a regular, timely basis,
why not start working with the new technology sooner?

I suspect this has to do with culture. Programmers don't want to work on
1990 technology if they see a group working on 1991 technology. The good
ones in the 1990 group would sooner quit and go work for company X that
promises them a shot at their 1991 project.

It also has to do with management. How many SW company executives have
a background in manufacturing? How many work for the near-term profits?
It takes a pretty long view to see that starting a rewrite of a product
that's not even out the door might be a real revenue enhancer in the 
future.

Are there any venture capitalists out there who want to fund two or three
groups of programmers before the first product gets out the door?
[Give me a call if you are one :-) ]



Allan Doyle                                        adoyle@bbn.com
Bolt Beranek and Newman,  Incorporated             +1 (617) 873-3398
10 Moulton Street, Cambridge, MA 02138             USA

yarvin-norman@cs.yale.edu (Norman Yarvin) (04/12/91)

ed@mtxinu.COM (Ed Gould) writes:
>  There is no
>good technical reason that a small, efficient X server couldn't be
>written.

To initialize a connection and open a window with the X protocol takes 12
bytes sent from client to server, followed by ~130 bytes (minimum) sent from
server to client, followed by 40 bytes (minimum) from client to server then
32 bytes from server to client.

Pure communication time is possibly the least of the overheads which this
profligacy generates.  Time gets spent generating and interpreting this
stream.  Since programmers would hate to do this themselves, a library must
be written to keep them happy, adding another layer of overhead.  Since the
raw byte stream is nearly unusable the library has to be completely general,
but that makes it tedious to use; this leads to other libraries on top of the
basic one.  These libraries add overhead both in time and in memory; if
shared libraries are not used the latter cost becomes nearly prohibitive.
All these seem to me to largely be consequences of the design of the
protocol.

--
Norman Yarvin					yarvin-norman@cs.yale.edu
 "We are at the moment in such a deep state of confusion that we are bound to
 learn something from it."  -- Sir Rudolph Peirls

rcd@ico.isc.com (Dick Dunn) (04/13/91)

mike@bria.UUCP (Michael Stefanik) writes:
> In an article, ico.isc.com!rcd (Dick Dunn) writes:
> |[various bits of whining about resources wasted by code bloat]

> I belive this to be 100% correct, and I think that Dick deserves a public
> pat on the back (consider yourself "patted" :-)

Dick deserves nothing of the sort.  He hasn't *done* anything; it's all
talk.  He's repeated the same tired carping about how the world is going to
hell in a handbasket.  Worse still, he's posting just to comp.unix.wizards,
where he knows he'll have a sympathetic audience.  It has about as much
significance as a bunch of old men sitting on the porch of the country
store, spinning yarns about how the world was when they were young.

In fact, I know Dick well enough to be able to tell you that he has done
some work on an X server; he may just be trying to atone (however
slightly!) for that sin.  His words pale before his actions.

Face it, folks...talking about simpler code while other folks are writing
complex, bloated code equals more complex, bloated code.
-- 
Dick Dunn     rcd@ico.isc.com -or- ico!rcd       Boulder, CO   (303)449-2870
   ...While you were reading this, Motif grew by another kilobyte.

wcs) (04/13/91)

In article <63660@bbn.BBN.COM> adoyle@vax.bbn.com (Allan Doyle) writes:
]but there are still *new* models every year. How do they do it? They
]use a design pipeline. While Design A is in production, Design B
]is in retooling, Design C is in prototyping, Design D is in concept ..
]How many companies do this with software? I suspect it is near heresy
]to suggest that a SW company begin design of a new rev of software using
]new technology before the previous rev is out the door. Look at how

In the traditional top-down methodology for large software projects,
this is done all the time.  The project isn't done just by hackers,
it's got market-analyzers, requirements-writers, system-designers,
system-engineers, code-writers, system-testers, etc.  
If the project is big enough to need lots of people, when the
requirements-writers get done writing the requirements for version N,
they start writing the requirements for version N+1. 

One big problem with this approach is lack of feedback.
You never really understand the problem at the beginning.
You might make a bad design decision early on, but you can't tell
until you learn the things you learn by implementing pieces of the
system based on that design decision, which may be much later.

This is why things like prototyping are critical, and why
military systems ALWAYS have cost overruns - the procurement process
forces an extremely rigid formal design system, without much feedback,
the people who evaluate the initial phases tend to be bean-counters 
who don't understand the implications of the decisions,
and it's easier *contractually* to try and implement an atrocious design
than to go back and fix the initial decisions - especially if the
bad decisions were requirements written by the customer.

If the design cycles are too long, there's often no way out -
except the Mythical Man-Month solution :-) of more and more bodies.

But if you've got a clean development process with a small core of
good people who know the whole system, you can still do pipelining.
Most large systems have a basic capability with a lot of modular
features built on top of it.  Typical things you do in later
releases are to add features that didn't get in Rel. 1, and to make
the core system go faster or add capabilities while upward-compatible.
If you do it well, this can all go on in parallel.
-- 
				Pray for peace;		  Bill
# Bill Stewart 908-949-0705 erebus.att.com!wcs AT&T Bell Labs 4M-312 Holmdel NJ
# Actually, it's *two* drummers, and we're not marching, we're *dancing*.
# But that's the general idea.

mouse@thunder.mcrcim.mcgill.edu (der Mouse) (04/13/91)

In article <71242@brunix.UUCP>, cgy@cs.brown.edu (Curtis Yarvin) writes:
> In article <15751@smoke.brl.mil> gwyn@smoke.brl.mil (Doug Gwyn) writes:
>> Perhaps what many of the old-timers miss most is the expectation
>> that people who use computers would know what they are doing.
[or at least have enough of an idea to avoid major blunders]
> My observation is that an _inquisitive_ user can learn to use any
> software with a simple user interface and a help facility.  It's true
> that studying the manual helps; but I think the problem is the demise
> (or at least the outnumberment) of curiosity.

Partially, but there's also an unrealistic expectation.  Nobody expects
to go driving without knowing how to drive.  When someone uses (say) a
pain sprayer without knowing how it works and how to use it, and (say)
manages to explode a pressurized container of paint all over
everything, everybody says "why didn't you learn how to use it", not
"paint sprayers should be novice-friendly".

Why should computers be any different?

> A lot of users fear and loathe the computer, and want to get their
> work done while learning as little about it as possible.  By
> contrast, the inquisitive user is intrigued by the machine, and
> actually enjoys learning.

True of any tool.  Unfortunately the North American school system
appears to be actively designed to kill curiosity and love of learning;
not coincidentally, I believe, this dreadful disease of expecting
absolutely anyone to be able to use a very complex tool with zero
training is at its worst in North America.

Of course, the priesthood types don't help matters any.  I prefer to
teach the person with the problem how to deal with it without help,
sort of along the lines of the proverb I think I heard as "give a man a
fish and you feed him for a day; teach a man to fish and you feed him
for a lifetime.".  (Besides, it means that next time that person
doesn't bother me with the same question all over again, and I can
continue reading netnews uninterrupted :-)

					der Mouse

			old: mcgill-vision!mouse
			new: mouse@larry.mcrcim.mcgill.edu

rbj@uunet.UU.NET (Root Boy Jim) (04/16/91)

In an article, ico.isc.com!rcd (Dick Dunn) writes:
?I've repeated the same tired carping about how the world is going to
?hell in a handbasket. 

No, it's going to Hell in a Bucket.
-- 
		[rbj@uunet 1] stty sane
		unknown mode: sane

gwyn@smoke.brl.mil (Doug Gwyn) (04/16/91)

In article <1991Apr12.233838.27407@cbnewsh.att.com> wcs@cbnewsh.att.com (Bill Stewart 908-949-0705 erebus.att.com!wcs) writes:
>military systems ALWAYS have cost overruns

While a lot of your points were valid, it is not true that all military
procurements have cost overruns.  I would wager that most do not.

gwyn@smoke.brl.mil (Doug Gwyn) (04/16/91)

In article <1991Apr13.101654.21974@thunder.mcrcim.mcgill.edu> mouse@thunder.mcrcim.mcgill.edu (der Mouse) writes:
>Partially, but there's also an unrealistic expectation.  Nobody expects
>to go driving without knowing how to drive.  When someone uses (say) a
>pain sprayer without knowing how it works and how to use it, and (say)
>manages to explode a pressurized container of paint all over
>everything, everybody says "why didn't you learn how to use it", not
>"paint sprayers should be novice-friendly".

Actually there is an alarming trend toward making manufacturers
liable for all product misuse.  For example, I've seen lawn mowers
labeled with warnings not to stick one's feet into the blade area,
in a feeble attempt to stave off lawsuits from stupid people (or their
estates).  In DC recently a law was passed to hold gun manufacturers
liable for the consequences of misuse of their products.  (Fortunately
important membrs of the US Congress told the DC administration that
they were about to lose their appropriations if that law was left on
the books.)

>Why should computers be any different?

Well, we were hoping that with computers we could emphasize the logical.

>Unfortunately the North American school system
>appears to be actively designed to kill curiosity and love of learning;

There is certainly something fundamentally wrong with US public education,
probably the very notion that it should be a governmental function.

>I prefer to teach the person with the problem how to deal with it
>without help

Right, but inefficient if postponed until after the usage problems
have already started to arise.