[comp.sw.components] a comment on Using components

eugene@eos.UUCP (Eugene Miya) (05/11/89)

I'll read this group for a while.  It was beginning to sound like an
Ada buzzword group for a while.  (Sorry, I once came from big software
projects, and I'm feeling punchy.)

In a nice article <11293@bloom-beacon.MIT.EDU> tada@athena.mit.edu 
by Michael Zehr:

>Most of the discussion so far seems to be directed at retrieving
>components, but there's been very little said about using them.
>
>background: i haven't done any programming in ADA, so if there's a
>definition of component that only applies to ADA, i don't it.  i'm
>basing this on my "feel" of component -- namely a group of code that's
>intended to for reuse.  

Any tool which is so specialized that it can't be used with any other
language has serious problems.  It would deserve to die quietly
in a corner (there are exceptions of course).  The problem with SOME of
the Ada people (those that don't code, but manage) is that they expect
their programmer's language to solve all of THEIR problems.

>my experience so far has been that the more general a component is, the
>slower it runs.

This is part of the reason why Unix developed.  No Unix flame wars please.
See my defense below.  The point is we see that performance is sort of
the main reason for using computers (not completely).

>for another example, some languages have arrays as their basic data
>unit. "+" becomes an array operator.  but if all you need to do is add
>two scalar quantities, you're hit with a performance penalty.

Small point of order: this is typically a compilation, not an execution
time function, you you are typically not penalized for this.  (Called
overloading, neat stuff.)

>so far, the trend has largely been towards larger and larger components:
> ..... abstraction from s/w to h/w
>
>hardware trends have been similar: emergence of CISC
>
>but recently, the hardware trend has reversed -- use very simple instructions
>(i'm referring to RISC of course.)

You are not alone in thinking about this.  The problem is: How to
build something that has never existed before?  I am refering to tools.
The problem is the management of complexity.  We are discussing this
in a local ACM/SIGGRAPH group, we call it TIGSV: Technical Interest Group
in "Scientific Visualization" [here's a buzzword I can introduce]....

Components are one thing, reuse is another.  There is the issue of
making them and maintaining them.  In the Beginning, there were scientific
libraries (seems the reuse-people forgot about these, oh well, how else
to go from stone wheel to steel-belted radials).  Now, these were good.

The came Unix and its software tools philosophy.  I began as an MVT hacker.
And at first it was strange to me.  People mistook the command (filenaming)
as bad and discarded without seeing and understanding the functionality
[BTW: LISP has goods as well, but not as wide spread].  They also created
other problems based on their closed, non-flexible  thinking about computers.

It was a philosophy of minimalism, but it was a simple, but very subtle,
logical next step.  See to use libraries, you had to edit, compile,
and then link a call to usage.  Unix introduced co-routines and parallelism
for the masses (and other things).  So instead of edit, compile, link, run,
you did:
	troff filename or to an extreme:
	chem filename | pic | refer | tbl | eqn | troff
Yes you learn a lot of commands, but then we want to do lots with computers.
You can learn lots of little commands or learn lots of little options.
But each of the above only takes a few minutes to learn (except troff
or tex, or Scribe, etc.... ok a generalization) [this is the learning problem]
But getting back, its not just text.  It graphically started as
	graph | plot
but if you wanted more you could do
	spline | graph | plot and so on.
This is actually better documented in two of CACM's Jon Louis Bentley
Programming Pearls column a couple of years back.  Unfortunately neither
appeared in the books PP or More PPs.  The one column was entitled:
Little Languages (Aug 86) and the other was a face off by two guest
Oysters: Don Knuth and Doug McIlroy (June 86).  The latter column is the more
important.  I recommend them if you have not read them.

Then there was the Xerox/SRI ARC/Apple/NeXT line of work.  This looks
good.  How can 10K words say it? ;)

So what's next?  THAT is where the real challenge is.  I think it
will have to deal with parallel architectures.  But there are some major
intellectual hurtles there.  Using Sequents and Crays and Connection Machines
are kind of neat in this regard.  Its the limits of Silicon.  Its only
going to get harder before it gets easier.

>saving time building the application is more important than
>reducing the running time of the application.  
>but my experience on applications has led to the conclusion that a lot
>of performance is being sacrificed for the decrease in application
>building time.
>
>do others agree that this is a problem in the software industry, or it
>is just me?  if it is a problem, how should we face it and fix it
>before it becomes worse?
>
>(sorry for the length of this)

True. (on time and loss of some performance, by faster machines.)
You cannot expect gains in performance and functionality at this time
with combined gains in both hardware and software.  Just ask Xerox
or Apple or NeXT.  There are many similar analogies in the physical world.

The real question is how much a user, builder is willing to change.
Change is what software is all about.

You have no reason to be sorry,  you only get rid of those short-sighted,
short-attention spanned people which our country is so notorious for
producing (bad managers). ;)

Longish signature follows "Type 'n' now"

Another gross generalization from

--eugene miya, NASA Ames Research Center, eugene@aurora.arc.nasa.gov
  resident cynic at the Rock of Ages Home for Retired Hackers:
  "You trust the `reply' command with all those different mailers out there?"
  "If my mail does not reach you, please accept my apology."
  {ncar,decwrl,hplabs,uunet}!ames!eugene
  				Live free or die.

duane@ginosko.samsung.com (Andrew Duane) (05/11/89)

In article <3579@eos.UUCP>, eugene@eos.UUCP (Eugene Miya) writes:
> In a nice article <11293@bloom-beacon.MIT.EDU> tada@athena.mit.edu 
> by Michael Zehr:
> >saving time building the application is more important than
> >reducing the running time of the application.  
> >but my experience on applications has led to the conclusion that a lot
> >of performance is being sacrificed for the decrease in application
> >building time.
> >
> >do others agree that this is a problem in the software industry, or it
> >is just me?  if it is a problem, how should we face it and fix it
> >before it becomes worse?
> 
> True. (on time and loss of some performance, by faster machines.)
> You cannot expect gains in performance and functionality at this time
> with combined gains in both hardware and software.  Just ask Xerox
> or Apple or NeXT.  There are many similar analogies in the physical world.
> 
> The real question is how much a user, builder is willing to change.
> Change is what software is all about.
> 
> You have no reason to be sorry,  you only get rid of those short-sighted,
> short-attention spanned people which our country is so notorious for
> producing (bad managers). ;)


EXTREMELY true!! A recent project I worked on had exactly this problem.
The project was years underway, and "almost" to beta, before someone
noticed "hey! this runs like a slow, fat pig!" So a performance czar
was appointed to magically fix everything. Naturally, he didn't since
the problems were built in to the design methodology. The attitude during
construction was "let's make it run; we can make version 1.1 run fast."
In spite of trying to get it shipped ASAP, we designed almost everything
from scratch, rather than using anything off-the-shelf that might save
us development time. This ended up contributing to our demise...

At this same company, after at least 10 years in the software business,
management (at least a small part of mgmt) was convinced that maybe we
ought to have a software library so the technology we created for one
project could be transferred to the next. This idea also died.

BTW, for an interesting perspective on those short-sighted, short attention
span bad managers, this month's Scientific American had another good article
on the state of american manufacturing and industry. One of the authors'
main points was that business today does not look toward long-term gains
and productivity, but toward short-term make a fast buck. While the article
covered all businesses, the analogy to much of today's s/w engineering is
obvious. How many times is the first (if not ONLY) priority to GET IT
SHIPPED! How ironic that these same managers often can't see the utility
in reusing existing tools.

Well, I've rambled long enough. Gotta earn that paycheck somehow...

Andrew L. Duane (JOT-7)  w:(508)-685-7200 X122  h:(603)-434-7934
Samsung Software America	 decvax!cg-atla!ginosko!duane
1 Corporate Drive			  uunet/
Andover, MA.   01810

Only my cat shares my opinions, and she's still learning "lint".

eugene@eos.UUCP (Eugene Miya) (05/12/89)

A couple more thoughts (from sitting in a meeting) and a clarification:
Clarification:  oh, I forgot to mention the really neat thing I thought
about Unix software tools was they kept much of the library function
while also being executable programs.  There was the spline program
and the spine function.  Few systems do that, and it also lacks what
others call "user friendly interfaces"  in favor of getting the job done.

The thoughts:  I was thinking about tool (component success)
and these go for hardware as well as software:
1) very important is who gets out there first.
2) cost, cheap, free-ware
3) good reasonable quality

Sort of in this order.  Consider we are still using Fortran, Cobol, etc.
I contact Backus about once every 6 months and it totally amazes him
Fortran is still alive ;) (it's neat, one can ask him "why column major?"
and he comes back and says "Oh, 704 storage handling, easy to check ends of
rows in H/W.").  Witness the success of Convex being the first in that
market place (and FPS).  Being first defines the terminology, creates the
concept, sets the standards. (timing is important, too, you can be too far
ahead).

Cheap free software (hardware): that's how Unix made it $150 for the first
licenses to Universities.  I think Steve Jobs noted how powerful and incentive
this was.  Contrast that to certain LISP AI packages held by companies,
as well as specialized programs.  Can you imagine paying $100K for such
a package?  (Hope so, short of custom software.)  This BTW is what separates
CS from other sciences, no communications without cost.  If you want
to succeed, make it free or cheap, except...

It has to be reasonably well made.  The quality much be fairly good.
One hopes one does it right the first time.  Fortran had lots of "Oops!"
If its cheap maybe you won't be able to give it away.  If its big
and expensive, maybe too big.

Longish signature follows "Type 'n' now"

Another gross generalization from

--eugene miya, NASA Ames Research Center, eugene@aurora.arc.nasa.gov
  resident cynic at the Rock of Ages Home for Retired Hackers:
  "You trust the `reply' command with all those different mailers out there?"
  "If my mail does not reach you, please accept my apology."
  {ncar,decwrl,hplabs,uunet}!ames!eugene
  				Live free or die.

gm@romeo.cs.duke.edu (Greg McGary) (05/12/89)

In article <3594@eos.UUCP> eugene@eos.UUCP (Eugene Miya) writes:
>... the really neat thing I thought
>about Unix software tools was they kept much of the library function
>while also being executable programs.  There was the spline program
>and the spine function.

Yes, this is a `really neat thing' but not something I usually think
of as a feature of UNIX.  If anything, this isn't done anywhere nearly
often enough.  The contents of libc probably contain only 5-10% of
commonly useful funtions that are reimplemented or cut/pasted all over
the system.  The integrated environments like lisp and smalltalk are
better for reusability, since the fire-wall that separates programs and
library-functions isn't present, but even these suffer from re-invention
of the wheel when programmers don't bother to research existing code.

Too often I've wanted to lift a function or two out of some random UNIX
utility only to find that the interface was all boogered up with
references to global data structures.  To this day, I'm shocked at how
many programmers blithely put pages full of global variables in their
programs...

A few years ago I found a real jewel.  (I think it was System-V.0) 
The library function getwd(3) was implemented by calling
popen("/bin/pwd", "r")!  Long live US{G,DL}.

-- Greg McGary
-- 4201 University Drive #102, Durham, NC 27707       voice: (919) 490-6037
-- {decvax,hplabs,seismo,mcnc}!duke!gm                 data: (919) 493-5953
--                                  gm@cs.duke.edu

davecb@yunexus.UUCP (David Collier-Brown) (05/13/89)

| In article <3594@eos.UUCP> eugene@eos.UUCP (Eugene Miya) writes:
| ... the really neat thing I thought
| about Unix software tools was they kept much of the library function
| while also being executable programs.  There was the spline program
| and the spine function.

In article <14479@duke.cs.duke.edu> gm@romeo.UUCP (Greg McGary) writes:
| Yes, this is a `really neat thing' but not something I usually think
| of as a feature of UNIX.  If anything, this isn't done anywhere nearly
| often enough.

   Hmmn.  
   If memory serves, that was a carry-over from Multics.  The mutlicks
 standards manual strongly suggested you write a set of functions which
 fit a standardized external interface, and directed you to something
 called the subsystems utility package, which would ease the writing
 of such an interface.

   To return to the present, a standard interface package for either
 a line-of-text style or an icon-and-menu style might make a component 
 which would make writing framework for other components easier....


 --dave (7 more days until monomania!) c-b
--
David Collier-Brown,  | {toronto area...}lethe!dave
72 Abitibi Ave.,      |  Joyce C-B:
Willowdale, Ontario,  |     He's so smart he's dumb.
CANADA. 223-8968      |