[comp.arch] Once more into the niche!

cdshaw@cs.UAlberta.CA (Chris Shaw) (07/24/90)

In article <2392@l.cc.purdue.edu> cik@l.cc.purdue.edu (Herman Rubin) writes:
>A good applied scientist would not exclude a tool on the grounds that it 
>could be misused. A good applied scientist does not try to use "standard"
>methods when there are better ways. A good applied scientist allows the
>introduction of new methods, and even invents them if he can think of them
>and they are appropriate.

This is total pablum Herman Rubin. We're debating what the "better way" is.
You seem to think that the only purpose of code is to run fast. 

Wrongo. 

One of the most important purposes for most code is to comminicate to the
reader what's going on. Excessive use of GOTOs, for example, markedly reduces
readability. There ARE better GOTO paradigms, such as breaks or named block
quits. They are very easy to understand. GOTOs are not necessary.

I have a classic applied science anecdote that illustrates this. The Earth
Sciencists at lab X use a device called a core heater. It's a heater with a
cylindrical chamber in which you put a core sample to heat it to some
specific temperature. Well, the heater also happens to be an excellent hot
dog cooker, so members of lab X took to cooking their snacks in this
thing. Which was fine, everybody thought, until while heating a particularly
valuable core sample, the heater caught on fire from all the animal fat in
the heating chamber, destroying both the sample and the heater.

Needless to say, a NO HOT DOGS rule was put into effect by the lab supervisor.

Now the point of this story is simply this. You CAN use certain tools for
unorthodox tasks. The question really is "what is appropriate use?" True,
the heater fire was caused by the grease in the chamber. The rule
could have been "always clean the heater", but it relies heavily on the
discipline of the hot dog eaters to diligently clean after every use.
Anyone acquainted with human nature will know that 100% human reliability is
unattainable, so a simple rule supplants a perhaps more accomodating rule.

>Too many of them are the type Bruce Karsh brings up. Those who say that 
>one must not use gotos, or that one should contort the problem to fit in
>the context of Fortran or Pascal or C, those who want RISC so that the
>compiler can optimize the code, those who would replace insertions with
>subroutine calls, etc.

Why too many? What's your criterion for judging why these things are wrong?
My guess is that you think these things are wrong because it makes your
programs run too slow. Which, in the small, is probably right. I have no
doubt whatsoever that if you use gotos, program in assembler, use a
well-designed CISC, do inlining, that your code will go fast. So what?

I have also no doubt that this speed will be bought at the expense of large
quantities of programmer time. I also have no doubt that the coder will be
the only one who understands the code, that even he will not understand how
it works after putting it down for 6 months, and that the code itself will
run on 1 machine only, and that cost per line will be astronomical.
There will still be deviously subtle bugs, because spaghetti code is
hard to understand.

But the punch line is that you *can't* guarantee that
you've gotten every optimization available without personally looking at
every inch of code with the checklist for your bag of tricks. Which is what
a compiler does anyway, so why do it yourself?

On the other hand, if I write the code in a HLL, sans GOTOs, on a RISC with a
highly optimizing compiler that does appropriate inlining automatically,
anybody will be able to read and fix the program. I'll also be able to complete
3 or 4 other projects while you're twiddling bits. Moreover, when the new faster
machine comes along, I'll be able to port the program. When the new faster,
better compiler comes along, I can recompile all my 5 projects and have
them run faster still due to some other groups' efforts at learning how to
twiddle bits *in general* to get speedup *everywhere*.

My point is that nobody is reliable enough to do the hand-tweak process
across the board. Well-built currently-available compilers are that reliable,
and what's more the reliablility is the same invariant of program size,
10 lines or 10,000,000 lines.

>Probably most of those so-called scientists would have a difficult time
>optimizing the code, so they want the machine to do it for a restricted
>set of code. These people are legion. The others are scarce.

Let's translate this. What you're really saying is that you are a member of
a scarce minority group that knows how to hand-tune little bits of code. Well
bully for you, Herman Rubin. What's implicit in this claim, of course, is
that you are somehow a better person for this skill, and that other people
should be like you. 

Well, to put it bluntly, "The others are scarce" because their abilities
are of little general value. Which isn't to say that hand-tuners are
entirely useless, just that they occupy a very narrow niche. The problem
is that you think that your niche is all there is.

>Herman Rubin, Dept. of Statistics, Purdue Univ., West Lafayette IN47907
--
Chris Shaw     University of Alberta
cdshaw@cs.UAlberta.ca           Now with new, minty Internet flavour!
CatchPhrase: Bogus as HELL !

cliffc@libya.rice.edu (Cliff Click) (07/25/90)

>In article <2392@l.cc.purdue.edu> cik@l.cc.purdue.edu (Herman Rubin) writes:
>> [ Why he hacks in assembly, and why the ability to do so should be taught. ]

In article <1990Jul24.030035.9169@cs.UAlberta.CA> cdshaw@cs.UAlberta.CA (Chris Shaw) writes:
> [ A lot of stuff I agree with, about the purpose of compilers in the world. ]
>
>But the punch line is that you *can't* guarantee that
>you've gotten every optimization available without personally looking at
>every inch of code with the checklist for your bag of tricks. Which is what
>a compiler does anyway, so why do it yourself?

Ahh, but here Herman's point is correct:  Herman (and human programmers
everywhere) have many, many more "bags of tricks" than any compiler.  If the
cost of doing your job in a HLL & compiling & running for time T is GREATER
than the cost of doing your job in assembly & running for time T/10, then
assembly MAKES SENSE (pick any factor other than 10 you might want, and I'm
also assuming TIME == MONEY).

There *IS* a middle ground; sometimes it's assembly, sometimes it's HLL, 
sometimes a little of both.

>>Herman Rubin, Dept. of Statistics, Purdue Univ., West Lafayette IN47907
>Chris Shaw     University of Alberta
Cliff Click

-- 
Cliff Click                
cliffc@owlnet.rice.edu       

peter@ficc.ferranti.com (Peter da Silva) (07/25/90)

In article <10185@brazos.Rice.edu> cliffc@libya.rice.edu (Cliff Click) writes:
> There *IS* a middle ground; sometimes it's assembly, sometimes it's HLL, 
> sometimes a little of both.

I don't think there's anyone who would disagree with that. It's just that
Herman Rubin isn't satisfied with using a HLL and switching to assembly
when appropriate. He wants his HLL to let him use all his assembly tricks
while remaining high level. And there ain't no such animal.
-- 
Peter da Silva.   `-_-'
+1 713 274 5180.   'U`
<peter@ficc.ferranti.com>