[comp.misc] Another 1.3 wish.

ewhac@well.UUCP (Leo 'Bols Ewhac' Schwab) (08/03/87)

[ Followups redirected to comp.misc, where they like to talk about such
things. ]

In article <434@sugar.UUCP> peter@sugar.UUCP (Peter da Silva) writes:
>[Flame-throwers on "stun", captain]
>
>And why explain in 68000 code?  [ ... ]
>I bought this machine because the operating system requires an absolute
>minimum mucking about in assembly. Ther's no reason on god's green earth
>why that code had to be in assembly.  [ ... ]
>Right. da Silva's rule #0x7FFE: if conversion of some example from assembly
>to some high level language is easy, then it should have been expressed
>in high level language in the first place.
>
	AHEM!

	I am of the opinion that assembly code is *not used enough* these
days.  The assumption that high-level languages are "good enough" simply
does not jibe with me.

	Expressing a new or complicated algorithm in a high-level language
is a useful endeavor.  It allows you to clearly formulate and express the
problem/procedure in machine-interpretable form.  However, once you have the
basic algorithm down, your next step should logically be to translate that
algorithm into assembly by hand.  Compilers can do a good job, but never as
good as a human.

	I point to accomplishments of the past.  SpaceWars fit in 4K of core
on the PDP systems (as I understand it).  I have a copy of a very powerful
music program for the 8080 that runs in 4K.  I have a copy of an 8080 BASIC
interpreter that has multi-line user-defined functions, IF-THEN-ELSE,
automatic structure indenting, error trapping, formatted I/O, and matrix
functions.  It's 15K in size, and runs great in 32K.  Sonix is all assembly
code, and its core is 29K in size (I think).

	Meanwhile, MicroSlush is proud of the fact that the core of their
AmigaBASIC interpreter runs in "only 80K" of RAM.  Truly pathetic.

	I'm not knocking high-level languages.  I use them a lot, and would
be very lost without them.  In fact, if you know how a particular compiler
behaves, you can get some very efficient code out of it.  But it should be
agreed that assembly has its place in the world, and should not be
arbitrarily poo-poohed.

	I really should hone my 68K assembly skills.  I think I'm getting
lazy....

>Fear not, you're in good company. Donald E. Knuth (yes, that Knuth) is one
>of the worst offenders. His "Art of computer programming" is almost useless
>for day-to-day use because of "MIX".  [ ... ]

	You didn't read his introduction to the book.  It's a cookbook of
algorithms, with sample implementation in a hypothetical language.  It's up
to you to translate the *algorithm*, not the code, over to what you want.  I
suspect he invented a hypothetical language to force his readers to use the
algorithms he presented, not just his code.  If he'd written it in Pascal or
something, people would be tempted to drop the code into a machine
unmodified, and they wouldn't learn anything.  Anyone can transliterate
code, but can you use the *algorithm*?

	Final note:  This is not a personal attack.  I simply feel that
assembly code should be used more often than it is currently.

_-_-_-_-_-_-_-_-_-_-_-_-_-_-_-_-_-_-_-_-_-_-_-_-_-_-_-_-_-_-_-_-_-_-_-_-_-_-_
Leo L. Schwab -- The Guy in The Cape	ihnp4!ptsfa -\
 \_ -_	 Bike shrunk by popular demand,	      dual ---> !{well,unicom}!ewhac
O----^o	 But it's still the only way to fly.  hplabs / (pronounced "AE-wack")
"Work FOR?  I don't work FOR anybody!  I'm just having fun."  -- The Doctor

mwm@eris.BERKELEY.EDU (Mike (My watch has windows) Meyer) (08/04/87)

In article <3664@well.UUCP> ewhac@well.UUCP (Leo (My glasses have gate arrays) Schwab) writes:
<Compilers can do a good job, but never as good as a human.

That's what you get for dealing with shoddy compilers - which most C
compilers qualify as. A good compiler will optimize thing to a degree
that most humans can't match. A good example is that VMS C (one of the
exceptions to the above; I'm not meaning to slight anyone, it's just
the one I'm familiar with) takes something like:

main() {
	register int	a, b ;
	a = 6 ;
	b = 8 ;
	printf("%d", a + b) ;
	}

and turns it into:

	push	14			; I'm not using real VAX mnemonics...
	push	<address of "%d">	; so that those not exposed to
	jump	printf			; such see what's going on.

Sure, on 8080 and similar (eighty-eighty sux family, maybe?) where
stack operations are expensive, useing a compiler - no matter how good
- usually looses to even a moderately good human. But on modern
architectures, where the difference between a stack access and a
global access is small, it doesn't matter as much. On these, the
compiler knowing some non-obvious sequences can make up for that.

<	I'm not knocking high-level languages.  I use them a lot, and would
<be very lost without them.  In fact, if you know how a particular compiler
<behaves, you can get some very efficient code out of it.  But it should be
<agreed that assembly has its place in the world, and should not be
<arbitrarily poo-poohed.

Yes, and that place is in doing things that your high-level language
isn't capable of. And - in time-critical pieces of code - in squeezing
the last microsecond out. But for other things, it just isn't worth
it.

In most cases, there are *better* ways to improve code if it's in a
high level language than by mapping the algorithm to assembler. Two
examples follow:

Once upon a time, a group wrote a microcode compiler for their
machine. They then rewrote the microcode in their new high level
language. The next step was to rewrite some of the code to use better
algorithms (now that they understood it). Result: the high-level
version was 75% of the size and run in 66% of the time of the original
assembler version.

A second time, someone had an application that was spending a *lot* of
it's time in the math library. Like 90+%. Mostly doing sin() and
cos() calls. The "rewrite it in assembler" school would have led to a
few weeks of work tweaking those routines for 10 or 20% more speed.
Not bad, considering how long this thing took to run. However, the
code was getting the same values over and over again, so writing a new
version of sin and cos that cached values from calls to the real thing
and returned them resulted in a greater than 50% reduction in runtime.
After which, the 10 or 20% from the assembler tweaking wasn't worth
the two weeks it would have taken.

	<mike

--
How many times do you have to fall			Mike Meyer
While people stand there gawking?			mwm@berkeley.edu
How many times do you have to fall			ucbvax!mwm
Before you end up walking?				mwm@ucbjade.BITNET

nick%ed.cheops@its63b.ed.ac.uk (Nick Rothwell, Laboratory for Foundations of Computer Science,) (08/04/87)

In article <3664@well.UUCP> ewhac@well.UUCP (Leo (My glasses have gate arrays) Schwab) writes:
>	Expressing a new or complicated algorithm in a high-level language
>is a useful endeavor.  It allows you to clearly formulate and express the
>problem/procedure in machine-interpretable form.  However, once you have the
>basic algorithm down, your next step should logically be to translate that
>algorithm into assembly by hand.  Compilers can do a good job, but never as
>good as a human.
>
>	I point to accomplishments of the past.  SpaceWars fit in 4K of core
>on the PDP systems (as I understand it).  I have a copy of a very powerful
>music program for the 8080 that runs in 4K.  I have a copy of an 8080 BASIC
>interpreter that has multi-line user-defined functions, IF-THEN-ELSE,
>automatic structure indenting, error trapping, formatted I/O, and matrix
>functions.  It's 15K in size, and runs great in 32K.  Sonix is all assembly
>code, and its core is 29K in size (I think).
>
>	Meanwhile, MicroSlush is proud of the fact that the core of their
>AmigaBASIC interpreter runs in "only 80K" of RAM.  Truly pathetic.

*Sigh*. Tell me: why should I, developing sophisticated applications in
a strongly typed language on a 4Meg Sun, pause to contemplate doing things
in assembly language?
   Re: quality of code - Perhaps hand-assembly beats compilers IN EXTREME
CASES in the code generation phase, but on the whole a compiler
can generate code as well as any assembly-hacker. I know people who issue
their compilers with a challenge to hand-assemble code which is better -
nobody's taken up the challenge yet.
   Re: program size - well, your 4K SpaceWars fits into, what, $2 worth of
memory? Why waste your time with the details of hand-assembly when a few
cents worth of memory will solve the problem for you? Such paranoid fear of
using memory has caused untold amounts of horrible software to be produced
in the past. I guess SpaceWars is impressively small - so's somebody who
can calculate really fast on a slide rule. But I've got other things to
think about these days.
   I've been using various 68000 machines for the past couple of years, and I
still don't know any 68000 code. My *general* knowledge of compiler construction
lets me get whatever efficiency I want out of my code.
   By the way, I would use assembly code for device drivers, operating system
kernels, this sort of thing - but that wasn't really your argument. Anyway,
I know of projects where the ENTIRE system was written in a high-level language-
yes, every last statement.
   And I know what it's like to use a small machine - I have a 56k Terak at
home. I've developed sizeable Pascal (& nothing else!) applications in it. And
I've found that, in the time it would take me to hand-code critical sections of
these applications to go faster/be smaller, I could hire myself out as a
consultant, make some cash, and go and buy myself a bigger, faster machine. So
there.
-- 
Nick Rothwell,   Laboratory for Foundations of Computer Science,
                 University of Edinburgh.
                 ARPA:    nick%{cstvax,itspna}.ed.ac.uk@cs.ucl.ac.uk
                 JANET:   nick@uk.ac.ed.{cstvax,itspna}
                 UUCP:    <Atlantic Ocean>!mcvax!ukc!{cstvax,itspna}!nick
~~ ~~ ~~ ~~ ~~ ~~ ~~ ~~ ~~ ~~ ~~ ~~ ~~ ~~ ~~ ~~ ~~ ~~ ~~ ~~ ~~ ~~ ~~ ~~ ~~
"Nothing's forgotten. Nothing is ever forgotten."   (Herne)

john@frog.UUCP (John Woods, Software) (08/08/87)

In article <3664@well.UUCP>, ewhac@well.UUCP (Leo 'Bols Ewhac' Schwab) writes:
> In article <434@sugar.UUCP> peter@sugar.UUCP (Peter da Silva) writes:
> >[Flame-throwers on "stun", captain]
> >And why explain in 68000 code?  [ ... ]
> >Right. da Silva's rule #0x7FFE: if conversion of some example from assembly
> >to some high level language is easy, then it should have been expressed
> >in high level language in the first place.
> 	I am of the opinion that assembly code is *not used enough* these
> days.  The assumption that high-level languages are "good enough" simply
> does not jibe with me. [pining for the days of 4K 8080 code deleted]
> 
> >Fear not, you're in good company. Donald E. Knuth (yes, that Knuth) is one
> >of the worst offenders. His "Art of computer programming" is almost useless
> >for day-to-day use because of "MIX".  [ ... ]
> 
> Anyone can transliterate code, but can you use the *algorithm*?
> 
AHEM yourself.  My complaint with Knuth's MIX nonsense was that his algorithms
were expressed in English just as turgid and knotted as his assembly code.
Concrete example:  I wanted to remind myself how to do a heapsort.  I found
Knuth's expression of it just as unreadable in English as in MIX ("goto" my
FOOT!).  I went to another of my algorithm books (title forgotten, I think it
was by Aho (of the Dragon Book) and someone else), and found the heapsort
algorithm expressed in high level pseudocode, using recursion (the assembly
programmer's bane! or so it seems...) -- a true marvel of clarity, the English
was SUPERFLUOUS to the high level code description, and it translated into
C (including doing the part which had been left as the traditional Exercise
for the Reader) in minutes.

For your archetypical program that does something wonderful in 4K of
absolutely tight 8080 assembly code -- how long did someone sweat over every
last byte?  How many *more* wonderful things could that program have done
if that time had been spent extending it?  That is what most high-level
fanatics are interested in.

--
John Woods, Charles River Data Systems, Framingham MA, (617) 626-1101
...!decvax!frog!john, ...!mit-eddie!jfw, jfw%mit-ccc@MIT-XX.ARPA

"The Unicorn is a pony that has been tragically
disfigured by radiation burns." -- Dr. Science

ins_anmy@jhunix.UUCP (Norman Yarvin) (08/09/87)

In article <4564@jade.BERKELEY.EDU> mwm@eris.BERKELEY.EDU (Mike (My watch has windows) Meyer) writes:
>In article <3664@well.UUCP> ewhac@well.UUCP (Leo (My glasses have gate arrays) Schwab) writes:
><Compilers can do a good job, but never as good as a human.
>
>That's what you get for dealing with shoddy compilers - which most C
>compilers qualify as. A good compiler will optimize thing to a degree
>that most humans can't match. A good example is that VMS C (one of the
>exceptions to the above; I'm not meaning to slight anyone, it's just
>the one I'm familiar with) takes something like:
>

Note that the small size of machine-language programs as versus C programs
is almost all due to the use of library functions and C startup code
(especially on those machines which don't support a Unix-like environment
too well.)

A case in point: I wrote a C program to perform a subset of the functions of
the system's "who" program.  At first it was about 10K of program.  Then I
cut out all the calls to high level I/O routines, and the resulting program is
about 2.5 K.

Of course it took a little thought to do that bit of optimization, even.

The moral of programming: What you get is the time you spend, except in COBOL.

							 Norman Yarvin
(seismo!umcp-cs | ihnp4!whuxcc | allegra!hopkins) !jhunix!ins_anmy

	"Almost doesn't count. But it almost does."