[net.arch] top down vs. bottom up design was strange sex

johnl@ima.UUCP (John R. Levine) (07/19/86)

In article <802@tekig4.UUCP> jerryn@tekig4.UUCP (Jerry Nelson) writes:
>In article <2900019@ztivax.UUCP> david@ztivax.UUCP writes:
>>I know of one system which was completely developed [software first.] Some
>>software people wrote "the perfect language" and the "perfect OS
>>concepts" and then some smart HW folks developed the hardware to
>>support it.... It has got to be the world's most un-portable system.
>>David Smyth
>Hold it!  Are you telling me that there really is such a thing as portability?

This isn't portability, this is reimplementability.  The IBM 360 instruction
set has been implemented at least 25 times, with a performance range of at
least 1000 between the slowest (360/25, ca. 1965) and the fastest (3090,
ca. 1985 or maybe the Fujitsu engine.)  The PDP-11 instruction set has been
implemented about a dozen times, more if you count PDP-11 mode on a VAX.
The 360 and PDP-11 were designed more or less software first, although in both
cases changes in hardware technology and software technology have made the
architectures look a little tired by now.  In both cases you can take object
code from the early models and run it on more recent models and it still works
(by and large -- I'm thinking more about user programs than operating systems
here.)

But this begs the point.  People aren't implementing the PDP-11 much any more,
and there are good reasons for that.  When they designed the '11 about 15 years
ago, it seemed that 16 bits of address were a lot for a computer of that size.
What do you know, memories grew faster than they expected.  The 360 has the
same problem, although they can be excused a little by noting that they started
a lot earlier, and they never dreamed that their architecture would still be the
dominant one over 20 years later.  (They can also be cursed because the 360's
addressing was shrunk from 32 bits to 24 by what were obviously warts, even
at the time.)

It's perfectly wonderful to design your computer so that it is well matched to
the software that is going to be run on it, but your computer will go nowhere
if its implementation is static.  All of the successful architectures have had
multiple implementations, going back to the IBM 704 in the mid 1950's, and
that's not going to change.  I've seen precious little design that is set up
to work smoothly in the face of reimplementation, except maybe for Intel
encouraging us to pretent that the 8086 had segments because subsequent
reimplementions really would have them.

I'd be interested in hearing from the RISC crowd -- how much to they tune
their designs to the precise technology available today, and how much do
they expect to carry over into their next generation?
-- 
John R. Levine, Javelin Software Corp., Cambridge MA +1 617 494 1400
{ ihnp4 | decvax | cbosgd | harvard | yale }!ima!johnl, Levine@YALE.EDU
The opinions expressed herein are solely those of a 12-year-old hacker
who has broken into my account and not those of any person or organization.

henry@utzoo.UUCP (Henry Spencer) (07/21/86)

> I'd be interested in hearing from the RISC crowd -- how much to they tune
> their designs to the precise technology available today, and how much do
> they expect to carry over into their next generation?

An underemphasized but important point of much of the RISC work is that
near-total use of high-level languages and a semi-standardized operating
system largely eliminates hardware dependencies everywhere except in the
bowels of the kernel and the compilers.  The "architecture" these machines
present to their customers is at a higher level, almost independent of the
instruction set, register configuration, etc.  Hence continuity at the
lowest level is not that important.  (It remains useful, mind you, since
redoing the kernel guts and the compilers is not an overnight job.)

This is increasingly true in the Unix world in general, in fact.  My own
shop is about to convert from a little-endian 16-bit machine using DEC
floating-point format to a big-endian 32-bit machine using IEEE floating-
point.  The two architectures resemble each other only vaguely.  The only
real hassles I expect are from stupid and unnecessary divergence between
the two versions of the operating system.  I really don't *care* that the
underlying machine is changing in just about every way, because at the
level I work at, those changes won't show.
-- 
EDEC:  Stupidly non-standard
brain-damaged incompatible	Henry Spencer @ U of Toronto Zoology
proprietary protocol used.	{allegra,ihnp4,decvax,pyramid}!utzoo!henry