[comp.arch] Architectural design

cik@l.cc.purdue.edu (Herman Rubin) (10/24/90)

In article <27089@mimsy.umd.edu>, chris@mimsy.umd.edu (Chris Torek) writes:
> In article <2661@l.cc.purdue.edu> cik@l.cc.purdue.edu (Herman Rubin)
> hyperbolizes (is that a word? :-) ):
> >... to deliberately restruct
> (`restrict', I think, not `restructure', not that it matters too much
> to my particular reply)
> >an architecture to what a language provides is at best an insult to
> >all thinking people.
> 
> Hardly.  If someone builds a machine designed to run COBOL programs fast,
> I may not want to *use* it, but it is in no way an *insult*.  If company X
> thinks there is a market for such machines, and can find investors, why
> should I be insulted?

I suspect that even in a totally business-oriented situation, a machine
which can ONLY run Cobol fast would be an insult to any thinking programmer.

In a university environment, with a very heterogeneous collection of jobs,
it is far worse.  Even in an industrial environment; I am almost certain
that Bob Silverman and Peter Montgomery use computers for other than
primality testing and factorization.

> Now, if Herman Rubin thinks a machine with instructions Y and Z can be
> built and would be such a great leap forward, I suggest he found his
> own company.  There are plenty of investors.  Instead of griping about
> the ones investing in MIPS and SPARC, why not find your own?

A great leap forward, no.  More efficient, yes.  The great leap forward
that von Neumann made has almost been eliminated by workarounds, usually
but not always, more efficient.

Should a new computer be built every time an applications programmer can
come up with an instruction improving efficiency?  Obviously not.  Are
their ideas taken into account?  Essentially not.  I was told 25 years
ago by an IBM researcher that, well before the design for the 360 was
firm, the researchers complained about the total lack of communication
between the fixed point and floating point registeds.  Conversion is a
real beast on that machine.  Nothing was done, and the situation has not 
improved in the entire family of computers.

As for the last part, it is not clear that I could continue to function
as a researcher in mathematics and statistics if I did so.  I am one of
those who gives condolences to my friends who become department chairmen.
-- 
Herman Rubin, Dept. of Statistics, Purdue Univ., West Lafayette IN47907
Phone: (317)494-6054
hrubin@l.cc.purdue.edu (Internet, bitnet)	{purdue,pur-ee}!l.cc!cik(UUCP)

jpk@ingres.com (Jon Krueger) (10/27/90)

From article <2675@l.cc.purdue.edu>, by cik@l.cc.purdue.edu (Herman Rubin):
> I suspect that even in a totally business-oriented situation, a machine
> which can ONLY run Cobol fast would be an insult to any thinking programmer.

You suspect wrong.  Thinking programmers are not insulted by
processors designed to perform specific operations well.

> In a university environment, with a very heterogeneous collection of jobs,
> it is far worse.  Even in an industrial environment; I am almost certain
> that Bob Silverman and Peter Montgomery use computers for other than
> primality testing and factorization.

Good morning,  Herman.  You shouldn't have imbibed so freely with those
little guys with the nine-pins.  It is now 1990.  The changes in the
last twenty years may come as a bit of a shock to you.  It is now
pretty common for universities to own more than a single computer.  In
fact, it's not unusual for individuals to own or have access to more
than a single computer.

You may want to take some time to acquaint yourself with the effects
this has had on computer use.  Bob and Peter do indeed use computers
for primality testing and factorization and other applications.  They
aren't all the same computers.  If Bob and Peter have a need to run
Cobol fast, it is unlikely that they would want to run their primality
testing and factorization on the same machine.  And it is silly to
criticize either machine for doing its task well.

You might also consider the effects this has had on architectural
design.  It may be hard to comprehend twenty years of changes all at
once.  Perhaps it would help to pick up where you left off, review the
changes that occured in the previous twenty years.  Timesharing,
compilers, programming languages, operating systems, function
libraries, the beginnings of software tools -- you do remember them
being useful in their way?  Enough to affect processor design?  Even at
some cost in performance?  Just how long were you asleep, Herman?

-- Jon
--

Jon Krueger, jpk@ingres.com