[net.arch] flexible-instruction-set machines

henry@utzoo.UUCP (Henry Spencer) (06/18/85)

> 	A reasonable approach to producing CISC machines ... to have the 
> 	instruction set be changeable for different languages.  
> 
> Who is working on flexible instruction set computers now-a-days?

Actually, my impression was that most everyone had been discouraged by
the B1700's flaws.  The concept was fine, but performance was not, and
there was also the nasty problem that with different microcode for each
language, it becomes very difficult to mix languages (e.g. to call Fortran
number-crunching routines from a program written in something else).

Burroughs also did a lousy job on software and marketing, as I understand
it, from the viewpoint of researchers and non-Cobol production shops.
-- 
				Henry Spencer @ U of Toronto Zoology
				{allegra,ihnp4,linus,decvax}!utzoo!henry

hes@ecsvax.UUCP (Henry Schaffer) (06/20/85)

The Microdata Corp. had several *user* microprogrammable minis
back in the 1969-1972 time frame.  As I remember, it was more of
a "do your own firmware" kind of setup - rather than loading new
instructions on the fly during program execution.  If anyone really
cares I could look a bit further at the Micro 800 and 1600 Reference
Manuals.
--henry schaffer

mjl@ritcv.UUCP (Mike Lutz) (06/22/85)

In article <5694@utzoo.UUCP> henry@utzoo.UUCP (Henry Spencer) writes:
>Actually, my impression was that most everyone had been discouraged by
>the B1700's flaws.  The concept was fine, but performance was not, and
>there was also the nasty problem that with different microcode for each
>language, it becomes very difficult to mix languages (e.g. to call Fortran
>number-crunching routines from a program written in something else).
>Burroughs also did a lousy job on software and marketing, as I understand
>it, from the viewpoint of researchers and non-Cobol production shops.

Ok, the straight poop from an old poop who actually used the B1700
(microcode & all) in graduate school:

1. Performance was not spectacular in the absolute sense, but was quite
   good given the market Burroughs staked out for the machine.
   Remember:  it was designed to go head-to-head with the IBM System 3,
   which was essentially an RPG machine used by small companies and
   service bureaus.  For roughly the same money, the B1700 provided a
   multitasking, virtual memory, multilanguage, multiarchitecture
   system.  It is hardly the engineers' fault that Burroughs chose to
   squirrel away most of the neat stuff (99.9999999% of the Burroughs
   sales force had *no* idea what it was they were selling).
   [Digression: our image processing group used to run their
   reconstruction algorithms on the machine because it had virtual
   memory and the CDC6400 didn't have enough real memory to hold their
   matrices].

2. Performance for any one language was sub-optimal (easily demonstrated:
   even if the architecture were perfect, the implementation could always
   be improved by tossing the microcode & doing everything in hardware).
   However, performance across *all* languages & architectures was better
   than other systems in its price range.

3. It was an ideal machine for trying out new architectural ideas.  Microcode
   was relatively easy to write (in a high level assembly language).
   The internal registers all had special properties appropriate for
   different phases of the instruction cycle.  The impression one got
   was that the information you needed was where you wanted it when you
   wanted it (sounds like Unix, eh? :-).  Memory was bit addressable,
   and ALU operations could easily be tailored for any bit width.  Some
   of the emulators exploited this: one instruction set based opcode
   length on static frequency statistics; another determined the
   address length from the number of distinct identifiers.

4. In general, the software (micro assembler, compilers, utilities)
   were modern if not futuristic for the early '70s.  The operating
   system and other systems programs were written in SDL, an ALGOL &
   PL/I hybrid with data structures and no gotos -- of course, it had
   its own specialized architecture.  The main problem with MCP (the
   operating system) was that it was single threaded (i.e., it executed
   in its own dedicated context, rather than duplicating the context
   for each process like Unix).  If one process blocked on disk I/O,
   the operating system blocked for all intents and purposes.  It would
   try to run other processes, but if they made a system request, they
   blocked even if the request required no I/O.  Arguably a flaw, but
   MCP provided services (like virtual memory) which were unusual even
   for large systems, with as little as 48Kbytes of real memory!

5. The biggest problem, as Henry noted, was the inability to create a
   program using routines written in different languages.  I know our
   group did some preliminary research on this issue, but we dropped it
   when other more exciting projects came along.

All in all, I liked the machine.  It was diametrically opposed to
current RISC ideas, but it was a clean implementation of its
philosophy, and one worth looking at when comparing architectural
concepts.

Anyone else on the net from SUNY/Buffalo want to support or refute
this?  Are you out there Bill Hopkins?

P.S. I also worked on the Microdata 1600 -- puke! blecch!  It was an
ugly microarchitecture, impossible to use, and slow as mud.  Other than
that, it was perfect.
-- 
Mike Lutz	Rochester Institute of Technology, Rochester NY
UUCP:		{allegra,seismo}!rochester!ritcv!mjl
CSNET:		mjl%rit@csnet-relay.ARPA

jer@peora.UUCP (J. Eric Roskos) (06/24/85)

> Actually, my impression was that most everyone had been discouraged by
> the B1700's flaws.

This is most certainly untrue!  Actually, before I got involved in my
research on multiprocessor memory primitives, I was very actively involved
in this area of research.

It is unfortunate that the popular interest in RISC machines has so obscured
the equally active (in fact, possibly more widespread) research in the
CISC machines.

But, if you want to see what is going on, just read the proceedings of one
of the annual Conferences on Microprogramming, MICRO-n, where n is a number
around 18 or so at present, I think (when I published in it it was 14).

There is quite a lot of research going on.  Some of the most interesting is
in "vertical migration"; this involves programs which analyze the usage of
instruction sequences in existing programs, and migrate portions of the
programs into microcode, based on various optimization criteria.

Besides, I don't see where the complaints against the B1700 come from.
When I was an undergraduate, I worked with people who had been around since
the days of the early IBM machines; and I recall how many of them very
fondly remembered the B1700. (In fact, the manager of the DP shop at the
college I attended often complained that the newer machines didn't have
this or that feature of the B1700.) However, I have no experience with it,
so I can hardly judge that myself.
-- 
Shyy-Anzr:  J. Eric Roskos
UUCP:       ..!{decvax,ucbvax,ihnp4}!vax135!petsd!peora!jer
US Mail:    MS 795; Perkin-Elmer SDC;
	    2486 Sand Lake Road, Orlando, FL 32809-7642

	    "Erny vfgf qba'g hfr Xbqnpuebzr."

warren@tikal.UUCP (Warren Seltzer) (06/25/85)

Interpret this line with the natural language currently in your cortex ;-)

As a user of the B1700, in a harsh commercial environment, I would like to
add that the software was at least three levels above all the competition
available at the time.  When the B1700 came out, the IBM competor machine
was not even able to print and compute and the same time !  I was told that
the IBM salesmen claimed that spooling  "Didn't make the printer any faster".

The thing was great, it would run three Cobol programs at the same time.
(I admit it publicly, I've programmed Cobol ... ).  Several years later, the
Z80 CP/M system came out, and with twice the memory, couldn't walk and chew
gum at the same time - I was sorely disappointed.

The great weakness of the B1700 was its miserable reliability.  The static
charges from the printer would bring down the CPU for days.  The disk drives
crashed heads, the tape drives would skip out of alignment weekly, but they
would stream !   The MCP software would write one block after another without
stopping, if things went just right.  We became close friends with the
repair crew, but not with our management.

The Cobol compiler supported RECURSION !!  You could write these miserable
imitation subroutines that Cobol had (Performs) and it would correctly
recurse !

When it ran, it was great.  Burroughs could have built a version that 
supported the architecture of its Medium and Large series of machines, but
the B1700 eventually died out.

  WHY RISC SUCCEEDS -- AND VARIABLE MICROCODE DOESN'T

The success of risc is based as much on physics as on software.  The
cost of variable microcode depends on the cost of FAST RAM.  Its faster
to simply give the fast ram to the user, and make it available as a set
of sliding register windows (a la Patterson), than to force another
level of interpretation.

  RISC ~ CISC

The machine that the B1700 used to INTERPRET all that variable microcode
was very simple.  The registers were used in an innovative way.  Move
something to the X register, move something else to the Y register.  Now,
to get the sum, read the "X+Y" register.  To get the product, read the
"X*Y" register, and so on.  Parallel execution !   If the Burroughs B1700
series were alive today, Burroughs (or is it Burroughs-Univac ?)
could build a RISC, sliding register windows and all, and have it
interpret that old Cobol microcode.   But since they control the
compiler, they could change the code generated for those old Cobol
programs, even generate RISC instructions, if they want to.  There are
THREE levels of control:  1) The code the Cobol compiler generates, 2) the
microcode that interprets the generated code, and 3) the hardware that
interprets the microcode (2).  

As electronics migrates through VLSI to giga-chips, it may come back.
The thing is still viable, all they need to do is write new microcode for C,
and burn all that Cobol stuff !

	teltone:warren