[comp.arch] 8086 design goals

sbq@verdix.com (Sam Quiring) (05/10/89)

In article <3312@bd.sei.cmu.edu> firth@sei.cmu.edu (Robert Firth) writes:
>In article <912@aber-cs.UUCP> pcg@cs.aber.ac.uk (Piercarlo Grandi) writes:
>
>>Well, the problem with the Intel architecture is that it was designed for
>>Pascal, whose pointers can only point to heap allocated objects (and no
>>arithmetic on them is allowed).
>
>Leaving aside the question whether the Intel 8086 architecture was
>"designed" for anything, let alone Pascal ...

The Intel 8086 was definitely *not* designed with Pascal in mind (I did get
a good belly laugh out of that statement).  It was designed to provide an
upgrade path for 8080/8085 assembly language software (see note below), to
have 16-bit arithmetic (the 8080 had 8-bit arithmetic), and to be able to
address one megabyte of memory.  The 8085 gave Intel a lot of problems in
production, so they were very conservative with how much silicon the 8086
was permitted.  I believe it has about 29,000 transistor equivalents (the
486 has 1.2M!).  The 8086 was announced and available with PL/M-86 support
in June 1978.

Above all else, the 8086 was designed to make Intel a lot of money.  You may
not like the architecture, but I'd say it met all it's design goals.

Note: Intel provided an asm80 -> asm86 converter program, written by
John Crawford, architect of the 386/486.

Sam Quiring
Verdix Western Operations
uunet!vrdxhq!verdix!sbq

PS: I joined Intel in July of 1977 and worked on ASM86 (just the assembler,
not the assembly language!).

pcg@aber-cs.UUCP (Piercarlo Grandi) (05/15/89)

In article <362@verdix.verdix.com> sbq@verdix.com (Sam Quiring) writes:
    In article <3312@bd.sei.cmu.edu> firth@sei.cmu.edu (Robert Firth) writes:
    >In article <912@aber-cs.UUCP> pcg@cs.aber.ac.uk (Piercarlo Grandi) writes:
    >
    >>Well, the problem with the Intel architecture is that it was designed for
    >>Pascal, whose pointers can only point to heap allocated objects (and no
    >>arithmetic on them is allowed).
    >
    >Leaving aside the question whether the Intel 8086 architecture was
    >"designed" for anything, let alone Pascal ...
    
    The Intel 8086 was definitely *not* designed with Pascal in mind (I did get
    a good belly laugh out of that statement).

It is cheap to have good belly laughs out of unsupported and fairly
inaccurate stetements... Try reading a book about the 286 by Isaacson and
Albert (Addison).

    It was designed to provide an
    upgrade path for 8080/8085 assembly language software (see note below), to
    have 16-bit arithmetic (the 8080 had 8-bit arithmetic), and to be able to
    address one megabyte of memory. [ .... ]

Laudable marketing goals for the 8086. Too bad that I was speaking about the
"intel" architecture, in a context of segmentation/multics. If you want additional
corroboration that the Intel guys were Pascal language oriented (e.g. ada),
consider not just the segmentation scheme, that fits Pascal exactly and causes
C problems, but also the ENTER, LEAVE and BOUND instructions of the 286, and
the ridiculous idea of putting the rings in the 286 SDs in the middle of the
pointers, thus making 32 bit address arithmetic difficult, which of course
again is horrible for C but irrelevant for Pascal like languages.

    Above all else, the 8086 was designed to make Intel a lot of money.  You may
    not like the architecture, but I'd say it met all it's design goals.

Its *marketing* goals! Ok....


Further historical note: when the 8086 and 80286 were designed, C and Unix
were but a gleam in many CS dept. eyes, and Pascal reigned...

Now that C/Unix are important, the 80386 has been given a more C/Unix 
orientation (of course by *adding* features...)
-- 
Piercarlo "Peter" Grandi           | ARPA: pcg%cs.aber.ac.uk@nsfnet-relay.ac.uk
Dept of CS, UCW Aberystwyth        | UUCP: ...!mcvax!ukc!aber-cs!pcg
Penglais, Aberystwyth SY23 3BZ, UK | INET: pcg@cs.aber.ac.uk

rcd@ico.ISC.COM (Dick Dunn) (05/17/89)

In article <362@verdix.verdix.com>, sbq@verdix.com (Sam Quiring) writes:
...
> >In article <912@aber-cs.UUCP> pcg@cs.aber.ac.uk (Piercarlo Grandi) writes:
> >>Well, the problem with the Intel architecture is that it was designed for
> >>Pascal, whose pointers can only point to heap allocated objects ...
...
> The Intel 8086 was definitely *not* designed with Pascal in mind (I did get
> a good belly laugh out of that statement)...

Done laughing?  OK, good, it's our turn to laugh now.  The 8086 architects
most definitely had Pascal in mind when they designed the chip.  The
segment registers reflect this quite directly.  They had wanted more
segment registers to avoid thrashing, but concerns about chip size forced
them to use what they considered the bare minimum of four.  These were
quite closely tied to the notion of a Pascal-ish model of memory: code,
data (global variables), stack (local variables and parameters), and the
extra-segment register, which was nominally for the heap but also the "out"
for cases (such as var parameters) where you didn't know the segment.  BP
also reflects a (perceived) need of a Pascal implementation, and the
addressing modes correspond to what were expected to be common access forms
in Pascal.

In fact, ES did turn out to be a bottleneck with compilers that didn't pay
careful attention to it.  Probably the conceptual failure here was in
underestimating the frequency of use of var parameters and overestimating
the role of the heap.

One of the architects of the 8086 had done a lot of work with Pascal before
he went to intel.  Shortly after the 8086 came out, two of the architects
left intel and started a company to do--guess what?  Pascal compilers and
associated development tools for the 8086!  Trust me; they were thinking
Pascal during the (all too brief) design period.
-- 
Dick Dunn      UUCP: {ncar,nbires}!ico!rcd           (303)449-2870
   ...Relax...don't worry...have a homebrew.

baum@Apple.COM (Allen J. Baum) (05/17/89)

[]
 various people have said stuff about whether the 8086 was designed for PASCAL.
An anecdotal kind of story:

When I was consulting for Apple, (when the LISA project was just starting up,
& they were trying to decide which processor to use), a couple of people
who had been consultants to Intel gave a presentation on the architecture,
and how it was designed to run PASCAL. I belive that Steve Glanville was one
of those people. The implication was that they had been involved in the
architectural definition of the chip.

More anecdote: I suggested another option, rather than the 8086. I
investigated it for a week, then came back and told them not to screw
around, the 8086 would run rings around it. However, they were so
enamored with the idea that the stuck with it for a year or so, then
gave up. By that time, the 68000 was actually available, and they
easily chose that over the 8086.
--
		  baum@apple.com		(408)974-3385
{decwrl,hplabs}!amdahl!apple!baum

mcdonald@uxe.cso.uiuc.edu (05/18/89)

>
>>Well, the problem with the Intel architecture is that it was designed for
>>Pascal, whose pointers can only point to heap allocated objects (and no
>>arithmetic on them is allowed).
>
>Leaving aside the question whether the Intel 8086 architecture was
>"designed" for anything, let alone Pascal ...

Why are segments "desirable" for Pascal? I can see why multiple
data segments are undesirable for C, but not why they are DESIRABLE
for Pascal. 

In any case, even if segments were desirable, wouldn't there still
be NO good reason(*) for limiting segments to sizes smaller than the
memory space of the processor? What if a programmer wants to have
most of his address space filled with one big array. An 8086 still
makes a Pascal array larger than 64K a mess.

Doug McDonald

(*) other than simply and truly running out of silicon area? 

roger@telesoft.UUCP (Roger Arnold @prodigal) (05/18/89)

Since this news thread is dealing with ancient Intel history, I'd like
to ask a question I've long been curious about, and have never seen a
definitive answer.  Maybe enough time has gone by to get a straight 
answer from someone involved with the project and no longer working 
for Intel...

What I'd like to know is what was the story behind the undocumented
instructions in the 8085?  And, as a bonus question, do the chips
currently being produced still include those instructions?

The 8085 was really a much better chip than Intel advertised.  The
undocumented instructions extended the 8080 instruction set in what I
considered to be a cleaner and more natural manner than the glitzier
extensions in the Z80.  As a compiler writer, I would have found them
very useful.  I did use them in assembly code on my old IMSAI machine.
But Intel chose to pretend that they weren't there, and marketed the
8085 strictly as an upgraded 8080, with an easier system interface and
more efficient bus.  

I suppose they didn't want to invite comparison between the 8085 
extensions, and those of the Z80 that were out first.  And/or they
didn't want to steal any fire from the upcoming 8086.  But it always
struck me as wierd to have those nice extensions fully implemented
in the silicon and usable by anyone who knew about them, but never
even acknowledged by the chip's maker.  

- Roger Arnold				..ucsd!telesoft!roger

davidsen@sungod.steinmetz (William Davidsen) (05/18/89)

In article <370@telesoft.UUCP> roger@telesoft.UUCP (Roger Arnold @prodigal) writes:

| The 8085 was really a much better chip than Intel advertised.  The
| undocumented instructions extended the 8080 instruction set in what I
| considered to be a cleaner and more natural manner than the glitzier
| extensions in the Z80.  

  Agreed. I was maintaining a compiler at that time, and just went ahead
and put the extensions in. Having 2 byte indirect was a major win.

  At that time GE was designing a hardcopy terminal called the TermiNet
2000, and I sent a copy of the article in DDJ to Nelson Rosenstein in
Waynesboro. Some time later they sent me a copy of a contract between GE
and Intel for "a superset of the 8085 chip with added instructions."
If you ordered the GE part number you would get the instructions,
documented.

  I assume that Intel has not taken out the instructions, since that
would take a design effort.

  There were also (as I recall) 12 undocumented instructions in the Z80,
at least the Zylog version, of which only a few were useful to a sane
person. Some of the others were really neat, but not useful.
	bill davidsen		(davidsen@crdos1.crd.GE.COM)
  {uunet | philabs}!crdgw1!crdos1!davidsen
"Stupidity, like virtue, is its own reward" -me

peter@ficc.uu.net (Peter da Silva) (05/18/89)

One thing I don't understand in this discussion... what feature of Pascal
made them think no object should ever be larger than 64K? It's this 64K limit
that's the main problem in the 8086 family... the segments themselves are
a relatively minor problem. Even for 'C' there's no rule against , or real
difficulty with, putting one object (or at most a few objects) in a segment.
-- 
Peter da Silva, Xenix Support, Ferranti International Controls Corporation.

Business: uunet.uu.net!ficc!peter, peter@ficc.uu.net, +1 713 274 5180.
Personal: ...!texbell!sugar!peter, peter@sugar.hackercorp.com.

trebor@biar.UUCP (Robert J Woodhead) (05/19/89)

In article <13822@steinmetz.ge.com> davidsen@crdos1.UUCP (bill davidsen) writes:
>  There were also (as I recall) 12 undocumented instructions in the Z80,
>at least the Zylog version, of which only a few were useful to a sane
>person. Some of the others were really neat, but not useful.

The 6502 (at least some versions) had many undocumented instructions.  I
believe that they were written up in an ancient issue of Byte.  In particular,
there was a Store Immediate counterpart to Load Immediate, that stuck the
register in the byte following the instruction and skipped it.  And there was
the aptly named HCF ``Halt and Catch Fire'' instruction.  This instruction
locked the cpu, made it insensitive to anything but power down, and rapidly
cycled the address lines.

-- 
Robert J Woodhead, Biar Games, Inc.  !uunet!biar!trebor | trebor@biar.UUCP
"The lamb will lie down with the lion, but the lamb won't get much sleep."
     -- Woody Allen.

pcg@aber-cs.UUCP (Piercarlo Grandi) (05/23/89)

In article <31963@sri-unix.SRI.COM> peter@ficc.uu.net (Peter da Silva) writes:

    One thing I don't understand in this discussion... what feature of Pascal
    made them think no object should ever be larger than 64K? It's this 64K
    limit that's the main problem in the 8086 family...

Just a moment; first of all 8086 and 80286 are two different things. Let's
look at the 8086 first: it is a 16 BIT machine. The fact that it has four
segment registers is really like separate I/D spaces in the PDP-11. It is
still a 16 bit machine, only you can address from your program up to 256k,
if you don't care about not being able to have pointers from one space to
another (which is true for pascal, not true for C -- or algol 68 :-> -- where
you can have pointers to globals or locals -- pointers to procs are
different).

You also have a funny banking scheme to address 1M of physical memory, much
like the PDP-11 had in the MMU, but that would be irrelevant to user
programs, except that in real mode they can themselves change the MMU
registers. I know at least one machine (Altos 586) that using an 8086 with
an external, "protected mode" MMU, implements a diabolically fast PDP-11/70
"equivalent", i.e. 64k+64k of addressability per user program. (historical
note:  this scheme was also used in the Onyx C8002 with the Z8002).

    the segments themselves are a relatively minor problem. Even for 'C'
    there's no rule against , or real difficulty with, putting one object
    (or at most a few objects) in a segment.

The 80286 is different. It is a 16 bit machine with 32 bit segment descriptors.
Here lies the rub; they are descriptors, not pointers. In particular, the
rings are in the middle of the descriptor, and this makes reinterpreting
the descriptor as a pointer hard, because pointer arithmetic is not easy.
Again, there is no problem for Pascal (no pointer arithmetic) and most
languages, but indeed for C there is some. It is not impossible, and indeed
you have large/huge "models", etc..., but not pleasant.

summary: 8086 is 16 bit, so the 64k limitation is inherent, 80286 is 16+16 bit
(for addresses), so the 64k limitation is still there, even if it could have
been dispensed with if 32 bit address arithmetic had been removed.
-- 
Piercarlo "Peter" Grandi           | ARPA: pcg%cs.aber.ac.uk@nsfnet-relay.ac.uk
Dept of CS, UCW Aberystwyth        | UUCP: ...!mcvax!ukc!aber-cs!pcg
Penglais, Aberystwyth SY23 3BZ, UK | INET: pcg@cs.aber.ac.uk

iwm@ic.ac.uk (Ian Moor) (06/03/89)

In article <4230@ficc.uu.net> peter@ficc.uu.net (Peter da Silva) writes:

   One thing I don't understand in this discussion... what feature of Pascal
   made them think no object should ever be larger than 64K? 

Possibly the Berkeley Pascal compiler's use of 16-bit indexing for arrays :-)

Ian W Moor
  UUCP: uunet!mcvax!ukc!icdoc!iwm     
  ARPA: iwm@doc.ic.ac.uk
  JANET: iwm@uk.ac.ic.doc
           
 Department of Computing   We don't need no documentation,
 Imperial College.         We don't need no source control,
 180 Queensgate            No dark sarcasm in the boardroom,
 London SW7 UK.            Manager! leave those programmers alone!

--
Ian W Moor
  UUCP: uunet!mcvax!ukc!icdoc!iwm     
  ARPA: iwm@doc.ic.ac.uk
  JANET: iwm@uk.ac.ic.doc
           
 Department of Computing   We don't need no documentation,
 Imperial College.         We don't need no source control,
 180 Queensgate            No dark sarcasm in the boardroom,
 London SW7 UK.            Manager! leave those programmers alone!