[net.arch] What if IBM Had chosen the 68000?...

scott@gitpyr.UUCP (Scott Holt) (11/20/85)

   IBM did use the 68000 in one of their micros: The 9000 ( I think ) from IBM
Instruments. It is a 68000 based lab computer which is designed primarily for
data collection and analysis. Its been out for several years ( I think its
pre-XT ) and IBM has even re-packaged it in a business machine. 
   The cost of the machine is a bit higher than a PC, but I believe this is due
to the fact that many interfaces for lab equipment come standard, not because of
alot of bus decoding curcitry as someone mentioned  earlier. The general
impression I got from the reviews I have read and from talking to some of the
people who have used the one we have here is that it is a decent machine. It did
not, however, have a very serious impact on the micro market, and I doubt that
if IBM had used the 68000 in the PC it would have made much of a difference...
the market was just not ready for the power of a 68000 in a home/business
machine.
-- 
---------
I'll stop procrastinating tommorow.

Scott Holt
Georgia Tech Po Box 36199
Atlanta, GA 30332
       
USENET: scott@gitpyr
BITNET: CCASTSH AT GITVM1

thau@h-sc1.UUCP (robert thau) (11/21/85)

> 2) CP/M Software (8080) is given no place to migrate.  CP/M programs and
>    6502 programs all have a high degree of processor loyalty that C programs
>    for 16 bit CPU's don't.  You *can't* port a cp/m program to a 68000
>    without a total rewrite.  (This may be a good thing!)  What this means
>    is that CP/M doesn't die, and maintains strength the same way the Apple
>    ][ and Commodore Architectures hang on.  The result: CP/M and the 6502
>    are the only serious contenders against IBM.

WRONG.  Even the 8086 gave CP/M software no place to migrate.  It is true that
the register set-up is similar to that of the 8080.  However, the instruction
sets are not in one-to-one correspondance.  I recall a BYTE article several
years back which compared a few 8080-to-8086 translators which were on the
market at the time; all of them had to expand one instruction to three in some
cases, and many could be tricked into far worse.  There were the inevitable
problems with translation of operating system calls.  Lastly, any directly
translated software would be unable to use more than 64K bytes.

A bit of history:  one of the biggest problems with the IBM PC during its first
year out was that there was no software available.  In fact, somebody came out
with (of all things) an 8080 coprocessor board, called, I believe, Baby Blue,
so that people could run available CP/M software on otherwise useless big blue
paperweights.  If the chips really were *that* compatible, this never could
have flown.

farren@well.UUCP (Mike Farren) (11/22/85)

After eliminating a LOT of verbiage:
In article <428@ecn-pc.UUCP>, wdm@ecn-pc.UUCP (Tex) writes:
> In article <456@looking.UUCP> brad@looking.UUCP (Brad Templeton) writes:
> >>
> >>    Think what the world would be like now if IBM had decided to go with
> >>    the Motorola family of chips for the PC series.  WOW!!  We would
> >>    really have some systems out there.  
> >
> >2) CP/M Software (8080) is given no place to migrate. (...)
> 
>     Is a sizable percentage of ms-dos software old cp/m software? It
>     would surprise me if it were.  Or maybe I should say it would sur-
>     prise me if it weren't rewritten in a major way, seeing as how the
>     operating systems are not at all alike.  I would have rather they
>     were rewritten for the 68000 environment.
 	
	In fact, MOST of the software first available for the IBM PC
(that which made it a popular product) was old cp/m software (WordStar,
DBase II, etc.).  Without that quickly available software base, the PC
would have languished for far longer than it did.  Also, without that
starting point, it would have taken a lot longer for the state of the
software art (as far as PCs go) to advance to where it is now.
   The two operating systems are very much alike.  There are even hooks
in MS-DOS to allow I/O calls in EXACTLY the same fashion as CP/M!


-- 
           Mike Farren
           uucp: {dual, hplabs}!well!farren
           Fido: Sci-Fido, Fidonode 125/84, (415)655-0667
           USnail: 390 Alcatraz Ave., Oakland, CA 94618

rcd@opus.UUCP (Dick Dunn) (11/22/85)

> I didn't really want to get dragged into this, but a comparison of the sizes
> of executables (using size(1), and only adding up .text) of the stuff in
> /bin and /usr/bin on a 68k UNIX (Sun-2) versus a 286 UNIX SYS V shows that the
> 286 binaries are only 65% of the size of the 68k binaries.  I think Brad's
> argument *is* valid.

If you compare sizes, you're not just comparing processors.  You're
comparing compilers (as well??? mostly???)

My comparisons show that 186 code is about 70% larger than 68010 code...but
I'm going against my preceding argument.  186 code <<as generated by the
Intermetrics C compiler>> is about 70% larger than 68010 code <<as
generated by OUR 68010 compiler>> when measured <<on the set of test cases
which concerned me at that time>>.
-- 
Dick Dunn	{hao,ucbvax,allegra}!nbires!rcd		(303)444-5710 x3086
   ...If you get confused just listen to the music play...

brownc@utah-cs.UUCP (Eric C. Brown) (11/22/85)

In article <768@h-sc1.UUCP> thau@h-sc1.UUCP (robert thau) writes:
>> 2) CP/M Software (8080) is given no place to migrate.  CP/M programs and
>>    6502 programs all have a high degree of processor loyalty that C programs
>>    for 16 bit CPU's don't.  What this means
>>    is that CP/M doesn't die, and maintains strength the same way the Apple
>>    ][ and Commodore Architectures hang on.  
>WRONG.  Even the 8086 gave CP/M software no place to migrate.  It is true that
>the register set-up is similar to that of the 8080.  However, the instruction
>sets are not in one-to-one correspondance.  I recall a BYTE article several
>years back which compared a few 8080-to-8086 translators which were on the
>market at the time; all of them had to expand one instruction to three in some
>cases, and many could be tricked into far worse.  There were the inevitable
>problems with translation of operating system calls.  Lastly, any directly
>translated software would be unable to use more than 64K bytes.
>

Actually, the software *did* migrate; Microsoft Basic and early WordStar
versions for the IBM PC were nothing more than (poorly) translated copies of
8080 Microsoft Basic and WordStar.  Try unassembling IBM Cassette Basic or
Wordstar 3.0 or so.  As far as the OS call translation problems, MS-DOS 1.0
was essentially CP/M 2.2 with a different file structure.  The majority of
programs that did nothing but open, close, read, and write text files could
be ported without any major changes.  On your final point, you are correct,
most translated software didn't use more than 64K.  WordStar simply swapped
back and forth within its buffer of 20K or so, regardless of how much memory
was actually there.  That's why RAMDisks were so popular; if you couldn't
get at more than 64K above DOS, it made sense to load the rest of the stuff
into RAM and use it as a disk.

Eric C. Brown
brownc@utah-cs

jmoore@mips.UUCP (Jim Moore) (11/22/85)

And what if Eleanor Roosevelt had been able to fly?

peter@graffiti.UUCP (Peter da Silva) (11/23/85)

> would have languished for far longer than it did.  Also, without that
> starting point, it would have taken a lot longer for the state of the
> software art (as far as PCs go) to advance to where it is now.

Maybe to advance to the point it was at in 1982 or 1983. Since then more
effort has been spent on bypassing the limitations of the "OS" and CPU
of the IBM-PC.

Evidence:

	#1 witness: Sidekick, a program that would be totally redundant
	   in a multitasking operating system.

	#2 witness: The rash of "integrated" applications on the market.
	   As has been shown by the Macintosh "Switcher" program and the
	   multijob programs for the PC, most if not all users are better
	   served by a series of co-operating programs.

	#3 witness: Concurrent this, concurrent that, concurrent the other
	   thing. None of these multitaskers is as easy to use, nor as
	   versatile and reliable, as RSX... an O/S running on a machine
	   with even worse memory restrictions... let alone UNIX.

	#4 witness: The AMIGA. This is where the PC should have been years
	   ago.
-- 
Name: Peter da Silva
Graphic: `-_-'
UUCP: ...!shell!{graffiti,baylor}!peter
IAEF: ...!kitty!baylor!peter

ark@alice.UucP (Andrew Koenig) (11/24/85)

Well, IBM *DID* choose the 68000... to use in a little box
they called the CS9000.  Apparently it didn't catch on.

As to why they chose the 8088 instead of the 8086, Philip Estridge
explained it in an interview a few years ago.  At the time they
had to start manufacturing, the 8086 was simply much more expensive,
and they couldn't afford to use it in a product that would be able
to sell at the price they wanted to charge.

dave@heurikon.UUCP (Dave Scidmore) (11/27/85)

> Well, IBM *DID* choose the 68000... to use in a little box
> they called the CS9000.  Apparently it didn't catch on.
> 
As I recall (my memory may be faulty) the CS9000 was a lab machine designed
for an entirely different market than the "PC." It seems obvious to me that
a machine designed for lab use would generally be more expensive than a
business machine and incorporate features that the business market would not
be willing to pay for. To expect a machine that was designed for one
application to "catch on" in another does not make sense. In addition the
machine mentioned did not have nearly the marketing thrust behind it that the
PC did. Proof of this is the fact that it is such a little known machine.

Also I have heard some talk that IBM did not make the decision on which
processor was to go in the PC, some small firm did and was later bought out
by IBM. Somewhere down the line IBM had to make the internal decision to go
with the 8088 in as big a way as they did. Companies the size of IBM
simply do not put the vast amounts of money required into sales and third
party support for a new product without first checking out the sources for
all of the components. In the type of analysis required to get a product
to market the factors that weigh the most heavily are not technical at all.
The factors that weigh heaviest are factors such as the security of the
company supplying parts, cost of the parts and how that cost affects the
cost of the end product, will the cost of the end product be acceptable
to the market place. In the final analysis it was probably a balance of all
factors that made IBM choose the 8088 as the right tool for the job. To say
that it is all black and white, that IBM based their decision entirely on
performance or entirely on the market, or that some small company nobody
ever heard of made the decision on which processor to use for IBM, is to view
the corporate decision making process much too simplistically.

						Dave Scidmore

hsu@eneevax.UUCP (Dave Hsu) (11/27/85)

In article <143@heurikon.UUCP> dave@heurikon.UUCP (Dave Scidmore) writes:
>> Well, IBM *DID* choose the 68000... to use in a little box
>> they called the CS9000.  Apparently it didn't catch on.
>> 
>As I recall (my memory may be faulty) the CS9000 was a lab machine designed
>for an entirely different market than the "PC." It seems obvious to me that
>a machine designed for lab use would generally be more expensive than a
>business machine and incorporate features that the business market would not
>be willing to pay for. To expect a machine that was designed for one
>application to "catch on" in another does not make sense. In addition the
>machine mentioned did not have nearly the marketing thrust behind it that the
>PC did. Proof of this is the fact that it is such a little known machine.
>
I don't remember seeing much hoopla about the 9000 either.  The PC, by 
contrast, enjoyed such popularity as an untried product, that IBM employees
found themselves waiting at the end of a 6+ month waiting list at times.
Although (and I know some of you must share this suspicion) I believe that
IBM at times uses about 3 times as many parts as necessary to accomplish
anything, another problem facing the 9000 was that its parts density, according
to the Byte/IBM interview, exceeded commercial manufacturing limits until only
a few months before its release.  Not exactly a tiny machine for what it did,
the 9000 was ill suited to be mass-marketed and serviced.

-dave
[std disclaimer goes here; I have no connection to Ichi Bichi Motors, or any
 other company sharing their initials.]
-- 
Spoken: David Hsu	ARPA: 	hsu @ eneevax.umd.edu	hsu @ mit-prep.arpa
UUCP:	{seismo,allegra}!umcp-cs!eneevax!hsu		BITnet: CF522 @ UMDD
USnail: Communication & Signal Processing Lab, Dept of Electrical Engineering
	University of Maryland,  College Park, MD 20742

"I realized my destiny when God came to me in a dream and told me
 that I was to be King of All Maryland..."
	-King Tom II; after King Tom III; after King Tom II; after Prince Fred..