[net.micro.pc] 64K segments are good for you

brad@looking.UUCP (Brad Templeton) (03/04/85)

I've struggled quite a bit with 64K address limitations, and I'll be
the first to say that they are a pain.

But don't knock segmentation so quickly.  Have you ever wondered why
those 68000 workstations are so slow and memory wasteful, and how a
PC/AT can beat most of them in performance?  The answer is segmentation.

Most unix applications run in 64K, as they were designed for that.
So let them run with 16 bit pointers, and boy are they fast and small.
You can't run unix on a 68000 these days without 20 megs of disk and
1 meg of ram.

You can actually be up and running on a 256K z-8001 or 8086 system.  And
often beat out the 68000s

The segmentation offers you a nice deal - small programs as small as they
can be, and large programs there at a cost.  And with hardware designed
for this, it could be at no penalty.

If it turns out that you can do well by limiting single objects to 64K,
you aren't bothering anybody but the numerical analists.  I've yet to
see a non-numerical program with an object that large.  And what are they
doing on a 16 bit processor anyway?
-- 
Brad Templeton, Looking Glass Software Ltd. - Waterloo, Ontario 519/884-7473