[net.micro] 4->8->16->32->64 bit micros

planting@uwvax.UUCP (W. Harry Plantinga) (12/17/84)

In article <280@oakhill.UUCP> davet@oakhill.UUCP (Dave Trissel)
writes concerning the need for 64 bit microprocessors:

>There are a lot of finer points that can be made in this discussion but I
>believe the bottom line boils down to  the requirement of supporting data
>or addresses of more than 32-bits in size.  I simply do not find much
>interest around for either.  

It is true that there has not been much interest to this point in
greater than 32-bit address space.  However, that will change, and in
fact there has been _some_ interest already.  

In their paper "A Massive Memory Machine," H. Garcia-Molina, R.
Lipton, and J. Valdes* make a case for the need for machines with
massive amounts of actual (non-virtual) semiconductor memory.  (For
"massive" read 10-100 gigabytes or more.)  The argument is basically
that while some classes of problems can be solved efficiently by existing
supercomputers, other problems essentially require large amounts of
actual memory, and can do with a slower processing speed.

They go on to show some types of problems (for example, large databases)
which will operate much faster on a slow machine with a huge amount of
physical memory than on a processor with infinite clock speed but
limited (i.e. 100 megabyte) physical memory.

At any rate, who's to say no one will ever want more than 4 gigabytes
of physical memory?

			Harry Plantinga
			planting@uwvax
			{allegra,heurikon,inhp4,seismo}!uwvax!planting

*"A Massive Memory Machine," IEEE Transactions on Computers, Vol.
c-33, No. 5, May 1984, 391-99.