[net.physics] reversible computing

KATZ@USC-ISIF.ARPA (06/28/85)

From:  Alan R. Katz <KATZ@USC-ISIF.ARPA>

In response to the question of tunneling:

I have not read all of the Sci Amer article, so I can't answer about this
particular point.  However, Richard Feynman (and others) have constructed
Quantum Mechanical models of reverisble computers, so such machines
are realizable in the Quantum world.

The trouble is that these models consists of constructing a wierd
Hamiltonian for a theoretical system.  They do not really say
how an actual system could be contructed.  The one(ones) mentioned
in Sci Amer may have problems, but in principle, you could do it.

For those who have not read the article, it is about making computers
that generate essentially no heat, and use essentially no energy.  The
key is using reversible microscopic processes (such as billiard ball
collisions) as gates in the computer.  Because the computation is
reversible, any noise will just "undo" the computation a little, rather
than mess it up.  Since energy is conserved, it takes no (very little) energy
to do the actual computation.  You get strange things like the number of
bits being conserved (so when you generate an answer, you also must
generate its complement).

After discussing various ways to do these computations based on 
billiard ball type collisions, you must take quantum mechanics into
account.  Feynman and others have done this and it still all works,
at least theoretically.


			Alan
-------

trt@rti-sel.UUCP (Tom Truscott) (06/30/85)

I was surprised that that Sci Amer article "The Fundamental Physical
Limits of Computation" spent so many pages on "is a minimum amount
of energy required" and almost none on "some other questions."

The question that interests me most is: how fast can computers go?
The article only suggests that "extremely fast events can take place
without any loss of energy" but hints that there might be problems
in determining the exact time at which the event took place.
But in a sequential machine the transition times are crucial!!
Years ago a letter to the CACM (I think) suggested an upper bound
on sequential transitions/second.  It used the uncertainty
principle to determine ergs/transition/second, and then E=mc^2
to determine the point at which the computer became a black hole (!).
Now, this particular line of argument might be flawed,
but surely 'MIPS' is as interesting as kilowatts/operation.

Another interesting question is: how small can a computer memory be made?
If there is some minimum volume per bit then no memory is O(1)
and binary search is not O(log n).
I wish the authors had spent more time on such things.
Posing interesting questions can be just as illuminating as answering them.
	Tom Truscott