[net.arch] electrons as a bound on memory size

jqj@gvax.cs.cornell.edu (J Q Johnson) (09/16/86)

JRB3 writes:
>The point is that our memory is based on electrons.

It has been argued that the number of electrons in the universe puts a bound
on the potential maximum memory size of a computer.  Reflection should convince
you that this argument is specious:  the real limitation is the number of
different energy states possible, and hence is constrained only by the Pauli
exclusion principal and by your imagination.  As a simple example, you can
encode more than 1 bit of data with a single electron in the shell of a
hydrogen atom simply by using different energy states.  For that matter you
can encode data within a nucleus by using different energy states.

Going one step further out, why use matter to encode data at all?   If you 
want bulk sequential storage, simply modulate an EM wave directed at a
distant point (that destination might be a reflector of some kind if you
know you will want to retrieve your data at the same location you generated
it).  Result:  effectively infinite bulk storage at the cost of a finite
and small number of electrons.  Granted it's not random access, but neither
are lots of existing technologies.

Of course, such storage techniques must come to grips with uncertainty, but
then, so must a memory based on the presence of electrons (note that even the
number of electrons in a container cannot be precisely specified!).

nather@ut-sally.UUCP (Ed Nather) (09/17/86)

In article <505@gvax.cs.cornell.edu>, jqj@gvax.cs.cornell.edu (J Q Johnson) writes:
> Going one step further out, why use matter to encode data at all?   If you 
> want bulk sequential storage, simply modulate an EM wave directed at a
> distant point (that destination might be a reflector of some kind if you
> know you will want to retrieve your data at the same location you generated
> it).  Result:  effectively infinite bulk storage at the cost of a finite
> and small number of electrons. 

I'm not sure what "effectively infinite" means, but I do know that any EM
"wave" is made up of individual photons, which must appear in groups
(using current technology) of 5 or more to generate a detectable signal.
Even if you could get it down to 2 photons (1 == off, 2 == on, 100% detection
efficiency) the storage will be far from infinite -- it will be, in fact,
finite. (*gasp*).  

There is also an energy requirement: if it were as you describe, you could
make a jim-dandy perpetual motion machine that way.


-- 
Ed Nather
Astronomy Dept, U of Texas @ Austin
{allegra,ihnp4}!{noao,ut-sally}!utastro!nather
nather@astro.AS.UTEXAS.EDU

smdev@csustan.UUCP (Scott Hazen Mueller) (09/18/86)

In article <> nather@ut-sally.UUCP (Ed Nather) writes:
>In article <>, jqj@gvax.cs.cornell.edu (J Q Johnson) writes:
>> Going one step further out, why use matter to encode data at all?   If you 
>> want bulk sequential storage, simply modulate an EM wave directed at a
>> distant point ...  Result:  effectively infinite bulk storage at the cost
>> of a finite and small number of electrons. 
>
> ...any EM "wave" is made up of individual photons, which must appear in groups
>...  2 photons (1 == off, 2 == on, 100% detection efficiency) the storage will
> be far from infinite -- it will be, in fact, finite. (*gasp*).  
>
>Ed Nather

Why bother with using a presence/absence test at all?  What the original poster
was saying was, based on the fact that matter/energy (to the best of my know-
ledge) always exhibits the property of multiple energy levels, it is possible
to encode multiple bits for each energy level.  A useful analogy is to observe
the distinction between bits (1s and 0s) and baud (detectible state changes).
One baud may encode several bits.  Likewise, a photon may occupy a very large
number of energy levels; probably not an infinite number.  Digressing for a
moment into physics, we have E=h*nu (if you can draw a Greek nu on an alpha-
numeric terminal, more power to you); this shows that if we vary the energy
level of our photon beam, we vary its frequency.  Voila!  FM - and we already
know that this works...
   Since the universe is finite, any storage mechanism will be by definition
finite, but the constraint is not something simple like the total number of
electrons, but rather the total number of energy states occupied by all
mass/energy in the universe.  In fact, the universe already encodes *all*
information on itself in itself :->
                           \scott
-- 
Scott Hazen Mueller                         lll-crg.arpa!csustan!smdev
City of Turlock                             work:  (209) 668-5590 -or- 5628
901 South Walnut Avenue                     home:  (209) 527-1203
Turlock, CA 95380                           <Insert pithy saying here...>

cbbrowne@watnot.UUCP (Christopher Browne) (09/18/86)

In article <505@gvax.cs.cornell.edu> jqj@gvax.cs.cornell.edu (J Q Johnson) writes:
>JRB3 writes:
>>The point is that our memory is based on electrons.
>
>It has been argued that the number of electrons in the universe puts a bound
>on the potential maximum memory size of a computer.  Reflection should convince
>you that this argument is specious:  the real limitation is the number of
>different energy states possible, and hence is constrained only by the Pauli
>exclusion principal and by your imagination.  As a simple example, you can
>encode more than 1 bit of data with a single electron in the shell of a
>hydrogen atom simply by using different energy states.  For that matter you
>can encode data within a nucleus by using different energy states.
>
It should be remembered that if we are trying to model all of the electrons in
the universe at the beginning, (assuming that more have not come into existance,
which really fouls things up anyway), we must consider all of the possible
energy states for each electron.  Thus, much more than one bit per electron to
be modeled.  There aren't enough  electrons to allow storage of the data, as
well as to run the computer & program to process the data.   :-)

-- 
            Christopher Browne
            University of Waterloo
            Faculty of Mathematics

	        "To do is to be."  -- Aristotle
         	"To be is to do."  -- Socrates
         	"Do be do be do."  -- Sinatra
         	"Do be a do bee."  -- Miss Sally of Romper Room fame.
        	"Yabba dabba do."  -- Fred Flintstone
		"DO...BEGIN..END"  -- Niklaus Wirth

peters@cubsvax.UUCP (Peter S. Shenkin) (09/22/86)

In article <ut-sally.5745> nather@ut-sally.UUCP (Ed Nather) writes:
>In article <505@gvax.cs.cornell.edu>, jqj@gvax.cs.cornell.edu (J Q Johnson) 
>  writes:

>> 			...why use matter to encode data at all? ...simply 
>> modulate an EM wave directed at a
>> distant point (that destination might be a reflector of some kind if you
>> know you will want to retrieve your data at the same location you generated
>> it).  Result:  effectively infinite bulk storage at the cost of a finite
>> and small number of electrons. 

>								...any EM
>"wave" is made up of individual photons, which must appear in groups
>(using current technology) of 5 or more to generate a detectable signal.
>Even if you could get it down to 2 photons (1 == off, 2 == on, 100% detection
>efficiency) the storage will be far from infinite -- it will be, in fact,
>finite. (*gasp*).  
>
>There is also an energy requirement...

OK, let's try to quantify this a bit.  The mass of the universe is felt to
be ca. 1e37 kg;  if all this were converted to radiant energy via 
Einstein's relationship (E=mc**2) this corresponds to 1e54 J.  Now, 
a photon has energy given by Planck (E=hv, where v is the Greek "nu", 
freqency);  but presumably one has to use photons of E >= ~kT (T the
temp, k, Boltzmann's constant) lest the signal be lost in thermal noise.
At the present epoch, T =~ 3K (the 3-degree-K cosmic background level),
so that kT is about 4e-23 J.  Let's say it takes 10 kT to
encode a bit at an acceptable signal-to-noise level;  then the
number of bits one can encode is (1e54/4e-22)=~2e75;  that is, (2 x 10^75).

Note that as the universe cools down, it gets cheaper to convey a bit, so
that the number of bits it is possible to store is continually going up!
(I pass over any difference between storing and conveying information,
because how could you use stored information if it weren't conveyed?)
This would seem to imply (perhaps stretching things a bit, but what the hell --
you only live once!) that at the moment of the big bang there was no information
present in the universe (since T was infinite, I believe).

Peter S. Shenkin	 Columbia Univ. Biology Dept., NY, NY  10027
{philabs,rna}!cubsvax!peters		cubsvax!peters@columbia.ARPA

nather@ut-sally.UUCP (Ed Nather) (09/23/86)

In article <541@cubsvax.UUCP>, peters@cubsvax.UUCP (Peter S. Shenkin) writes:
> Let's say it takes 10 kT to
> encode a bit at an acceptable signal-to-noise level;  then the
> number of bits one can encode is (1e54/4e-22)=~2e75;  that is, (2 x 10^75).
> 
> Note that as the universe cools down, it gets cheaper to convey a bit, so
> that the number of bits it is possible to store is continually going up!
> (I pass over any difference between storing and conveying information,
> because how could you use stored information if it weren't conveyed?)
> This would seem to imply (perhaps stretching things a bit, but what the hell --
> you only live once!) that at the moment of the big bang there was no information
> present in the universe (since T was infinite, I believe).
> 

Well, probably not infinite but large enough so the point stands pretty well
anyway.  Your point is interesting in the context of the "communication"
problem in the early universe.  Briefly, it arises if you go so far back
toward the Big Bang that the whole universe becomes very tiny -- and the
time after time zero is correspondingly small.  The standard argument says
that the individual parts of the universe didn't have time to communicate with
each other (assuming the speed of light then was what it is now) so how did
they all know to assume the same conditions?  The 3 degree "fireball" radiation
is remarkably uniform as we now see it, so communication MUST have been
possible.  So maybe we can't extrapolate back to that time "linearly" --
maybe the expansion from time zero wasn't uniform, at the present rate, but
the initial expansion was "inflated" (exponential) and started slower, but
speeded up to the rate we see today "after the communication took place."

If this seems implausible to you, welcome to 1986!  But it is the most widely
accepted cosmological model available today.  The "inflationary universe"
model was developed to explain away the inability of the differing parts of
the very early universe to communicate (and interact, so the system came to
thermal equilibrium).

Your point suggests an alternative explanation, which I find interesting:
maybe the different parts of the early universe really had nothing much to
say to each other!  If so, the "communication" problem simply disappears.

{If we get a Nobel prize for this, we split it 50/50.  OK?}

-- 
Ed Nather
Astronomy Dept, U of Texas @ Austin
{allegra,ihnp4}!{noao,ut-sally}!utastro!nather
nather@astro.AS.UTEXAS.EDU

franka@mmintl.UUCP (Frank Adams) (09/24/86)

In article <541@cubsvax.UUCP> peters@cubsvax.UUCP (Peter S. Shenkin) writes:
>This would seem to imply (perhaps stretching things a bit, but what the
>hell -- you only live once!) that at the moment of the big bang there was
>no information present in the universe (since T was infinite, I believe).

I'm not sure about the reasoning, but I believe this conclusion is correct.

Frank Adams                           ihnp4!philabs!pwa-b!mmintl!franka
Multimate International    52 Oakland Ave North    E. Hartford, CT 06108

kay@warwick.UUCP (Kay Dekker) (09/27/86)

In article <505@gvax.cs.cornell.edu>
	jqj@gvax.cs.cornell.edu (J Q Johnson) writes:

>Going one step further out, why use matter to encode data at all?   If you 
>want bulk sequential storage, simply modulate an EM wave directed at a
>distant point (that destination might be a reflector of some kind if you
>know you will want to retrieve your data at the same location you generated
>it).  Result:  effectively infinite bulk storage at the cost of a finite
>and small number of electrons.  Granted it's not random access, but neither
>are lots of existing technologies.

I seem to remember that this idea has already been proposed; read
_His_Master's_Voice_, by Stanislaw Lem.  This contains probably the
most outrageous (and *universal*) method of data storage I've ever seen...

							Kay.

-- 
"Jung'f n tbbq ohqql?  V'yy gryy lbh - n tbbq ohqql tbrf vagb gbja, trgf
 n pbhcyr bs oybjwbof, gura pbzrf onpx naq tvirf lbh bar."

peters@cubsvax.UUCP (Peter S. Shenkin) (09/30/86)

In article <mmintl.1832> franka@mmintl.UUCP (Frank Adams) writes:
>In article <541@cubsvax.UUCP> peters@cubsvax.UUCP (Peter S. Shenkin) writes:
>>This would seem to imply (perhaps stretching things a bit, but what the
>>hell -- you only live once!) that at the moment of the big bang there was
>>no information present in the universe (since T was infinite, I believe).
>
>I'm not sure about the reasoning, but I believe this conclusion is correct.

Your remark reminds me of a syllogism mentioned by a philosophy professor I once
had.  He said:  ...all camels are over 100 feet tall;  everything over 100 feet
tall has four legs;  therefore, all camels have four legs....  I always though
this was a good metaphor for science.

You might argue that not all camels have four legs (there are birth defects,
accidents, etc.), but that's experimental error....

Peter S. Shenkin	 Columbia Univ. Biology Dept., NY, NY  10027
{philabs,rna}!cubsvax!peters		cubsvax!peters@columbia.ARPA