[comp.ai] How much info can the brain hold?

svissag@hubcap.clemson.edu (Steve L Vissage II) (11/30/90)

I've heard estimates of how many neurons the human brain contains, somewhere 
in the trillions, I believe.  Has there ever been a reliable estimate of
how much information, in bits or other computer-relevant units, can be
contained in that structure?
 
I ask because, if we don't know exactly HOW we store information (we still
don't know, do we?)  can we estimate how much?
 
Steve L Vissage II

6600dt@ucsbuxa.ucsb.edu (Dave Goggin) (11/30/90)

In article <11941@hubcap.clemson.edu> svissag@hubcap.clemson.edu (Steve L Vissage II) writes:

>I've heard estimates of how many neurons the human brain contains, somewhere 
>in the trillions, I believe.  Has there ever been a reliable estimate of
>how much information, in bits or other computer-relevant units, can be
>contained in that structure?
> 
>I ask because, if we don't know exactly HOW we store information (we still
>don't know, do we?)  can we estimate how much?
>

I'd follow up with another questin of brain
computer comparison.  It is known that much of the
brain's power comes ffom the high degree of
paraallel processing involved.  What is the speed
(in MHz, or other units) that the brin runs at, and
how does it vary with state.  Also, how does this
compare iwht existing parrallel-processed hardware?

*dt*
 

frank@bruce.cs.monash.OZ.AU (Frank Breen) (11/30/90)

In <7492@hub.ucsb.edu> 6600dt@ucsbuxa.ucsb.edu (Dave Goggin) writes:

>In article <11941@hubcap.clemson.edu> svissag@hubcap.clemson.edu (Steve L Vissage II) writes:

>>  Has there ever been a reliable estimate of
>>how much information, in bits or other computer-relevant units, can be
>>contained in that structure?

>  What is the speed
>(in MHz, or other units) that the brin runs at, and
>how does it vary with state.  Also, how does this
>compare iwht existing parrallel-processed hardware?

In one book by Richard Dawkins (the Selfish Gene I think) He
estimates roughly how many bits per second of input the brain 
is processing.  I can't remember the details, but it seemed
surprisingly low, of the order of real time video.

He worked it out from the definition of our vision, hearing etc.

Do people (who've read it) think this means anything or what.

-- 
Frank Breen            "I am a warrior in the landscape of my mind"
Monash Uni                                                -unkown?

turner@webb.psych.ufl.edu (Carl Turner) (11/30/90)

In article <11941@hubcap.clemson.edu> svissag@hubcap.clemson.edu (Steve L Vissage II) writes:
>I've heard estimates of how many neurons the human brain contains, somewhere 
>in the trillions, I believe.  Has there ever been a reliable estimate of
>how much information, in bits or other computer-relevant units, can be
>contained in that structure?
> 
>Steve L Vissage II

The estimates of the number of neurons in the brain range from 30 billion
to 100 billion.

Most information-processing theories of memory assume that long-term
memory (LTM) is essentially unlimited.  Failure to remember are due to
either 1) a failure to encode the information (move it from STM to LTM)
or 2) a failure to retrieve the info (it's there but you just can't get
it out).  Storage capacity is not assumed to be the problem; I say 
"assumed" because, as you have probably already guessed, this is probably
impossible to prove.

The essential works on memory are by Shiffrin, Atkinson, and Tulving.
Check them out.

ObA.I.

An attempt to relate the question and follow-up article to a significant
problem in artificial intelligence: if human memory is "essentially
unlimited," what import does this have for a full (machine) working model
of human intelligence?  Let me state this another way; given a massively
parallel piece of computer hardware, is it possible to find and implement
algorithms that allow one to consider the storage there "essential 
unlimited?"

Carl Turner
turner@webb.psych.ufl.edu

eliot@phoenix.Princeton.EDU (Eliot Handelman) (11/30/90)

In article <11941@hubcap.clemson.edu> svissag@hubcap.clemson.edu (Steve L Vissage II) writes:
;I've heard estimates of how many neurons the human brain contains, somewhere 
;in the trillions, I believe.  Has there ever been a reliable estimate of
;how much information, in bits or other computer-relevant units, can be
;contained in that structure?
; 
;I ask because, if we don't know exactly HOW we store information (we still
;don't know, do we?)  can we estimate how much?

Yes. You can store some trillion things but no more. If you exceed the
number of neurons, then things will start "falling out." This causes
first baldness, then senility. This is why very intelligent men are
usually bald. They have replaced their hair with weightier bits of
information.

Well, gotta go -- my UFO's waiting.

--beep beep

ins_atge@jhunix.HCF.JHU.EDU (Thomas G Edwards) (12/01/90)

In article <7492@hub.ucsb.edu> 6600dt@ucsbuxa.ucsb.edu (Dave Goggin) writes:
>I'd follow up with another questin of brain
>computer comparison.  It is known that much of the
>brain's power comes ffom the high degree of
>paraallel processing involved.  What is the speed
>(in MHz, or other units) that the brin runs at, and
>how does it vary with state.  Also, how does this
>compare iwht existing parrallel-processed hardware?

The brain wins on number of processing units when compared to existing
parallel machines (such as the Connection Machine with only 64K
processors).  This would be even if you only count neurons, and
not glial cells and other support mechanisms which could play some
unknown cognitive roles.

Most neurons fire at less than 100 Hz.  But, there is a large question
of whether individual pulses really mean anything by themselves
in all of brain.  While there is some indication that while in some places
in brain a single pulse might be meaningful, some of brain probably works
frequency and phase modulation of pulse trains.  One aspect which
has been looked into is synchronous firing of neurons associated
with different parts of the same object in the visual field.
This type of "phase-locking attentional mechanisms" may explain
why there are limits to some kinds of short-term memory (like
how long a train of digits you can repeat back to me if I tell them
to you, or maximum number of individually tracked objects in
visual field) due to a limited amount of "phase slots" into which
these attended stimuli can fit.

Secondly, alot more goes on in neurons than action potentials.
For example, long-term potentiation is going on, something which doesn't
easily translate into a clock speed.  Also there are much slower
DC potentials working across cells.

And thridly neurons have fairly large connectivity (up to about 10,000)
which is a few orders of magnitude greater than the connectivity of
your average processor in a parallel machine (I guess no more than 32
if you are in a hypercube).

-Tom

sklarew@eniac.seas.upenn.edu (Dann Sklarew) (12/02/90)

In article <7492@hub.ucsb.edu> 6600dt@ucsbuxa.ucsb.edu (Dave Goggin) writes:
>In article <11941@hubcap.clemson.edu> svissag@hubcap.clemson.edu (Steve L Vissage II) writes:
>
>>I've heard estimates of how many neurons the human brain contains, somewhere 
>>in the trillions, I believe.  Has there ever been a reliable estimate of
>>how much information, in bits or other computer-relevant units, can be
>>contained in that structure?
>> 
>
>I'd follow up with another questin of brain
>computer comparison.  It is known that much of the
>brain's power comes ffom the high degree of
>paraallel processing involved.  What is the speed
>(in MHz, or other units) that the brin runs at, and
>how does it vary with state.  Also, how does this
>compare witht existing parrallel-processed hardware?
> 

Computational neuroscientist Terrence J. Sejnowski (Salk Institute) has 
cited the best current estimate of brain complexity as 10exp14 
synapses: "If we assume that synapses are sites of information storage, 
then we can make a rough estimate for the total information stored in 
the brain (given that each synapse stores only a few bits). . . around 
10exp14 bits." Given an activation rate of 10 synapses per second, he 
states that the brain must be performing at least 10exp15 operations 
per second. This is a full five orders of magnitude above the capacity 
of the Connection Machine, one of the largest of today's highly paral-
lel computers. Thus, beyond the total number of neurons in the brain 
(10exp12 according to neurobiologist Eric Kandel), it is this intricate 
synaptic connectivity that allows the brain to out-perform advanced 
VLSI technologies.

Ref: 

Sejnowski, Terrence. 1989. In "The Computer and The Brain: Perspectives 
	on Human and Artificial Intelligence," edited by Jean R. Brink 
	and C. Roland Haden. New York: Elsevier.

Includes a figure of the logarithm number of elementary operations per 
second by the largest digital computers plotted as a function of time 
(a line estimated to reach aforementioned brain capability by 2020).

smoliar@vaxa.isi.edu (Stephen Smoliar) (12/03/90)

In article <33870@netnews.upenn.edu> sklarew@eniac.seas.upenn.edu (Dann
Sklarew) writes:
>
>Computational neuroscientist Terrence J. Sejnowski (Salk Institute) has 
>cited the best current estimate of brain complexity as 10exp14 
>synapses: "If we assume that synapses are sites of information storage, 
>then we can make a rough estimate for the total information stored in 
>the brain (given that each synapse stores only a few bits). . . around 
>10exp14 bits."

I would like to contest the assumption behind this calculation.  Actually, what
I want to contest is the validity of using a phrase like "sites of information
storage."  Therefore, I shall see your Sejnowski quote and raise you one
Edelman quote.  This comes from the exposition of his theory of memory as
a process of REcategorization (a theory recently explained quite lucidly
by Oliver Sacks in the November 22 issue of THE NEW YORK REVIEW in an article
entitled "Neurology and the Soul").  Here is Edelman talking about
recategorization on page 266 of his book, NEURAL DARWINISM:

	Inasmuch as the recategorization carried out by classification
	couples has a variable element and because categorization is a
	continually ACTIVE selective process of disjunctive partitioning
	of a world that exists "without labels," the static idea of
	information as it is used in communication theory is not very
	appropriate.  It is true that, as the amount of categorization
	of members of a set increases, there is an accrual of adaptive
	behavior and of response generalization;  it is also likely that
	this is accompanied by a decrease in degeneracy [redundancy] in
	the neural circuits mediating the response.  But inasmuch as
	there is always a trade-off between specificity and range in
	selective systems (see chapter 2), and because there is, in
	general, no prior prefixed or coded relation between an animal's
	behavior and objects and events in its present environment, it
	is not illuminating to talk of information (except A POSTERIORI,
	as an observer).  It is equally fruitless to attempt to measure
	the capacity of such a system in information theoretical terms:
	reaching a response that even vaguely categorizes a stimulus by
	trading some specificity for range puts an animal in a reasonably
	good position for adaptive behavior.  If that behavior is rewarded,
	the gain or loss in the amount of "information" is an EX POST FACTO
	judgment the efficacy of which is dubious.

=========================================================================

USPS:	Stephen Smoliar
	5000 Centinela Avenue  #129
	Los Angeles, California  90066

Internet:  smoliar@vaxa.isi.edu

"It's only words . . . unless they're true."--David Mamet

dmb@odin.icd.ab.com (David Babuder) (12/04/90)

In article <15882@venera.isi.edu> smoliar@vaxa.isi.edu (Stephen Smoliar) writes:
>
>I would like to contest the assumption behind this calculation.....
>
>	....  But inasmuch as
>	there is always a trade-off between specificity and range in
>	selective systems (see chapter 2), and because there is, in
>**	general, no prior prefixed or coded relation between an animal's
>**	behavior and objects and events in its present environment, it
>	is not illuminating to talk of information (except A POSTERIORI,
>	as an observer).  It is equally fruitless to attempt to measure
>	the capacity of such a system in information theoretical terms...
>
>
While I would support the conclusion that comparisons between the memory
of people, or animals, and of machines is difficult, I am confused
by the apparent statement that a set of behavior is not prefixed
or coded. Is this a misunderstanding of the statement above, a 
lack of context, or a basic difference in psychological theory, since
it appears that a number of behaviors are prefixed or coded? I suspect
the later based on the title.

If you don't mind a combination of pun and analogy, read on...

We seem to have an issue of 'information hiding' on the part of 
the designer coupled with a benchmarking problem! 
    (1) We can not see into the object, to see how the information
        is stored - we can only inquire of it externally.
    (2) Our 'debugging aide' is to look at some of the bit level
        storage and guess what might be represented - since we
        do not know what is being stored, or even if it represents
        'delta' (change from a baseline) or 'absolute' information!

This is like trying to ask a set of computer salespeople
"How many pages of information can your computer store?"

  - One vendor stores a bitmap because they manipulate images
  - A second vendor stores ascii text from the page because they
    perform text processing
  - A third vendor stores delta's from an original text because
    they focus on configuration management of source code programs
  - A forth vendor stores a series of polynomials representing data
    values such that they can reproduce a long series of data points
    while only storing a few pieces of information 

This is why I agree with the conclusion of the referenced article,
that we are not in a good position to measure/compare the 
information storage of people to machines. However, my basis is
that comparing objects whose design intent is very different
produces little in the way of meaningful results.

Dave Babuder - working for, but not representing
Allen-Bradley Company (ICCG)  A Rockwell International Company

smoliar@vaxa.isi.edu (Stephen Smoliar) (12/05/90)

In article <1984@abvax.UUCP> dmb@odin.icd.ab.com (David Babuder) writes:
>In article <15882@venera.isi.edu> smoliar@vaxa.isi.edu (Stephen Smoliar)
>writes:
>>
>>I would like to contest the assumption behind this calculation.....
>>
>>	....  But inasmuch as
>>	there is always a trade-off between specificity and range in
>>	selective systems (see chapter 2), and because there is, in
>>**	general, no prior prefixed or coded relation between an animal's
>>**	behavior and objects and events in its present environment, it
>>	is not illuminating to talk of information (except A POSTERIORI,
>>	as an observer).  It is equally fruitless to attempt to measure
>>	the capacity of such a system in information theoretical terms...
>>
>>
>While I would support the conclusion that comparisons between the memory
>of people, or animals, and of machines is difficult, I am confused
>by the apparent statement that a set of behavior is not prefixed
>or coded. Is this a misunderstanding of the statement above, a 
>lack of context, or a basic difference in psychological theory, since
>it appears that a number of behaviors are prefixed or coded?

The misunderstanding comes from your ignoring the word "prior" in the above
passage.  It is not that codes do not exist.  It is that there is no reason
to assume they exist prior to any behavior performed by an animal.

=========================================================================

USPS:	Stephen Smoliar
	5000 Centinela Avenue  #129
	Los Angeles, California  90066

Internet:  smoliar@vaxa.isi.edu

"It's only words . . . unless they're true."--David Mamet

syswerda@bbn.com (Gilbert Syswerda) (12/13/90)

In article <11941@hubcap.clemson.edu> svissag@hubcap.clemson.edu (Steve L Vissage II) writes:
>I've heard estimates of how many neurons the human brain contains, somewhere 
>in the trillions, I believe.  Has there ever been a reliable estimate of
>how much information, in bits or other computer-relevant units, can be
>contained in that structure?
> 
>I ask because, if we don't know exactly HOW we store information (we still
>don't know, do we?)  can we estimate how much?

Jacob Schwartz estimates the capacity of the human brain in an article in
the Winter 1988 Daedalus. It is an interesting article, with estimates
based on brain physiology. With regard to your questions:

  "...Such exceedingly rough quantitative guesses lead us to estimate that
the long-term memory available to the brain is about 10,000 trillion bytes
and that the amount of shorter-term data needed to characterize the state
of each of its synapses is roughly the same. The logical activity of each
neuron can then be regarded as a process that combines approximately 10
thousand input bytes with roughly 40 thousand synapse status bytes at a
rate of 100 times each second. The amount of analog arithmetic required for
this estimate is (again very roughly) 10 million elementary operations per
neuron per second, suggesting that the computing rate needed to emulate the
entire brain on a neuron-by-neuron basis may be as high as 1,000,000
trillion arithmetic operations per second. (Of course, computation rates
many orders of magnitude lower might suffice to represent the logical
content of the brain's activity if it could be discovered what this is.)
   It is interesting to compare these exceedingly coarse estimates with
corresponding figures for the largest supercomputer systems likely to be
developed over the next decade. These will probably not attain speeds in
excess of 1 trillion arithmetic operations per second, which is about
one-millionth of the computation rate that we have estimated for the
brain."

dfo@tko.vtt.fi (Foxvog Douglas) (12/19/90)

In article <61534@bbn.BBN.COM> syswerda@labs-n.bbn.com (Gilbert Syswerda) writes:
>Jacob Schwartz estimates the capacity of the human brain in an article in
>the Winter 1988 Daedalus. It is an interesting article, with estimates
>based on brain physiology. 

>  "...Such exceedingly rough quantitative guesses lead us to estimate that
>the long-term memory available to the brain is about 10,000 trillion bytes
>and that the amount of shorter-term data needed to characterize the state
>of each of its synapses is roughly the same. 

This number is unbelievable.

>The logical activity of each
>neuron can then be regarded as a process that combines approximately 10
>thousand input 

Of many kinds of neurons, most have far less than 10,000 input synapses.

>bytes

Why assume that 8 bits of data are transmitted at any instant instead of
up to 2 bits?
 
>with roughly 40 thousand synapse status bytes 

This implies 4 states per synapse (with all synapses independent) = 2 bits.

>at a rate of 100 times each second. 

Is arithmetic done for each 100th of a second?  Aren't many signals
pulse trains in which frequency is important?

>The amount of analog arithmetic required for

I dare say that the 4*10^4 operations simultaneously
occurring in a neuron are not independent.  The neuron may activate
depending upon the number of positive inputs (often with a cutoff
function) with the possibility of some inputs alternatively forcing
the neuron either on or off.  You could impliment this in an analog
computer with simple components for each synapse and minimal other
computational resources.  Using digital logic the number of steps would
depend upon the fan in.

Why not use 100 ic's with a fan-in of about 100 with a few bits output
tied into another such ic.  One such array running at 200 Hz would
emulate a neuron's input & processing (a bus would handle output).  Run
it at 20 MHz and it would emulate 10^5 neurons.  (And therefore process
10^12 elementary operations per second (see below))

>this estimate is (again very roughly) 10 million elementary operations per
>neuron per second,      ^^^^^^^^^^^^

40,000 * 100 approx.= 10,000,000 ?
This seems to suggest that a single neuron has a computing speed higher
than cheap PCs (ignoring the fact that output is at most 100 bits
(bytes???) per second (although widely branched).

>suggesting that the computing rate needed to emulate the
>entire brain on a neuron-by-neuron basis 
                   ^^^^^^^^^^^^^^^^
Why not emulate on a molecule-by-molecule basis , you'll get a higher
number.-)

>may be as high as 1,000,000
>trillion arithmetic operations per second. 

Assuming 100 billion neurons each of which can perform 10^7 operations
per second.  

>(Of course, computation rates many orders of magnitude 
                               ^^^^^^^^^^^^^^^^^^^^^^^^
>lower might suffice to represent the logical
>content of the brain's activity if it could be discovered what this is.)

>   It is interesting to compare these exceedingly coarse estimates with
>corresponding figures for the largest supercomputer systems likely to be
>developed over the next decade. These will probably not attain speeds in
>excess of 1 trillion arithmetic operations per second, which is about
>one-millionth of the computation rate that we have estimated for the
>^^^^^^^^^^^^^      brain."

How does "many orders of magnitude" compare with one million?

It seems to me that even these high figures for computational speed of
the brain, when compensated for as suggested in the parenthetical
expression, are not vastly beyond what can be achieved in hardware in
the not too distant future.  

doug foxvog
dfo@vtt.fi