[comp.ai.philosophy] My view of intelligence...

ttoupin@diana.cair.du.edu (Tory Toupin) (03/23/91)

In article <13577@helios.TAMU.EDU> rpb0804@venus.tamu.edu writes:
>I dunno about this philosophy about intelligence being inbred or evolving.  I
>personally feel that the brain is a large, organic macrocomputer.  After all,
>it stores data in "chips" (cells) using electrical currents.  It constantly 
>takes in data, processes it, acts on it.  One can never really say that it is 
>idle in the real sense; it is always doing something.  

Definitely!  IMHO, the brain is always in a dream like state.  This is not to
say that nothing is real - this is all a dream.  No, I mean that, dreams
are a sort of cacophony of mental symbols trying to come together and form new
mental concepts - this is what the brain is always up to.  What appears to be
consciousness is a single thread of symbolic association, beginning with some
small group of initial symbols which form the seed for later associations
which are conscious thoughts.

>                                                       The meta-thought 
>mentioned earlier - it isn't spontaneous, I'd wager.  It was probably a result 
>of a stimulus, whether you consciously realized it or not.  

I don't know which meta-thought you refer to but, I'd guess that any concei-
vable meta-thought can be generated spontaneously, just by a bit of symbolic
juggling.

>I think that if a system were created with enough storage capacity to hold all 
>the data it could accumulate through its "senses" (visual, audio, tactile, 
>etc.) and a sufficient algorithm to process the data, reflexes could be 
>conditioned.  Independent actions could result from a "library" of reflexes
>(that's what WE do, after all!).  This would result in an organism as 
>intelligent as the technology supporting it.  It just so happens that we 
>haven't caught up with ourselves yet, hmm?
>
>Another stray thought (or is it really stray?) - Would you consider biological 
>nerve pulses (pulse, no pulse) a form of digital coding and data transfer?  
>That would be a great way to approach the input from the environment...

I think so...  But not so much as a discrete signal with a fixed number of
states, but as a continuous state (sorry if the terminology is incorrect or
weak here).  This would allow and even force symbols (by symbols I mean a
series of nerve impulses in various parts of the nervous system which stand
for some environmental stimulus) to mutate - perhaps a sort of nerual
compression algorithm :).

>Roberto
>RPB0804@TAMVENUS

I'll explain any of this further if anyone so desires...  I think I've left
quite a few gaps up there.  Flames anxiously welcome...
--
Tory S. Toupin                         |
ttoupin@diana.cair.du.edu              | Existence toward perfection...
Unversity of Denver                    |     Life of mediocrity. 
Undergraduate: Math & Computer Sciences| 
Denver, CO  80208                      |             -Tory Toupin

-----
C'est ne pas un fichier de <<.signature>>

cs196006@cs.brown.edu (Josh Hendrix) (03/23/91)

In article <13577@helios.TAMU.EDU>, rpb0804@venus.tamu.edu (BATES, ROBERT PATRICK) writes:
|> I dunno about this philosophy about intelligence being inbred or evolving.  I
|> personally feel that the brain is a large, organic macrocomputer.  After all,

I just can't let this go by without a response. Let me just say at the outset that I am not trying to get into a flame war over AI vs neural nets, but I disagree with what you say here (although "...I would defend to the death your right to say it," a quote which I may mistakenly attribute to Thomas Jefferson). 

Here's where I disagree:
1. The brain is indeed large (considering the size of its constituent parts, and compared to the brains of other animals), and it is most definitely organic. I have a problem, however, with any sentence, phrase, or other semantically meaningful arrangement that contains the words 'brain' and 'computer' within 10 words of each other, unless the words 'is not a' fall in between :-).

First, nerve cells are actually slow, firing on the order of 10,000 times/second, while the cycles per second of your average Sun workstation is measured in the millions. Second, this means that, to do all the things that we effortlessly do, (walk around, avoid things, etc.) the 'computer' in our skulls have to be massively parallel. Third, this massive parallelism means that the neurons, the 'computing elements', are hooked together very differently from the way CPU's, busses, and chips are. Everything th





at happens in your workstation happens one 'operation' at a time, whereas all of the 'computation' that is going on to allow you to see the screen right now is happening at the same time.

Of course, there's a lot of debate about that last sentence, so I do not claim here to be the final word. But that's what seems to me to be going on.

|> it stores data in "chips" (cells) using electrical currents.  It constantly 

Again, I disagree. Cells may contain information of their own (DNA), but the information that makes up a thought is stored in the connections of many cells.

...Stuff deleted...
 
|> I think that if a system were created with enough storage capacity to hold all 
|> the data it could accumulate through its "senses" (visual, audio, tactile, 
|> etc.) and a sufficient algorithm to process the data, reflexes could be 
|> conditioned.  Independent actions could result from a "library" of reflexes
|> (that's what WE do, after all!).  This would result in an organism as 
|> intelligent as the technology supporting it.  It just so happens that we 
|> haven't caught up with ourselves yet, hmm?

Yes, I suppose if you had the time, money, fast enough CPU (which, I would wager, does not exist yet), and programmers (lots) you might be able to build something that, say, kicks you when you tap its knee. But the CPU would have to be wicked fast to handle things like processing visual data quickly enough to pattern-match an orange blur with the concept 'trash-can', calculating current trajectory of the center of gravity of its body planning a route around in under a second. Yet people do this with ease. 







If I am reading it into your words, forgive me, but it seems as though you intend all of this in a linear, sequential paradigm, which is not the case with the brain. Furthermore, why bother 'conditioning' when you could just program a reaction in? 


|> Another stray thought (or is it really stray?) - Would you consider biological 
|> nerve pulses (pulse, no pulse) a form of digital coding and data transfer? 

There are arguments (that I can only relate, I don't know enough to confirm or deny them) to the effect that a cell is really a voltage-to-frequency converter, and that the frequencies get re-converted to voltage potentials at the other side of the synapse. I would argue that the brain is not a binary thing.


|> If I'm a little behind my time in this concept, shut me up!  But I couldn't 
|> resist jumping into a good AI philosophical discussion...
|> 

I don't intend this as a flame or to try to 'shut you up'. I just come from a different angle and I thought I'd share it. 

|> Roberto
|> RPB0804@TAMVENUS

Josh Hendrix
cs196006@brownvm.brown.edu

txo6870@cs.rit.edu (Timothy X Oertel) (03/24/91)

In article <69649@brunix.UUCP> cs196006@cs.brown.edu (Josh Hendrix) writes:
<In article <13577@helios.TAMU.EDU>, rpb0804@venus.tamu.edu (BATES, ROBERT PATRICK) writes:
<|> I dunno about this philosophy about intelligence being inbred or evolving.  I
<|> personally feel that the brain is a large, organic macrocomputer.  After all,

[stuff deleted]

<Here's where I disagree:
<1. The brain is indeed large (considering the size of its constituent parts, and compared to the brains of other animals), and it is most definitely organic. I have a problem, however, with any sentence, phrase, or other semantically meaningful arrangemen
<t that contains the words 'brain' and 'computer' within 10 words of each other, unless the words 'is not a' fall in between :-).
<
<First, nerve cells are actually slow, firing on the order of 10,000 times/second, while the cycles per second of your average Sun workstation is measured in the millions. Second, this means that, to do all the things that we effortlessly do, (walk around,
< avoid things, etc.) the 'computer' in our skulls have to be massively parallel. Third, this massive parallelism means that the neurons, the 'computing elements', are hooked together very differently from the way CPU's, busses, and chips are. Everyth

[stuff eaten due to lines too large]

While, as far as I know, the above is mostly true, just because there is no
one-to-one relationship between processors and neurons doesn't mean that a
group of neurons couldn't be modeled by a single processor and a brain modeled
by a large group of processors.  Although, this comes together again in that
the size of the memory and processing would be mighty monsterous.  I've seen
figures that the number of neurons in the brain is ~1-4 billion and that each,
on average (presumably), has ~1 million connections.  No kidding massively
parallel... :)  
  Also, in modeling neural networks, people normally pay attention to a 100
step limit, where no computation, to be "neurologically correct", can exceed
this.  So, by what we've said, apparently we might need an order of magnitude
or so increase in processor speed AND THEN, to deal with all the computation,
we would need one processor per neuron.... goodness.  Seems almost a tad
hopeless for the near future.

<at happens in your workstation happens one 'operation' at a time, whereas all of the 'computation' that is going on to allow you to see the screen right now is happening at the same time.
<
<Of course, there's a lot of debate about that last sentence, so I do not claim here to be the final word. But that's what seems to me to be going on.
<
<|> it stores data in "chips" (cells) using electrical currents.  It constantly 
<
<Again, I disagree. Cells may contain information of their own (DNA), but the information that makes up a thought is stored in the connections of many cells.
<

Although the thought is distributed, each cell DOES hold a certain amount of
nonspecific information in electric potentials in each cell and in their 
connection strength between neurons.

[stuff deleted]

<|> Roberto
<|> RPB0804@TAMVENUS
<
<Josh Hendrix
<cs196006@brownvm.brown.edu

I admit that the total of my NN knowledge is by no means state of the
art, but I thought I'd toss in my two cents.  I won't say more since
this is philosophy, not quite NN's.


---> Tim Oertel
txo6870@ultb.isc.rit.edu

velasco@beowulf.ucsd.edu (Gabriel Velasco) (03/24/91)

cs196006@cs.brown.edu (Josh Hendrix) writes:

>First, nerve cells are actually slow, firing on the order of 10,000
>times/second, while the cycles per second of your average Sun
>workstation is measured in the millions. 

What does the speed of the computational element have to do with
whether or not the system is a computer?  How fast do you think the
differential engine would have run?

>Second, this means that, to do all the things that we effortlessly do,
>(walk around, avoid things, etc.) the 'computer' in our skulls have to
>be massively parallel.

Again, how does this affect the comment that the brain is a computer?
Are you saying it can't be a computer if it's massively parallel?

>Third, this massive parallelism means that the neurons, the 'computing
>elements', are hooked together very differently from the way CPU's,
>busses, and chips are. Everyth at happens in your workstation happens
>one 'operation' at a time, whereas all of the 'computation' that is
>going on to allow you to see the screen right now is happening at the
>same time.

I don't think that the type of connectivity determines whether or not a
system is a computer.  The second part of your statement is not
entirely true either.  The brain apparently goes through states just
like "regular" computers.

>Of course, there's a lot of debate about that last sentence, so I do
>not claim here to be the final word. But that's what seems to me to be
>going on.

What seems to be going on is that the brain moves through states.  This
has been shown through semantic net experiments where people take
longer to answer questions that require them to traverse longer paths
through the semantic net.

>Again, I disagree. Cells may contain information of their own (DNA),
>but the information that makes up a thought is stored in the
>connections of many cells.

How does this make the brain a non-computer?

>Yes, I suppose if you had the time, money, fast enough CPU (which, I
>would wager, does not exist yet), and programmers (lots) you might be
>able to build something that, say, kicks you when you tap its knee. 

Earlier, you stated how slow the computational elements of the brain
were.  Apparently we don't a fast CPU.  We probably need a huge amount
of possibly slow ones connected in a similar manner.

-- 
                              ________________________________________________
 <>___,     /             /  | ... and he called out and said, "Gabriel, give |
 /___/ __  / _  __  ' _  /   | this man an understanding of the vision."      |
/\__/\(_/\/__)\/ (_/_(/_/|_  |_______________________________________Dan_8:16_|

reh@wam.umd.edu (Richard E. Huddleston) (03/24/91)

In article <13577@helios.TAMU.EDU> rpb0804@venus.tamu.edu writes:
>I dunno about this philosophy about intelligence being inbred or evolving.  I
>personally feel that the brain is a large, organic macrocomputer.  After all,
>it stores data in "chips" (cells) using electrical currents.  It constantly 
>takes in data, processes it, acts on it.  One can never really say that it is 
>idle in the real sense; it is always doing something.  The meta-thought 
>mentioned earlier - it isn't spontaneous, I'd wager.  It was probably a result 
>of a stimulus, whether you consciously realized it or not.  
>
>I think that if a system were created with enough storage capacity to hold all 
>the data it could accumulate through its "senses" (visual, audio, tactile, 
>etc.) and a sufficient algorithm to process the data, reflexes could be 
>conditioned.  Independent actions could result from a "library" of reflexes
>(that's what WE do, after all!).  This would result in an organism as 
>intelligent as the technology supporting it.  It just so happens that we 
>haven't caught up with ourselves yet, hmm?
>
>Another stray thought (or is it really stray?) - Would you consider biological 
>nerve pulses (pulse, no pulse) a form of digital coding and data transfer?  
>That would be a great way to approach the input from the environment...
>
>If I'm a little behind my time in this concept, shut me up!  But I couldn't 
>resist jumping into a good AI philosophical discussion...
>
>Roberto
>RPB0804@TAMVENUS

 
In article <13577@helios.TAMU.EDU> rpb0804@venus.tamu.edu writes:
 
>              Independent actions could result from a "library" of reflexes
>(that's what WE do, after all!).  This would result in an organism as
>intelligent as the technology supporting it.  It just so happens that we
>haven't caught up with ourselves yet, hmm?
 
One question here, I think.  Since evolution is essentially a response to the
environment, and it is therefore impossible for a species to develop past
the point where the environment 'pushes' it, you seem to be implying that
either the theory of evolution is whacked or that evolution has given us
greater capability than we've learned what to do with.  These are quite
different predicates.  Can you clarify what you mean?
 
Thanks,
 
Richard