[comp.ai.philosophy] Testing Intelligence

pnettlet@gara.une.oz.au (Philip Nettleton) (11/30/90)

I thought it was about time I reposted the general requirements for
determining a system (automated or biological) as being intelligent,
developed over the last six months of comp.ai and comp.ai.philosphy.
This definition is unbiased in its assessment of the system and uses
standard terms to describe the systems behaviuoral characteristics.

Constructive critism will be used to further enhance this set of
requirements, flames will be ignored (ie, treated with the contempt
they deserve).

----------------------------------------------------------------------
			DEFINITION:
	GENERAL REQUIREMENTS OF AN INTELLIGENT SYSTEM.

a)	The system MUST be able to learn. This implies that the
	system MUST have a memory for learning to be maintained.
	Also learning comes in a number of varieties:

	i)	It MUST be able to learn from its own experiences.
		These can be broken down into further criteria:

		1)	Learning through trial and error.
		2)	Learning through observation.
		3)	Learning through active reasoning.

	ii)	It SHOULD be able to learn by instruction, but this
		is not necessary. At the very least the system MUST
		have preprogrammed instincts. This is a boot strap
		for the developing intelligence.  Without a starting
		point, the system cannot progress.

b)	The system MUST be autonomous. That is to say, it MUST be
	able to do things by itself (however may choose to accept
	aid).  This can be dissected as:

	i)	The system MUST be able to affect its environment
		based on its own independent conclusions.

	ii)	The system MUST be its own master first and foremost,
		and therefore not require operator intervention to
		function. This does not necessarily rule out the
		taking of orders from another system, but the choice
		to obey MUST be made by the system itself.

	iii)	The system MUST be motivated. It must have needs and
		requirements that can to be satisfied by its own
		actions.

c)	The system MUST be able to reason. That is to say, it must
	use some form of reasoning, based on known facts and capable
	of producing insights which later become known facts. It
	should be noted that the degree of certainty about the truth
	of a known fact is also an important concept and some way of
	dealing with uncertainty MUST be provided.

d)	The system MUST be able to develop self awareness. This is
	related to autonomy, reasoning and learning, but also
	embodies the need for internal and external senses. Without
	these senses there is no way of appreciating the difference
	between "me" and "outside of me". Sensations of pain and
	pleasure can provide motivation.
----------------------------------------------------------------------
			DEFINITION OF TERMS.

1)	A "system" CAN be comprised of multiple subsystems, each one
	of these could be a system in its own right (systems theory).

2)	The "environment" in which the system exists MUST be external
	to the system, but that is as far as the definition of the
	environment goes (it could be computer generated).

3)	The terms "learning", "reasoning" and "autonomy" are
	BEHAVIOURAL characteristics, further supported by our
	understanding (to date) of how they MIGHT work.

4)	The term "self awareness" is based on learning, reasoning
	and autonomy, and is the state where the system is aware
	(has knowledge) of its own existence as separate from its
	environment.

5)	"Intelligence" is a BEHAVIOURAL phenomena displayed by
	intelligent systems.

6)	"Truth" about a known fact is SUBJECTIVE with respect to the
	system. Ultimate truth is an ideal which is seldom
	achievable even in "human intelligence".

7)	"Certainty" is a statistical measure of the probability of
	a fact being true.
----------------------------------------------------------------------

				Regards,

						Philip Nettleton,
						Tutor in Computer Science,
						University of New England,
						Armidale,
						New South Wales,
						2351,
						AUSTRALIA.

greenba@gambia.crd.ge.com (ben a green) (11/30/90)

In article <4832@gara.une.oz.au> pnettlet@gara.une.oz.au (Philip Nettleton) writes:
   ----------------------------------------------------------------------
			   DEFINITION:
	   GENERAL REQUIREMENTS OF AN INTELLIGENT SYSTEM.

   a)	The system MUST be able to learn. This implies that the
	   system MUST have a memory for learning to be maintained.
	   Also learning comes in a number of varieties:

	   i)	It MUST be able to learn from its own experiences.
		   These can be broken down into further criteria:

		   1)	Learning through trial and error.
		   2)	Learning through observation.
		   3)	Learning through active reasoning.

	   ii)	It SHOULD be able to learn by instruction, but this
		   is not necessary. At the very least the system MUST
		   have preprogrammed instincts. This is a boot strap
		   for the developing intelligence.  Without a starting
		   point, the system cannot progress.

   b)	The system MUST be autonomous. That is to say, it MUST be
	   able to do things by itself (however may choose to accept
	   aid).  This can be dissected as:

	   i)	The system MUST be able to affect its environment
		   based on its own independent conclusions.

	   ii)	The system MUST be its own master first and foremost,
		   and therefore not require operator intervention to
		   function. This does not necessarily rule out the
		   taking of orders from another system, but the choice
		   to obey MUST be made by the system itself.

	   iii)	The system MUST be motivated. It must have needs and
		   requirements that can to be satisfied by its own
		   actions.

   c)	The system MUST be able to reason. That is to say, it must
	   use some form of reasoning, based on known facts and capable
	   of producing insights which later become known facts. It
	   should be noted that the degree of certainty about the truth
	   of a known fact is also an important concept and some way of
	   dealing with uncertainty MUST be provided.

   d)	The system MUST be able to develop self awareness. This is
	   related to autonomy, reasoning and learning, but also
	   embodies the need for internal and external senses. Without
	   these senses there is no way of appreciating the difference
	   between "me" and "outside of me". Sensations of pain and
	   pleasure can provide motivation.
   ----------------------------------------------------------------------
			   DEFINITION OF TERMS.

   (see original)


A most interesting and helpful posting!

IMHO paragraphs a and b are non-controversial, but paragraphs c and d
would rule out, say, cats, since reasoning and self awareness in any
non-trivial senses require language.

--
Ben A. Green, Jr.              
greenba@crd.ge.com
  Speaking only for myself, of course.

cpshelley@violet.uwaterloo.ca (cameron shelley) (12/01/90)

In article <GREENBA.90Nov30092227@gambia.crd.ge.com> greenba@gambia.crd.ge.com (ben a green) writes:

[...]

>
>A most interesting and helpful posting!
>
>IMHO paragraphs a and b are non-controversial, but paragraphs c and d
>would rule out, say, cats, since reasoning and self awareness in any
>non-trivial senses require language.
>

How do you figure that?  Do you mean a mental language?  If so, what
do you consider 'mentalese' to be like?


--
      Cameron Shelley        | "Logic, n.  The art of thinking and reasoning
cpshelley@violet.waterloo.edu|  in strict accordance with the limitations and
    Davis Centre Rm 2136     |  incapacities of the human misunderstanding..."
 Phone (519) 885-1211 x3390  |				Ambrose Bierce

greenba@gambia.crd.ge.com (ben a green) (12/01/90)

In article <1990Nov30.180650.26648@watdragon.waterloo.edu> cpshelley@violet.uwaterloo.ca (cameron shelley) writes:

   In article <GREENBA.90Nov30092227@gambia.crd.ge.com> greenba@gambia.crd.ge.com (ben a green) writes:

   > ...
   >reasoning and self awareness in any
   >non-trivial senses require language.
   >

   How do you figure that?  Do you mean a mental language?  If so, what
   do you consider 'mentalese' to be like?

No, not a mental language. An actual, socially derived language. 
What is reasoning without talking to oneself, or actually writing to
oneself? We do this all the time when reasoning with tough problems.

Now cats can solve tough problems, but there is no way to classify
their performance as reasoning beyond just the statement that they
solve the problems. When we humans reason, we clearly use language.

Self awareness is more subtle and perhaps here I am relying on an
unpopular position that self awareness is learned by interacting with
other people. This is not really as strange as it may seem.  Haven't
you often heard therapists say that a large part of their task is to
help the client "get in touch with his feelings"?  Especially men who
don't talk much about their feelings, or realize that they have them.
The therapy is talking and probing with questions, which requires
language.

In another context, how do we teach children to be self aware?  It
seems natural to me to say that children see colored objects without
necessarily seeing colors, as such, before we teach them to name their
colors. It is an even greater achievement for them to see that they
are seeing. We ask "Do you see that bird?" (a probing question like
what is described between the therapist and the client) The pressure
of the question in the circumstance leads the child to recognize that,
yes, he is seeing something.

These are not ideas original with me, but the source is certainly
out of fashion nowadays. Someday ...


--
Ben A. Green, Jr.              
greenba@crd.ge.com
  Speaking only for myself, of course.

cpshelley@violet.uwaterloo.ca (cameron shelley) (12/01/90)

In article <GREENBA.90Nov30154938@gambia.crd.ge.com> greenba@gambia.crd.ge.com (ben a green) writes:
>In article <1990Nov30.180650.26648@watdragon.waterloo.edu> cpshelley@violet.uwaterloo.ca (cameron shelley) writes:
>
>   In article <GREENBA.90Nov30092227@gambia.crd.ge.com> greenba@gambia.crd.ge.com (ben a green) writes:
>
>   > ...
>   >reasoning and self awareness in any
>   >non-trivial senses require language.
>   >
>
>   How do you figure that?  Do you mean a mental language?  If so, what
>   do you consider 'mentalese' to be like?
>
>No, not a mental language. An actual, socially derived language. 

I don't mean to be dense, although you may think otherwise :>, but
I'm not sure what a "socially derived" language is either.  Certainly
a large component of human language is acquired through social
interaction, most of it, but is language performance the only real measure
of intelligence?  What about innateness?  Without a bootstrap, no
performance would be possible.

>What is reasoning without talking to oneself, or actually writing to
>oneself?

I give up! :>  This is a contentious issue for sure -- do you have an
anwser?

> We do this all the time when reasoning with tough problems.
>

Yes, *we* do.  But does that establish a necessary condition for
*any* form of intelligence?  I am assuming that the poster of the 
summary was directing his comments to intelligence in general, as
has been discussed here, and not limiting them to humans.  I readily
admit that humans are the sole example of that level of intelligence
available for study, but we can still attempt to generalize.

>Now cats can solve tough problems, but there is no way to classify
>their performance as reasoning beyond just the statement that they
>solve the problems. When we humans reason, we clearly use language.
>

These two statements don't produce much of a distinction in my mind.
All you've done is use different vocabulary in describing cats'
abilities and humans'.  How does a person's use of language allow
us to say more about them than of the cat if they're given the same
'tough problem' and both succeed, say?  Do you also mean that if a
person is not using language, that he or she is not reasoning?  In 
other words, I still don't see the line you're trying to draw.

>Self awareness is more subtle and perhaps here I am relying on an
>unpopular position that self awareness is learned by interacting with
>other people. This is not really as strange as it may seem.  Haven't
>you often heard therapists say that a large part of their task is to
>help the client "get in touch with his feelings"?  Especially men who
>don't talk much about their feelings, or realize that they have them.
>The therapy is talking and probing with questions, which requires
>language.
>

I heard of this many times on tv sitcoms and the like, and I still 
don't see how it shows what you stated earlier.  Here, you appear
to be associating "self-awareness" (which has a great deal to do
with interaction) with 'sensitivity' and I'm forced to ask what this
has to do with your contention that intelligence requires what we
would recognize as language?  I don't think the existence of psycho-
therapy is proof of this.

>In another context, how do we teach children to be self aware?  It
>seems natural to me to say that children see colored objects without
>necessarily seeing colors, as such, before we teach them to name their
>colors. 

It may seem natural to say it, but it doesn't have much basis that
I can see.  What justification is there for this?

>It is an even greater achievement for them to see that they
>are seeing. We ask "Do you see that bird?" (a probing question like
>what is described between the therapist and the client) The pressure
>of the question in the circumstance leads the child to recognize that,
>yes, he is seeing something.
>
>These are not ideas original with me, but the source is certainly
>out of fashion nowadays. Someday ...
>
>
Well, out of fashion or not, could you indicate it?  I'm not attempting
to deny there's a difference between levels of intelligence, but I am
still unsure of what connection you are making between this and having
an anthropomorphic existence and language.     
--
      Cameron Shelley        | "Logic, n.  The art of thinking and reasoning
cpshelley@violet.waterloo.edu|  in strict accordance with the limitations and
    Davis Centre Rm 2136     |  incapacities of the human misunderstanding..."
 Phone (519) 885-1211 x3390  |				Ambrose Bierce

greenba@gambia.crd.ge.com (ben a green) (12/03/90)

The story up to last week:

Philip Nettleton recently posted a summary of what he thinks this
newsgroup regards as requirements on a system for it to be classified 
as intelligent. In brief,

a) The system must be able to learn from experience.
b) The system must be autonomous, independent of any operator.
c) The system must be able to reason.
d) The system must be able to develop self awareness.

I responded with the remark that items c and d would tend to rule
out the system _felinus domesticus_ as intelligent, since the items
would require language and cats don't have it.

Cameron Shelley asked me to explain what I meant by language and why
reasoning and self-awareness require it.

I responded that I referred to ordinary, socially derived language,
and that reasoning is basically talking to oneself, and that self
awareness is taught to us by other people by means of language.

So ...


In article <1990Dec1.020816.1372@watdragon.waterloo.edu> cpshelley@violet.uwaterloo.ca (cameron shelley) writes:

   I don't mean to be dense, although you may think otherwise :>, but
   I'm not sure what a "socially derived" language is either.  Certainly
   a large component of human language is acquired through social
   interaction, most of it, but is language performance the only real measure
   of intelligence?  ... I am assuming that the poster of the 
   summary was directing his comments to intelligence in general, as
   has been discussed here, and not limiting them to humans.  I readily
   admit that humans are the sole example of that level of intelligence
   available for study, but we can still attempt to generalize.

My point, exactly. No, I don't think language performance is the only
real measure of intelligence. I would say that cats ARE intelligent.
That is why I objected to items c and d. 

But that doesn't mean that cats REASON.  This subject is hard to
discuss because there is no accepted vocabulary of technical terms to
use. I guess I base my definition of reasoning on experience with
logic and mathematics, which are based on manipulation of symbols in
the manner of languages.

   >Self awareness is more subtle and perhaps here I am relying on an
   >unpopular position that self awareness is learned by interacting with
   >other people. This is not really as strange as it may seem.  Haven't
   >you often heard therapists say that a large part of their task is to
   >help the client "get in touch with his feelings"?  Especially men who
   >don't talk much about their feelings, or realize that they have them.
   >The therapy is talking and probing with questions, which requires
   >language.
   >

   I heard of this many times on tv sitcoms and the like, and I still 
   don't see how it shows what you stated earlier.  Here, you appear
   to be associating "self-awareness" (which has a great deal to do
   with interaction) with 'sensitivity' and I'm forced to ask what this
   has to do with your contention that intelligence requires what we
   would recognize as language?  I don't think the existence of psycho-
   therapy is proof of this.

Again, I don't think that intelligence requires language, but I do
think that we could not learn self awareness without language --
certainly not easily. (This is why I question the inclusion of item d
of Nettleton's summary.)  For a complete statement of the argument,
see the paper "Behaviorism at Fifty" by B.F. Skinner. It originally
appeared in Science magazine in the middle '60s. It was reprinted in
_Behaviorism and Phenomenology_, edited by T. Swann and in one of
Skinner's later books, and most recently in _The Behavioral and Brain
Sciences_, Vol.  7, Number 4, pp. 615-620 (1984) with peer commentary.

It is ironic that most readers of this newsgroup will likely think that
B.F. Skinner denies the existence of consciousness, while in truth, he
has made IMHO the greatest contribution to its understanding. Skinner
has been the victim of the big lie for many decades now.



--
Ben A. Green, Jr.              
greenba@crd.ge.com
  Speaking only for myself, of course.

cpshelley@violet.uwaterloo.ca (cameron shelley) (12/04/90)

In article <GREENBA.90Dec3091624@gambia.crd.ge.com> greenba@gambia.crd.ge.com (ben a green) writes:
>
>The story up to last week:
>
>Philip Nettleton recently posted a summary of what he thinks this
>newsgroup regards as requirements on a system for it to be classified 
>as intelligent. In brief,
>
>a) The system must be able to learn from experience.
>b) The system must be autonomous, independent of any operator.
>c) The system must be able to reason.
>d) The system must be able to develop self awareness.
>
>I responded with the remark that items c and d would tend to rule
>out the system _felinus domesticus_ as intelligent, since the items
>would require language and cats don't have it.
>
[...]

>My point, exactly. No, I don't think language performance is the only
>real measure of intelligence. I would say that cats ARE intelligent.
>That is why I objected to items c and d. 
>
>But that doesn't mean that cats REASON.  This subject is hard to
>discuss because there is no accepted vocabulary of technical terms to
>use. I guess I base my definition of reasoning on experience with
>logic and mathematics, which are based on manipulation of symbols in
>the manner of languages.
>

Then this is where we differ.  The limitation of the terms "reason" and
"self awareness" to _homo sapiens_ I find too anthropomorphic.  It 
implies a very sharp dividing line between our abilites and those of
of other (somewhat) intelligent animals which I don't see justified.
However, as you point out, we may be just arguing over terminology.

[...]

>
>Again, I don't think that intelligence requires language, but I do
>think that we could not learn self awareness without language --
>certainly not easily. 

I think alot hangs by that qualification, which I don't recall from
your previous posting.  It is possible to learn a division between
oneself and the world by observing that one can will one's limbs
to move (and it usually works), but that one cannot simply
will external objects to move and have any effect.  Certainly language,
in tandem with culture, are much more powerful ways to form a concept
of place and separation, but they are not exclusive.  The first
method I gave here is available to cats and many other organisms with
sufficient intelligence to make the observation I talked about.

>(This is why I question the inclusion of item d
>of Nettleton's summary.)  

You should concurrently consider questioning your reading of what item d
means.  I think it is correct if the terminology is understood as being
more general than just referring to humans.

>For a complete statement of the argument,
>see the paper "Behaviorism at Fifty" by B.F. Skinner. It originally
>appeared in Science magazine in the middle '60s. It was reprinted in
>_Behaviorism and Phenomenology_, edited by T. Swann and in one of
>Skinner's later books, and most recently in _The Behavioral and Brain
>Sciences_, Vol.  7, Number 4, pp. 615-620 (1984) with peer commentary.
>
>It is ironic that most readers of this newsgroup will likely think that
>B.F. Skinner denies the existence of consciousness, while in truth, he
>has made IMHO the greatest contribution to its understanding. Skinner
>has been the victim of the big lie for many decades now.
>

You seems to finish your postings with some sinister allusion which
remains unqualified.  Exactly what "big lie" do have in mind?  Who
has benefitted from it?

--
      Cameron Shelley        | "Logic, n.  The art of thinking and reasoning
cpshelley@violet.waterloo.edu|  in strict accordance with the limitations and
    Davis Centre Rm 2136     |  incapacities of the human misunderstanding..."
 Phone (519) 885-1211 x3390  |				Ambrose Bierce

greenba@gambia.crd.ge.com (ben a green) (12/05/90)

In article <1990Dec3.192057.9050@watdragon.waterloo.edu> cpshelley@violet.uwaterloo.ca (cameron shelley) writes:

   In article <GREENBA.90Dec3091624@gambia.crd.ge.com> greenba@gambia.crd.ge.com (ben a green) writes:
   >
   >The story up to last week:
   >
   >Philip Nettleton recently posted a summary of what he thinks this
   >newsgroup regards as requirements on a system for it to be classified 
   >as intelligent. In brief,
   >
   >a) The system must be able to learn from experience.
   >b) The system must be autonomous, independent of any operator.
   >c) The system must be able to reason.
   >d) The system must be able to develop self awareness.
   >
   >I responded with the remark that items c and d would tend to rule
   >out the system _felinus domesticus_ as intelligent, since the items
   >would require language and cats don't have it.
   >
   [...]

   >My point, exactly. No, I don't think language performance is the only
   >real measure of intelligence. I would say that cats ARE intelligent.
   >That is why I objected to items c and d. 
   >
   >But that doesn't mean that cats REASON.  This subject is hard to
   >discuss because there is no accepted vocabulary of technical terms to
   >use. I guess I base my definition of reasoning on experience with
   >logic and mathematics, which are based on manipulation of symbols in
   >the manner of languages.

   Then this is where we differ.  The limitation of the terms "reason" and
   "self awareness" to _homo sapiens_ I find too anthropomorphic.  It 
   implies a very sharp dividing line between our abilites and those of
   of other (somewhat) intelligent animals which I don't see justified.
   However, as you point out, we may be just arguing over terminology.

What I am trying to say, perhaps poorly, is not that reason and self-awareness
are by definition limited to humans, but that they require language.
Reasoning involves explicit use of language. Self-awareness needs 
language in its learning. The limitation to humans follows from the fact
that only humans have highly developed languages. Maybe Koko (the gorilla)
has enough language for both. I hope so.

My proposal is, drop requirements c and d. Items a and b are enough.

   You should concurrently consider questioning your reading of what item d
   means.  I think it is correct if the terminology is understood as being
   more general than just referring to humans.

But I don't read it as referring to just humans. I argue that to develop
self awareness requires language, since it is socially learned. The
limitation to humans is a conclusion, not a premise.

   You seems to finish your postings with some sinister allusion which
   remains unqualified.  Exactly what "big lie" do have in mind?  Who
   has benefitted from it?

Well, I didn't mean to be sinister, and perhaps "lie" was a bad choice
of words. But certainly there are many textbooks that have all behaviorists
denying the obvious facts of consciousness. And then there is Chomsky's
review of _Verbal Behavior_, which missed the point of the whole book
and yet turned many away from Skinner. Nobody much has benefitted, but
IMHO those who have been misled have suffered the loss of Skinner's
contributions to the understanding of consciousness.



--
Ben A. Green, Jr.              
greenba@crd.ge.com
  Speaking only for myself, of course.

yamauchi@cs.rochester.edu (Brian Yamauchi) (12/05/90)

In article <GREENBA.90Dec4112748@gambia.crd.ge.com> greenba@gambia.crd.ge.com (ben a green) writes:
      In article <GREENBA.90Dec3091624@gambia.crd.ge.com> greenba@gambia.crd.ge.com (ben a green) writes:
      >
      >Philip Nettleton recently posted a summary of what he thinks this
      >newsgroup regards as requirements on a system for it to be classified 
      >as intelligent. In brief,
      >
      >a) The system must be able to learn from experience.
      >b) The system must be autonomous, independent of any operator.
      >c) The system must be able to reason.
      >d) The system must be able to develop self awareness.

   What I am trying to say, perhaps poorly, is not that reason and self-awareness
   are by definition limited to humans, but that they require language.
   Reasoning involves explicit use of language. Self-awareness needs 
   language in its learning. The limitation to humans follows from the fact
   that only humans have highly developed languages.

While I would agree that some form of language is required for logical
reasoning, I don't believe this is the case for self awareness.  What
is required for self awareness is the ability to perceive the world as
separate from the individual, and the ability to generate some model
of one's self.  This requires perception, motor control, and the
ability to interact with a complex environment, but I don't think it
requires language -- unless you want to consider any knowledge
representation or data structure as an example of language.
--
_______________________________________________________________________________

Brian Yamauchi				University of Rochester
yamauchi@cs.rochester.edu		Computer Science Department
_______________________________________________________________________________

cam@aipna.ed.ac.uk (Chris Malcolm) (12/05/90)

In article <GREENBA.90Nov30154938@gambia.crd.ge.com> greenba@gambia.crd.ge.com (ben a green) writes:
>In article <1990Nov30.180650.26648@watdragon.waterloo.edu> cpshelley@violet.uwaterloo.ca (cameron shelley) writes:

>   In article <GREENBA.90Nov30092227@gambia.crd.ge.com> greenba@gambia.crd.ge.com (ben a green) writes:

>   > ...
>   >reasoning and self awareness in any
>   >non-trivial senses require language.
>   >

>   How do you figure that?  Do you mean a mental language?  If so, what
>   do you consider 'mentalese' to be like?

>No, not a mental language. An actual, socially derived language. 
>What is reasoning without talking to oneself, or actually writing to
>oneself? We do this all the time when reasoning with tough problems.

We also solve lots of tough problems without reasoning linguistic-like
at all. Sometimes, e.g. engineering problems, the visualisations can be
cast into words afterwards, albeit with difficulty. Sometimes, as with
musicians who lack a formal musical education, they can't explain the
problem or solution in words at all. People use non-linguistic-like
reasoning very succesfully even in very narrow formalisable domains such
as chess, as simultaneous lightning chess displays demonstrate.

Even in cases where linguistic-like reasoning seems to be the method
used, this reasoning is hosted on a substrate of non-linguistic
abilities, such as the kind of immediate comprehension we often refer to
as "seeing".

And I haven't mentioned "action-problems" at all, such as which way to
run when crossing the road and something nasty happens. "Talk to
yourself" then and you're dead!
-- 
Chris Malcolm    cam@uk.ac.ed.aipna   +44 31 667 1011 x2550
Department of Artificial Intelligence, Edinburgh University
5 Forrest Hill, Edinburgh, EH1 2QL, UK             DoD #205

corey@dataco.UUCP (Shawn Corey) (12/05/90)

In article <GREENBA.90Nov30092227@gambia.crd.ge.com> greenba@gambia.crd.ge.com (ben a green) writes:
[material deleted]
>IMHO paragraphs a and b are non-controversial, but paragraphs c and d
>would rule out, say, cats, since reasoning and self awareness in any
>non-trivial senses require language.

Reasoning and self awareness do _NOT_ require language; expression of these
(to other beings) require language. Another prime example of "If it ain't
human, it ain't intelligent."

As an aside, I have seen cats that reason in an unstructured enviroment. My
friend had a cat and a dog. The dog was asleep at the bottom of the stars. The
cat comes over, looks at the dog, looks up the stairs, looks at the dog, looks
up the stairs. Then it goes up the stairs. Half a minute latter, it bounds
down the stairs, lands on the dog and takes off. This is the observed
behavior. My conclusion is that the cat reasoned that it could have a greater
impact by bounding down the stairs and landing on the dog then by merely
jumping on it.
-- 
+---------------+--------------------------------------------------+
| Shawn Corey   | "Never mind, Scotty, we'll test them in combat!" |
| corey@dataco  |   -- famous Starfleet Captain                    |
+---------------+--------------------------------------------------+

greenba@gambia.crd.ge.com (ben a green) (12/05/90)

In article <YAMAUCHI.90Dec4140917@heron.cs.rochester.edu> yamauchi@cs.rochester.edu (Brian Yamauchi) quotes me

      What I am trying to say, perhaps poorly, is not that reason and self-awareness
      are by definition limited to humans, but that they require language.
      Reasoning involves explicit use of language. Self-awareness needs 
      language in its learning. The limitation to humans follows from the fact
      that only humans have highly developed languages.

and adds

   While I would agree that some form of language is required for logical
   reasoning, I don't believe this is the case for self awareness.  What
   is required for self awareness is the ability to perceive the world as
   separate from the individual, and the ability to generate some model
   of one's self.  This requires perception, motor control, and the
   ability to interact with a complex environment, but I don't think it
   requires language -- unless you want to consider any knowledge
   representation or data structure as an example of language.

I don't pretend that the connection between self-awareness and language
is obvious (and no, I'm not extending the definition of language). It
follows from an analysis by Skinner of what is involved in the process
of coming to perceive things. 

There are many things around us every moment that are perceptible to us
but which we do not perceive. What we perceive is typically something that
is important to us. We say that is what we are paying attention to.
In a state of nature, what is important to most animals is safety, food,
and sex, and objects and conditions surrounding these concerns are
perceived very well. We come to perceive these objects and conditions
because we are "reinforced" for doing so.

Now our internal states certainly affect our behavior strongly, e.g.
blood sugar level vs. eating. But there is little in the natural world
(exclusive of other people) that reinforces perception of our internal states.
So we are not likely to learn to perceive our internal states in isolation.
Other people can provide contingencies under which we do learn to perceive
our internal states. They use language to do so.

This is too brief to persuade anyone, I know. Look up Skinner's paper
"Behaviorism at Fifty" last reprinted in Behavioral and Brain Sciences,
December, 1984.

--
Ben A. Green, Jr.              
greenba@crd.ge.com
  Speaking only for myself, of course.

greenba@gambia.crd.ge.com (ben a green) (12/06/90)

In article <309@dcsun21.dataco.UUCP> corey@dataco.UUCP (Shawn Corey) writes:

   In article <GREENBA.90Nov30092227@gambia.crd.ge.com> greenba@gambia.crd.ge.com (ben a green) writes:
   [material deleted]
   >IMHO paragraphs a and b are non-controversial, but paragraphs c and d
   >would rule out, say, cats, since reasoning and self awareness in any
   >non-trivial senses require language.

   Reasoning and self awareness do _NOT_ require language; expression of these
   (to other beings) require language. Another prime example of "If it ain't
   human, it ain't intelligent."

Not at all. I think cats are quite intelligent, although I don't think
they reason or are self-aware. My quarrel is with the definition of
intelligence as requiring reasoning and self-awareness. I think it is
enough if the organism learns and prospers in a range of different
hostile environments.

My difference with Shawn is in the meaning of reasoning and in the
analysis of what it takes to become self-aware, but I won't repeat
what I said in another recent posting.

--
Ben A. Green, Jr.              
greenba@crd.ge.com
  Speaking only for myself, of course.

smiller@aio.jsc.nasa.gov (Stephen Miller) (12/06/90)

In article <4832@gara.une.oz.au> pnettlet@gara.une.oz.au (Philip Nettleton) writes:
>I thought it was about time I reposted the general requirements for
>determining a system (automated or biological) as being intelligent,
>----------------------------------------------------------------------
>			DEFINITION:
>	GENERAL REQUIREMENTS OF AN INTELLIGENT SYSTEM.
>
>a)	The system MUST be able to learn. This implies that the
>	system MUST have a memory for learning to be maintained.
>	Also learning comes in a number of varieties:
>
>	i)	It MUST be able to learn from its own experiences.
>		These can be broken down into further criteria:
>
>		1)	Learning through trial and error.
>		2)	Learning through observation.
>		3)	Learning through active reasoning.
>

congratulations!  I am happy someone has undertaken this important job
of beginning to describe (define) intelligence.  (That is, intelligence in
a non-ethnocentric and non-anthropomorphic way.)

I have a comment, however, pertaining to the above passage:
    If you really mean "criteria", something is seriously amiss.  Do you 
mean "groupings" or "classifications"?  Assuming the latter, it still 
gives me shivers, as it is potentially restricting not only the way we
divide learning, but also what the categorical types of learning are.
    Would you settle for adding a fourth category, say "Other" ?  Or 
perhaps leaving this whole "i)" breakdown out.  There are TOO MANY ways
we learn;  you haven't listed them all;  there may be other ways we haven't
identified yet, too!
    I really think you are on the right track.

.steve.

***** My comments in no way reflect the views or opinions of my employer *****

greenba@gambia.crd.ge.com (ben a green) (12/06/90)

In article <3608@aipna.ed.ac.uk> cam@aipna.ed.ac.uk (Chris Malcolm) writes:

   In article <GREENBA.90Nov30154938@gambia.crd.ge.com> greenba@gambia.crd.ge.com (ben a green) writes:
   >In article <1990Nov30.180650.26648@watdragon.waterloo.edu> cpshelley@violet.uwaterloo.ca (cameron shelley) writes:

   >   In article <GREENBA.90Nov30092227@gambia.crd.ge.com> greenba@gambia.crd.ge.com (ben a green) writes:

   >   > ...
   >   >reasoning and self awareness in any
   >   >non-trivial senses require language.
   >   >

   >   How do you figure that?  Do you mean a mental language?  If so, what
   >   do you consider 'mentalese' to be like?

   >No, not a mental language. An actual, socially derived language. 
   >What is reasoning without talking to oneself, or actually writing to
   >oneself? We do this all the time when reasoning with tough problems.

   We also solve lots of tough problems without reasoning linguistic-like
   at all. Sometimes, e.g. engineering problems, the visualisations can be
   cast into words afterwards, albeit with difficulty. Sometimes, as with
   musicians who lack a formal musical education, they can't explain the
   problem or solution in words at all. People use non-linguistic-like
   reasoning very succesfully even in very narrow formalisable domains such
   as chess, as simultaneous lightning chess displays demonstrate.

I agree with Chris completely except that the problem solving he describes,
I would say, involve intelligence but not reasoning.

Several people have posted objections to my comment that reasoning 
requires language, but they always then jump from "reasoning" to
"intelligence".

For, I hope, the last time, my position is as follows:

	Intelligence does not require language.
	Reasoning does.

	Cats are intelligent.
	Cats solve problems.
	Cats don't reason.

If you want to disagree, fine, but please don't misrepresent me.

--
Ben A. Green, Jr.              
greenba@crd.ge.com
  Speaking only for myself, of course.

smiller@aio.jsc.nasa.gov (Stephen Miller) (12/11/90)

In article <GREENBA.90Dec5101155@gambia.crd.ge.com> greenba@gambia.crd.ge.com (ben a green) writes:
>In article <YAMAUCHI.90Dec4140917@heron.cs.rochester.edu> yamauchi@cs.rochester.edu (Brian Yamauchi) quotes me
>      language in its learning. The limitation to humans follows from the fact
>      that only humans have highly developed languages.
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

To fill you in:  whales and dolphins have highly developed languages.  Active
research has been ongoing for the last fifteen years.  In fact, the languages
are so complex and alien that we have not yet cracked them.  That means we 
cannot understand what they are talking about, but we know they are talking. 
    This is not the case with other mammals (such as the wolf or chimp) which
are highly developed socially as well.  (As humans and sea mammals and other 
primates are.)  Wolves and chimps do have language -- in the case of wolves,
the vocabulary is less than fifty concepts.  We have "cracked" this language
because it is a simple one, and we can guess some of the animals' concerns
in life, such as food, danger, communication about social relationships and
hierarchies, and so on.  The sea mammals' world and concerns, however, are
alien to us, which obviously makes the decryption job much more difficult.
However, there is another factor which is making the decoding extremely 
difficult, and that is that these animals are highly developed, with social
relationships equal or superior to ours in complexity and depth, possibly
greater intelligence than ours, and (again) the world so alien to ours that 
we have an extremely hard time imagining it.  
    Understanding these creatures, and their language, is really a problem
closest to understanding an extraterrestrial intelligent life form.

.steve.
*** The opinions and ideas expressed herein in no way reflect those of my
	    employer. ***

valis@athena.mit.edu (John O'Neil) (12/12/90)

In article <717@aio.jsc.nasa.gov> smiller@aio.jsc.nasa.gov (Stephen Miller) writes:

> To fill you in:  whales and dolphins have highly developed languages.  Active
> research has been ongoing for the last fifteen years.  In fact, the languages
> are so complex and alien that we have not yet cracked them.  That means we 
> cannot understand what they are talking about, but we know they are talking.

How do you know? Where's your evidence? If you want to believe that
cetaceans have language while you're eating your dolphin-safe tuna,
please feel free.  If you want to convince anyone else, you'll have to
do better than asserting it repeatedly with great fervor. 

Evidence would help, but you've got a small problem -- there isn't
any.


 John O'Neil
 Organlegger
"From head to toe, you know where to go."
 Spleens a specialty.

G.Joly@cs.ucl.ac.uk (Gordon Joly) (12/13/90)

Animal languages? I saw a clip on TV that showed monkeys with a
different word for each type of danger; eg snake, lion, etc.
No syntax, I think, but they had a very large vocabulary.

Gordon Joly                                       +44 71 387 7050 ext 3716
InterNet: G.Joly@cs.ucl.ac.uk          UUCP: ...!{uunet,ukc}!ucl-cs!G.Joly
Computer Science, University College London, Gower Street, LONDON WC1E 6BT