[comp.ai.philosophy] A "working definition" of intelligence.

pnettlet@gara.une.oz.au (Philip Nettleton) (06/24/91)

People are again batting the breeze with notions about Turing Tests,
intelligence and how to define it. Those of you who were not watching
this news group last year (and indeed comp.ai as well) would have
missed some fruitful Global participation on a "working definition"
of intelligence (artificial or otherwise, and independent of human
prejudices, etc).

The definition makes no attempt to quantify the "degree" of
intelligence, only whether a particular system can be "classed" as
intelligent. Ie, a human is, a cat is, a beatle is, a brick isn't,
etc.

I shall repost the fourth and most comprehensive version of that
definition of intelligence for those who missed out and ask again
for "constructive" criticisms.

As before, any constructive criticisms will find their way into a
new version of the definition and flames will be duly ignored.

----------------------------------------------------------------------
			DEFINITION:
	GENERAL REQUIREMENTS OF AN INTELLIGENT SYSTEM.

a)	The system MUST be able to learn. This implies that the
	system MUST have a memory for learning to be maintained.
	Also learning comes in a number of varieties:

	i)	It MUST be able to learn from its own experiences.
		These can be broken down into further groupings:

		1)	Learning through trial and error.
		2)	Learning through observation.
		3)	Learning through active reasoning.

	ii)	It SHOULD be able to learn by instruction, but this
		is not necessary. At the very least the system MUST
		have preprogrammed instincts. This is a boot strap
		for the developing intelligence.  Without a starting
		point, the system cannot progress.

b)	The system MUST be autonomous. That is to say, it MUST be
	able to do things by itself (however may choose to accept
	aid).  This can be dissected as:

	i)	The system MUST be able to affect its environment
		based on its own independent conclusions.

	ii)	The system MUST be its own master first and foremost,
		and therefore not require operator intervention to
		function. This does not necessarily rule out the
		taking of orders from another system, but the choice
		to obey MUST be made by the system itself.

	iii)	The system MUST be motivated. It must have needs and
		requirements that can be satisfied by its own
		actions.

c)	The system MUST be able to reason. That is to say, it must
	use some form of reasoning, based on known facts and capable
	of producing insights which later become known facts. It
	should be noted that the degree of certainty about the truth
	of a known fact is also an important concept and some way of
	dealing with uncertainty MUST be provided.

d)	The system MUST be able to develop self awareness. This is
	related to autonomy, reasoning and learning, but also
	embodies the need for internal and external senses. Without
	these senses there is no way of appreciating the difference
	between "me" and "outside of me". Sensations of pain and
	pleasure can provide motivation.
----------------------------------------------------------------------
			DEFINITION OF TERMS.

1)	A "system" CAN be comprised of multiple subsystems, each one
	of these could be a system in its own right (systems theory).

2)	The "environment" in which the system exists MUST be external
	to the system, but that is as far as the definition of the
	environment goes (it could be computer generated).

3)	The terms "learning", "reasoning" and "autonomy" are
	BEHAVIOURAL characteristics, further supported by our
	understanding (to date) of how they MIGHT work.

4)	The term "self awareness" is based on learning, reasoning
	and autonomy, and is the state where the system is aware
	(has knowledge) of its own existence as separate from its
	environment.

5)	"Intelligence" is a BEHAVIOURAL phenomena displayed by
	intelligent systems.

6)	"Truth" about a known fact is SUBJECTIVE with respect to the
	system. Ultimate truth is an ideal which is seldom
	achievable even in "human intelligence".

7)	"Certainty" is a statistical measure of the probability of
	a fact being true.

8)	"Reasoning" can never be independent of a language (read
	any good book on logic and this will become evident). The
	language, however need have no verbal or social component.
----------------------------------------------------------------------

jamesm@gemma.cs.rpi.edu (Michael James) (06/24/91)

In article <7135@gara.une.oz.au> pnettlet@gara.une.oz.au (Philip Nettleton) writes:
>People are again batting the breeze with notions about Turing Tests,
>intelligence and how to define it. Those of you who were not watching
....stuff omitted


>----------------------------------------------------------------------
>			DEFINITION:
>	GENERAL REQUIREMENTS OF AN INTELLIGENT SYSTEM.
>
.....other stuff omitted
>----------------------------------------------------------------------

I'm glad this was posted at this time because I spent all of last weekend
trying to construct just such a definition.   I never could get it to 
say what I wanted, though (big surprise, eh?).  I have some problems with
the definition that Philip posted;  these are not flames but just problems 
that I am struggling with.

1) What is learning?  I'm not talking about a phrase or off-hand remark. 
   I am wanting an explicit definition.

2) What are instincts?

3) What is instruction?  It seems to me that when we say something learns by
   instruction, it is actually doing what we would call learning by
   observation.  The materials under observation are just manipulated by 
   an outside entity to facilitate the 'learning.'

4) What does it mean to say something is motivated?  When I tell my dog 
   to fetch his bone and he does this because he knows that I will give him
   something to eat, how is this different from a micro-organism which always
   heads toward light sources or a computer which computes the sum of two
   numbers?  I'm not sure I see any difference between them except in degree
   of complexity.

5) (Oh boy)  What does it mean to reason?  I still am pretty clueless on this
   one.  Do we 'learn' how to 'reason?'

6) Is self-awareness the same thing or an ingredient of consciousness? 
   Or neither?

The definition of terms section presented in the original posting says
that things such as learning and reasoning are behavioral phenomena.
I'm not sure that this is explicit enough to serve as a basis for the
recognition of learning or reasoning.



mj

---------------------------------------------------------------------
Michael James                Rensselaer Polytechnic Institute
jamesm@turing.cs.rpi.edu

eb2e+@andrew.cmu.edu (Eric James Bales) (06/25/91)

jamesm@gemma.cs.rpi.edu (Michael James) writes:
> 1) What is learning?  I'm not talking about a phrase or off-hand remark. 
>    I am wanting an explicit definition.

Learning is more than just remembering something, such as a computer
storing a bit of data in a look up table so that it will know what to
do when presented with a specific situation in the future.  IMHO,
learning is being able to apply that information to other situations,
some of which can be quite dissimilar.  Learning, to me, is applying
your memories.

> 5) (Oh boy)  What does it mean to reason?  I still am pretty clueless on this
>    one.  Do we 'learn' how to 'reason?'

In order to apply what you remember to situations other than the exact
one that you learned, you have to be able to reason.  To be able to
draw parallels between two situations or events.

-- ---------------------------------------------------------------------
eb2e+@andrew.cmu.edu                              -Eric Kirkbride-
		        -The second dolphin-
Dolphins. Soon you will be one of us, and then you will understand.
Disclaimer:  What do I know about philosophy?  I'm an Engineer!

erwin@trwacs.UUCP (Harry Erwin) (06/25/91)

One concept that I've found of use in this area is
that of a "self-simulation." This is a simulation
of the future that originally evolved to aid in
movement at night and in closed environments.
It seems to be a mammalian characteristic. It
also seems to be associated with a sense of
self in the primates. The evolved uses of that
simulation appear to be associated with many of
the meanings we associate with "intelligence."

-- 
Harry Erwin
Internet: erwin@trwacs.fp.trw.com

jane@latcs1.lat.oz.au (Jane Philcox) (06/28/91)

I am aware that this whole question involves subtleties of which I know
nothing, and that there are some formal studies in this subject of which I
also know nothing.  However, I think I have something to say from the
empirical point of view on this one:

In article <-hclt6l@rpi.edu> jamesm@gemma.cs.rpi.edu (Michael James) writes:
>5) (Oh boy)  What does it mean to reason?  I still am pretty clueless on this
>   one.  Do we 'learn' how to 'reason?'

Without wishing to get into an argument about what it means to reason, on a
normal day-to-day level I would say it means reasoning from cause to effect
(however you wish to define those terms!).  Yes we do have to learn how to do
it - a small child who is learning to talk is unable to connect the two.  A
couple of years later, the child is starting to be able to connect things like
"I just tripped the baby up and fell over him, and now my shoulder hurts," with
"If I hadn't tripped the baby up, my shoulder probably wouldn't be hurting."
At an earlier stage, the connection was to "If I hadn't fallen over the baby,
my shoulder probably wouldn't be hurting," so it was the baby's fault.  Now it
is the child's fault.  There is an obvious development in ability to follow
a chain of causality (is that the right term?) here, particularly as the
conclusion the older child is reaching is actually _less_ palatable than the
one s/he would have reached earlier.

I would _guess_ that later development in formal reasoning would probably
follow along the same lines. Of course, in both cases, the quality of the
teaching, from the parents in the first instance, and the instructor in the
second, must be crucial.  I suspect it would be a rare person indeed who
could make the jump unassisted.

Regards, Jane.

-- 

           A programmer is a machine for converting coffee into code.