[comp.ai.philosophy] Jack is flying his kite

G.Joly@cs.ucl.ac.uk (Gordon Joly) (02/18/91)

Marvin Minsky writes:
> There has been considerable discussion under this subject of
> differences between human and animal thought.  Has anyone considered
> the conjecture that humans have 3.5 levels of STM, or large=-scale
> temporary K-lines -- and procedures capable of earning to use them.
> Maybe chimps have only 2.5 layers of recirsion abilities -- and
> earlier mammals only 1.5.  This could account for many aspects of
> human abilities in language, planning, problem-solving, etc.  And note
> the positive feedback: with a larger (yet still small) such stack,
> yiou also get more time to put more things into LTM to use as
> "virtual" STM stack.
> 
> For example, Marcus grammars can do a lot of "natural language
> grammar" with 3 stack-like registers, but not very much with only two.
> 
> By "2.5" levels of stack, I simply mean that the first register is
> very competent and capacious, the second less so, etc., so the thing
> trails off.  That's why, presumably, you can understand sentences with
> 2 levels of embedding, but have trouble with 3, etc.

One consideration is that the passage from 0 to 1 is always much
harder than the passage from 1 to 2 and hence to the rest of the
integers (cf Peano's axioms).

On page 161 of the "Society of Mind", the Recursion Principle is
stated as follows:-

``When a problem splits into smaller parts, then unless one can apply
the mind's full power to each subjob, one's intellect will get
dispersed and leave less cleverness for each new task.''

On the opposing page is an example of obfustication in language.

``This is the malt that the rat that the cat that the dog worried
killed ate.''

``This is the dog that worried the cat that killed the rat that ate
the malt.''

I cannot grok the first; doesn't Latin put all the verbs at end? Why
say it in such a roundabout fashion? The Tao of sentence construction?

Does comprehension of natural language... oh damn I have lost my
thread!

More later...

Gordon Joly                                       +44 71 387 7050 ext 3716
Internet: G.Joly@cs.ucl.ac.uk          UUCP: ...!{uunet,ukc}!ucl-cs!G.Joly
Computer Science, University College London, Gower Street, LONDON WC1E 6BT

G.Joly@cs.ucl.ac.uk (02/19/91)

Gordon Joly writes:
 > 
 > Marvin Minsky writes:
 > > There has been considerable discussion under this subject of
 > > differences between human and animal thought.  Has anyone considered
 > > the conjecture that humans have 3.5 levels of STM, or large=-scale
 > > temporary K-lines -- and procedures capable of earning to use them.
 > > Maybe chimps have only 2.5 layers of recirsion abilities -- and
 > > earlier mammals only 1.5.  This could account for many aspects of
 > > human abilities in language, planning, problem-solving, etc.  And note
 > > the positive feedback: with a larger (yet still small) such stack,
 > > yiou also get more time to put more things into LTM to use as
 > > "virtual" STM stack.
 > > 
 > > For example, Marcus grammars can do a lot of "natural language
 > > grammar" with 3 stack-like registers, but not very much with only two.
 > > 
 > > By "2.5" levels of stack, I simply mean that the first register is
 > > very competent and capacious, the second less so, etc., so the thing
 > > trails off.  That's why, presumably, you can understand sentences with
 > > 2 levels of embedding, but have trouble with 3, etc.
 > 
[...]
 > Does comprehension of natural language... oh damn I have lost my
 > thread!
 > 
 > More later...

Found a thread. In mathematics, there is a feature of some iterations
called "period doubling". For a certain value (or higher values) of
the a parameter in the logistic map, the outcome is chaotic.

The notion that once evolution reaches a ceratin point "intelligence
doubling" takes place, kept in check by (long-term) homeostasis.