[comp.ai.digest] Biological Models, Their Real Value for AI

wallacerm@AFWAL-AAA.ARPA (12/30/87)

In attempts to reconcile the vast amount of information that is being said on 
the topic of biological modeling I do not hear mention of the de facto 
requirements of all living organisms.  These are: greed, fear, pain, and 
pleasure.  From my observations on the experiments performed on vertebrates 
and resolution of the experimenters results, I find that these drives are 
foremost in control of all situations that the subject undergoes.  I am not a 
terminology bigot, so if you have an equivalent word that caries the same 
semantic content, for greed, fear, pain, and pleasure substitute it for the 
remainder of this squib!

To elaborate, each organism -- once given that spark of electrochemical 
activity -- demonstrates a multilevel control structure that is geared for 
immediate survival.  Fright of the new environment (which is often cooler than 
a womb or clutch of eggs) stimulates the next control structure of search for 
heat.  The pain of hunger causes the search for food.  Once heat and food are 
found the characteristics of pleasure and greed (desire for all the food and 
heat that is available) start.  By now you've noticed that I've glossed over 
what are called the instincts, the innate abilities, of the organism that 
control all the electromechanical operations of the organism.  I will return 
to these, but first it is important to concentrate on the characteristics 
enumerated above.  An organism's life is constantly, intrusively altered by 
its "mother."  I quote mother as it is generally the female of the species, 
but doesn't always have to be female past birth or the laying of the egg 
(reptiles, fish, birds).  This is an important concept that we often fail to 
recognize.  This is an extremely interactive phased development of the 
organism's learning process; its "self" is intruded and it learns to accept 
stimuli from another source other that its inanimate environment.

With these four characteristics, the phases of newborn, infant, pass with much 
teaching for the organism's "mother" and environment (niche).  These phases 
have no definite time span, and vary per phylum, family, genus, and species.  
What is taught, of course varies, and is highly dependent on environment.   
Once the organism has achieved a certain autonomous status its basic four 
characteristics drive it in its life.

Returning now to the instincts.  I have noticed that there is an expectation 
that the AI community puts on its silicon based electronic, electromechanical 
"protoplasm;" and that is that it has connections, but no initial "program-
ming!"  I feel that this is quite silly, as all of the expected higher mental 
functions are formed by experience/interaction with the "mother" and environ-
ment (ignoring societal interaction for the moment).  The state-of-the-art is 
still at the instincts stage.  We are therefore expecting a non-instinctive, 
non-greedy, non-pain feeling, non-pleasure feeling, non-fearing lump of 
connections to "boot," via our programming to a state past the infant phase!  
I feel that this is a gross error in our hypothesis on trying to get 
non-biological machines to learn.  In our experiments here with a one neuron 
model, the four characteristics proved to be crucial to the development of an 
"organism" that could learn.

Turning our attention to the task at hand -- creation of expert systems, 
consciousness, and generally a context adaptive decision making entity -- we 
must first concentrate on the learning.  To do this we must insert the de 
facto characteristics.  Easily said, now how does one do such a thing?  
Remembering that we are in fear of pain from our greedy pain or pleasure 
giving task masters (a.k.a. "mom") for whom we work; this can be a time and 
resource sink in a development process.  Hence we tend not to concentrate on 
putting in a baseline of characteristics, but instead try to get our 
"brain-damaged" systems to exhibit some set of output for some set of input.  
First there must be a baseline.  "Baseline," is defined as, "The point at 
which infancy ends, and autonomy begins."

We are all versed in the concepts, operation and use of virtual, multiprocess-
ing, multiprocessor (any function not supported by the CPU), systems.  To 
quote the comic strip character Pogo, "We have met the enemy, and they are 
us!" because we have these physical items innate.  Any neurologists care to 
comment?  Our second definition is computer.  "Computer" is defined as, "The 
silicon based, electrically stimulated machine that has computational and 
logic (boolean expected, others accepted) capability."  To baseline our 
computer in the characteristics that are necessary is quite a task.  Here I am 
going to stop this message as I hope that it will stimulate a response from 
the reading community.  I have my own opinions and results, but as to not 
prejudice the respondents reply, I will not include them yet.  Instead I will 
leave the following question for your rumination.

How is a computer to be baselined with the characteristics of greed, fear, 
pain, and pleasure for it to learn a higher task/function?

Richard M. Wallace
AFWAL/AADE
Wright-Patterson, AFB, OH 45433
ARPA: <wallacerm@afwal-aaa.arpa>

------