[comp.ai] the role of 'emotional'/goal directed components in understanding

silber@voodoo.ucsb.edu (12/15/89)

-Message-Text-Follows-

It occurs to me, with respect to discussions about cognitive competence,
to reflect on the necessary role of emotional components in all aspects
of human cognition. Perhaps a von-neumann strong-ai machine/program of the
traditional kind can never instantiate 'consciousness', not because it
is computationally incompetent, BUT because it is emotionally incompetent????

smoliar@vaxa.isi.edu (Stephen Smoliar) (12/15/89)

In article <3312@hub.UUCP> silber@voodoo.ucsb.edu writes:
>
>It occurs to me, with respect to discussions about cognitive competence,
>to reflect on the necessary role of emotional components in all aspects
>of human cognition. Perhaps a von-neumann strong-ai machine/program of the
>traditional kind can never instantiate 'consciousness', not because it
>is computationally incompetent, BUT because it is emotionally incompetent????


This is not that different from the path of reasoning which eventually led
Marvin Minsky to the material in his SOCIETY OF MIND book.  If you go back
to his original paper on K-line in COGNITIVE SCIENCE, he is arguing that
constructs such as frame-based systems which basically provide powerful handles
on declarative representations may be the wrong way to approach models of
memory.  Instead, he advocates an approach to memory in which something more
like feelings (he uses the word "dispositions") provide the primitive elements.
He has yet to get around to being very specific about just what these
dispositions are and how they would relate to an implementation of a
memory model.  The heart of the matter, however, seems to be the ability
to induce a mental state from which an agent is "disposed" to take particular
actions.  In other words, if we think of an intelligent agent (the whole ball
of wax) as some sort of enormous state machine in which each state has an
effect on the actions which may be performed (and this is, admittedly, an
over-generalization for the sake of explanation), then we should be asking
questions like:  "How does perceiving a given situation put the agent into
a state from which it will take an appropriate action?"  Given what we know
about human behavior, it should come as no surprise that emotions often play
a greater role in determining what state we are in than do any objective
operations of inference on a set of facts which delimit a problem statement.

=========================================================================

USPS:	Stephen Smoliar
	USC Information Sciences Institute
	4676 Admiralty Way  Suite 1001
	Marina del Rey, California  90292-6695

Internet:  smoliar@vaxa.isi.edu

"For every human problem, there is a neat, plain solution--and it is always
wrong."--H. L. Mencken

traiger@oxy.edu (Saul Traiger) (12/16/89)

In article <3312@hub.UUCP> silber@voodoo.ucsb.edu writes:
>It occurs to me, with respect to discussions about cognitive competence,
>to reflect on the necessary role of emotional components in all aspects
>of human cognition. Perhaps a von-neumann strong-ai machine/program of the
>traditional kind can never instantiate 'consciousness', not because it
>is computationally incompetent, BUT because it is emotionally incompetent????

An excellent starting point for this issue is Chapter 6 of John Haugeland's
book, Artificial Intelligence: The Very Idea (Cambridge: MIT/Bradford Books,
1986).

   ooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooo
   o   Saul Traiger		 oooooo  Cognitive Science	   o
   ooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooo
   o   Internet:traiger@oxy.edu  *----*  Occidental College	   o
   ooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooo
   o   CIS:71631,717		 oooooo  Los Angeles, CA 90041	   o
   ooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooo

bwk@mbunix.mitre.org (Kort) (12/16/89)

In article <3312@hub.UUCP> silber@voodoo.ucsb.edu writes:

 > It occurs to me, with respect to discussions about cognitive competence,
 > to reflect on the necessary role of emotional components in all aspects
 > of human cognition.  Perhaps a Von-Neumann strong-ai machine/program of
 > the traditional kind can never instantiate 'consciousness', not because
 > it is computationally incompetent, BUT because it is emotionally
 > incompetent????

It is my thesis that a learning system necessarily exhibits emotional
behavior.  As a scientist, I frequently experience the emotions of
curiosity, puzzlement, frustration, boredom, exhilaration, anxiety,
confidence, and satisfaction as I explore, ponder, get stuck, make
progress, develop understanding and gain insight into the systems that
I study.

It occurs to me that any learning system must experience such states,
and alter its goals and strategies accordingly.  Metaphorically
speaking, emotions are the first derivative of the learning curve:

	E = dK/dt   where E = Emotional State and K(t) = Knowledge.

I once wrote a pair of whimsical Socratic dialogues between a
pair of anthropomorphic self-programming computers to illustrate
this idea.  If you are interested, I would be happy to share
them with you.

I nice feature of this theory is that it tells me what to do
with my emotions:  Emotions tell me what I (and others) need to
learn next in life.

--Barry Kort