[comp.ai.digest] McDermott model of free will

NICK@AI.AI.MIT.EDU (Nick Papadakis) (06/04/88)

Date: Fri, 3 Jun 88 16:18 EDT
From: Bruce E. Nevin <bnevin@cch.bbn.com>
Subject: McDermott model of free will
To: ailist@ai.ai.mit.edu
cc: bn@cch.bbn.com

DM> Date: 30 May 88 16:40:28 GMT
DM> From: dvm@yale-zoo.arpa  (Drew Mcdermott)
DM> Subject: Free will

DM> More on the self-modeling theory of free will:
DM> . . .
DM> What's pointless is trying to simulate the present period
DM> of time.  Is an argument needed here?  Draw a mental picture: The robot
DM> starts to simulate, and finds itself simulating ...  the start of a
DM> simulation.  What on earth could it mean for a system to figure out
DM> what it's doing by simulating itself?

Introspect about the process of riding a bicycle and you shortly fall
over.  Model for yourself the process of speaking and you are shortly
tongue-tied.  It is possible to simulate what one just was doing, but
only by leaving off the doing for the simulation, resuming the doing,
resuming the simulation, and so on.

What might be proposed is a parallel ("shadow mode") simulation, but
it's always going to be a jog out of step, not much help in real time.  

What might be proposed is an ongoing modelling of what is >supposed< to
be going on at present.  Behavior then is governed by the model unless
and until interaction in the environment that contradicts the model
exceeds some threshold (another layer of modelling), whereupon an
alternative model is substituted, or the best-fit model is modified
(more work), or the agent deals with the environment directly (a lot of
work indeed).

A great deal of human culture (social construction of reality) may have
the function of introducing and supporting sufficient redundancy to
enable this.  Social conventions have their uses.  We support one
another in a set of simplifications that we can agree upon and that the
world lets us get away with.  (Often there are damaging ecological
consequences.)  We make our environment more routine, more like that of
the robot in Drew McDermott's last paragraph ("Free will is not due to
ignorance.")

It's as if free will must be budgeted:  if everything is a matter for
decision nothing can happen.  The bumbler is I suppose the pathological
extension in that direction, the introspective bicyclist in all things.
For the opposite pathology (the appeal of totalitarianism), see Eric
Fromm, _Escape from Freedom_.

Bruce Nevin
bn@cch.bbn.com
<usual_disclaimer>