[comp.ai] Beating a Dead Homunculus

pollack@toto.cis.ohio-state.edu (Jordan B Pollack) (01/16/90)

There really are questions which a Turing Machine, or other "pure
program" cannot answer, which real machines can: For example:

     What time is it?

In order to answer this question correctly, a program MUST BE DEPENDENT
ON THE HARDWARE upon which it is running.

In fact, to even "play the game", a machine must has I/O facilities,
such as the ability to set a bit which changes a measurable physical quantity
in the universe, or to read a bit which reflects one.

These facilities are definitely missing from Abstract Symbolic
Computing machines. (Which is precisely why there is a P (not an A) in
the PSSH.) I think anyone who has ever written a computer program
tacitly assumes an extended computational model with I/O facilities.

So, Searle, if he only knew, could actually win this on this one
little point.  So, I suggest the (mythological?) proponents of "Hard
AI" agree to these two bits of physical machine dependency. Why not?
Then it will be up to Searle to prove necessity of the FULL machine
dependency of his "Brains Only" stance.


--
Jordan Pollack                            Assistant Professor
CIS Dept/OSU                              Laboratory for AI Research
2036 Neil Ave                             Email: pollack@cis.ohio-state.edu
Columbus, OH 43210                        Fax/Phone: (614) 292-4890

kp@uts.amdahl.com (Ken Presting) (01/16/90)

In article <POLLACK.90Jan15193218@toto.cis.ohio-state.edu> pollack@cis.ohio-state.edu writes:
>There really are questions which a Turing Machine, or other "pure
>program" cannot answer, which real machines can: For example:
>
>     What time is it?
>
>In order to answer this question correctly, a program MUST BE DEPENDENT
>ON THE HARDWARE upon which it is running.

Bravo!  Let's follow this idea out a little.

In order for the Chinese Room to report the current time, some of the
rules in the books must instruct the homunculus to look at his watch, and
select symbols for output based on the result.  The clever little fellow
will certainly have some hope of learning Chinese expressions for times
of day.
   Furthermore, that portion of the room's symbology which refers to the
time of day might be considered "grounded" in real events, depending on
how extensive the requirements for symbol grounding turn out to be.

Time of day is not the only type of query that depends on the
implementation:

How long since my last question?
How big is the question window?
What color is this piece of paper?

>In fact, to even "play the game", a machine must has I/O facilities,
>such as the ability to set a bit which changes a measurable physical quantity
>in the universe, or to read a bit which reflects one.
>
>These facilities are definitely missing from Abstract Symbolic
>Computing machines. (Which is precisely why there is a P (not an A) in
>the PSSH.) I think anyone who has ever written a computer program
>tacitly assumes an extended computational model with I/O facilities.

I would claim that potential execution on a machine with I/O is an
essential component of the concept of "program" - that's what
distinguishes programs from algorithms or functions.  In a previous thread
on "what is a program" I offered the (partial) definition:

   A program is (1) a sentence of a formal language which
                (2) defines an equivalence class of physical devices.

(If anyone wants to say that it's the implementations of a program which
are related to physical computers, that's fine.  It's implementations
that AI research is aiming for.)

>So, Searle, if he only knew, could actually win this on this one
>little point.  So, I suggest the (mythological?) proponents of "Hard
>AI" agree to these two bits of physical machine dependency. Why not?
>Then it will be up to Searle to prove necessity of the FULL machine
>dependency of his "Brains Only" stance.

On the other hand, your point here goes to strengthen the "system reply".

The Chinese Room as Searle sets it up lacks some crucial features which
any AI system would have - the physical machine dependencies.

There is another more subtle difference between the CR and a computer.  By
installing a human being as the processor, the causal relationship between
input, program, and output is obscured.  On some influential theories of
human action, (notably Donald Davidson's "anomalous monism") there can be
no strict laws concerning mental events.  Given such a view of human
behavior, there is no strict causal connection between the inputs and the
outputs of the CR.  This is quite contrary to the situation of an
electronic computer.  Searle's homunculus strengthens the intuition that
the program is pure abstract symbolism, at the cost of distancing his
example from his target.  The homunculus is just a straw man.

>--
>Jordan Pollack                            Assistant Professor
>CIS Dept/OSU                              Laboratory for AI Research
>2036 Neil Ave                             Email: pollack@cis.ohio-state.edu
>Columbus, OH 43210                        Fax/Phone: (614) 292-4890

Thanks for a very interesting contribution!