[comp.ai] Consciousness?

mmt@dciem.UUCP (Martin Taylor) (02/04/87)

(Moved from mod.ai)
> I always thought that a scientific theory had to undergo a number of
> tests to determine how "good" it is.  Needless to say, a perfect score
> on one test may be balanced by a mediocre score on another test.  Some
> useful tests are:
> 
> - Does the theory account for the data?
> - Is the theory simple?  Are there unnecessary superfluousities?
> - Is the theory useful?  Does it provide the basis for a fruitful
>         program of research?
All true, and probably all necessary.
> ....
> While the study of consciousness is fascinating and lies at the base of
> numerous religions, it doesn't seem to be scientifically useful.  Do I
> rewrite my code because the machine is conscious or because it is
> getting the wrong answer?
If you CAN write your code without demanding your machine be conscious,
then you don't need consciousness to write your code.  But if you want
to construct a system that can, for example, darn socks or write a fine
sonata, you should probably (for now) write your code with the assumption
of consciousness in the executing machine.

In other words, you are confusing the unnecessary introduction of
consciousness into a system wherein you know all the working principles
with the question of whether consciousness is required for certain
functions.
>       Is there a program of experimentation
> suggested by the search for consciousness?
Consciousness need not be sought.  You experience it (I presume).  The
question is whether behaviour can better (by the tests you present above)
be described by including consciousness or by not including it.  If, by
"the search for consciousness" you mean the search for a useful definition
of consciousness, I'll let others answer that question.
> Does consciousness change the way artificial intelligence must be
> programmed?  The evidence so far says NO.  [How is that for a baldfaced
> assertion?
Pretty good.  But for reasons stated above, it's irrelevant if you start
with the idea that AI must be programmed in a silicon (i.e. constructed)
machine.  Any such development precludes the necessity of using consciousness
in the design, although it does not preclude the possibility that the
end product might BE conscious.
> 
> 
> I don't think scientific theories of consciousness are incorrect, I
> think they are barren.
Now THAT's a bald assertion. Barren for what purpose?  Certainly for
construction purposes, but perhaps not for understanding what evolved
organisms do. (I take no stand on whether consciousness is in fact a
useful construct.  I only want to point out that it has potential for
being useful, even though not in devising artificial constructs).
> 
>                                         Seth
-- 

Martin Taylor
{allegra,linus,ihnp4,floyd,ubc-vision}!utzoo!dciem!mmt
{uw-beaver,qucis,watmath}!utcsri!dciem!mmt
mmt@zorac.arpa