[net.ai] Camus never believed in Robots.

pd (01/31/83)

I would like to make a case/arguement against the feasiblity of
constructing a machine that exhibits conciousness/idenitity. The
frame work of the argument will be as follows:

1) Conciousness is a non-linguistic, non-mathematical experience.

2) Thus, one cannot convey the nature of one's experience of conciousness
   to another (by mathematical or linguistic means).

3) Thus one can never disprove or prove the existence one's own or
   any one else's conciousness thro mathematical or linguistic
   arguments.
   
4) Since one cannot mathematically model, or linguistically communicate
   one's experience of conciousness to another, it is impossible to
   build a machine that has conciousness. Furthermore, one can never
   prove or disprove that a machine has conciousness.

I will only defend statement 1) above as follows: 
Any statement one makes about oneself has an active and a passive: 
(eg) consider the statement "'I' am a good person"
'I' makes a descriptive statement about him/her/itself. This duality
will always be the case, what ever the statement the entity makes
about itherimself. Itheshe may make meta, philosphical, recursively
defining statements, but this will still prevail. (Go ahead,
try it). Trying to do otherwise will be like chasing one's own tail.
One's experience of one's own identity or conciousness is
essentially uncommunicable, since all mathematical and linguistic
descriptions/models are dualistic, whereas Conciousness is not. It
is a monadic experience of oneself.

Above is a justification of 1); hence 2), 3), and 4)

Any takers ?

Prem Devanbu

ka (02/01/83)

     4) Since one cannot mathematically model, or linguistically communicate
	one's experience of conciousness to another, it is impossible to
	build a machine that has conciousness. Furthermore, one can never
	prove or disprove that a machine has conciousness.

The two sentences above are contradictory.

Kenneth Almquist

neiman (02/14/83)

Let me ramble for a moment---

Global statements like "It is impossible to define conciousness"
bother me.  "It is currently impossible to describe a conciousness exactly"
would be more accurate.  

"It is impossible to define conciousness because it is non-linguistic and
non-mathematical."  So...do what any other scientists do when they are
dealing with a concept too hairy to be defined explictly; create a model
with enough simplifying assumptions so that it can be represented.


A concious machine need not be aware in all the ways that a human being 
is aware in order to be concious.  I would define conciousness ( were I not
afraid to boldly go, etc) as the ability to examine one's own motivations
and internal states.  I am hungry, I am depressed, I am performing this
action to obtain this result.  


Current computer systems do not have this capability; their
instruction set/program is not available for examination and/or modification.
A rule-based or script-driven AI program can examine its state and is
therefore "more" concious then its predecessors.  


A Gedanken experiment:


  Suppose it were possible to take a human mind and copy it atom for atom
  so that you have two identical pieces of wetware.  

  Would you have created another conciousness?  Probably not, what you 
  would have is one mind, and one fairly useless pile of organic matter.
  The difference would be analogous to the difference between a running
  VAX and a VAX taken down for maintenaince (their normal state).

  Suppose you were to repeat the experiment a little more carefully storing
  potentials also so that the state of the created object is *exactly*
  the same as that of the original.  Is this concious?  Is it an artficial 
  intelligence?

  a.  Don't say that this is impossible.  Brains are being created all the 
      time.  I suspect that a (fearfully advanced) fabrication device
      could do the job as well as any genetic mechanism.

  b.  A better argument might be that the construction was done without any 
      real understanding.  Well, yes, but once we've proved that conciousness
      can be achieved by a duplication of state, then the device which
      records that state is immaterial.



One more random thought...


Evolution took three billion years of fumbling in the dark to make something
intelligent enough to be cocky about it.  Computers have been around for 
about thirty years of directed evolution and have already gotten to a point
where the hairless apes are getting nervous.  My guess is that anyone who 
turns off a model-year 2083 IBM/DEC/XXX  will probably get thirty years to life.



					dann
                                         (who, like cognitive scientists, 
					       ought to know better)