govern (02/09/83)
(Long article, as might be expected from the title) Prem Devanbu's original article had three points which interested me, so here is more grist for the mill. 1) Consciousness in another being is essentially unprovable, (in the dualistic-logical sense of "proof".) (I'll buy that.....unless some really Mind-boggling counter-argument comes along.) (However, this was only a side point in the discussion.) 2a) Consciousness is essentially monistic / non-dualistic, and 2b) this implies 1). 2a) This worldview fits in well with Eastern forms of philosophy/religion, but that does not mean that we can take it as an axiom, even if it concurs with other observations. postnews -n net.religion <<! It is one (of several) philosophical explanations for the subjective experiences of meditation, but it also fails to explain some of the subjective experiences of Christianity, as well as conflicting with the traditional explanations of the hisctorical events experienced by the very early Jews and the early Christians. ! 2b) Monistic worldviews tend to include the concept that Consciousness pervades everthing, which makes the consciousness of an individual entity a moot point. 3) Because a machine's consciousness can be clearly analyzed (in the sense of dualistic logic or mathematics), it must not be REAL consciousness, which cannot be thus analyzed. (This appeared to be the main point, but I've expressed it rather fuzzily.) 3) Even supposing the premises 1) and 2) to be true, this conclusion is somewhat flawed. The human brain is basically a machine subject to dualistic analysis in the same way that a man-made machine is. It is a physical entity with distinct states (most of which are probably continuous rather than discrete, but nonetheless subject to dualistic description.) Its behavior is poorly understood, at best, and its complexity is orders of magnitude beyond current technology's abilities, but that doesn't change the basic dualism of the situation. (Yes, I *am* sidestepping the questions of "does brain=mind", "what is mind (or Mind)", and "how does the mind push the brain around".) The relationships of consciousness and machines apply to the human brain (as a machine) just as well as to man-made machines, and most of us believe in the existence of (at least our own) consciousness. If consciousness is strictly a physical thing, (i.e. no soul or mind of a non-physical nature) (which I don't believe to be the case), then AI must deal with the questions of "When is a machine complex enough to sustain consciousness", and "Can we build something that complex, and if so, how?". Clearly the human brain is a complex enough machine, and the PDP 11/70 isn't. If consciousness is non-physical, whether in the Eastern sense of all-pervasive somethingness, or in the Western sense of (dualistic) individual entities, the same questions apply, since we don't understand how a physical object (i.e. brain) gets its Consciousness attached to it, and we have no way of knowing that a sufficiently complex machine, at some point in its construction, might or might not obtain consciousness in the same way. (Admittedly, this is a little more far-fetched in the "non-physical" case than in the "physical", but then, God (or The Universe) always does things His (or Its) way whether we understand it or not.) Bill Stewart (hoscf!bill)
mjs (02/09/83)
One point in your treatise struck me as unfounded (but not by much). You state that the human brain is capable of sustaining consciousness (no argument, mostly), but that a PDP-11/70 clearly is not. Well, it ain't all that clear. How many CPU hours (perhaps the proper measure is Gigainstructions?) have all the 11/70's in the world executed? How many Gigainstructions must a human execute after birth before it reaches consciousness? And what is the relative power of a human "instruction" vs. a PDP-11 instruction? As long as I'm playing Devil's advocate, can anyone prove that a Turing machine is capable (or incapable) of consciousness?