cugini@icst-ecf.UUCP.UUCP (02/24/87)
> > Me: The Big Question: Is your brain more similar to mine than either > > is to any plausible silicon-based device? > > SH: that's not the big question, at least not mine. Mine is "How does the > mind work?" To answer that, you need a functional theory of how the > mind works, you need a way of testing whether the theory works, and > you need a way of deciding whether a device implemented according to > the theory has a mind. > Cugini keeps focusing on the usefulness of "presence of `brain'" > as evidence for the possession of a mind. But in the absence of a > functional theory of the brain, its superficial appearance hardly > helps in constructing and testing a functional theory of the mind. > > Another way of putting it is that I'm concerned with a specific > scientific (bioengineering) problem, not an exobiological one ("Does this > alien have a mind?"), nor a sci-fi one ("Does this fictitious robot > have a mind?"), nor a clinical one ("Does this comatose patient or > anencephalic have a mind?"), nor even the informal, daily folk-psychological > one ("Does this thing I'm interacting with have a mind?"). I'm only > concerned with functional theories about how the mind works. How about the epistemological one (philosophical words sound so, so... *dignified*): Are we justified in believing that others have minds/consciousness, and if so, on what rational basis? I thought that was the issue we (you and I) were mostly talking about. (I have the feeling you're switching the issue.) Whether detailed brain knowledge will be terribily relevant to building a functional theory of the mind, I don't know. As you say, it's a question of the level of simulation. My hunch is that the chemistry and low-level structure of the brain are tied very closely to consciousness, simpliciter. I suspect that the ability to see red, etc (good ole C-1) will require neurons. (I take this to be the point of Searle's remark somewhere or other that consciousness without a brain is as likely as lactation without mammary glands). On the other hand, integer addition clearly is implementable without wetware. But even if a brain isn't necessary for consciousness, it's still good strong evidence for it, as long as one accepts the notion that brains form a "natural kind" (like stars, gold, electrons, light switches). As I'm sure you know, there's a big philosophical problem with natural kinds, struggled with by philosophers from Plato to Goodman. My point was that it's no objection to brain-as-evidence to drag in the natural-kinds problem, because that is not unique to the issue of other minds. And it seems to me that's what you are (were?) guilty of when you challenge the premise that our brains are relevantly similar, the point being that if they are similar, then the my-brain-causes-consciousness-therefore-so-does-yours reasoning goes through. John Cugini <Cugini@icst-ecf> ------