[net.philosophy] mind vs. brain

dmcanzi@watdcsu.UUCP (David Canzi) (10/26/85)

Let me suggest the following analogy: mind is to brain as digestion is
to stomach.  (Somebody else recently used this analogy in a net
article.  I'm borrowing it.)  Nobody ever talks of digestion/stomach
dualism.  Nobody ever wonders whether digestion is just a function
performed by the stomach, or whether digestion exists, perhaps,
somewhere outside of physical reality, on some "digestive plane" of
existence.  Nobody ever writes articles claiming that "no machine can
produce digestion".  Why not?

Because everybody understands that the word "digestion" is a word we
use to describe activities performed by the stomach, ie. the stomach
digests food.  This is made easy to understand by the fact that, for
the noun "digestion" denoting activities, there is a handy verb,
"digest", which denotes the same activities.

The noun "mind", similarly, denotes some activities performed by the
brain.  Unfortunately, our language doesn't provide us with a handy
verb to denote these activities, so many people tend to sucked into
thinking of "the" mind as a "thing".  So people waste millions of hours
wondering if "the" mind exists, where does "it" exist, etc.

The important thing to remember is that the lack of a handy verb to go
with the word "mind" is a feature of the language with which we attempt
to describe reality, and doesn't imply anything about the reality we
are attempting to describe.
-- 
David Canzi, an entirely physical phenomenon.

mangoe@umcp-cs.UUCP (Charley Wingate) (10/28/85)

David Canzi suggests the following analogy:

  mind:brain::digestion:stomach

and then goes on to say that no one talks about the digestion/stomach
dichotomy.

But I think another analogy fits at lest as well, if not better:

    brain:mind::hardware:software

or
  
    brain:mind::hardware:process

and people DO talk about the dichotomies implied by the latter pair.  SO I
see no reason to consider the implications of David's analogy until he gives
some evidence that we should choose it over some other one.

I might also add that a reason people don't worry about distiguishing
stomach from digestion is that the construction of artificial stomachs is
neither a pressing nor an interesting issue at this time.

Charley Wingate

ellis@spar.UUCP (Michael Ellis) (10/28/85)

>Let me suggest the following analogy: mind is to brain as digestion is
>to stomach.  (Somebody else recently used this analogy in a net
>article.  I'm borrowing it.)  Nobody ever talks of digestion/stomach
>dualism.  Nobody ever wonders whether digestion is just a function
>performed by the stomach, or whether digestion exists, perhaps,
>somewhere outside of physical reality, on some "digestive plane" of
>existence.  Nobody ever writes articles claiming that "no machine can
>produce digestion".  - David Canzi

    The analogy [mind:brain::digestion:stomach] was an earlier paraphrase by
    me from Searle, who indeed claims that a machine can produce mind,
    provided that machine has the causal properties of a brain.
    
    Where Searle diverges from strong AI is that he claims that mental
    states THEMSELVES are physical entities -- here he makes another
    analogy, between minds (not brains) and hands:
    
        My own approach to mental states and events has been totally
	realistic in the sense that I think there are such things as
	intrinsic mental phenomena that cannot be reduced to something else
	or eliminated by some kind of re-definition. There really are
	pains, tickles, itches, beliefs, fears, hopes, desires, perceptual
	experiences, experiences of acting, thoughts, feelings, and all the
	rest...

        If one were doing a study of hands or kidneys or of the heart,
	one would simply assume the existence of the structures in question,
	and then get on with the study of their structure and function.

	No one would think of saying "Having a hand is just being disposed
	to certain sorts of behavior such as grasping" (manual behaviorism),
	or "Hands can be defined entirely in terms of their causes and effects
	(manual functionalism), or "For a system to have a hand is just for
	it to be in a certain computer state with the right sorts of inputs
	and outputs" (manual Turing machine functionalism), or "Saying that 
	a system has hands is just adopting a certain stance towards it"
	(the manual stance).
	
	How, then, are we to explain the fact that philosophers have said
	apparently strange things about the mental? An adequate answer to
	that question would trace the history of philosophy of mind since
	Descartes.. My brief diagnosis of the persistent anti-mentalistic
	tendency in recent analytical philosophy is that it is largely based
	on the tacit assumption that, unless there is some way to eliminate
	mental phenomena.. we will be left with a class of entities that lies 
	outside the realm of serious science and with an impossible
	problem of relating these entities to the real world of physical
	objects. We will be left, in short, with all the incoherence of
	Cartsesian Dualism..
	
	On my account, mental states are as real as any other biological
	phenomena..
	
	-John Searle, "Intentionality", Cambridge U. Press (1983)

    As we read in an interesting recent article from Rich Rosen, Searle's
    position has provoked angry responses from those, like Hofstadter, who
    lean towards the strong-AI thesis that mental states can be created by
    Turing machines. Hofstadter does not like Searle's position at all:
    
	This religious diatribe against AI, masquerading as a serious
	scientific argument, is one of the wrongest, most infuriating
	articles I have ever read in my life.. it seems to me that what
	Searle and I have is, at the deepest level, a religious
	disagreement. - Hofstadter on Searle's "Minds, Brains, Programs"

    Hofstadter is often fascinating, but some of his ideas, for me, require
    a huge leap of faith. Somehow, the undeniable reality of mental
    experience emerges from a program of suitable formal complexity!
    Neither Searle nor Hofstadter are hardly the last word on the
    `mind/brain question', of course.  
    
    Those interested in philosophy of mind might also enjoy the perspective
    of Richard Rorty (an eliminative materialist like Paul Feyerabend).
    Rorty argues, among many things, that theories of the mind as an
    intentional network have lumped {feelings,sensations} together with
    {beliefs, intentions} in an ad hoc fashion. In this view, Searle may be
    guilty of kluging sensation into his intentional theory by
    gerrymandering painn, for instance, into a belief that one's tissues
    have been damaged, thus unwittingly recasting mind-body duality into a
    sensation-belief duality. Perhaps I have misunderstood, especially since
    what I know of Rorty's ideas preceded Searle's work by at least a year.
    
    Rorty's major work is `Philosophy and the Mirror of Nature', where he
    reviews the historical invention of western concept of  mind (as an
    `inner eye') by the Greeks, thru Descartes, Kant, and Wittgenstein,
    and concludes with an attack on epistemology. Rorty quite possibly
    smashes the mirror...
    
	Feelings are just appearances. Their reality is exhausted in how
	they seem. They are pure seemings. Anything that is not a seeming
	(putting the intentional to the side for the moment) is merely
	physical -- that is, it is something that can appear other than it
	is. The world comes divided in things whose nature is exhausted by
	how they appear and things whose nature is not.
    
	That special sort of subject.. whose appearance IS its reality --
	(eg) phenomenal pain -- turns out to be simply the painfulness of
	the pain abstracted from the person having the pain. To put
	it oxymoronically, mental particulars, unlike mental states of
	people, turn out to be universals...
	
	It turns out, in other words, that the universal-particular
	distinction is the only metaphysical distinction we have..  The
	mental-physical distinction then is parasitic on the
	universal-particular distinction.  Further, the notion that
	mind-stuff as that out of which pains and beliefs are made makes
	exactly as much sense as the notion of `that out of which
	universals are made'. The battle between realists and conceptualists
	over thew status of universals is thus empty save that it is made
	of whatever universals are made of..
	
	We simply lift off a single property from something (the property of
	being red, or painful, or good) and then treat it as if it were the
	a subject of a predication and perhaps a locus of causal efficacy.
	
	-Richard Rorty, "Philosophy and the Mirror of Nature" (1979)

    The most anarchic yet somehow thoroughly sensible position, in my
    estimation, comes from the wild Paul Feyerabend, who has persistently
    supported eliminative materialism. Feyerabend, however, somehow
    reconciles materialism with his call for cultural pluralism -- after
    assaulting all the major anti-materialist arguments, he preceeds to
    encourage a materialist outlook that harmonizes with commonsense (or
    even spiritual) reality.
    
    The quotes below are taken from his "Materialism and the Mind-body
    problem". I apologize for the discontinuites induced by my severe
    editing of his arguments [my modifications in brackets]:
    
	The crudest form of materialism will be taken as the basis of
	argument.. A simple atomism such as the theory of Democritos will be
	sufficient for our purpose...
	    
	The first question that arises with the question about [the
	incorrigible certainty of mental experience] concerns the source of
	this certainty concerning mental processes. The answer is very
	simple: it is their lack of content which is the source of their
	certainty..  [as opposed to] statements about physical objects
	[which] possess very rich content...

	A new theory of pains will not change the pains.. It will change the
	meaning of "I am in pain". The causal connection between the
	production of a `mental' sentence and its `mental' antecedent is
	very strong. It is learned very early in life. It is the basis for
	all observations concerning the mind.. The connections [between
	meaning and reference] change all the time anyway. It is much more
	sensible to establish a one to one connection between observational
	terms and their causal antecedents, than between such words and
	their always variable meanings. This procedure has great benefits
	and can do no harm.. But it should not be used to turn intelligent
	people into nervous wrecks...
	        
	Materialism  (and for that matter objective spititualism like the
	Egyptian theory of BA..) recognizes [that mentalistic `facts' are
	peculiarities of spoken language] and suggests that [language] be
	altered... 
	    
	There is therefore, not a single reason why the attempt to give a
	purely physiological account of human beings should be abandoned, or
	why physiologists should leave `soul' out of their considerations...
	    
        [added 1980]

	On the other hand, it must not be admitted that the overthrow of an
	entire worldview, including the most familiar assumptions, can be
	stopped by the decision to make commonsense (and the views of man it
	contains) an essential part of any form of knowledge. Such a
	decision was made by Aristotle, and, much later, by Niels Bohr, in
	his interpretation of the quantum theory.  

-michael

pmd@cbsck.UUCP (Paul M. Dubuc) (10/28/85)

There is another useful analogy on the distinction between "mind"
and "brain".  Donald M. MacKay (a British scientist) makes it in his
book _Science and the Quest for Meaning_ (Eerdmans 1982).  He describes
the "bogey of determinism" (in this area) as arising from a confusion
of levels at which the operation of the mind is described.  The analogy
he uses deals with a computer.

When a computer is used to compute our income tax and we ask the operator
how the computer functioned in doing so, she would take certain pains
to show that the calculations were done according to certain laws of
arithmetic and income tax laws based on your income bracket, etc.
Ask and engineer who designed the computer the same question, however
and you might be shown the computer's internal workings and be given
a description of how the computer works according to the laws of
physics operating on the mass of transistors and copper that make up
the "brain" of the computer.  Both descriptions are accurate at their
own levels and the fact that all the calculations of the computer
have some physical/mechanical representation does not take away from
the meaning of those actions implied by the operators description of
the computer's calculations.  To say that it does confuses levels of
description.  It is like saying that the words appearing on your CRT
can be explained by the computer's instructions to light up certain
dots on the screen and that they have no significance beyong that--
they're "nothing but" dots on your screen.

MacKay goes into the problem of "brain" vs. "mind" a little more.  I'll
append the following quote from the book cited above (p. 25):

    Assume for the sake of argument that all that you believe and know
    and feel and think is represented in some sense by the physical
    configuration of your brain.  I have shown elsewhere (for example,
    in _Brains, Machines and Persons_) that even if that configuration
    were fully mechanistic in its workings, no complete specification
    of your brain would exist that you would be bound to accept as
    inevitable, unless the specification were of its past.  [This
    possibility would have no bering on the free will question since
    that is exercised in the present (or "immediate future"). -- PMD]
    In particular, no complete specification of the immediate future of
    your brain could have an unconditional claim on your assent, even if
    your brain were as mechanical as the workings of a clock.

    The reason for this, of course, is that if all your mental processes
    are represented in your brain, then no change can take place in
    any mental state of yours without a change also taking place in
    the physical state of your brain.  And therefore the validity of
    any complete specification of your brain depends on whether or not
    you believe it.  So it doesn't have an unconditional claim to your
    assent.  No matter how clever I might be in preparing the specification,
    it can't be equally correct whether or not you believe it.  If I
    accurately describe the make-up of your brain at the moment you
    read or hear my description, it will be incorrect by the time you've
    accepted its truth, because the simple act of acquiescence will have
    changed your brain's make-up.  Alternatively, if I can allow for 
    the effects of your believing the description, so as to produce one
    that you would be correct to believe, then its correctness will *depend*
    on your believing it--so you would not be mistaken to disbelieve it!

    In that sense, then, the immediate future or your brain is open to *you*,
    undetermined for *you*, and would remain so even if the physical elements
    making up your brain were as determinate as the solar system. ...

The book MacKay refers to does go into  more detail on this and is also
well worth reading.  _Brains, Machines and Persons_ is also published by
Eerdmans (1980).
-- 

Paul Dubuc 	cbsck!pmd

throopw@rtp47.UUCP (Wayne Throop) (10/28/85)

> Let me suggest the following analogy: mind is to brain as digestion is
> to stomach.  [...]  for the noun "digestion" denoting activities,
> there is a handy verb, "digest", which denotes the same activities.
> The noun "mind", similarly, denotes some activities performed by the
> brain.  Unfortunately, our language doesn't provide us with a handy
> verb to denote these activities, so many people tend to sucked into
> thinking of "the" mind as a "thing".
>               David Canzi, an entirely physical phenomenon.

I think the verb you are looking for is "to think".

stomach, intestines;  digestion;  digest
brain, nervous system;  mind, cognition, "mentation";  think, "mentate"

I think all the parts of speech are there, it's just that

 1) the forms for mental activity (the ones in common use anyhow) are
    irregular, while those for gastric activity are (relatively)
    regular.
 2) There is a common, pre-existing notion that "the mind" is a "thing"
    rather than a process.

I don't, however, think that this notion "caused by" language.  I think
that the language simply reflects the deep-rooted notion.  To a thinker,
the mind simply seems more *thing-like* than the digestion, and this
seeming is reflected in the language.

I will also note that, in English, almost any noun can be verbed, so
there is no bar to the *notion* of "mind" as a verb.

--
"You find sometimes that a Thing which seemed very
 Thingish inside you is quite different when it gets
 into the open and has other people looking at it."
              Winnie-the-Pooh.
-- 
Wayne Throop at Data General, RTP, NC
<the-known-world>!mcnc!rti-sel!rtp47!throopw

franka@mmintl.UUCP (Frank Adams) (10/30/85)

In article <1794@watdcsu.UUCP> dmcanzi@watdcsu.UUCP (David Canzi) writes:
>Let me suggest the following analogy: mind is to brain as digestion is
>to stomach.  [...] Nobody ever writes articles claiming that "no machine can
>produce digestion".  Why not?
>
>Because everybody understands that the word "digestion" is a word we
>use to describe activities performed by the stomach, ie. the stomach
>digests food.  This is made easy to understand by the fact that, for
>the noun "digestion" denoting activities, there is a handy verb,
>"digest", which denotes the same activities.
>
>The noun "mind", similarly, denotes some activities performed by the
>brain.  Unfortunately, our language doesn't provide us with a handy
>verb to denote these activities, so many people tend to sucked into
>thinking of "the" mind as a "thing".  So people waste millions of hours
>wondering if "the" mind exists, where does "it" exist, etc.

Yes, there is such a word.  That word is "think".  So we really have:
thinking is to brain as digestion is to stomach.  It is true that there
is no word analogous to mind for the stomach.

Your stomach performs certain chemical reactions, to convert food into
chemicals directly usable by your body.  From this, it easy to see what
parts of what your stomach does can be abstracted from your actual stomach,
and what cannot.  Specifically, the nature of the reactions can be
abstracted; the fact that they are being performed on the food you ate,
to provide chemicals for your use, cannot.

The brain performs certain computations, to make decisions for actions to
be taken by your body.  The computations can be abstracted; the fact that
the decisions are instantiated in your body cannot.

Now there are two separate questions here.  One is, is the Turing machine/
computer model strong enough to represent the computations performed by
your brain?  On this question, no proof is available.  Personally, I believe
it is.

The second is, if you successfully abstract the computations done by your
brain into a computer or some other machine, does that machine think?  Is
it you?

First of all, I would assert that a "successful" abstraction would respond
like you, modulo the differences in the sensory information and action
modes available to it.  Anything less than this is not a success.

So, given a machine which responds "like" a person, what grounds can there
be for denying that that machine thinks?  I can see none, except for
mystical concepts like "souls".  (In particular, I don't understand what
an "intentional state" is supposed to be.)

As to whether that machine is "you": as I have argued elsewhere, barring
mysticism, this is a debate about definitions.  I think it is more consistent
with the common usage to answer yes.

>David Canzi, an entirely physical phenomenon.

Frank Adams                           ihpn4!philabs!pwa-b!mmintl!franka
Multimate International    52 Oakland Ave North    E. Hartford, CT 06108
(Also an entirely physical phenomenon.)

mangoe@umcp-cs.UUCP (Charley Wingate) (11/20/85)

In article <1884@watdcsu.UUCP> dmcanzi@watdcsu.UUCP (David Canzi) writes:

>>But I think another analogy fits at lest as well, if not better:
>>    brain:mind::hardware:software
>>or
>>    brain:mind::hardware:process
>>and people DO talk about the dichotomies implied by the latter pair. 

>There's an important difference between the brain:mind::hardware:software
>analogy and the brain:mind::hardware:process analogy that you cite as
>if they were the same thing.  A process is performed by a computer.
>Software, on the other hand, is a *description* of a process for the
>computer to perform.  The computer performs processes by interpreting
>(ie. running) software.

The reason why I gave both analogies is not that I think they are identical,
but that that on or the other was The Analogy.  And while I'll agree to the
distinction between software and process, I think the last sentence should
be essentially reversed.  Processes within the computer are the dynamic
realization of the software, which is essentially a static thing.

>There is no such thing as software.  We can rephrase any statement
>about software to eliminate all references to software as a thing.  For
>example, "this software is portable" can mean something like "Existing
>equipment can detect the magnetic fields near the surface of this disk,
>and magnetize other disks so that the other disks can cause different
>computers to behave similarly."

But these two statements are in fact NOT equivalent; one can replace the
second statement with "This series of symbols will cause the same behavior
regardless of what system it is entered into."  This bears the the same
relationship with the first statement as the orignal second statement did,
namely, that of instantiation.  But the immediate meaning of the this new
statement is not identical to the orignal second statement.

There is in fact a certain level of abstraction on which the notion of
software (or programs, or a number of constructs of equivalent abstraction)
is necessary.  Consider the following statement:

  "This software to implement a relational database is written in Pascal."

Now if you attempt to avoid the word "software" and talk about patterns on
a disk, you will be making an untruth.  The patterns on the disk are
irrelevant except as a representation, and the most immediate representation
is that the disk's file system, not of "software".  So you have to back away
from patterns on a disk and abstract them to files.  Then again, at some
point the software was at least partially written on paper, and there is
therefore another level of abstraction necessary: that of abstract
characters.  But not all patterns of characters are software.  Both of the
two statements (about what the software does and what language it is written
in) refer only to software; therefore, these have to be rewritten to
eliminate such references too.  But the rub is that once you get rid of the
semantics of Pascal it ceases to be a Programming Language-- and you can't
get rid of the abstract operational implications of a "relational database"
at all.  Its entire content is abstracted to the user level!

It seems to me therefore that software IS a necessary concept with respect
to computers.  It describes the realization of programmer intent in the form
of what to the programmer are abstract operations (i.e. they are not
patterns of electrical charge or the like) but which the computer can (by an
appropriate representation and transformation) actually execute and produce
the desired (abstract) behavior.

It should be quite clear from this that I now think that software is
therefore an inappropriate concept with respect to the mind and brain,
because it presupposes the realization of this level of abstraction (even if
we punt on the clear intentionality implied here by assigning it to
"evolutionary processes").

>Some statements containing the word "mind" can be rephrased to eliminate
>the use of any noun equivalent to "mind".  (For example: "That's what's
>on my mind" can be rephrased as "That's what I'm thinking about.")  Some
>statements containing the word "mind" can't be rephrased this way.

Well, it seems to me that commonly there are two different levels on which
the concept of mind is used.  On the lower level, the mind is simply the
dynamic state of the brain.  If one realizes that one is working at this
level, then the phrase and the word can be exchanged with impunity.  The
other level, that of mental abstractions, is much more troublesome.  Both of
the statements above occur at this level, and it is quite important NOT to
confuse this with the lower level.

>Some people use arguments in which the word "mind" plays an important
>part to prove some point.  If attempting to rephrase such an argument
>to remove the use of the word "mind" as a noun destroys the argument's
>validity, then the argument was dependent on the unstated assumption
>that "the" mind is a "thing".  If I don't accept this unstated
>assumption, then I am not forced to accept the conclusion.

>For example, John Searle's attempt to prove that one must be a dualist
>to believe that "strong AI" is possible:

>    Unless you believe that the mind is separable from the brain both
>    conceptually and empirically -- dualism in a strong form -- you
>    cannot hope to reproduce the mental by writing and running programs
>    since programs must be independent of brains or any other
>    particular forms of instantation.

>The phrase "the mind is separable from the brain" can be rephrased as
>"some other system can be programmed to behave like the brain", a
>description which doesn't resemble dualism at all.

No, that's NOT what his statement says.  If you want to get rid of the mind
there, you have to rephrase it as

 "For all brains X, some other system can be induced to produce the same
  abstract behavior as brain X."

It should be apparent that this is so close to the notion of software that I
described above that it is essentially the same as the statement

 "Software can be described for the brain."

(Written in NEUROSYS, of course :-)

It's certainly indisputable that strong AI asserts the last.  It's also
evident that if one relaxes the constraints and forgets about the *abstract*
behavior, one can readily model the brain simply by bulding a machine
comprised of a network of eloctronically simulated neurons, connected
together with appropriate sensors and mechanical manipulators.  But such a
machine isn't what is currently desired; the *abstract* behavior is
*exactly* what AI people are setting out to model.  So they certainly DO
believe in this dualism!

Charley Wingate

franka@mmintl.UUCP (Frank Adams) (12/04/85)

In article <2274@umcp-cs.UUCP> mangoe@umcp-cs.UUCP (Charley Wingate) writes:
>It should be quite clear from this that I now think that software is
>therefore an inappropriate concept with respect to the mind and brain,
>because it presupposes the realization of this level of abstraction (even if
>we punt on the clear intentionality implied here by assigning it to
>"evolutionary processes").

Whoa!  I agreed with you up to here, but I do not think software is an
inappropriate concept with respect to mind and brain.  The software is
a static description of the process.  I don't see that there need be
a realization of this level of abstraction for the concept to be valid.
In other words, a post facto description would be as much software as
a prescription.

I don't think intentionality enters into the picture at all.  The
intentionality is in the writing of the software, and not in the software
itself.

I will agree that the correct analogy is
   brain:mind::computer:process,
not
   brain:mind::computer:software.

Frank Adams                           ihpn4!philabs!pwa-b!mmintl!franka
Multimate International    52 Oakland Ave North    E. Hartford, CT 06108

franka@mmintl.UUCP (Frank Adams) (12/11/85)

In article <2452@umcp-cs.UUCP> mangoe@umcp-cs.UUCP (Charley Wingate) writes:
[Concerning the applicability of the concept of software to the brain]
>Well, OK, except that it's rather dubious whether the brain has a static
>state which is analogous to stored programs (which is is kind of abstraction
>which software is).  It's entirely likely that the neural structure of the
>brain is at least as important as the "programs" which they "store", and it
>is a fact that the structure is itself dynamic.  That's what I don't like
>about the software paradigm; it's really too restricting given the great
>ignorance we now have.  I would prefer not to try to characterize the kinds
>of abstraction of process and program needed until there's abetter basis for
>understanding.

In some sense, there has to be such a description.  Any "dynamic" description
can be converted into a "static" description.  In other words, no character-
ization of the kinds of abstraction is taking place.

Specifically, I think you make a mistake in thinking that the changes to
the  neural structure is outside the bounds of what a software description
might specify.  The whole point of software is that when executed, it affects
the hardware.  I don't think the nature of those effects is presupposed in
the concept.

Frank Adams                           ihpn4!philabs!pwa-b!mmintl!franka
Multimate International    52 Oakland Ave North    E. Hartford, CT 06108

mangoe@umcp-cs.UUCP (Charley Wingate) (12/16/85)

In article <866@mmintl.UUCP> franka@mmintl.UUCP (Frank Adams) writes:

[With reference to my objection to people refering to the brain's "software"]

>Specifically, I think you make a mistake in thinking that the changes to
>the  neural structure is outside the bounds of what a software description
>might specify.  The whole point of software is that when executed, it affects
>the hardware.  I don't think the nature of those effects is presupposed in
>the concept.

Fine.  Does this really mean that every time I see the word software used
with respect to the brain, that I can add "The brain's software is not
necessarily anything like computer software as we know it"?

Charley Wingate

franka@mmintl.UUCP (Frank Adams) (12/19/85)

In article <2579@umcp-cs.UUCP> mangoe@umcp-cs.UUCP (Charley Wingate) writes:
>In article <866@mmintl.UUCP> franka@mmintl.UUCP (Frank Adams) writes:
>
>[With reference to my objection to people refering to the brain's "software"]
>
>>Specifically, I think you make a mistake in thinking that the changes to
>>the  neural structure is outside the bounds of what a software description
>>might specify.  The whole point of software is that when executed, it affects
>>the hardware.  I don't think the nature of those effects is presupposed in
>>the concept.
>
>Fine.  Does this really mean that every time I see the word software used
>with respect to the brain, that I can add "The brain's software is not
>necessarily anything like computer software as we know it"?

Every time is perhaps a bit much.  It wouldn't hurt to remind people of it
every once in a while, particularly if they seem to be assuming it is.
So far, I have seen little evidence that the people arguing for AI in this
group are doing so.

Frank Adams                           ihpn4!philabs!pwa-b!mmintl!franka
Multimate International    52 Oakland Ave North    E. Hartford, CT 06108