rlr@pyuxn.UUCP (Rich Rosen) (10/02/84)
> My point in that article had been to point out that behaviorist objections > to the term "mind" were based on overloading that term with connotations > that they don't like. Thus they assume -- falsely -- REPEAT: FALSELY -- > that "mind" implies "nonphysical", "soul", etc. [PAUL TOREK] This may be true. Others have proposed the notion that the "mind" is not a physical or extraphysical entity, but is analogous to a program (software) running on the hardware (brain). Remember though, that programs exist as load modules (physical entities). The mind, too, if it is to be thought of as the "program" which "runs" the brain, is also a physical entity, composed of the chemicals/etc. that make up the "mind" (just as a program running on a computer occupies physical space in memory for the instructions/data/etc.). >> Free will implies some agent of choice doing the choosing. No one is >> denying the swirling around in the brain. It's just that there is no agent >> that makes a "choice" as to how the chemicals will swirl. [RICH ROSEN] > Yes there is: THE SAME agent who is constituted by those chemicals. What an > agent chooses to do determines how the chemicals will swirl. The chemical- > swirling correlated with choice A is different from that associated with > choice B. Let me put it this way for you: the agent IS those chemicals; > or more precisely, is constitued by (or if the "mind as program" view is > correct, is instantiated by) those chemicals. [PAUL TOREK] This would mean that a program "chooses" to do certain things based on the "current" state of its "chemicals" (e.g., the CPU instruction address counter, the data in "memory" and input through sensory channels). Can the program "choose" arbitrarily to do something (like set fire to the disk drive, or cause the computer to explode----just like in the movies!!) or can it only make the "decision" it is programmed to make based on the contents of its "chemicals". As you say, the agent of decision making in the brain is in fact the chemicals, and the successive current states of the chemicals themselves (the ones that "make the decisions" causing other chemical/physical actions in the body) are DETERMINED by the same physical laws that govern action in rocks and trees! >> Given the capacity of the human brain to impose its >> own preconceived patterns on those things it does not fully understand, and >> to interpret based on that often faulty patterning (look at the evidence >> AGAINST viability of hypnotic recall), the subjective perspective simply >> isn't worth looking at from an analytical viewpoint. > Let's put it this way: if, contrary to fact but consistent with many people's > beliefs, the freedom of an action were a matter of its FEELING free, then the > subjective perspective would be valid. And there ARE some things for which > subjective (i.e., in the subject -- NOT equivalent to "biased",etc.) feelings > ARE decisive. For example, if I feel like I'm in pain, I AM in pain. And to > make matters worse (for behaviorism), being in pain IS IMPORTANT. And that is > what is wrong with spurning the subjective. This has nothing to do with the > free will issue, but I felt it was worth picking this bone too. This is akin to saying "if enough people believe there is a god, there is one". Yes, feelings are decisive. But when you feel like you are in pain even if there is no direct physical cause (nerves from a damaged area of the body sending a message to the brain constitutes a direct physical cause), the root cause may lie deeper; e.g., a psychological phenomenon [of course, manifested physically---a psychological problem is just a chemical problem with the "program" of "mind" rather than with the brain per se] ... a psychological phenomenon that physically results in a simulation of the direct physical symptom: the "mind" hooks into the hardware and sends a bogus message. It's wrong to say "the subjective isn't real". Within the brain it simulates real things happening, but remember it is only a simulation. Someone once asked "when you think about a duck, have you created a real duck?" or something like that. This assumes that the entirety of the essence of a physical object is YOUR sensory perception of it. A duck, or any object, is more than just your or my sensory perception of it; all we "see" or sense in our brains is a SIMulation of the STIMulation of brain centers by outside sensory input. Instead of being stimulated from the "outside", the mind simulates the stimulation from within. >> What "willed" the chemicals in your brains to move in a certain way to >> cause movement/action? And what "willed" whatever process in your brain >> that caused that chemical movement to start? > Again, the agent and the chemicals are one and the same. Viewed on one level, > we have an agent (me) making a decision; viewed on a lower (component) level, > we have certain chemical processes. To deny that there is an agent on the > ground that it is "just" a bunch of chemicals makes about as much sense as > denying that there is warmth in the room on the ground that there is "just" > a bunch of molecules moving around. Again, this is like saying that a program has free will. On the contrary, barring system errors, a program's functions are deterministic. You may ask "How can you say this when you don't know what external data will be present?" Well, that's the point. Based on the external data, the program makes a deterministic "decision", though not necessarily "pre"-determined or "designed" to occur in a certain way. A program, for example, may expect arithmetic data and get alphabetic data at some point, and the program may not be "designed" to handle it, but given the same set of external and internal variables it will always handle it in the same way! (The human brain seems more naturally prone to "system errors", but that resulting randomness also does not imply free will.) *Now* you might say, "that's a bogus restriction: saying that all the variables must be the same in order for it to be guaranteed to act in the same way". Well, that's what determinism is: given the same set of external and internal variables, things will act in the same way. You might say "but then, there's an agent of choice: the chemicals themselves 'choose' based on the external input data!" Sorry, again. The chemicals just do what they're supposed to do given a certain set of circumstances (external variables). Thus, it's the world around you, consisting of external variables input into the brain through sensory channels, that is the agent of your free will. And that's not free will at all, is it? -- AT THE TONE PLEASE LEAVE YOUR NAME AND NET ADDRESS. THANK YOU. Rich Rosen pyuxn!rlr
esk@wucs.UUCP (Eric Kaylor) (10/05/84)
From: rlr@pyuxn.UUCP (Rich Rosen) > [Some] have proposed the notion that the "mind" is not a physical > or extraphysical entity, but is analogous to a program (software) > running on the hardware (brain). Remember though, that programs exist as > load modules (physical entities). I would avoid saying that a program is a physical entity. It is more like *the functioning process* in that entity. The program is identical with certain mathematical properties of that process; it is not identical with any particular physical entity. But I should say that I do not believe in the "mind as program" view; my view is what is called in philosophy, "type- type materialism". See a recent issue of the journal *Synthese* for a defense of type-type materialism. >> Let me put it this way for you: the agent IS those chemicals; >> or more precisely, is constitued by (or if the "mind as program" view is >> correct, is instantiated by) those chemicals. [PAUL TOREK] > This would mean that a program "chooses" to do certain things based on the > "current" state of its "chemicals" (e.g., the CPU instruction address > counter, the data in "memory" and input through sensory channels). YES, IF the "mind as program" view is true, then SOME programs (Douglas Hofstadter (sp?) says "sufficiently complex" programs) do (can) choose. > Can the program "choose" arbitrarily to do something (like set fire to the > disk drive, or cause the computer to explode----just like in the movies!!) An arbitrary action is BY THAT VERY FACT not a choice. (Contrary to popular opinion.) > ... or can it only ... ONLY? > make the "decision" it is programmed to make based on the contents of its > "chemicals". As you say, the agent of decision making in the brain is in > fact the chemicals, and the successive current states of the chemicals > themselves (the ones that "make the decisions" causing other chemical/ > physical actions in the body) are DETERMINED by the same physical laws > that govern action in rocks and trees! FINE. SO?????????????????? The operations of a human brain are, *viewed on a sufficiently high level* (Hofstadter makes some excellent points on distin- guishing between levels), characteristically (one may hope) RATIONAL. THAT MAKES THE DIFFERENCE. (See some of my earlier articles. Read Chin-Tai Kim.) >> ...if I feel like I'm in pain, I AM in pain. This has nothing to do with >> the free will issue, but I felt it was worth picking this bone too. > This is akin to saying "if enough people believe there is a god, there is > one". Yes, feelings are decisive. But when you feel like you are in pain > even if there is no direct physical cause ... Then you are still in pain, even though you may be wrong about its nature or origin. This is not so with a god. The analogy fails. Also, one can be mistaken in *belief* about pain, but not in feeling it. The feeling just is. > It's wrong to say "the subjective isn't real". Yay, we agree! That's all I was trying to get at on this tangential issue. >> To deny that there is an agent on the >> ground that it is "just" a bunch of chemicals makes about as much sense as >> denying that there is warmth in the room on the ground that there is "just" >> a bunch of molecules moving around. > Again, this is like saying that a program has free will. See above. > On the contrary, No it's not. > barring system errors, a program's functions are deterministic. Fine. So? > ... You might say "but then, there's an agent of choice: the chemicals > themselves 'choose' based on the external input data!" EXACTAMOONDO! (--The Fonz) BY GEORGE, I THINK HE'S GOT IT! > Sorry, again. The chemicals just do what they're supposed to > do given a certain set of circumstances (external variables). I guess he doesn't got it. There's that word "just" again. > Thus, it's the world around you, consisting of external variables input into > the brain through sensory channels, that is the agent of your free will. Non sequitur. Just because the chemicals give a specifiable output for any input doesn't mean they aren't doing anything. It's a typical reductionist fallacy: show that you can explain B in terms of A, and then pretend that B is somehow less real or less important than A. It ain't so. Explaining human behavior does not explain it away. --The aspiring iconoclast, Paul V Torek, ihnp4!wucs!wucec1!pvt1047 Please send any mail directly to this address, not the sender's. Thanks.
jim@ism780b.UUCP (10/10/84)
#R:pyuxn:-117300:ism780b:27500045:000:2543 ism780b!jim Oct 8 15:17:00 1984 >This would mean that a program "chooses" to do certain things based on the >"current" state of its "chemicals" (e.g., the CPU instruction address counter, >the data in "memory" and input through sensory channels). Can the program >"choose" arbitrarily to do something (like set fire to the disk drive, or >cause the computer to explode----just like in the movies!!) or can it only >make the "decision" it is programmed to make based on the contents of its >"chemicals". As you say, the agent of decision making in the brain is in >fact the chemicals, and the successive current states of the chemicals >themselves (the ones that "make the decisions" causing other chemical/physical >actions in the body) are DETERMINED by the same physical laws that govern >action in rocks and trees! *Of course* the program can choose to set fire to the disk drive, if it has the physical means to do it (can a robot "choose" to walk into a furnace? Why not?). While they didn't have it right in the movies, your notion of a computer as a purely ethereal thing is highly outmoded, and leads you to make these sort of false distinctions. And you keep presuming that determinism and free will are mutually exclusive. It doesn't matter how many times you right it in all caps, it still doesn't make the tautological fact that all occurrences in the physical world are determined (ignoring random quantum events) by physical laws *relevant* to questions of free will. >It's wrong to say "the subjective isn't real". Behaviorists say so. They say that only the physically observable manifestations of the subjective are real. You are failing to distinguish the two (if you say there really is no difference, you have verified my claim). >Again, this is like saying that a program has free will. On the contrary, >barring system errors, a program's functions are deterministic. Which of the following have free will: programs I write bacteria me programs written after ten thousand years of development of AI How did you decide? Can't you see how silly you are being? You are begging the question. Your premises: 1) All behavior is is a result of physical processes 2) All physical processes are determined 3) Determinism is mutually exclusive with free will conclusion: We have no free will. Of course. Now justify premise 3. You won't be able to do it until you give the terms formal definitions. But the definitions will refer to different meta-levels; they don't talk about the same things. -- Jim Balter (ima!jim)
jim@ism780b.UUCP (10/18/84)
>It's a typical reductionist >fallacy: show that you can explain B in terms of A, and then pretend that >B is somehow less real or less important than A. It ain't so. Explaining >human behavior does not explain it away. I think this is the crux. Reductionists don't seem to realize that discussions about human bahavior and discussions about chemicals are discussions about *different things*, and the fact that one is composed of the other does not mean they are discussions about *the same thing*. One can discuss human behavior; one can discuss chemicals; one can discuss the implications of our knowledge about chemicals on our knowledge about human behavior, *and vice versa*, but it is a mistake to think that once we know all about chemicals we know all about human behavior, just as it is a mistake to think that once you know all about machine instruction sets you know all about optimizing programs. We talk about human behavior as opposed to chemicals because human behavior is an extremely complex and non-obvious manifestation of chemical reactions for us limited humans, and *because human behavior is interesting in its own right*. No matter how much we understand about molecular behavior, we will still talk about fluid mechanics as a separate subject. Just because we can formulate Peano axioms doesn't mean that the algebraic topologists can all go home. Explaining the components does not explain the whole, no matter how completely implicit the whole is in the components. But as a reductionist, you won't see that the nature of human discourse is independent of the mechanistic, hierarchic nature of the universe. And you won't see that your ability to accept the case of fluid mechanics or algebraic topology but not human behavior is politically motivated. Behaviorists, sociobiologists, libertarians, and free-marketeers cling to their beliefs because they justify certain behaviors and policies, not out of a neutral attempt to determine truth. Saying that human behavior is completely the result of chemical interactions states a materialistic view that some idealist types might have trouble with, but your average non-reductionist or non-behaviorist would certainly not disagree with it, and to think otherwise is to set up a straw man. But saying that human behavior is "just" the result of chemical interactions bears with it the false implication that we are capable of determining human behavior given our knowledge of chemical reactions, or that we will be able to completely understand human behavior given enough knowledge about chemical reactions, which is stupid and arrogant. -- Jim Balter (ima!jim)