throopw@agarn.dg.com (Wayne A. Throop) (03/21/89)
> harnad@elbereth.rutgers.edu (Stevan Harnad) > If you examine the brain with a view to slicing off its "transducers" > and "effectors," you come up against a problem, because even if you > yank off the sensory surfaces, what is actually left over is repeated > analog transforms of the sensory surfaces as you go deeper and deeper > into the brain. An interesting assertion. It seems incorrect on two counts. First is the trivial one, that the chemical reactions in the brain are, at base, representable as discrete and symbolizable. That is, there is a limit to the "analogness" of the brain's representation of the world around it. Second, no case has been made for how much of the "analogness" of the signal that makes its way to the brain is significant. There is some evidence that the "analogness" is, in fact, filtered out quite quickly, and what is left are symbolic representations of relationships among various input stimuli. In fact, it would be very, VERY surprising if the analogness mattered, because the analogness that exists in human neural systems is not accurate. It seems plausible (and even likely) that the "analogness" of signals within the brain are not representations of analog quantities in the "real world". -- "Who would be fighting with the weather like this?" "Only a lunatic." "So you think D'Artagnian is involved?" --- Porthos, Athos, and Aramis. -- Wayne Throop <the-known-world>!mcnc!rti!xyzzy!throopw
ray@bcsaic.UUCP (Ray Allis) (03/28/89)
> From: throopw@agarn.dg.com (Wayne A. Throop) > > > harnad@elbereth.rutgers.edu (Stevan Harnad) > > If you examine the brain with a view to slicing off its "transducers" > > and "effectors," you come up against a problem, because even if you > > yank off the sensory surfaces, what is actually left over is repeated > > analog transforms of the sensory surfaces as you go deeper and deeper > > into the brain. > > An interesting assertion. It seems incorrect on two counts. First is > the trivial one, that the chemical reactions in the brain are, at > base, representable as discrete and symbolizable. Perhaps, but can they be *replaced* by symbols? I think not. > That is, there is a > limit to the "analogness" of the brain's representation of the world > around it. > > Second, no case has been made for how much of the "analogness" of the > signal that makes its way to the brain is significant. I assert that the "analogness" is absolutely critical. My case is based on the fundamental difference between _representations_ and _symbols_. (i.e. the voltages, frequencies, chemical concentrations and so on are _representations_ of "external reality" rather than symbols. Symbols appear at a much "higher" level of cognition, where _representations_ can be associated with each other. > There is some > evidence that the "analogness" is, in fact, filtered out quite > quickly, and what is left are symbolic representations of > relationships among various input stimuli. What evidence? Is there neurological evidence? The neurological research I've seen does seem to point to a series of analog transforms. Do you have some references, please? > In fact, it would be very, > VERY surprising if the analogness mattered, because the analogness > that exists in human neural systems is not accurate. What value "accuracy"? True, analog computers fell out of favor because they didn't perform numerical computation to the same "accuracy" as digital computers. But we sometimes tend to gloss over the truth that any computation is only as good as its input measurements, assumptions and premises. A digital computer is the archetypical physical symbol system; it manipulates symbols according to specified relationships among them, with absolute disregard for whatever they symbolize. In contrast, your nervous system's state at, say, the visual cortex, *represents* the effect your environment is having on your sensory equipment, with nary a symbol to be found. > It seems > plausible (and even likely) that the "analogness" of signals within > the brain are not representations of analog quantities in the "real > world". > I'm sorry, but I don't understand what you mean by this. > Wayne Throop <the-known-world>!mcnc!rti!xyzzy!throopw Ray Allis ray@atc.boeing.com bcsaic!ray
ellis@unix.SRI.COM (Michael Ellis) (03/29/89)
> Wayne A. Throop >> Stevan Harnad >> If you examine the brain with a view to slicing off its "transducers" >> and "effectors," you come up against a problem, because even if you >> yank off the sensory surfaces, what is actually left over is repeated >> analog transforms of the sensory surfaces as you go deeper and deeper >> into the brain. >..First is the trivial one, that the chemical reactions in the brain >are, at base, representable as discrete and symbolizable. That is, >there is a limit to the "analogness" of the brain's representation >of the world around it. This is exactly what you need to show. I would consider it be a miracle if it just happened to turn out that way. References? >..In fact, it would be very, VERY surprising if the analogness mattered, >because the analogness that exists in human neural systems is not accurate. The analogness of the brain is not accurate? What does that mean? Can I infer that a digital technician would be a bit confounded by such signals as are found in the brain? >It seems plausible (and even likely) that the "analogness" of signals within >the brain are not representations of analog quantities in the "real >world". Grasping for straws. Just who have you been reading? Douglas Hofstadter? The brain is clearly analog. What you *desperately* have to show us is that it is "at base, representable as discrete". You have only given us a wish list of blanket assertions. -michael
fransvo@htsa.uucp (Frans van Otten) (03/31/89)
Michael Ellis writes: >Wayne A. Throop writes: > >>...First is the trivial one, that the chemical reactions in the brain >>are, at base, representable as discrete and symbolizable. That is, >>there is a limit to the "analogness" of the brain's representation >>of the world around it. >>...It seems plausible (and even likely) that the "analogness" of signals >>within the brain are not representations of analog quantities in the "real >>world". > >Grasping for straws. Just who have you been reading? Douglas Hofstadter? > >The brain is clearly analog. What you *desperately* have to show us is that >it is "at base, representable as discrete". You have only given us a wish >list of blanket assertions. Let me translate this to the digital computer world. The signals in it are clearly analog. Then your conclusion is "the entire computer is analog" when you say "the brain is analog". Probably many analog signals in the brain are (directly or indirectly) representations of analog quantities in the real world. But not neccesarily the same way. The digital representation (within a computer) of an original analog signal is also an analog value but it can be represented as discrete and symbolized. -- Frans van Otten Algemene Hogeschool Amsterdam Technische en Maritieme Faculteit fransvo@htsa.uucp
throopw@bert.dg.com (Wayne A. Throop) (04/10/89)
> ray@bcsaic.UUCP (Ray Allis) >> throopw@agarn.dg.com (Wayne A. Throop) >> no case has been made for how much of the "analogness" of the >> signal that makes its way to the brain is significant. > I assert that the "analogness" is absolutely critical. My case is based on > the fundamental difference between _representations_ and _symbols_. (i.e. > the voltages, frequencies, chemical concentrations and so on are > _representations_ of "external reality" rather than symbols. This seems not to be the case, especially in the visual cortex. The firing of a given neuron does not *represent* "a vertical line so long" in any discernable way, and yet this happens at quite a low level of processing. Even the processing that winds up calculating "color" is not a very simple "representation" of the extent of stimulation of the three varieties of color-sensitive sensors in the retina... if one could even warp "representation" to cover it at all. The point is, that even at rather low, automatic levels of processing in the brain, features of the percept are calculated that are very, very symbol-like, and not very representation-like at all. Thus, as I started out, I'd say that this > Symbols appear > at a much "higher" level of cognition, where _representations_ can be > associated with each other. seems an interesting theory... but not to be the case. > What evidence? Is there neurological evidence? The neurological research > I've seen does seem to point to a series of analog transforms. Do you have > some references, please? I don't have the original research referents handy. They were given in the credits of the episode of "The Brain" on PBS, and in the various accounts of this research as reported in Science News and the like. -- Indeed he knows not how to know who knows not how to un-know. --- Sir Richard Francis Burton -- Wayne Throop <the-known-world>!mcnc!rti!xyzzy!throopw
throopw@bert.dg.com (Wayne A. Throop) (04/10/89)
> ellis@unix.SRI.COM (Michael Ellis) >> Wayne A. Throop >>> Stevan Harnad >>> If you examine the brain with a view to slicing off its "transducers" >>> and "effectors," you [..can't do it because of the brain's "analogness"..] >> [..This seems wrong, because first..] the chemical reactions in the brain >> are, at base, representable as discrete and symbolizable. That is, >> there is a limit to the "analogness" of the brain's representation >> of the world around it. > This is exactly what you need to show. I would consider it be a > miracle if it just happened to turn out that way. References? Harnad claims that the brain is special because as one proceeds from sensor, through processing, to effector, there are only analog transforms of other signals created. I presented two plausible scenarios where his claim is incorrect or moot. So, no, I don't need to "show" that my claims are true. Merely that Harnad's claims are not the only plausible ones. My first proposed scenario is that of considering chemistry to be the interaction of finite numbers of discrete particles in rather styilized ways. Naturally, this is not literally true, but it seems likely that the cases where it is not true give answers that differ from "real" answers far less than the system can plausibly be sensitive to. If, for example, it mattered in any practical sense that one million photons struck sensors instead of one million and one photons, the brain is reacting at a level that would be non-adaptive. It would be a "miracle" if the brain *didn't* filter out analog variance of that small degree. In this trivial sense, given the infinitesimal size (and hence small degree of error in discretization) of atoms, it seems quite plausible that there is SOME level at which the brain can be digitized, and treated as discrete symbols without losing anything that matters. Note that I'm NOT saying that indeterminacy is irrelevant, I'm not saying that analog effects are irrelevant. I'm saying that the argument that what is special about the brain is its "analog" nature is flawed, because analogness can in principle be approximated arbitrarily by discrete processes. Simply pick the point at which the accuracy is irrelevant, and substitute discrete processes with accuracy also in this range. Again, this is NOT a general argument FOR anything (let alone a particular brain structure), merely pointing out that the "analogness is what makes a brain" is flawed in this trivial way. >>..In fact, it would be very, VERY surprising if the analogness mattered, >>because the analogness that exists in human neural systems is not accurate. > The analogness of the brain is not accurate? What does that mean? > Can I infer that a digital technician would be a bit confounded > by such signals as are found in the brain? No, I mean that sensors respond differently to identical stimuli under different circumstances. This was intended to be supporting material to show that there is some limit where the "analogness" no longer matters, because it MUST be ignored by the brain, because there is noise in the system that would swamp it in any event, far before one part in 10^24 per gram or so. >>It seems plausible (and even likely) that the "analogness" of signals within >>the brain are not representations of analog quantities in the "real >>world". > Grasping for straws. Just who have you been reading? Douglas Hofstadter? No, not Hofstadter. The second and more fundamental scenario I'm talking about is that the level at which "analogness" ceases to matter is quite a high level. This is based on simple research into the meaning of various signals in the visual cortex, for example, as presented on "The Brain", and reported in Science News and other sources widely available to the lay public. Quite soon in processing, signals at single neurons are unrelated in strength to any particular signal in the outside world, but instead either fire or do not according to a rather high-level criteria, and in a very binary (discrete) fashion. Therefore Michael's argument that > The brain is clearly analog. is moot, because computer circitry is also clearly analog if looked at closely enough. You could argue that there is something fundamental about the fact that voltages change from state to state along continuous curves, and the fact that THIS transistor is holding its output line at .4 volts while THAT one is at .45 volts. But if the outputs of certain transistors in the computer are discovered to have clusters of output states that seem to be related to discoverable binary assertions about the world, and this might well lead one to suppose that all that analog stuff measured earlier isn't as important as it seemed. > What you *desperately* have to show us > is that it is "at base, representable as discrete". You have only > given us a wish list of blanket assertions. I'm sorry, but that's just not the case. It is Searle and Harnad who claim to have "proven" something, and hence need to "show" that the brain is definitely analog in some important way, when this really seems not to be the case. The "blanket assertions" I give are really quite modest, and largely consist of pointing out plausible alternative scenarios that the mystic-causal-powers crowd simply have not ruled out, NOT of pointing out gospel truth. I'll state it again: I'm not particularly in the camp of the dogmatic strong AIers. I'm not claiming to know what subjective experience IS, or what it means to have a mind, or why neurons fire at different levels at different times. I merely find it premature to leap to the conclusion that subjective experience, minds, and suchlike things require things which -- what shall I call them -- "typographic engines" do not have. Searle's, Lucas's, et al's case simply have not been convincingly made. -- Indeed he knows not how to know who knows not how to un-know. --- Sir Richard Francis Burton -- Wayne Throop <the-known-world>!mcnc!rti!xyzzy!throopw