[net.followup] More on Searle, et. al.

tombl (12/03/82)

I really do think Searle has gotten more mileage out of this than
he deserves. As far as I am concerned, his latest contribution
was his speech acts stuff.

I recently had the chance to see Searle present the paper several people
have mentioned. Several interesting notes:

	- he has backed off from some of the more definitive (read
	  uninformed) claims that originally appeared in the Brains
	  and Behavior (or whatever it is) article;

	- his presentation was very lively and entertaining -- in
	  fact it was a great deal more entertaining than it was
	  substantative;

	- Searle doesn't make one argument, he makes several. He never
	  does define what he alludes to, and expects the listener
	  grant as commonly understood, when he uses the term "meaning",
	  as in the emotive meaning of thought (what do the words
	  "Ayatollah Khomeini(sp?)" really "mean" to you?). He accuses
	  AI researchers of ignoring this issue; in fact, any logical
	  positivist in his own (Philosophy) department would deny
	  that there was any such issue to address.

	- He claims that semantics (equivalently the "meaning" of words)
	  can not be embedded in a formal system. At least he is
	  willing to admit that he cannot prove this!

	- He makes an analogy between the chemistry of water and the
	  structure of intelligence. Specifically, he admits that
	  AI researchers are able to manipulate programs and properties
	  of a system at the level corresponding to the molecular
	  structure of water. The intelligent behavior we observe,
	  a sequence of "mental states", corresponds to the macroscopic
	  properties of water.  (Incidentally, he spoke throughout as
	  if these were conscious mental states; when I asked him about
	  that, he immediately backed off).  Searle maintains that the
	  link between the two levels, and he does acknowledge that the
	  lower has a causal relationship to the upper, is its
	  embedding in the mind (whatever, dear heaven, that is),
	  and that it certainly cannot be embedded in a silicon chip.
	  He obviously doesn't know much about statistical mechanics
	  either, or he would see the gaping hole in his own analogy.

	- Searle thinks his dog is intelligent. I can only hope he
	  lives to become a victim of the Turing test.

Generally speaking, I think the man suffers from the Weizenbaum
Syndrome (guess what that is), and it is too bad he could not
either keep up with the pace of his own profession, if that is
the problem, or find an easy early out from what is not only
a faulty, by markedly unconvincing argument.

	Tom Blenko
	decvax!teklabs!tekmdp!tombl
	ucbvax!teklabs!tekmdp!tombl
	tombl.tektronix@rand-relay