[comp.ai.digest] Nanotechnology, Science Priorities

lewis@smu.UUCP (Eve Lewis) (02/22/88)

Urgent  insert:  On the morning of Friday, 12 February 1988, I heard
the  following NPR item:- The National Research Council has issued a
report  calling  for  a  major shift in biology research. The report
says the Federal Government should spend $200 million  a  year,  for
fifteen  years, on a single genetic project. NPR's Science reporter,
Laurie Garrett, has more:

Garrett:  For  several  years,  some  powerful  scientists have been
promoting the largest biological  research  effort  ever  attempted.
They  want  to  decipher the code of over three billion genetic mes-
sages stored in human chromosomes. The National Research Council en-
dorsed the project yesterday, but did not say which  Federal  agency
should  control  it.  Last night, Harvard University Nobel Laureate,
David Baltimore,  addressed  the  Annual  Meeting  of  the  American
Association for the Advancement of Science. Despite his leading role
in  genetics  and biotechnology research, Baltimore came out against
"The Genome Project."

Baltimore: When I walk around and ask people, "Do you feel  in  your
own  research,  that what's holding you up, is a lack of sequence of
the human genome?" I have yet to meet anybody who says, "Yes."

Garrett: Baltimore says Federal money and research efforts would  be
better spent on a massive, co-ordinated attack on the AIDS virus. In
Boston, I'm Laurie Garrett.

     End of this NPR transcript re: Baltimore's horrendous opinion.

1)  Do  researchers always know precisely what is "holding up" their
research, or how said research would  be  spurred,  stimulated,  and
aided by such an important store of data?

2)  From Robert Kanigel's book, "Apprentice to Genius," which I just
finished reading, discussing James Shannon of  NIH,  and  I  believe
supporting my view that Baltimore's advice is a mistake:

"Shannon's  task  was to align NIH's disease-oriented structure with
the needs of basic research. The strategy he advanced all the  years
of  his  tenure as director was: Don't embark on a narrow search for
disease cures at all." and,

"`Knowledge of life processes and of phenomena underlying health and
disease  is  still grossly inadequate,' he would write. Without such
knowledge, it was a waste of time, money, and manpower  to  aim  for
the solution of a specific medical problem. He blamed the failure of
polio  vaccines  back in the 1930s on lack of knowledge of the polio
virus and techniques needed to culture it. He pulled the  plug  from
an artificial heart program already approved because he didn't think
cardiac functioning was well enough understood.

"He  didn't  like  the  term basic research; he preferred calling it
fundamental. But in the end it was the same. As he put it in an  ar-
ticle he coauthored for "Science" soon after becoming director, "The
potential  relevance  of  research to any disease category is [best]
defined in terms of long-range possibilities and  not  in  terms  of
work  directed  toward  the quick solution of problems obviously and
solely related to a given disease." Additionally, re:

Ca. 1955, Dr. Seymour Kety, Director of Scientific Research for NIMH

"In the long run, basic science would gain, leading to clinical  ad-
vances more abundant than if they'd been pursued directly.

"Some  years  later,  two  clinical investigators, Julius Comroe and
Robert Dripps, lent analytical force to Kety's  intuition.  The  two
undertook  to examine the origins of the ten most important clinical
advances in heart and lung medicine and  surgery  of  the  preceding
thirty years. They tracked down 529 scientific articles that had, in
retrospect,  proven  crucial  to  those clinical success stories. Of
them, Comroe wrote, fully forty-one percent `reported work that,  at
the  time  it was done, had no relation whatever to the disease that
it later helped to prevent, diagnose, treat, or alleviate.' Penicil-
lin, the anticoagulant heparin, and the  class  of  drugs  known  as
beta-blockers were among them."

Now,  I refer to the above, because I believe that if David "reverse
transcriptase" Baltimore's advice were followed,  it  would  deprive
the AI people and the neuroscience people (let's hope that some have
a  foot  in  each camp) of the BIOLOGICAL COMPILER IN THE NEURONS OF
THE HUMAN BRAIN, to wit: the "reverse transcriptase"  implicated  in
mental  function.  That is why it is too ironic for words, that Bal-
timore should come out with such an  opinion,  in  addition  to  its
being pathetic that someone of his calibre should "think small." One
wonders what interests have gotten to him.

My article continues:

On  page A1 of the New York Times, 20 March 1987, there was a report
of a meeting called by the American Physical Society.  I  tell  you,
these  people were chortling in a state of absolute mania, in regard
to the discoveries in "superconductivity."  The  head:  "Discoveries
Bring a `Woodstock' for Physics." Byline: By James Gleick.

Was I jealous? Was I envious? I tell you that I was totally sick, to
the  max.  But  I  also tell you that less than one year later, I am
beginning to feel rather good, about AI (artificial intelligence, as
we all know) and about NI (natural intelligence).

Interdigitation between these two  disciplines  will  forge  an  ab-
solutely  unbreakable  bond,  and after a struggle of two and a half
millenia, WE will chortle at our own "Woodstock." I can see The  New
York  Times  article's  banner  headline  in  my  "mind's eye," now:
"DISCOVERIES BRING A `WOODSTOCK'  FOR  NEUROSCIENCE  AND  ARTIFICIAL
INTELLIGENCE." (We will also be able to define in neurophysiological
and molecular biological terms, just what is that "mind's eye.")

Now, re: the Drexler concept:  Godden  <GODDEN%gmr.com@RELAY.CS.NET>
in an article, "Intelligent Nanocomputers," dated Friday, 15 January
1988 @ 09:46 EST, discusses K. Eric Drexler's "Engines of Creation."
What he explores particularly in his review is the chapter on AI and
nanocomputers.  "Drexler  makes the fascinating claim (no doubt many
will vehemently disagree) that to create a  true  artificial  intel-
ligence  it  is  not necessary to first understand intelligence. All
one has to do is  simulate  the  brain,  which  can  be  done  given
nanotechnology.  He  suggests that a complete hardware simulation of
the    brain    can     be     done,     synapse-for-synapse     and
dendrite-for-dendrite,  in  the  space of one cubic centimeter (this
figure is backed up in the notes)." 

However,  the  real  "Engine  of  Creation"  is,  in  actuality,   a
NANO-NANOTECHNOLOGY. It would be most resourceful, and productive to
access   the   BIOLOGICAL   NANO-NANOTECHNOLOGY,   and   with   this
inspiration,"  with   this   wealth   of   clues,   create   an   AI
NANO-NANOTECHNOLOGY.

Biological Nano-Nanotechnology IS Molecular Biology. But take heart,
because there is no conflict between Biological  Nanotechnology  and
Biological  Nano-Nanotechnology.  Indeed,  the  latter is the raison
d'etre, the vis a tergo, the provider of  templates,  the  sine  qua
non,  for  the  former,  which provides the model, and the source of
reference  for Drexler's "Engine of Creation," which is - one judges
from Godden's review - an isomorph of the human brain.

I  only  recommend that AI nanotechnologists avail themselves of the
wealth   of   experimentation   and   information   in    Biological
Nano-Nanotechnology.  Every  structure  in  the  human body, not ex-
cepting, for sure, the human brain, drags  with  it  a  phylogenetic
residue, from further back in time than perhaps we care to remember,
a  DNA  riddled  with  karma,  "sins,"  choices made at forks in the
evolutionary road, even choices not made at  such  forks,  and  all,
rattling  like  the  chains of Marley's ghost. We may be scared, in-
hibited by built-in protective mechanims, which is why we  have  not
yet solved "the biggie." I here refer to precisely what is the human
mind, in neurophysiological and molecular biological terms.

The structural genes of the human genome determine the morphology of
the  human  brain,  not to neglect the neurotransmitters, receptors,
etc. In my view, the human genome is the engineer that  masterminded
the "Engine of Creation." There is no antithesis implicated; indeed,
after  swallowing  the  theories  of  Galileo, Darwin and Freud, the
species (ours) had best prepare itself for another  humongous  gulp,
when  the genetic constraints on human thought are revealed and sub-
stantiated.

Godden's review of the Drexler book apparently elicited  some  reac-
tion.  I  refer  specifically  to the article dated 01 Feb 88 (11:53
PST),  by  John  McCarthy  <JMC@SAIL.Stanford.EDU>,  and  that  from
Wednesday,   03   Feb   1988   (01:25  EST),  by  Marvin  L.  Minsky
<MINSKY%OZ.AI.MIT.EDU@XX.LCS.MIT.EDU>.

Dr. Minsky is optimistic in the extreme, as I am:

"Progress  in  this  direction  certainly  seems  faster than almost
everyone would have expected. I will make a prediction: In the  next
few  years,  various  projects will request and obtain large budgets
for the "human genome sequencing" enterprise. In the meantime, some-
one will succeed in stretching single strands of  protein,  DNA,  or
RNA  across  crystalline  surfaces, and sequence them, using the STM
method. Eventually, it should become feasible to do such  sequencing
at  multi-kilocycle  rates,  so  that  an entire chromosome could be
logged in a few days."

That is why Dr. Baltimore's advice is so alarming. With his prestige
and influence, whomelse will he recruit for that bandwagon?

John  McCarthy  (01  Feb 88) describes two extreme approaches to AI,
the "Logic Approach," which is his preference:

"Understand the common sense world well enough to express in a suit-
able logical language the facts known to a person. Also express  the
reasoning methods as some kind of generalized logical inference."

and the "Instrumental Approach," which McCarthy finds less useful:

"Using  nano-technology  to  make  an instrumented person. (This ap-
proach was suggested by Drexler's book and by Eve Lewis's commentary
in AILIST. It may even be what she is suggesting)."

McCarthy  points  out  certain  problems  with   the   "Instrumental
Approach,"  which  on a Nanotechnological (Drexler) level would be a
recruitment of a functional isomorph of the human brain,  and  on  a
more  intricate,  biologically  more "basic," Nano-Nanotechnological
level,  would  involve a functioning human genomime ["genomime" is a
word I coined for the genomic equivalent of the brain isomorph].

He says "Sequence the human genome. Which one?  Mostly  they're  the
same, but let the researcher sequence his own." 

Dr. McCarthy is overlooking, in the most anti-serendipitous  manner,
what  neuroscience,  embryology, and most particularly, which he did
not mention, PHYLOGENY, have to offer. Just for starters, the evolu-
tion of the genome, pari passu with the phylogenesis of  the  pineal
organ  and  the  phylogenesis of the inner ear, as traced throughout
the vertebrate phylum, would give  artificial  intelligence  such  a
wealth of data, such an embarrassment of riches that it would hardly
begin to know what to do with. 

This  article  is  fast  becoming a "gontzeh megillah," and I really
must wind it up, but when McCarthy refers to "the human genome," and
says, "which one?" that is enough to get me started again. As far as
feeding "culturgens"  (E.O.  Wilson's  term)  into  the  super-duper
parallel  computer,  does  McCarthy really believe that there are no
racial differences, no sexual differences, not to mention individual
differences, in human brains? Does  he  think  that  it's  "cultural
pressure"  that  accounts for the Oriental reverence for the dragon,
and for the Occidental St. George, slayer of same? 

"Evolution keeps going, and even if we don't do anything artificial,
we won't be the same in ten million years." - Marvin L. Minsky

Meanwhile, before McCarthy abandons the "Engine  of  Creation,"  en-
tirely,  I suggest that he check out the "Sexually Dimorphic Nucleus
of the Hypothalamus," S.D.N., for short - just for starters.

McCarthy states: "However, experience since the 1950s shows that  AI
is a difficult problem, and it is very likely that fully understand-
ing  intelligence  may  take  of  the  order  of  a  hundred  years.
Therefore, the winning approach is likely to be tens of years  ahead
of the also-rans."

In fact, we are not referring just to the experience of the last few
decades, but that of the last two and a half millenia.

If you would read the "Works of Plato,"  specifically  the  Socratic
dialogues in "Phaedrus" and "Theatetus" and compare the Dialogues to
Minsky's "Society of Mind," or to Lopate's interview with him, or to
Sir  Francis  Crick's  seminar  on  "The  Impact  of Biochemistry on
Neurobiology," at Cornell University on 6 May 1986, or  to  Jonathan
Winson's  "Brain  and Psyche," or Michael S. Gazzaniga's "The Social
Brain," or J.Z. Young's "Programs of the Brain," or the  other  per-
ceptive  contributions  in  this  area,  then you would have to ack-
nowledge that as far as the struggle to comprehend  memory,  or  in-
ternal  representation,  or  vision, or "How We Know Universals," is
concerned, we would have to paraphrase the  Queen  in  "Through  the
Looking-Glass," and admit that it's taken all the running we can do,
just to keep in the same place! 

Nonetheless,  solving this problem is do-able, and we are better off
with the information about the trillion  neurons,  with  the  10,000
connections  each,  and the plethora of neurotransmitters and recep-
tors. And we will be better off yet with sequencing the human genome
with the three billion base pairs, and the introns  and  the  exons,
and  the promoters and the enhancers and the repressors and the even
the "junk" DNA. Maybe, especially the "junk" DNA.

Surely, we will have come up with a robust theory, or  theories,  to
interpret  all  that data, but let it be there to interpret, just as
we have to make sense out of all the neural pathways and connections
and peptides and receptors.

We  don't  need  any  "bottom-line," pessimistic, "applied research"
types shoving us off that track, or plunging us back into  the  dark
ages.  So much for Baltimore's suggestion. Keep the faith!