[sci.nanotech] FI Update # 10

nanotech@cs.rutgers.edu (12/08/90)

+---------------------------------------------------------------------+
|  The following material is reprinted *with permission* from the     |
|  Foresight Update No 10, 11/15/90.                                   |
|  Copyright (c) 1990 The Foresight Institute.  All rights reserved.  |
+---------------------------------------------------------------------+

Nanotechnology Progress:
Evolution in a Drum
by K. Eric Drexler

Advanced molecular nanotechnology will use strong, rigid molecular
components. These can be assembled to form molecular systems much like
the machinery found in manufacturing plants today--far from identical,
yet much more similar than one might naively expect. Machines like
these will typically be straightforward to design, confronting the
engineer with only the familiar stubbornness of physical devices. In
taking the early steps toward this technology, however, the problems
are messier. If they weren't, we'd be there by now.

Natural molecular machines are made from proteins and nucleic acids,
long molecular chains which can fold to form knobby, elastic objects.
Protein design has made great strides in the last ten years, but
remains difficult. Yet how can we copy nature's molecular machines
without mastering the art of protein design?

One answer is to copy nature further, by replacing (or at least
supplementing) design with evolution. Here, technologies have made
great strides in the last ten months.

Evolution works though the variation and selection of replicators. It
works most powerfully when the best results of one round of selection
can be replicated, varied, and selected again. Molecules lend
themselves to evolutionary improvement because they are cheap to make
and handle: a cubic centimeter can hold well over 10(*super)16 protein
molecules. With so many variations, even one round of selection can
often find molecules that behave as desired. Biomolecules, in
particular, lend themselves to evolutionary improvement because they
can be made by bioreplicators: a cubic centimeter can hold well over
10(*super)10 bacterial cells, programmed by genetic engineering
techniques to make on the order of 10(*super)10 variations on a chosen
molecular theme. All the techniques developed so far produce molecules
having a useful property from a nanotechnological perspective: they
are selected to stick to another, pre-selected molecule.

An earlier issue of Update reported on work by Huse et al. [Science,
246:1275, 1989] in which bacteria were used to make ~10(*super)7
different antibody fragments. Selection in this system involves
growing a film of bacteria, infecting them with phage particles
bearing genes for the fragments, and sampling from areas where a
labeled molecule is observed to be bound. In only two weeks of work,
this procedure generated many different antibody fragments able to
bind a predesignated small molecule. Standard techniques can be used
to fine-tune protein molecules like these by further variation and
selection.

This approach, as presently implemented, uses human eyes and hands to
do the selection. Two more recent approaches do not.

Scott and Smith [Science, 249:386, 1990] have made many millions of
phage particles having surface proteins with different short peptide
chains dangling from them. These phage particles can be poured through
an affinity purification column, a tube filled with a porous medium
having molecules attached to it which in turn will be sticky for some
complementary sub-population of peptide chains. The phage particles
which display such chains don't wash away with the rest; they can be
recovered and allowed to multiply in a bacterial culture. Again,
further rounds of variation and selection are feasible, if there is
room for improvement in molecular stickiness. Scott and Smith rapidly
found novel peptides that bound to their molecule of choice.

Tuerk and Gold have developed a procedure they term systematic
evolution of ligands by exponential enrichment (SELEX). They make a
diverse population of RNA molecules, then use an affinity column to
select molecules that bind (at least weakly) to a target molecule.
Those that bind are recovered and enzymatically replicated via reverse
transcription to DNA. The result after four rounds was a population of
RNA molecules with strong, selective binding to the target molecules.

At the end of their article, they suggest that the same basic approach
be applied to the variation and selection of protein molecules: it is
well known that many proteins fold and achieve function while still
being manufactured by--and hence attached to--a ribosome, which is in
turn still attached to the RNA molecule which encodes the structure of
the protein. By applying SELEX methods to this protein-translation
complex, the selective stickiness of the protein molecules can be used
to recover the RNA replicators needed for further evolution (and
eventual production). The article ends with the observation that these
methods could be used to generate "nucleic acids and proteins with any
number of targeted functions."

What does this mean for molecular systems engineering on the path to
molecular nanotechnology? To build molecular devices, or to attach
molecular tools to atomic force microscope systems, researchers will
find it useful to make molecules that spontaneously adhere in a
planned manner. This is the basic requirement for molecular
self-assembly. The growing ability to package the evolutionary process
and use it as a reliable, convenient tool may substantially accelerate
progress in molecular systems engineering.

	*	*	*	*	*	*	*	*

Recent Progress
Steps Toward Nanotechnology
by Russell Mills

Chymotrypsin-like enzyme synthesized

Chemists at the University of Colorado have designed and made a
modest-size peptide molecule called "CHZ-1" that imitates the activity
of chymotrypsin. (Chymotrypsin is an enzyme that cleaves bonds on the
acid side of the amino acids phenylalanine, tyrosine, and tryptophan.)
This is the first report of a catalytically active peptide having gone
all the way from de novo design to functioning molecule.

While the activities of CHZ-1 and chymotrypsin are similar, their
structures have almost nothing in common except at the active site
where substrate molecules are bound and transformed. In chymotrypsin
the actual work of catalysis is carried out by a particular
configuration of three amino acids: histidine, serine, and aspartate.
One of the main tasks of the rest of the enzyme is to maintain these
amino acids in their relative positions. CHZ-1 was designed with the
same three amino acids held in a similar configuration.

CHZ-1 is much smaller than chymotrypsin: 73 amino acids versus 245.
The two catalysts have many substrates in common; for these, CHZ-1
cleaves bonds at about 1% the rate of chymotrypsin -- a 100,000 times
acceleration over the background rate. Heat tolerance of CHZ-1 is
greater than that of chymotrypsin, but this difference may be
attributable to the chemists' employment of a non-standard amino acid
and several non-peptide bonds to hold the molecule together. These
would not be found in an enzyme of biological origin.

CHZ-1 was made in a protein synthesis machine; it could not have been
produced by recombinant DNA methods, since it contains nonstandard
parts. But having shown that the basic activity of an enzyme can be
transferred to a very different molecule simply by copying the design
of the active site, chemists will no doubt soon develop active
peptides consisting of single chains of amino acids that can be
produced in quantity by engineered microorganisms.  [See Science
249,1544-1547,22Jun90]

New mechanisms for chemistry at surfaces

In his book Engines of Creation, Eric Drexler envisioned machines able
to assemble structures with atomic accuracy by thrusting each part
into an appropriate site on the workpiece, using an angle and velocity
likely to promote formation of the desired bond. The reasonableness of
this picture of an assembler, not so obvious five years ago, is
becoming more apparent as chemists explore the mechanisms of chemical
reactions. A good example is provided by the work at MIT of Sylvia T.
Ceyer and her colleagues who have been using molecular beams to study
the adsorption of small molecules onto metal surfaces. Metal-catalyzed
reactions constitute a large class of chemical processes that have
been widely used but poorly understood -- until now.

Ceyer's group investigated one such reaction in great detail: the
adsorption of methane onto nickel. The key factor is the velocity of
the molecular beam -- specifically, the speed at which the incident
molecules approach the nickel surface. Since a methane (CH4) molecule
is a carbon atom surrounded by four hydrogen atoms, the first atom to
near the surface is always a hydrogen. If the impact speed is great
enough, this hydrogen will be pushed aside, allowing the carbon atom
to approach and bind to a nickel atom; the hydrogen atom, now free,
binds to a different nickel atom. At lower speeds, the methane
molecules remain intact; some are trapped by forces near the metal
surface, others bounce off and escape.

The MIT researchers found they could control this and similar
reactions by varying the parameters of the molecular beam (e.g., the
velocity and angle of incidence) and the temperature of the nickel
surface. They discovered that the reactions occur at lower incident
velocities when the methane molecules are given extra vibrational
energy before sending them to the nickel surface -- presumably because
the vibrational distortions give carbon and nickel atoms easier access
to each other. And they found that unreacted methane molecules trapped
near a metal surface can be forced to react with and bind to it simply
by "hammering" them with a beam of neutral atoms (such as argon).

This work confirms the assembler concepts put forward in Engines of
Creation -- atoms and molecules can indeed be added to a workpiece by
hammering them against it, and they can be pre-processed to enhance
their reactivity. [See Science 249:133-139,13Jul90]

Clusters

Small clusters of metal or semiconductor atoms give rise to properties
not seen composition. Adding or removing a few atoms from an ordinary
sample will not change its properties, but this is not true of samples
whose component particles each contain only a few dozen atoms or less.
For example, a cluster of nine cobalt atoms is practically inert to
hydrogen or nitrogen gas, whereas a cluster of ten cobalt atoms is
quite reactive.

Cluster research is aimed partly at finding ways to make clusters in
quantity. Current methods produce a mixture of cluster sizes,
complicating the study of their structure and behavior.

The fact that the properties of these substances depend so critically
upon cluster size has mixed implications for nanotechnology. On the
positive side, it suggests that the range of possible characteristics
that materials may possess could be much broader than we realize. But
on the negative side, it means that the characteristics of materials
can be very sensitive to small errors in design or construction. [See
Science 248:1186-1188,8Jun90]

Chiral metal complexes as molecular catalysts

Nanotechnology uses assemblers; biochemistry uses enzymes; chemistry
uses catalysts; carpentry uses tools. Assemblers, enzymes, catalysts,
tools -- four examples of objects that control the processing of other
objects.

We're all familiar with the evolution of tools, from crude hammers and
chisels capable of only the roughest sort of production, to complex
machine tools that control the shapes of manufactured objects with
micron accuracy. Enzymes underwent a similar evolution more than a
billion years ago, developing a complexity and variety that enabled
them to conduct the biochemistry of life.

Analogous to these two traditional lines of development is the current
progress in chemical catalysis. Catalysts are substances that direct
the course of chemical reactions without themselves being used up;
catalysts participate in the reactions, but they emerge intact and so
are available for another round. Generally speaking, simple catalysts
are less specific than complex catalysts. If a catalyst is to promote
specific reactions and not others, then it must contain sufficient
structure to enable it to distinguish between the reactants it is to
use and those it is to ignore.

In recent years a sophisticated class of catalysts has emerged from
research laboratories such as that of Ryoji Noyori at Nagoya
University. Noyori has been studying what are called "chiral metal
complexes" in which a metal atom is bound to an asymmetric molecule to
form a catalytic complex. Such catalysts distinguish between reactants
not only on the basis of their chemical structure, but their chirality
as well. (Chirality is the symmetry property that causes certain
structures to be mirror images of each other but not identical -- the
same property that prevents left-handed nuts from fitting on
right-handed bolts.) Ruthenium-BINAP catalysts are especially
promising examples -- their superiority over conventional catalysts
has been demonstrated for the production of dozens of commercially
important chiral chemicals.

Noyori says, "In principle, any chiral structure can be generated
through rational modification of the catalyst's molecular structure."
From a traditional chemical viewpoint it seems hard to believe that
there would not be some chiral structures for which no appropriate
catalyst could be designed. After all, traditional chemistry generally
takes place in solution where substrate molecules bump around randomly
and often prefer different reactions than the chemist does. On the
other hand, if chemistry is a stage in the development of
nanotechnology, then catalysts should be thought of as rudimentary
assemblers that are slightly "programmable" through changes in the
reaction milieu (i.e., changes in pH, temperature, etc.). Plain metal
catalysts, like platinum or nickel, have played a major role in
chemistry despite their simplicity. In chiral metal catalysts the
unique catalytic features of metal atoms are combined with structures
that aid in the recognition and handling of desired substrates, and
that can be more readily "programmed" by the milieu.

As catalysts become more sophisticated, they will become more complex,
more varied, more programmable, and more selective; their descendants
sometime in the 21st Century may well turn out to be the molecular
assemblers we discuss in Update. If they do, then Noyori's claim might
evolve into this one: "In principle, any physically realizable
molecular structure can be constructed by appropriately programmed
assemblers." [See Science, 248:1194-1199,8Jun90]

Telomeres and aging

Rejuvenation buffs will be interested in the work of Calvin B. Harley,
et al. at McMaster University and Cold Spring Harbor Laboratory. These
researchers have shown that human fibroblast cells undergo gradual
losses at the ends of DNA molecules.

In organisms having linear chromosomes (such as yeast and higher
organisms), the replication of DNA during cell division is often
incomplete -- base pairs are lost at the ends of the DNA molecules. To
guard against the loss of important information, the end segments of
the DNA consist of repetitive sequences of base pairs that contain no
essential information; these are called "telomeres."

Organisms that do not age (like yeast) have "telomerase" enzymes that
maintain the length of telomeres by adding repetitive sequences when
necessary. Higher organisms also have telomerases, but these appear to
be active only in the production of reproductive cells (e.g., sperm)
and in tumors. Consequently -- and this is what Harley et al. have
shown -- human somatic cells lose about 50 base pairs per DNA terminus
per cell division, on the average. Since sperm DNA has about 9000 base
pairs of repetitive DNA at each terminus, the process of incomplete
replication would have eaten into critical parts of the DNA at a given
terminus after about 180 cell divisions. There are, however, 92
different telomeres in each human cell (23 pairs of chromosomes x 2
telomeres per chromosome). A cell may die or become impaired if even
one of these 92 telomeres begins losing critical information -- an
event that would generally occur sooner than the average.

If telomere shortening proves to be a major mechanism of aging, then
gene therapy offers a possible way to deal with it. We can envision a
day when genes can be introduced into the human genome to provide a
telomerase system that has been redesigned to be active in somatic
cells. [See Nature 345:458-460,31May90]

Nano-Mechanism Project

Shoichiro Yoshida and his research team with the Research Development
Corporation of Japan have completed a five-year project aimed at
developing instruments and techniques for measuring and processing at
nanometer scales. Among the fruits of this effort are:

1. Systems for measuring and positioning samples with subnanometer
accuracy.

2. A combination scanning electron microscope/scanning tunneling
microscope to provide a wide range of magnifications;

3. An STM in an ultra-high-vacuum chamber to enable ion-etched
surfaces to be studied before they get contaminated;

4. Techniques for producing x-ray multilayer mirrors by sputter
deposition. The mirrors will be used in x-ray microscopes, x-ray
lithographic steppers, and other instruments.

5. Improved zone plates for x-ray microscopy;

6. Compilation of data on optical constants of various materials for
use in making multilayer mirrors and zone plates;

7. Methods for making atomically smooth surfaces by low-energy
ion/atom beam sputter etching.

This is just one of 21 projects in Japan's national ERATO program.
With efforts like these taking place, progress toward nanotechnology
should be rapid. [See Nanotechnology 1:13-18,1990]

New design for AFM probes

Atomic force microscopes construct images by scanning a sharp tip over
a sample at sub-nanometer distances and measuring the force between
tip and sample. The tip is fastened to a cantilever arm; samples lie
on an atomically-flat surface (or "stage").

 Lacking techniques for making atomically perfect tips, researchers
have had problems with resolution, interpretation and reproducibility.
Earlier this year Eric Drexler at Stanford and John Foster of IBM
suggested that these problems could be alleviated if AFMs were
equipped with engineered molecular tips [See Nature 343:600, 15Feb90].
A variety of different molecules could be designed to have desired
characteristics and then synthesized with atomic precision by chemical
methods.

This earlier work left unanswered the important question of how such
molecular tips could be installed and placed on the AFM's cantilever.
In a paper presented in July at the Fifth International Conference on
Scanning Tunneling Microscopy/Spectroscopy and First International
Conference on Nanometer Scale Science and Technology, Drexler suggests
an answer to this question: the tips need not be installed on the
cantilever at all. In the new arrangement, the sample is to be held on
a round bead fastened to the cantilever; a variety of tips are bound
to the stage, not necessarily in an organized pattern. To image a
sample, the operator must first find an appropriate molecular tip on
the stage by broadly scanning the stage with the bead -- in this mode
of operation, the stage with its array of tips serves as the sample,
and the bead acts as a probe. When a molecular tip is found, a
confined scan is carried out so that the molecular tip can image a
sample bound to the bead; in this scan, the bead and stage have
exchanged roles.

An even more interesting application of this new design would be in
molecular construction. The array of molecular tips could be designed
so that each tip binds a reactive atom or molecule. As these "parts"
are added to a workpiece located on the bead, they would be
replenished from the surrounding solution. [See Journal of Vacuum
Science and Technology B, in press]


Russell Mills is research director at Group 9 Research Associates in
Palo Alto, California.

	*	*	*	*	*	*	*	*

BioArchive Progress

One of the Foresight Institute's goals is to stimulate efforts toward
building a BioArchive of endangered species: a set of safe
repositories of properly-stored genetic material and tissue samples.
The goal is that if the worst happens--if every living member of the
species is lost--irreplaceable information will survive.  With
sufficiently advanced technology and a biosphere on the mend, species
restoration could then be accomplished.  If enough samples have been
taken, even substantial genetic diversity within the species would
survive this hiatus.

Tania Ewing reports in Nature (7 June 1990) that the Centre for
Genetic Resources and Heritage (CGRH) in Australia has begun to work
explicitly toward this goal.  Based at the University of Queensland,
it will serve as a library of genetic material from Australia's rare
and endangered species.  Director John Mattick calls the center a
"genetic Louvre," which will store tissues, cells, and isolated DNA
samples which have been preserved cryogenically or by desiccation.
Mattick points out that if work like this isn't done, "subsequent
generations will see we had the technology to keep [DNA] software and
will ask why we didn't do it."

Exactly so: CGRH's work is critical to the future health of the
biosphere.  Yet they are preserving only Australian samples, and are
having trouble finding funds to do even that much.  We need to
encourage the powerful, mainstream environmental groups to become
active in support of this work, and to help establish similar efforts
around the world.

To contact them, write CGRH, Centre for Molecular Biology and
Biotechnology, University of Queensland, Australia.  Environmental
activists interested in furthering the BioArchive concept should
contact the Foresight Institute.

	*	*	*	*	*	*	*	*

International News
by Stewart Cobb

Sixth Generation Project

Japan's Ministry of International Trade and Industry (MITI) is now
planning its Sixth Generation Computer project.  This is a
three-pronged effort involving theory, technology, and applications.

The "Basic Theory" program is intended to develop a theoretical basis
for "processing of ambiguous and incomplete information," using ideas
from artificial intelligence, neuroscience, psychology, and cognitive
science.  MITI aims to build systems capable of "learning and
self-organization," "approximately correct problem solving," and
"integration of mass information."

Under "Fundamental Technology," MITI will study architectures for
massively parallel and distributed computer systems.  This effort
includes a wide-ranging investigation of new technologies for
computing devices, including superconductivity, quantum electronics,
optical switches, wafer-scale integration, and three-dimensional
integrated circuits.  The report mentions molecular devices as a
targeted technology.

MITI is not neglecting product development, the commercial payoff for
this massive project.  The final research area is "Novel Functions,"
investigating applications of the new computer technology. MITI
expects these applications to include large-scale system simulations,
self-organizing databases, autonomous and cooperative robot
controllers, high-level pattern recognition and understanding, and
generalized problem solving with soft knowledge.  Some of these areas,
such as large-system simulation and robot control, should be useful in
advanced applications of molecular manufacturing.
[Nature, 345:279, 24May1990; Intelligence, Vol7, No4, pp1-3, Aug1990]

Eastern Europe

With the opening of Eastern European countries, more is being learned
about research efforts and interests there.  Some of these are
relevant to nanotechnology:

In Czechoslovakia, the former president of that country's Academy of
Sciences--now director of the Institute of Molecular Genetics--has as
a goal "the bringing together of information theory and systems
engineering with the reductionist pursuit of molecular mechanisms."
Bulgaria's Institute of Electronics holds an annual meeting on quantum
electronics; it is attended by scientists from all over the world.
The president of the Romanian Academy of Sciences--also the Deputy
Prime Minister--wants to set up ten "advanced study groups" working on
specific problems like molecular engineering, materials science, and
tunneling microscopy.
[Nature, 344:609-619, 12Apr90]

Comparisons

Two government studies this spring compared the technological
capabilities of the US with the rest of the world.  The overall
results came as no great surprise.

The Department of Defense reported that the US significantly leads the
USSR in all but four of 20 militarily critical technologies, while the
Soviets lead in only one.  The European NATO countries were considered
roughly equal or slightly behind the US in all 20 areas. Japan,
however, was called the leader in five of the 20 technologies,
generally those with near-term commercial applications.
[Science 248:299, 20April1990]

The other study, from the Department of Commerce, compared the
relative standing of the US, Japan, and the European Community in
twelve emerging technologies.  According to the report, the US
generally leads both Japan and Europe in research and development in
these technologies, but lags Japan in creating new products.  The
Commerce Department also described the trends in these areas, showing
the US holding even with Europe but rapidly falling behind Japan.  The
report covered twelve new technologies which together are expected to
reach $1 trillion in sales worldwide in the year 2000.
[Science 248:1185, 8June1990]

The Office of Science and Technology Policy has been asked to combine
these two studies into a single report, which is expected in late
October.

Human Frontiers

The announcement of the initial grants in the Human Frontiers Science
Program has made clear that this international effort will not favor
any one country, as was feared earlier when Japan proposed the
program.  Grants were in approximate proportion to the number of
applications from each country: Japan (9), US (8), Britain (5), France
(3).  Despite this reassuring result, the US and Western European
countries reacted negatively to a Japanese proposal for an
international effort on intelligent manufacturing systems.
[Nature, 344:579, 12Apr90; Nature, 345:560, 14Jun90]

Degree patterns

Meanwhile, the US National Science Foundation reports that the number
of undergraduate degrees in science and engineering granted to US
students fell 10 percent from 1986 to 1988.  The total number of
undergraduate degrees granted to US students during the same two years
rose slightly to an all-time high.
[Nature 345:655, 21June1990]

Stewart Cobb is an aerospace engineer and was an early member of the
MIT Nanotechnology Study Group.

	*	*	*	*	*	*	*	*

Federal Comments on Nanotechnology

	Views on nanotechnology from three U.S. government research
agencies were expressed recently in response to inquiries by
Congressman Bill Green.

	The most detailed response was received from the head of a
laboratory at the National Cancer Institute, part of the National
Institutes of Health (NIH).  Excerpts follow:

	"I share the view that these [Drexler's] calculations and
reasonings are interesting and promising and should be considered
seriously...Stunning examples from biomedical research, chemistry, and
physics demonstrate the potential of engineering at the molecular
level.  All this has been accomplished by scientists from these
disciplines cooperatively applying the traditional scientific
principles of experimentation and theory.  Recently, a third
discipline has been added: computational science, or computational
simulation.  The latter approach depends on powerful, scientific
computers; it permits simulations of such realism (even at the atomic
level) that it is possible to explore the boundary between the
feasible and the infeasible.  This is certainly relevant to the
nanotechnology we are discussing.

	"Much of the current revolution in biology arose from the
study of viruses; they were treated as prototypical organisms and, at
the same time, research showed that, indeed, they are precise,
engineerable assemblages of molecules...

	"Strong, long-term public support, and to a lesser extent,
private support, have paid off mightily.  However, we must be certain
of continued support in such new directions as nanotechnology.
Predictions of the future can be unreliable, but there is clearly
justification for optimism and committed effort to further study and
research on nanoscale, self-replicating molecular machines in return
for the expectation of long-term practical developments."	A
Division Director at the National Science Foundation wrote "With the
exception of medical applications, nanotechnology is a research area
that would be appropriate for support by the National Science
Foundation.  Aspects of this research area are already supported by
several research programs at the Foundation."

	A spokesman for the R&D at the Environmental Protection Agency
was less informed--"a direct relationship to work being done at the
EPA was not readily apparent"--but requested further information on
environmental applications.

	Both of the positive responses were obtained from government
scientist/administrators who were already acquainted with the
nanotechnology concept through traditional sources of technical
information.  One of the Foresight Institute's goals is to maximize
the number of scientists introduced to the idea in this way, rather
than in the media or from nontechnical sources.  FI members who wish
to introduce the concept of nanotechnology to scientists or government
leaders are urged to call our office (415-324-2490) for advice before
proceeding.

	Thanks to Congressman Bill Green and FI member Alvin Steinberg
for stimulating the above correspondence .

	*	*	*	*	*	*	*	*

Market-Based Foresight : a Proposal
by Robin Hanson
	
	We need to evolve better fact-finding institutions to speed
the growth of foresight.  A better way for people to "stake their
reputations" might help.
	At present, when a technological question becomes a matter of
public concern, advocates often engage in a trial by media combat.
Opposing experts fling sharp words, and accuse each other of bias and
self-interest.  Debates quickly descend into hyperbole and
demagoguery.  Paralysis or folly often follows, seriously undermining
our ability to deal with important issues like space development,
nuclear energy, pesticides, and the greenhouse effect.  Greater issues
lie ahead, such as nanotechnology, where the consequences of such
folly could be devastating.

	We want better institutions for dealing with controversies
over science facts, so we can have better inputs for our value-based
policy decisions.  Yes, with enough study and time, most specific
science questions seem to get resolved eventually.  But we want to
form a consensus more quickly about which facts we're sure of, and
what the chances are for the rest.  Rather than depending on the good
nature of the people involved, we want explicit procedures that
provide a clear incentive for participants to be careful and honest in
their contributions.  And we want as much foresight as possible for a
given level of effort, with the temporary consensus now correlating as
closely as possible with the eventual resolution later.

	One institution for doing all this is the "fact forum" (or
"science court"), proposed by Arthur Kantrowitz.  In this, competing
sides would agree to an impartial but technically knowledgeable jury,
present their cases, and then submit to cross-examination.  The jury
isolates areas of agreement as specifically as possible, and writes a
summary at the end.  Advocates not willing to submit their claims to
such criticism are to be discounted.  This is a clever suggestion,
worthy of attention and further exploration.

	Even so, I would like to offer an alternative proposal, and
encourage people to think up yet more ideas.  Fact forums have
problems which alternative institutions might be able to remedy.
Forum participants have an incentive to avoid making claims that will
look silly under cross-examination, but as every lawyer knows, that is
not the same as trying to get the truth out.  Debates favor articulate
intellectuals over those with good "horse-sense."  Who should get to
represent the different sides for questions of wide interest?  What if
few potential jurors are both knowledgeable and impartial?  These
ambiguities, and the non-trivial costs involved, give excuses for the
insincere to decline participation.

	In contrast, the alternative I will describe can, for a
well-posed question, create a consensus that anyone can contribute to,
with less bias against the inarticulate.  If offers a clear incentive
for contributors to be careful, honest, and expert.  Such a consensus
can come much cheaper than a full debate, and once created can
continuously and promptly adjust to new information.  Any side can
start the process, and leave the resulting consensus as an open
challenge for other sides to either accept or change by participating.
And there is reason to believe that such a consensus will express at
least as much foresight, as defined above, as any competing
institution.

	You may be skeptical at this point.  But, in fact, similar
institutions have functioned successfully for a long time, and are
well-grounded in our best theories of decision.  I'm talking about
markets in contingent assets, more commonly known as "bets."  Bets
have long been seen as a cure for excessive verbal wrangling; you "put
your money where your mouth is."  I propose we create markets where
anyone can bet on controversial scientific and technological facts,
and that we take the market odds as a consensus for policy decisions.

	Can this make any sense?  Consider how it might work.  Imagine
(hypothetically, of course) that there was a disagreement on whether a
programmable nanoassembler would be developed in the next twenty
years, and that policy makers were in danger of not taking this
possibility seriously enough.  What could markets do?

	Policy makers could take the position that they don't know
much about technology, or even about who the best experts are.  They
would simply use the market odds in making policy decisions.  If some
market said there was a 20% chance of nanoassemblers by 2005, policy
makers might decide the issue was serious enough for them to set up
their own market. They would carefully form a claim to bet on, such
as:

	By 2005, there will be a device, made to atomic
specifications, fitting in less than 1 cubic mm., able to run C
programs requiring 10MB memory at 1 MIPS, and able to replicate itself
in less than one year from a bath of molecules, each of which has less
than 100 atoms.

They would choose a procedure for selecting judges to decide the
question in 2010, and a financial institution to "hold the stakes" and
invest them prudently.  And then a market would be set up, where
offers to trade would be matched; policy makers could even subsidize
the market to encourage participation.

	Ordinary people could take the attitude that those who claim
the consensus is mistaken should "put up or shut up" and be willing to
accompany claims with at least token bets.  (Statistics about how well
people do could then be compiled.)  When people on the "pro" side buy
bets, they would drive the consensus price up, toward what they
believe.  "Con" people would have to accept this or buy compensating
bets to push the consensus down; they could not just suppress the idea
with silence.

	If the different sides soon came to largely agree, they could
gradually sell and leave the market, leaving the consensus standing.
Judges need only be paid when they actually judge, and incentives to
"settle out of court" (not described here) can make the need for
formal judging rare.  Thus, even obscure questions could afford
expensive judging procedures.

	Individuals or groups who believe they have special insight
could use it to make money if they were willing to take a risk.
Arbitragers would keep the betting markets self-consistent across a
wide range of issues, and hedgers would correct for various common
human biases, like overconfidence.  Traders could base their trades on
the results of other relevant institutions, like fact forums, and so
the markets should reflect the best insights from all co-existing
institutions.

	Risk reduction, i.e. insurance, is also possible.  Policy
bodies and anyone else could bet that things will go against them, and
so be less sensitive to uncertainties surrounding, for example, a
nanoassembler breakthrough.

	Of course, like fact forums, betting markets have problems and
limitations.  There is no escaping the costs of thinking carefully
about what exactly one is claiming; a clearly worded claim is much
easier to judge impartially.  Science bets can take longer than most
to resolve, making investments in them less attractive.  There are
tradeoffs in how long to wait to resolve a bet, and in how many
variations on a question can be supported.

	"Moral hazard," where someone might do harm to prevent a claim
like "A person on Mars by 2030" just to win a bet, should be avoided.
Judges should be kept impartial, though judging in hindsight should be
easier than foresight.  Market procedures should discourage
conflict-of-interest cheating, such as brokers who trade for both
others and themselves.  Perhaps most limiting, explicit betting
markets on science questions seem to be legal only in the U.K.

	Some apparent problems are really not problems.  Markets may
look like opinion polls where any fool can vote and the rich get more
votes, but they are actually quite different.  In practice, markets
like corn futures are dominated by those who have managed to play and
not go broke.  Explicit betting markets cannot be cornered or
monopolized.  So, rich people who bet large sums carelessly or
insincerely give their money away to those with better information.
If word of this behavior gets out, they lose this money quickly, as
anyone can make money by correcting such a manipulation.

	While betting markets may be untried as a way to deal with
policy-related fact disputes, they are not untried as a human
institution.  Bets are a long-established reputation mechanism and
phrases like "you bet" are deeply embedded in our language.
Scientists have been informally challenging each other to reputation
bets for centuries, with a recent wave of such bets about "cold
fusion."  Illegal sports betting markets are everywhere, and England
has had science betting markets for decades. Many people there won
bets on the unlikely claim that men would walk on the Moon.

	Since June 1988, astrophysicist Piers Corbyn has bet to gain
publicity for his theory of long-term weather prediction, betting
against London bookies who use odds posted by the British
Meteorological Service.  Over the last six months alone, there is less
than a one in 10^10 chance of someone randomly winning his 25 bets a
month at his over-than-80% success rate.  Yet the Service still
refuses to take Piers seriously, or to bet against him.  Bookies have
taken on the bets for the publicity, but are tired of losing, and have
adjusted their odds accordingly.  These are the odds that should be
used for official British agricultural policy.

	Betting markets are also well established in economic theory.
A standard way to analyze financial portfolios is to break them into
contingent assets, each of which has value in only one possible world.
In fact, stock and other securities markets can be considered betting
markets relative to the return one can get by buying the "market"
asset.  A "complete market"--where one can bet on everything--is best,
allowing investors to minimize risk and maximize expected return.
Explicit betting markets, usually on elections, are widely used to
teach MBA students about markets, and one experimenter claims they
predict the final vote tallies better than national polls.

	A famous economics article argues that Eli Whitney would have
benefited more from his cotton gin by speculating in cotton-bearing
land than by trying to enforce his patent.  Finally, in the presence
of betting markets, perfect decision theory agents will make all
external actions as if they agreed with the market odds, making those
odds a real "consensus."

	Our biggest problem may be how we solve problems.  I suggest
that policy makers would do well to use estimates on technology issues
that are as unbiased and predictive as the odds at most any racetrack.
If you agree, help me find a way to give the idea a try.


Robin Hanson researches artificial intelligence and Bayesian
statistics at NASA Ames, has master's degrees in physics and
philosophy of science, and has done substantial work on hypertext
publishing.  To receive his longer paper on the above topic, send us a
self-addressed envelope (with 45 cents postage within the US), or send
electronic mail to hanson@charon.arc.nasa.gov.

References

K.E. Drexler, Engines of Creation, Doubleday, New York, 1986.

"Feedback Column," New Scientist, 7/14/90. See also 2/10, 6/23, 7/28.

R. Forsythe, F. Nelson, G. Neumann, J. Wright, "The Explanation and 
  Prediction of Presidential Elections:  A Market Alternative to Polls" 
  Economics Working Paper 90-11, U. of Iowa, Iowa City, 4/12/90.

R. Hanson, paper to appear in Proc. Eighth International Conference on Risk and Gambling, 8/90.

J. Kadane, R. Winkler, "Separating Probability Elicitation from 
  Utilities," Journal of American Statistical Association, June 1988, 
  83(402), Theory and Methods, pp. 357-363. 

J. Hirshleifer, "The Private and Social Value of Information and the
  Reward to Inventive Activity," American Economics Review, 61(4),
  Sept. 1971, pp. 561-74. 

W. Sharpe, Investments, 3rd Ed., Prentice Hall, NJ, 1985.

	*	*	*	*	*	*	*	*

JBIS on Nanotechnology

In honor of its 60th anniversary, the Journal of the British
Interplanetary Society is dedicating the opening issue of its
Celebration Series to the topic "Nanotechnology in Space."  JBIS is
known for publishing exploratory engineering work on space development
and exploration.

The issue is scheduled for October 1992 and will be edited by
Salvatore Santoli.  Dr. Santoli reports that he plans to interpret the
term nanotechnology as Foresight does, i.e. molecular manufacturing,
or thorough control of the structure of matter.  He is actively
soliciting papers for the issue.

For further information on this special issue, or to propose a paper
topic, contact the issue's editor at the following address: Salvatore
Santoli FBIS, via A. Zotti 86, I-00121, Rome, Italy.  To submit
completed papers, enclose a note indicating that they are meant for
this issue and mail to: Executive Secretary, British Interplanetary
Society, 27/29 South Lambeth Road, London SW8 1SZ, England.  Decimal
paragraphing and SI units must be used; contact BIS for their
"Guidelines for Authors."

	*	*	*	*	*	*	*	*

Molecular Carpentry
by Ted Kaehler

On his sabbatical from Apple Computer, Ted Kaehler worked with the
Foresight Institute on molecular systems design and modeling:

In the everyday world, we work with building materials that can be cut
to size.  When you are building a structure out of wood, you measure
the raw material and cut it to the proper dimensions.  Designs in the
everyday world commonly use fixed 90 degree angles, but variable
lengths.  Nanomechanical designs operate under a different set of
rules: bond lengths are virtually fixed, but angles can be varied
substantially.  Designing a nanomechanical structure is thus different
from designing with a material that can be cut to any desired size.

During the summer of 1990, I worked with Eric Drexler to develop a
system for designing molecular "parts" with arbitrary length or angle
requirements.  Here is a typical problem we considered: a designer has
a nanometer scale device that needs to be supported and held firmly in
place (the required rigidity varies with the application).  Suppose
the surrounding matrix is a diamond crystal lattice (Fig. 1).  For
convenience, the designer has chosen hexagonal carbon rings to serve
as a standard interface.  Every piece, including the diamond lattice,
has a triangle of three bonds coming straight out of a hexagonal
carbon ring which serves as its attachment point.  The problem is
this: what arrangement of atoms will bridge from the diamond crystal
to a mounting ring on the device and hold it firmly?  Within the
diamond, there are only four distinct bond directions available, and
at a limited set of points in space.  If we extend the crystal right
up to the mounting ring, it is unlikely that any of the bonds will
closely match it in angle or location.  Thus we need a new arrangement
of atoms which will form a strong and stiff bridge from the crystal to
the device.

Three methods for designing the bridge come to mind.  The first is to
build the bridge atom by atom and "search" for the proper
configuration.  This is much like a computer program for playing
chess.  Placing an additional atom on the end of the structure is like
making a move in chess.  One wants it to be a step toward the
solution, but one can't tell if it is the right step, except by trying
it.  Only after more atoms are added (more chess moves are made), can
we tell whether the bridge matches up at the far end (whether these
moves lead to a better chess position).  If not, one must take back
those moves and try others.  In the absence of a good predictive
theory, this kind of search takes a tremendous amount of computation,
just as chess playing programs do.  Without this, designing a new
bridge from scratch every time one has a specific need does not look
like such a good idea.

The second method is to design a "universal" structure that has length
and angle adjustments.  This would be a flexible structure with many
two-position adjustment points.  These might be chains that could be
shortened by one atom, or atoms with different bond lengths that could
be substituted.  By changing which adjustment points were set to
"long" and which were "short," the length of the whole structure could
be varied by small amounts.  Such a structure might have some
disadvantages.  It would have to be large in order to get sufficient
variability.  It is unlikely that it could be made very stiff without
being so large as to dwarf the device it was meant to hold.  We have
not been able to think of any good structures that avoid these
problems, and this area is still open for innovation.

The third method is to design hundreds of short, strong molecular
brackets and then classify them by offset and angle.  After each
arrangement of atoms is designed, a program computes its detailed
shape, and the results are stored in a dictionary.  The designer uses
the dictionary to choose the proper bracket to support the device in
its proper place.  To choose the right bridge from the catalog, we
first imagine the diamond crystal extended up past the mounting ring
we are trying to secure.  For the "number one" atom on the mounting
ring, we find its location within a unit-cell of diamond crystal.  We
also note the angle in 3-space of a vector that expresses the
orientation of the ring.  We then look up the position and angle in
the dictionary to find the closest match.  We find an entry for a
known bracket and the (x,y,z) offsets to each of its three attachment
points in the diamond lattice.  The dictionary tells the designer what
bracket to use and where in the diamond lattice it will attach.  The
bracket is free standing, attaching to the diamond crystal with just
three bonds (Fig. 1).  In the final design, the diamond only comes as
close to the device as the offset says to, and the bracket spans the
remaining distance.

To design a family of brackets, we begin with a stack of six-membered
carbon rings.  Such stacks are found within the structure of hexagonal
diamond (lonsdaleite) and are a strong, compact structure (Fig. 2).
Each ring has three covalent bonds to the ring below and three to the
ring above.  This gives good stiffness.  A barrel-like stack of
six-membered rings is straight, so we must introduce some variation to
make it bend.  One way is to use seven-membered rings.  Each
seven-membered ring has three attachments above and three below.  The
seventh atom distorts the ring in some direction.  A second
seven-membered ring on top of the first has six different places where
the seventh atom can interrupt the ring.  The many combinations of
seventh atoms on different levels give a range of combined twists,
bends, and offsets from the normal lattice.  All extra carbon bonds
that hang out of the structure are capped with hydrogen.

Figure 3 shows a typical two layered bracket with a hexagonal mounting
ring on each end.  Even a structure of just two layers can have quite
a bit of twist and offset.  The structure is compact and stiff, with
three or more covalent bonds at each cross-section.  Here are the
major ways that a normal stack of six-membered carbon rings can be
varied to make brackets for cataloging:

1) Add a seventh atom in one of six places on a given ring.

2) Add another layer to the structure.  Try both six-membered and
seven-membered rings in the new layer.

3) Substitute silicon for any of the carbon atoms.  Silicon has longer
bonds and distorts the structure.

4) Substitute nitrogen for carbon.  Substitute oxygen or sulfur for
carbon at the seventh atom (it is only bonded to two other atoms).

5) Use a C=C double bond instead of a single bond.  This only works at
certain places in the structure, and is strained.

6) A side view of the stack shows six-membered rings facing outward.
When there is a seventh atom in a layer, the side view shows a
seven-membered ring.  When two seven-membered layers have their extra
atom above each other, the side view shows an eight-membered ring.
That ring can be split in two by adding a fourth bond between the
layers.  The side view now shows two five-membered rings.  The extra
bond between the layers changes the shape of the bracket.

7) Similarly, a six-membered ring facing outward can be bridged in a
direction along the axis by adding a carbon, oxygen, sulfur, or
silicon (along with any needed hydrogen atoms).

The computer program to build the catalog proceeds as follows:
Enumerate all the possible brackets using the above rules, starting
with the shortest first.  For each bracket, compute its shape using a
molecular mechanics program.  The most important aspect of its shape
are the three bonds coming out of the mounting ring on each end.  With
one end attached to a diamond lattice, we compute the offset and angle
of the ring on the other end, and enter it into the catalog.  Since
computing the shape of the bracket is the hard part, we save time by
making catalog entries for the mirror image of the bracket, the
bracket upside down, and the bracket attached to a vertical face of
the diamond crystal.

Not every structure we compute will become an entry in the catalog.
When many brackets reach the same place and angle, we only want the
shortest and stiffest one.  The catalog will be made to a certain
spatial and angular resolution.  If we try to find one entry for every
0.154 (a tenth of a carbon-carbon bond length), then number of
position points in a unit will be around 1029.  For each of these, we
need a variety of angles.  Since bonds can bend much more easily than
they can change length, an angular accuracy of plus or minus 10
degrees may suffice.  Accounting for all the spherical symmetries, we
need 66 different angle entries per approximate position, derived from
as few as 4250 bracket designs.  (A single bracket may be entered into
the table in as many as 16 different ways.)  The shape of many more
than 4250 brackets will have to be computed to get a sufficient
variety of angles and locations.  It will be interesting to see how
clumpy the distributions of brackets is, and to see if there are any
regularities that will allow us to predict the shape of an
as-yet-uncomputed bracket.

It is possible that reaching the full diversity of the catalog will
require putting too many layers in the bracket.  Such a bracket would
be too long and floppy to be of much use.  If this is true, all
brackets with more than a certain number layers will be designed with
thick bases.  Imagine the thick base as a short bracket made from
three parallel hexagonal tubes.  It is short and stiff.  On top of
this is a normal one-tube bracket.  The richer structure of the
thicker bracket allow it to have many more variations per layer,
making a diverse set of shapes easier to generate.

If the designer is not happy with the spatial and angular resolution
he finds in the catalog, he can pull a few tricks.  The device he is
building is likely to be anchored at several places.  If one of those
anchors is at a slightly wrong place, he can pick the other anchors to
push the structure back in the right direction.  Likewise, slightly
wrong angles can be pitted against each other to give a correct final
position.  Such a mildly strained structure should work just fine.

To begin the project, we selected an existing molecular mechanics
program.  Programs that compute the shapes of molecules come in a
variety of speeds.  The structures we are simulating contain nothing
but the atoms and bonds of locally-unremarkable organic molecules.  We
are not studying unstable transition states in chemical reactions, so
we don't need "molecular orbital" programs that model the quantum
mechanics of electron clouds.  Instead we used a "molecular mechanics"
program that treats each chemical bond as a spring with a certain
resting length.  Additional springs handle the desire of an atom to
keep its bonds at certain angles to each other.  By using only forces
between the centers of atoms, this program can go very fast.  The
program we selected is STRFIT3 by Martin Saunders and Ronald Jarret of
Yale University, which gives results closely approximating those of
the classic MM2 program .  Around this we are building programs to
generate the brackets and enter them in the catalog after their shape
is known.

This system is implemented in Digitalk Smalltalk/V Mac on an
accelerator-assisted Macintosh II.  After we have verified that
STRFIT3 is producing shapes that agree with known molecules, we intend
to run the system every night and build a catalog of nanomechanical
brackets.

The interesting thing about this project is considering design
problems in a world in which angles can be varied but lengths cannot,
with lengths and flexible angles like those found in real molecules.
The catalog we build now will probably not be the one used when
nanostructures are actually built.  By the time fabrication technology
is available, designers will want to use the latest modeling programs
and the fastest computers to rebuild the catalog with high accuracy.
By creating the tools to build a catalog today, we can get a glimpse
of the techniques and pitfalls of designing mechanical structures in
which `every atom is in its place.'

Reference

Martin Saunders and Ronald Jarret, "A New Method for Molecular
Mechanics," Journal of Computational Chemistry, Vol. 7, No. 4, 578-588
(1986).

Ted Kaehler is a computer scientist who spent his sabbatical from
Apple Computer working with the Foresight Institute.  He and Foresight
would like to thank Martin Saunders of Yale for allowing us to use the
program STRFIT3 and for his additional help.  Ted's participation was
funded by the Restart Program of Apple Computer, Inc.

Figure 1.  A diamond surface with a six-membered ring attached.  The
upper three carbon atoms (dark gray) are shown with missing bonds
where the bracket would be extended. The lower carbon atoms are shown
likewise, where the diamond crystal would be extended.  The free
surfaces are terminated with hydrogen atoms (white) save for three
embedded nitrogens (light gray) included to avoid the need for crowded
hydrogens.  The illustrated structure was minimized using the MM2
potential energy function (Chem3D Plus implementation).

Figure 2.  A stack of four six-membered carbon rings.  'D' indicates
the three bonds to the diamond substrate.  The top ring attaches to
the device being supported.  Hydrogen atoms attached to the two middle
rings are not shown.  (The structure appears to be curving slightly to
the left.  It should be completely straight, and we are looking for
the bug in our software.)

Figure 3.  The same structure with a seventh atom inserted in two of
the rings.  The top ring is rotated, displaced sideways, and tilted.
Thousands of such variations will be be classified in a catalog
according the location of their top ring.  The designer selects the
bracket that matches the location of the part he wishes to support.

	*	*	*	*	*	*	*	*

STM '90

This year's conference on scanning tunneling microscopy was broadened
to include scanning probe microscopy and spectroscopy, as well as
immediately adjacent technical fields.  To reflect this increased
breadth, the parallel title NANO I was added to the conference.
Sponsored by a wide variety of organizations, it was held in Baltimore
on July 23-27, 1990.

Interest in the meeting was intense, with many hundreds of abstracts
submitted.  Chairman James Murday of the Naval Research Laboratory
reports that the meeting drew 675 attendees--forty percent larger than
previous meetings.  With so many submissions, the great majority had
to be presented in mammoth poster sessions.  The abstract "booklet"
had 372 pages.  One paper, a proposal for molecular tip arrays for
atomic force microscopy, is described in this issue's "Recent
Progress" column.

Of special interest to Foresight was the session on "Nanometer Science
and Technology--Prospects, Priorities, and Programs."  It included
presentations from Japan's Nanomechanism Project, Britain's
Nanotechnology Project, NIST's Micro-metrology Group, and NSF's group
on Quantum Electronics, Waves, and Beams.  Eric Drexler spoke on the
results of the First Foresight Conference on Nanotechnology (October
1990) and our perspective on molecular systems engineering as a path
to molecular nanotechnology.

Instead of publishing a separate conference proceedings volume, STM
'90 is working with the American Vacuum Society to publish a special
edition of the Journal of Vacuum Science and Technology (JVST) with
conference papers.  We'll let you know when it is available.

The AVS is so interested in nanometer-scale work that it has agreed to
host the next meeting as part of its larger meeting in Seattle,
November 11-15, 1991.  This first AVS National Symposium on "Science
and Technology at the Nanometer Scale" will include topics similar to
the 1990 meeting; we suggest you check the conference papers to get a
feel for which areas will be covered.  Foresight Update will publish
more information as it becomes available.

In addition to hosting the next meeting, AVS is renaming one of its
journals to reflect its new focus: the new subtitle for JVST B is
"Microelectronics and Nanometer Structures--Processing, Measurement
and Phenomena."  Publications on proximal probes (STM, AFM, etc.) are
expected to continue to increase.  While much of this work is not done
in vacuum, AVS's enthusiasm for the field should overcome any initial
confusion this may cause.

To contact STM '90; write to the Conference Office, 750 Audubon Road,
East Lansing, MI 48823.

	*	*	*	*	*	*	*	*

Journal Review:
Nanotechnology
by Chris Peterson

The new journal Nanotechnology takes as its subject a broad range of
fields which have, or hope to have, some connection to the nanometer
scale: machining, imaging, metrology (measurement), micromachines,
instrumentation and machine tools, scanning probe microscopy,
fabrication of components, nanoelectronics, molecular engineering, and
so on.  Based on the first issue, the journal will be worth the
attention of those with broad interests in nanometer-scale
technologies, particularly those interested in the nuts-and-bolts of
developing and implementing various enabling technologies.

Published by the Institute of Physics, based in the U.K., it has
pulled together regional editors and an editorial board from around
the world, including the USSR, Bulgaria, and Poland.  Most are from
the US, Japan, Britain, Germany, and Switzerland.  Some names are
familiar to those who follow progress in work leading to molecular
nanotechnology: regional editor E. Clayton Teague (from NIST) attended
our first nanotechnology conference, as did editorial board member
Robert Birge (U. Syracuse), who presented molecular electronics work
at the meeting.  The editorial board also includes Robert T. Bate
(quantum electronics, Texas Instruments), Paul K. Hansma (STM and AFM,
U. Cal. Santa Barbara), Richard S. Muller (micromachines, U. Cal.
Berkeley), and James S. Murday (Naval Research Lab, chaired
STM'90/NANO I meeting).

The challenge for the journal will be to maintain the quality shown by
the first issue.  This included a number of broad review articles of
interest to the newcomer and helpful in orienting new readers to the
interests of the publication.  To avoid repetition, however, later
issues will inevitably move toward more specialized material, such as
reports of STM experimental results, e.g. "Voltage dependence of the
morphology of the GaAs(100) surface observed by scanning tunnelling
microscopy" in the first issue.  While worthy, such a report is more
relevant to those working with GaAs than it is to nanotechnology per
se. There is a great deal of this work available, as shown by the huge
poster sessions at NANO I.

A promising sign is the inclusion in the first issue of a proposal for
the design of a new instrument.  This focus on future tools is unusual
and could provide a valuable niche for the journal to fill.

The scope of this journal once again shows that the word
`nanotechnology,' without a modifier, can no longer be taken to refer
to the technology at the core of Foresight's concerns.  In introducing
the subject of "thorough control of the structure of matter," one must
be more specific, speaking of molecular nanotechnology, or molecular
manufacturing.

We'll keep an eye on this publication and report how it progresses.
Two issues are planned for volume 1 in 1990, with four in the works
for volume 2 in 1991.  To subscribe in the US, Canada, or Mexico,
write to American Institute of Physics, Subscriber Services, 500
Sunnyside Blvd., Woodbury, NY 11797-2999.  Elsewhere write to Order
Processing Dept., IOP Publishing Ltd, Techno House, Redcliffe Way,
Bristol BS1 6NX, UK.  Volume 1 is $99, with a single issue price of
$49.50; volume 2 is $215, with a single issue price of $54.  If you
subscribe to both volumes together, the price is $270.00.

If the prices look a bit steep, ask your favorite technical library to
subscribe, or have them request the first issue as a sample copy.

	*	*	*	*	*	*	*	*

Advisor Profile
Kantrowitz: Solutions, not Sacrifice
by Dan Shafer

        Prof. Arthur Kantrowitz of Dartmouth is at it again.  A man
whose life has been filled with and, perhaps, characterized by the
building of bridges and the creation of transitions, has made another
grand leap. If the past is any indicator, we'd all be well-advised to
pay attention to where he's landing.

        His career has moved from a starting point in atomic science
into fluid mechanics, where he applied the ideas of modern physics and
thereby made direct contributions to the space program, particularly
in re-entry from space.  From there, he bridged disciplines once again
as he applied the principles of fluid

dynamics to blood flow and hematology.  As a result, he has played a
major role in research and development of cardiac assist devices and
has made contributions to the understanding of blood clotting
processes.  From physics to hematology is two giant steps, but
Kantrowitz has made them seem natural, almost inevitable.

        Kantrowitz, a member of the Foresight Institute Board of
Advisors, is focusing most of his energies these days advocating a
philosophy of optimism.  More than a philosophy, his approach to the
great scientific questions of our time is a hard-boiled,
policy-oriented method for dealing with the problems being created by
our technologies.  When such an idea emanates from the mind that
conjured up the Science Court of the mid-1970s and called attention to
"The Weapon of Openness" available to America in the 1980s (see
Foresight Background, No. 4), it ought to carry more than the usual
weight of authority.

        The greenhouse effect?  If it turns out to be real--and
Kantrowitz leaves little doubt he believes the jury is still out on
the issue--it ought, he says, to serve as a trigger to solution, not a
call to sacrifice.  "If the ozone layer is depleted and is being
consistently further depleted, we have to figure out something to do
about it.  Rather than using such phenomena as instruments to force us
to sacrifice, we should see them as a call to find solutions.  There

must be some creative way of fixing it.  For example, maybe some
chemistry grad student is sitting right now with a chemical solution
that would create ozone where it's needed.  I don't suggest that this
is a real solution, only that we ought to be thinking about solving
the problem rather than sacrificing.  Religious movements, rather than
technological breakthroughs, are built on sacrifice.  If--without
considering how to solve the problem--we go into a program of
sacrifices whose costs are measured in trillions of dollars, this is a
creature of the deepest pessimism.  Maybe it's time we took climate
control seriously instead of simply succumbing to the problem."

        Space colonization?  Again, only pessimism has kept it from a
success that we would already be enjoying.  "If we had a really
adventuresome space program, we'd already have people living in space.
If we had a space transportation system designed by some competitive
process rather than by some version of a centrally planned
economy--which works no better for such tasks than it did for Eastern
Europe--we'd have solved the problems years ago."

        In general, Kantrowitz tells us, "Pessimism leads us to
minimize risks and therefore reduces the rate of change by making
innovation difficult."  He points out that, "An optimistic society
realizes that mistakes will be in proportion to our technology.
Furthermore, we must remember that the problems we bequeath our
successors will be solved by their technologies, which will inevitably
be well beyond ours."  As a result, Kantrowitz finds himself largely
unworried by some of the technological issues that cause hand-wringing
by many other scientists and the general public.

        Although he advocates an open and optimistic approach to
science and technology, he does not believe such development should be
completely unbridled.  "In a pessimistic society such as the one we
have created, regulation is good.  We place paramount importance on
being safe.  If anyone wants to innovate, we must see that he takes
all responsibility for any harm.  As a result, our young people are
turning away from science and medicine and towards law.  Although it
is clear that we need some method for correcting malpractice of
various sorts, we can't survive if the only such system is one
designed of, by, and for the lawyers."

        It is at this point that Kantrowitz echoes some of the highly
original thinking that characterized his mid-1970s proposal to create
something called a Science Court.  This body would resolve factual
disputes between scientists by means of an adversary proceeding.
Kantrowitz proposed the idea as a result of his 197576 work with the
Presidential Advisory Group on Anticipated Advances in Science and
Technology.  Among other things, Kantrowitz saw this approach as a way
of avoiding the "trial by public opinion" in which he sees too many
such disagreements being resolved today.  "This process," he explains,
"would be conducted as an academic function.  Instead of addressing
themselves to the public, scientists with legitimate opposing views
would address themselves to each other as expert adversaries." In
fact, Kantrowitz called for a new norm in scientific behavior which
would insist that, "Any scientist who addresses himself to the public
must then be willing to answer questions from expert adversaries."

        Besides the inherent problems in public opinion holding sway
over scientific evaluation, Kantrowitz sees another evil in the
current system.  "The present pernicious practice of advertising
factual information in all kinds of media is aggravated by the fact
that someone decides which scientists and facts will receive even that
kind of a hearing."  He points out that when politicians and media
representatives have questions, they call people they view as experts,
generally people who appear on their call lists or on the call lists
of their colleagues.  "This practice means that the new thinkers, the
innovative idea people, are the least likely to receive an objective
hearing, or even to get access at all," Kantrowitz points out.


Dan Shafer is an author and consultant in computation and emerging
technologies.

	*	*	*	*	*	*	*	*

Books of Note

Books are listed in order of increasing specialization and reading
challenge.  Your suggestions are welcome.  And remember, if a book's
price looks too high, your library should be able to get it through
interlibrary loans.  --Editor

Intellectual Compromise: The Bottom Line, by Michael T. Ghiselin,
Paragon House, 1989, cloth, $24.95.  A critique of academia,
explaining how and why it strays from its own ideals.  Explains why a
large portion of intellectual work is now going on outside academia
in, e.g., think tanks.  Warning: may frighten students away from
academic careers.

A Handbook of Computational Chemistry, by Tim Clark,
Wiley-Interscience, 1985, cloth, $38.50.  A practical guide to
molecular mechanics and molecular orbital calculations.  Includes
information on MM2, a molecular mechanics program well-suited to the
design or molecular machinery.  For working chemists and molecular
systems engineers.

Intermolecular and Surface Forces, by Jacob Israelachvili, Academic
Press, 1985, cloth, $107.  Densely-packed information for the serious
molecular systems engineer; a modern classic.


	*	*	*	*	*	*	*	*

Upcoming Events

Compcon, Feb. 26-28, 1991, San Francisco, sponsored by IEEE.  Includes
plenary talk on nanotechnology, Feb. 26, 9:30 AM.  Contact Michelle
Aden, 408-276-1105.

Molecular Graphics Society Meeting, May 14-17, 1991, University of
North Carolina, Chapel Hill, NC.  Interactive graphics, presentation
graphics, interfaces networking, novel display techniques; includes
vendor exhibition.  Contact Molecular Graphics Conference Office, c/o
Dr. Frederick P. Brooks, Jr., Dept. of Computer Science, University of
Computer Science, Univ. of NC, Chapel Hill, NC 27599-3175.

Space Development Conference, May 22-27, 1991, Hyatt Regency, San
Antonio, TX, sponsored by National Space Society, Southwest Research
Institute.  Cosponsored by Foresight Institute.  Will have a session
and possibly a workshop on nanotechnology.  Talk abstracts due Nov. 15
to Bob Blackledge, 719-548-2329.  Register before Jan. 1 at cosponsor
rate of $60: contact Beatrice Moreno, 512-522-2260.

STM '91, International Conference on Scanning Tunneling Microscopy,
August 12-16, 1991, Interlaken, Switzerland.  Contact Ch. Gerber, fax
(1) 724 31 70.

Second Foresight Conference on Nanotechnology, Nov. 1991, a technical
meeting sponsored by Foresight Institute, Stanford Dept. of Materials
Science and Engineering, University of Tokyo Research Center for
Advanced Science and Technology.  Dates and details to be determined;
please wait for future announcements.

Science and Technology at the Nanometer Scale, American Vacuum Society
National Symposium, Nov. 11-15, 1991, Seattle, WA.  See article
elsewhere in this issue.

Media Watch

The Summer 1990 issue of Caltech's magazine Engineering & Science
mentioned nanotechnology in an article reviewing various
nanometer-scale efforts at that institution.  Included was coverage of
STM work by Prof. John Baldeschwieler, an participant of the First
Foresight Conference on Nanotechnology.

The June 3 issue of The Sunday Correspondent (London) explained
nanotechnology as part of a review of the book Engines of Creation,
now available in Britain from Fourth Estate.  The June/July High
Technology Careers magazine covered nanotechnology as an approach to
building exotic materials.  The July 1990 Computer Shopper (London)
described the bottom-up approach to nanotechnology in an article by
Adrian Owen.  The August 1990 issue of Self magazine briefly covered
the prospects for advanced medicine using nanotechnology, including
"molecular surgery."


Thanks

Special thanks go to Jeannine Smyth for her extensive work in
redesigning the Foresight logo and other materials; readers should
start to see the results soon.

Thanks to Bob Kirby of the Technology and Society Committee for
arranging a lecture on nanotechnology to his group, and to Dave
Kilbridge for converting IBM text to Macintosh format.

Thanks to the following for sending technical articles and media
coverage; please keep these coming: Robert Allgeier, Keith Davison,
Allan Drexler, Jerry Fass, David R. Forrest, Robin Hanson, Mark
Haviland, Alan Hold, Wlodek Mandecki, B. Molnar, Anthony Oberley,
Roger J. Plog, Jack Powers, Edward Rietman, Jack Veach, Steven C.
Vetter, Michael Weber.


Letters

The Foresight Institute receives hundreds of letters requesting
information and sending ideas.  Herewith excerpts:

I am writing to represent the academic debate team of Henry Ford II
High School, Sterling Heights, Michigan.  We have studied and
researched the topic of nanotechnology for some time now, and have
developed a debate case to increase space exploration via
nanotechnology which won first place at a recent tournament.  On
behalf of my team, I extend our thanks to Eric Drexler and the
Foresight Institute for developing this captivating and important
field.  If possible, I would like you to send us any available
information on the subject of nanotechnology.  Anything you send will
be greatly appreciated.

Brian Wassom
Sterling Heights, Michigan

We have prepared a package of materials for high school debaters.  Due
to the large number of debaters, we ask that a $4 donation accompany
each request.

Do proceedings exist for the First Foresight Conference on
Nanotechnology?  How may I obtain or purchase them?

Also, I am very interested in the idea of simulating nanotechnological
concepts in order to examine problems or potential designs of
simplified molecular machines, mainly for educational or instructional
purposes.  By this I do not mean complex protein folding computations.
Are you aware of work being done in this area?  I am thinking of a
demonstration program more like Richard Dawkin's BIOMORPH, a program
based on simplified physical laws.  Please let me know if you know of
anything along these lines, and whether writing a program like this
would be a waste of time at this point.

Robert L. Virkus
Dallas, TX

A proceedings volume is in progress, edited by James Lewis of Oncogen
in Seattle.  We'll let you know when it's available.

There has been and continues to be a great deal of work being done on
computer modeling of molecular systems, both simple and complex.  The
simplest programs are used to draw molecules: they may know how many
bonds each atom can make and at what angles.  Molecular mechanics
programs can take a designed structure and minimize its energy, i.e.
find the most stable configuration.  The most computation-intensive
programs use quantum mechanical methods to calculate the properties of
molecules and of chemical reactions.  In our next issue we plan to
review a new molecular mechanics package for the Mac II.  Before you
write you own software, we'd advise a thorough inspection of programs
already available.  --Editor

I'm passing along another tidbit related to Internet computer access,
which may interest the Chicago/Midwest readers of Foresight Update.
In the Chicago area there is a public-access UNIX bulletin board
system; the modem number is 312-714-6568.  The first two weeks of full
Internet access are free.  A donation of $40 per year is asked to
continue Internet access.  All the Internet goodies are available as
far as I know, including sci.nanotech. . .

John Papiewski
Palatine, IL

Foresight encourages those with online capability to join in the
nanotechnology discussion taking place in the sci.nanotech USENET
newsgroup.


Gift Subscriptions

Foresight Update and many of our other publications (e.g. Backgrounds,
new Briefings) are not sold as subscriptions per se, but are sent to
all who make a minimum donation to the organization.  Currently the
minimum donation we request is $25 per year.

The holiday season is approaching fast.  This is the last issue of
Update to be published before then, so we'll take this opportunity to
invite you to give Foresight for your holiday gifts.  A donation of
$25 will bring Foresight publications to your gift recipient for
twelve months, along with a note identifying you as the giver.


Foresight Email

The Foresight Institute can now be reached using electronic mail.  Our
address is foresight@cup.portal.com.  This address should be reachable
from most nets, including Internet and USENET.  There is also a
gateway to connect Compuserve with addresses in this format.

Wish List

We could use the following materials and help: Macintosh computers, an
additional Apple Laserwriter, an additional fax machine, and a small
photocopier.  Office space in the Palo Alto area is needed as well.
We are in need of volunteer help with laying out our publications,
using Pagemaker software on the Macintosh.  Fundraising experience,
including grantwriting, would be of great use.  Note that donations of
equipment or funds are tax-deductible as charitable contributions.

If you or your company can help, call us at 415-324-2490.

Molecular Artworks

Under the title "Nanotechnology and the Miniature Arts," the journal
Leonardo has issued a call for papers dealing with artworks created on
very small scales.  The journal focuses on the interface between art,
science, and technology.  Specific topics of interest include: history
of miniature art, genetically engineered artworks, artworks invisible
to the naked eye, theoretical aspects of scale, and scientific
visualization of microscopic phenomena.

Anyone who has seen high-quality molecular modeling programs running
on a color monitor may agree that molecular artwork is already being
routinely created.  Interested authors should send manuscript
proposals to Pamela Grant-Ryan, Managing Editor, Leonardo, 2030
Addison St., Suite 400, Berkeley, CA 94704, USA.  Electronic mail can
be sent to isast@garnet.berkeley.edu.



+---------------------------------------------------------------------+
|  Copyright (c) 1990 The Foresight Institute.  All rights reserved.  |
|  The Foresight Institute is a non-profit organization:  Donations   |
|  are tax-deductible in the United States as permitted by law.       |
|  To receive the Update and Background publications in paper form,   |
|  send a donation of twenty-five dollars or more to:                 |
|    The Foresight Institute, Department U                            |
|    P.O. Box 61058                                                   |
|    Palo Alto, CA 94306 USA                                          |
+---------------------------------------------------------------------+