[sci.nanotech] Update #4

nanotech@athos.rutgers.edu (Nanotechnology Newsgroup Nexus) (04/15/89)

[hacked, as usual, from a MAC binary file. any errors my fault.  --JoSH]

+---------------------------------------------------------------------+
|  The following material is reprinted *with permission* from the     |
|  Foresight Update No 4, 10/15/88.                                   |
|  Copyright (c) 1988 The Foresight Institute.  All rights reserved.  |
+---------------------------------------------------------------------+

Inside:
Manufacturing with Nanotechnology	3
Upcoming Events	 3
Talks	4
Letters to FI	4
How Many Bytes in Human Memory?	5
Hypertext Publishing Progress	6
The Road to Nanomachine   Design	7
Biostasis Research	7
Kantrowitz Joins Board	7
Books of Note	8
Media Coverage	8
Nanotechnology BBS	8
Recent Progress	9
"Human Frontiers" Advances	12
Precollege Training	12
Next issue:
AI Directions 
 Government policy affects nanotechnology, hypertext 
 Nanotechnology course at U of T
 Biotechnologists introduced to nanotechnology 
 Meet the advisors: Part 1 
 Technical advances

A publication of the Foresight Institute
Preparing for future technologies
 Board of Advisors
Stewart Brand
Gerald Feinberg
Arthur Kantrowitz 
Marvin Minsky
Board of Directors
K. Eric Drexler, President
Chris Peterson, Secretary-Treasurer
James C. Bennett
Editor   Chris Peterson
Publisher   Fred A. Stitt
Assembler   Russell Mills
Publication date  15 Oct 88
(c) Copyright 1988 The Foresight Institute.
All Rights Reserved.

If you find information and clippings of relevance to FI's goal of
preparing for future technologies, please forward them to us for
possible coverage in FI Update. Letters and opinion pieces will also
be considered; submissions may be edited. Write to the Foresight
Institute, Box 61058, Palo Alto, CA 94306; Telephone 415-364-8609.


BioArchive Project

Saving Species through Nanotechnology

by Chris Peterson

In the 1990s, over 10,000 species per year are expected to become
extinct. Three-quarters of the world"s animal species may vanish in
the next 25 years. Besides losing the intrinsic value of these animals
and plants as part of today"s environment, we face the destruction of
priceless genetic information evolved over millions of years.
Nanotechnology will one day let us restore lands torn by industry and
agriculture, but without this genetic information, today"s species and
ecosystems will be lost forever. The simplest way to preserve species
is to preserve their habitats, but the immediate survival needs of
nearby human populations often make this practically impossible.

An alternate way to preserve endangered species was suggested in Eric
Drexler"s book Engines of Creation: preserving tissue samples in
cryogenic storage.  He pointed out that "preserving just tissue
samples doesn"t preserve the life of an animal or an ecosystem, but it
does preserve the genetic heritage of the sampled species. We would be
reckless if we failed to take out this insurance policy against the
permanent loss of species. The prospect of cell repair machines thus
affects our choices today." To pursue this option, the Foresight
Institute is initiating the BioArchive Project.

This project will coordinate existing field workers with existing
cryopreservation facilities to collect and store samples from
endangered animal and plant species, establishing a group of low-cost
gene banks distributed around the U.S. and--ideally--the world. Since
the rainforest environment of the Amazon River basin is both rich in
species and under intense pressure from human populations, it is a
natural focus for early efforts.

We are fortunate that the task of freezing species samples was begun
even before understanding of nanotechnology showed how they could be
restored. Germ cells of endangered species, along with other cells
from common animals, are stored at liquid nitrogen temperatures at the
Center for Reproduction of Endangered Species (CRES) at the San Diego
Zoo. This effort focuses on freezing germ cells and embryos, since
when warmed up they are often viable without the need for cell repair
technology. The freezers containing these treasures have been labeled
"Frozen Zoo: Twentieth Century Ark."

Dr. Barbara Durrant told Update that "Right now the Frozen Zoo
contains cells representing virtually every mammalian species on
Earth. Mostly, these are blood and skin cells for chromosome studies
that help us in making breeding decisions." Dr. Durrant explained that
quite a few bird, reptile, and amphibian species were included, but no
insects.

FI's goal is to spread awareness of the long-term value of such
samples, to establish multiple sites as backups in case of disaster,
and to develop a collection program so broad that even the many
unknown, unclassified species are included, besides the well-known
larger animals.

Seeds from today's plants are protected in seed banks, but again more
sites are needed for redundancy's sake. We need to verify that storage
is at sufficiently low temperatures, and that non-agricultural plants
and even "weeds" are sampled.

To ensure that ecosystems--not just individual species--can someday be
restored, we will encourage sampling of the widest possible range of
plants and animals in an endangered area. This can be done far less
expensively if no effort is made to identify each species or to keep
them separate. With future technology to sort out the sampled cells,
present day techniques can be quick and crude: To sample rainforest
trees, use a helicopter to drag a bucket-rake through canopy, then
freeze the leaf fragments. To sample soil insects, use standard
progressive-drying techniques on soil samples to drive them out for
freezing. A variety of techniques used by ecologists to sample
populations will be applicable. We will need to freeze only a small
volume of material from each area; this volume can be minimized (and
costs reduced) by pulverizing and mixing samples before sending
portions to each storage facility. To succeed, one need get only a few
cells from each organism, and a cubic millimeter of tissue typically
contains a million cells.

There may well be sample preservation efforts of which we're unaware,
but an education effort is still required: the keepers of these
samples need to understand that the DNA information itself is
valuable, not just viable germ cells or viable seeds. (Literature from
CRES, for example, stresses that they store "living cells and
tissues.") Without this understanding, samples might be discarded if
they couldn't recover from freezing spontaneously.

Why is the Foresight Institute the right group to start this project?
First, we have the interest: our surveys show that over 95% of us see
saving endangered species as an issue of "major," "historic," or
"life-and-death" importance. Second, we can play a role in educating
and coordinating at the start, and then step back: the project has a
high payoff-to-effort ratio, so it isn't too ambitious for us to
tackle. But most important, we are the only group which already sees
the potential of restoring species from DNA only, via nanotechnology.

Heroic efforts to save species are already underway; groups like
Conservation International play a key role in the habitat-preservation
effort. FI fully supports this work, but we recognize that the high
cost of preserving habitats means that many species will perish unless
another method is tried. We see the BioArchive as a way to save the
many species which existing conservation efforts can't reach in time,
and a way to restore ecosystems when (as we expect) room for natural
habitat begins to expand.

There may be objections to the BioArchive approach. Nature reports
that "Scientists in the IUCN [Union for the Conservation of Nature and
Natural Resources] argue that genetic resources are better protected
in situ, by preserving species in their natural habitat to protect the
full range of genetic variability, an advantage not shared by
conservation in gene banks..." The IUCN advocates financing this work
through a tax on commercial and industrial users of genetic material.
But such an international tax would not be enough, even if it were
collected: the total net profits for 1987 of the top fifteen
biotechnology firms were only $7.7 million, with over half of the
companies having a loss (The Economist, 30 Apr 88). These profits will
increase over time, but species need preservation now.

Some might claim that a BioArchive effort will encroach on
habitat-preservation resources and decrease the sense of urgency now
felt for conservation efforts. We argue that the resources needed are
minimal in comparison, and that the sense of urgency will only
decrease to the extent that people are sure our approach will
work--and if it will work, and if we know that it will, what would the
future think of us for condemning so many species to an avoidable
extinction? Our goal must be to maximize the results given the
resources available, which are frighteningly modest. Besides, starting
an additional effort will likely draw additional press coverage and
additional funding.

The benefits of the project are clear. It will:
-- save species and ecosystems in the long term
-- make clear the lifesaving potential of nanotechnology
-- build our natural alliance with politically-powerful environmental
groups
-- draw attention to the potential of advanced technologies--if used
well--to help us heal and restore the Earth.

We're looking for volunteers to get the project off the ground. Here's
what needs to be done:

Research: What efforts are already underway? Do the people involved
understand the value of DNA as distinct from viable germ cells? Which
scientists are already collecting and freezing samples?

Networking: Which environmental groups are interested? Is there
another group in a better position than FI to coordinate this, and can
they be convinced to take it on?

Planning: We need creative ideas on how to make this happen. For
example, perhaps funds could be raised through an "Adopt-a-Species"
approach, in which donators are rewarded with a certificate saying
"You helped save 1000 endangered species."

Those interested should write FI to volunteer. Be aware that this is
still in the idea stage; we are just now starting a database of
volunteers. If you are experienced and concerned enough to head up the
effort, perhaps to coordinate other volunteers, please call us at
415-364-8609.

Nanotechnology Course

About fifty students attended the ten-week course on "Nanotechnology
and Exploratory Engineering" taught by FI's president Eric Drexler at
Stanford during the spring quarter. The main body of the course was
highly technical, drawing from the disciplines of physics, chemistry,
computation, and engineering. Later sessions addressed applications in
space development, warfare, and medicine, along with policy issues and
an analysis of where we are today in developing the technology.

The course was audiotaped, and Jim Stevenson of NASA Ames and Jim
Turney of Liberty Audio have volunteered to help produce a set of
tapes suitable for distribution by FI.  Please do not contact us yet
for copies; we'll announce when they're ready. The tapes will probably
not be transcribed, since they run for about twenty hours.

Many people have contacted us for further information on the course,
such as which textbook was used. Engines of Creation was recommended
reading, but copies of Drexler's technical papers were used as the
main course notes. These are the same papers we've referenced in the
FI Updates and Backgrounds. We can send you a copy of the syllabus;
just send a stamped, self-addressed envelope.  A molecular mechanics
handout is also available, but will only be comprehensible to those
who already have some knowledge of molecular mechanics.

Many have also asked when the course will be taught again: there are
no plans to repeat the course in the immediate future. Instead, the
instructor is working on a technical book which will both contain much
more information than can be conveyed in the classroom and will be
available to many more people.



Manufacturing with Nanotechnology

by Jerry Fass

 Nanotechnology-based manufacturing techniques should yield great
increases in productivity and wealth. Improvements in two techniques
in particular will greatly decrease resource requirements: the
incorporation of voids, and wearproofing.

Voids

Whenever possible, objects can incorporate carefully shaped voids to
save cost and mass. Generally, voids are more useful for large systems
or those under low loads. They can range in size from arbitrarily
large down to a fraction of a nanometer wide; the upper limit is set
by device size, the lower by the scale of atoms. For structures under
light compressive loads, voids formed in fractal patterns can yield
maximum efficiency.

Today's bulk manufacturing can produce large, irregular voids at
reasonable cost, as in foam rubber and insulation. Nanomachines should
be able to produce uniform voids down to one atom across, thereby
cutting the mass, cost, energy, and time needed for production. The
biggest gains will be for objects with structural loads in pure
compression or mixed compression and tension; fortunately this
includes the majority of objects we use, such as furniture, doors,
most walls, and appliances. The void fraction of these could be very
high, perhaps 99% or more. Highly loaded objects (e.g., engine parts)
will benefit less, and highly loaded tension systems (e.g., cables and
pressure vessels) will benefit little.

Incorporating voids, combined with scavenging heavy pre-nanotechnology
parts, will allow us to recycle old systems into multiple new ones
without new material resources, reducing the need for mining and
refining.

Wearproofing

Wear limits the lives of mechanical and structural systems, which
often attain a reasonable lifetime only by having worn-out parts
replaced. (An annoying example is the modern automobile). Wear is
cumulative and can seem exponential, as worn parts increase wear on
other parts. The aim of wearproofing is to head off the wear process,
with the increasingly ambitious goals of longer-lasting parts,
zero-wear parts, and finally self-repair. Using nanotechnology, we can
expect improvements in:

-- Tribology--the science of why and how objects wear. Nanomachines
should greatly aid collection of data needed to further advance this
field.

-- Hardness--surfaces of harder materials wear more slowly. Surfaces
of ceramic or diamond, or perhaps the new form of carbon, "C8",
reported by Soviet researchers, will last much longer. Nanomachines
could apply such coatings, and powerful computers may allow us to
design new ones.

-- Friction control--lubricants and bearings. All three means of
lubrication--solids (Teflon, graphite), liquids (oil), and gases
(air)--are improving rapidly thanks to improved data and computer
analysis. Contact bearings will benefit from ever-tighter tolerances
and more rugged materials. A revolutionary non-contact bearing, the
electromagnetic bearing, repulsively or attractively levitates moving
parts in a magnetic or electric field, with zero friction; wear can
often be practically eliminated by having such a bearing gently "land"
a part after it stops moving. The new high-temperature superconductors
will make such bearings smaller, stronger, and more precise; since
they are often computer-controlled, nanocomputers will be helpful too.
And of course, Drexler's suggested atomically-precise sigma bond and
van der Waals bearings will not wear in any conventional sense.

Wear on tools can be reduced even for bulk processes by forming parts
using non-contact methods such as explosives, lasers, electron beams,
plasma torches, water jets, and electromagnetic forming instead of
drill bits, grinding wheels, and the like. There may be uses where
such macro-tools forever outperform nano-tools: perhaps in well
drilling, tunneling, and excavating.

Synergies between the above techniques can be expected; for example,
making an object with voids but covering the surface with diamond. And
besides saving energy in manufacturing, we can expect to do so in
transport as well: objects will last longer and so need to be
delivered less often, they will weigh less when they do need
transport, and with nanoproduction systems--quiet, small, flexible,
and clean--manufacturing on-site becomes a possibility.

But eventually, we can expect self-repair to solve the wear problem.

Jerry Fass is a part-time science writer based in Wisconsin. He also
coordinates FI's journal monitoring project.

Upcoming Meetings: [I've deleted those that already happened -- JoSH]

Second Conference on Molecular Electronics and Biocomputers, Sept.
1989, Pushchino, USSR. Contact Dr. P.I. Lazarev, Research Computing
Centre of the Academy of Sciences of the USSR, Pushchino, Moscow
Region, 142292 USSR.

4th International Symposium on Molecular Electronic Devices, Oct.
1989, Baltimore/DC area. Contact Dr. Richard Potember, 301-953-6251.

Letters:

The Foresight Institute receives hundreds of letters requesting
information and sending ideas. Herewith some excerpts:

One of the things that would be most helpful to me right now is a
micro-hypertext system that could be used for organizing my personal
and professional work. If you're going to be developing hypertext, why
not plan from the beginning for a PC version to which it could
interface but which could be marketed separately.

John Simms
Math Dept.
Marquette University



An interesting question for proponents of nanotechnology: The
prototype products could well cost trillions in research and
development. Producers are faced with bringing very expensive products
to market, while new competitors, privy to many of the same memes,
could bring very similar products to market literally dirt cheap.
Where is the incentive for pioneering efforts?

Robert J. Hurt
Denver, CO


I am interested in doing molecular graphics on my own computer.
However, it's not a Macintosh, but rather an IBM-PC XT clone. Do you
know of any (reasonably priced) molecular graphics programs for IBM
compatibles? I've been working on writing my own, but I'm a better
programmer than chemist.

I believe that the time is ripe for a low end molecular CAD
[computer-aided design] program. The hardware is adequate, if you
don't require real time animation. The interest is there. It would
really aid this field if a standard data format could be established,
to avoid the incompatibilities found between rival mechanical CAD
programs. The more people we can get hacking away at new molecular
devices, and the better they can communicate, the sooner we will get
assembler technology. I would just as soon have the breakthrough made
by private industry or individuals [rather than governments].

Brett P. Bellmore
Capac, MI


Computer modeling of molecules, and eventually of molecular machines,
is a key part of the path to nanotechnology. Jerry Fass has brought to
our attention a shareware program called MoleculeM for building and
displaying 3D models of molecular structures. It is said to have
built-in bonding and ionization rules and full rotation abilities. The
companion program Chemview is said to make animated 3D rotation models
with each atom a different color. For a free catalog call Public Brand
Software at 800-426-3475.

However, much more sophisticated programs will be required to do the
modeling we need. There's a commercial opportunity here. --Editor



Talks

Talks on nanotechnology continue to expose the concepts to critique
and refinement. These have included: a presentation to an academic
audience at the University of Colorado at Colorado Springs; a keynote
talk for DEC's Futures of Computing Workshop; a talk at IBM Santa
Teresa as part of their Advanced Education Series, and a presentation
to Upjohn executives. Other recent talks include a talk by Dr. Ralph
Merkle on "Nanotechnology: Implications for Life Extension" at a
conference in early September, a talk to the Government Systems
Management Club of Control Data, and a lecture to a class at the LBJ
School at the University of Texas at Austin. Past talks at the Third
International Conference on Supercomputing and the National Space
Society's Space Development Conference will appear as papers in the
proceedings volumes; we'll let you know when they're available.

Talks on other topics of interest to FI have included a presentation
by Mark S. Miller on Agoric Open Systems at the Open Systems Workshop
held at Xerox PARC in June, and a talk on "Hypertext Publishing and
the Evolution of Knowledge" at Sun Microsystems in July.

How Many Bytes in Human Memory?

by Ralph Merkle

Today it is commonplace to compare the human brain to a computer, and
the human mind to a program running on that computer. Once seen as
just a poetic metaphor, this viewpoint is now supported by most
philosophers of human consciousness and most researchers in artificial
intelligence. If we take this view literally, then just as we can ask
how many megabytes of RAM a PC has we should be able to ask how many
megabytes (or gigabytes, or terabytes, or whatever) of memory the
human brain has.

Several approximations to this number have already appeared in the
literature based on "hardware" considerations (though in the case of
the human brain perhaps the term "wetware" is more appropriate). One
estimate of 1020 bits is actually an early estimate (by Von Neumann in
The Computer and the Brain) of all the neural impulses conducted in
the brain during a lifetime. This number is almost certainly larger
than the true answer. Another method is to estimate the total number
of synapses, and then presume that each synapse can hold a few bits.
Estimates of the number of synapses have been made in the range from
1013 to 1015, with corresponding estimates of memory capacity.

A fundamental problem with these approaches is that they rely on
rather poor estimates of the raw hardware in the system. The brain is
highly redundant and not well understood: the mere fact that a great
mass of synapses exists does not imply that they are in fact all
contributing to memory capacity. This makes the work of Thomas K.
Landauer very interesting, for he has entirely avoided this hardware
guessing game by measuring the actual functional capacity of human
memory directly (See "How Much Do People Remember? Some Estimates of
the Quantity of Learned Information in Long-term Memory", in Cognitive
Science 10, 477-493, 1986).

Landauer works at Bell Communications Research--closely affiliated
with Bell Labs where the modern study of information theory was begun
by C. E. Shannon to analyze the information carrying capacity of
telephone lines (a subject of great interest to a telephone company).
Landauer naturally used these tools by viewing human memory as a novel
"telephone line" that carries information from the past to the future.
The capacity of this "telephone line" can be determined by measuring
the information that goes in and the information that comes out, and
then applying the great power of modern information theory.

Landauer reviewed and quantitatively analyzed experiments by himself
and others in which people were asked to read text, look at pictures,
and hear words, short passages of music, sentences, and nonsense
syllables. After delays ranging from minutes to days the subjects were
tested to determine how much they had retained. The tests were quite
sensitive--they did not merely ask "What do you remember?" but often
used true/false or multiple choice questions, in which even a vague
memory of the material would allow selection of the correct choice.
Often, the differential abilities of a group that had been exposed to
the material and another group that had not been exposed to the
material were used. The difference in the scores between the two
groups was used to estimate the amount actually remembered (to control
for the number of correct answers an intelligent human could guess
without ever having seen the material). Because experiments by many
different experimenters were summarized and analyzed, the results of
the analysis are fairly robust; they are insensitive to fine details
or specific conditions of one or another experiment. Finally, the
amount remembered was divided by the time allotted to memorization to
determine the number of bits remembered per second.

The remarkable result of this work was that human beings remembered
very nearly two bits per second under all the experimental conditions.
Visual, verbal, musical, or whatever--two bits per second. Continued
over a lifetime, this rate of memorization would produce somewhat over
109 bits, or a few hundred megabytes.

While this estimate is probably only accurate to within an order of
magnitude, Landauer says "We need answers at this level of accuracy to
think about such questions as: What sort of storage and retrieval
capacities will computers need to mimic human performance? What sort
of physical unit should we expect to constitute the elements of
information storage in the brain: molecular parts, synaptic junctions,
whole cells, or cell-circuits? What kinds of coding and storage
methods are reasonable to postulate for the neural support of human
capabilities? In modeling or mimicking human intelligence, what size
of memory and what efficiencies of use should we imagine we are
copying? How much would a robot need to know to match a person?"

What is interesting about Landauer's estimate is its small size.
Perhaps more interesting is the trend--from Von Neumann's early and
very high estimate, to the high estimates based on rough synapse
counts, to a better supported and more modest estimate based on
information theoretic considerations. While Landauer doesn't measure
everything (he did not measure, for example, the bit rate in learning
to ride a bicycle, nor does his estimate even consider the size of
"working memory") his estimate of memory capacity suggests that the
capabilities of the human brain are more approachable than we had
thought. While this might come as a blow to our egos, it suggests that
we could build a device with the skills and abilities of a human being
with little more hardware than we now have--if only we knew the
correct way to organize that hardware.

Dr. Merkle's interests range from neurophysiology to computer
security. He recently spoke on nanotechnology and biostasis at the
Life Against Death Conference in San Francisco.

Molecular CAD

A computer graphics researcher at the Research Institute of Scripps
Clinic (Department of Molecular Biology) would like to collaborate
with others sharing his interest in computer-aided design tools
leading toward nanotechnology. Interested readers with skills useful
to such a project should send a letter to FI for forwarding to
Scripps.


Hypertext Publishing Progress
by Chris Peterson

Over twenty years after it was first envisioned, the goal of hypertext
publishing is finally near. As late as a year ago interest in a system
for publishing, not just swapping stand-alone hypertexts, was still
confined to a few scattered proponents. As a last resort, FI was even
considering trying to fund development of a system ourselves, since
the commercial sector seemed so uninterested. But now this has
changed.

Why has interest in the topic bloomed after so many years? Ironically,
much of it may be traceable to a misunderstanding. When Apple Computer
was ready to bring out a new software construction kit named
"Wildcard," they found the name already taken, and the owner unwilling
to sell it. (Files created by the final program are still labeled
internally as created by "WILD.") Marketing settled on the substitute
name "HyperCard." Long-time hypertext proponents were annoyed by the
name and by advertising which touted the product as hypertext, since
HyperCard is not hypertext as the word had been used. They correctly
assumed that confusion would result. But as Xanadu hypertext pioneer
Ted Nelson has pointed out, the publicity has been good for hypertext:
people assumed that if Apple was interested in hypertext, it must be
good. Suddenly it was all the rage, and in this avalanche of interest
there were a few farsighted people who focused on the original vision.
And those people are making all the difference.

John Walker of Autodesk, who had been interested in hypertext long
before HyperCard, is said to have assumed that (surely!) some large
company was funding hypertext development. He was reportedly appalled
to find instead that the classical hypertext development group,
Xanadu, was scraping along on volunteer labor. Fortunately, as
chairman of Autodesk--a company best known for its highly-popular
AutoCAD computer aided design products--he was in a position to solve
this problem.

Arranging for Autodesk to acquire 80% of Xanadu Operating Company was
a challenge: in its many years of struggling corporate existence
Xanadu had accumulated many stakeholders and piles of confusing legal
paperwork. Closing the deal became a task for Roger Gregory (longtime
leader of the group) working with Phil Salin of the aptly-named
consulting team, Venture Acceleration. The heartfelt thanks of all of
us who've longed for real hypertext go to these three people.

A side note: we were pleased to hear from John Walker that FI's
president Eric Drexler played a role in this as well: the vision of
hypertext publishing presented in his book, Engines of Creation,
helped convince Autodesk to proceed with the unorthodox deal.

Now Xanadu is rolling: the company has offices, equipment, and a
programming team at work turning out product. At this spring's West
Coast Computer Faire they announced plans to release their first
product within 18 months. This will be hypertext software for
individuals and small groups of under ten people, using technology of
the sort needed for a full public hypertext publishing system, and
providing a stepping stone toward the larger system.

The Xanadu system is divided conceptually into two parts, the backend
and the frontend. The backend handles storage, retrieval, versioning,
linking, and editing of data with no knowledge of the nature of the
data being handled, and with no direct contact with the user.
Frontends are advanced application packages that:

-- interact with users,
-- interface to input and output devices,
-- interpret the stored data as text, graphics, or music,
-- format data for display,
-- incorporate numeric manipulations and other operations to be
carried out on data,
-- are specialized to serve particular market needs,
-- create, implement, and manage the metaphors for working in
hypertext and hypermedia environments.

Frontends for Xanadu will be primarily produced by third party
developers. Late this year, experimental versions of the backend
software will begin to be licensed to researchers and advanced
software developers interested in starting hands-on hypertext
experience and thinking about design issues for front-end software.
(The functionalities of the experimental version will be carried
forward in the software product, but the syntax is expected to
change.) Such developers should write for further information: Xanadu
Operating Company, 550 California Ave., Suite 101, Palo Alto, CA
94306, Attn: Gayle Pergamit.

Interest in hypertext publishing now extends beyond Xanadu. ON
Technology--run by Mitch Kapor, founder of Lotus Development--is
developing an object-oriented software platform which could be
extended to support hypertext publishing. ON is rumored to be
considering this possibility, which may be within reach now that
(according to MacWeek) they have obtained $3 million in venture
capital.

Apple has formed a working group on "collaborative hypertext." Combine
this with their existing efforts in hypertext and it adds up to real
interest; Apple will have a big impact on this field if it chooses.

Keep an eye on Doug Engelbart & Co. too; as a hypertext pioneer he is
well-placed to stimulate the creation of a valuable system. He and
colleagues Howard Franklin and Christina Engelbart have not yet
announced their plans, but if Doug doesn't build a new system himself,
he will inspire further efforts by others.

In the nonprofit sphere, software developers are aiming to incorporate
hypertext publishing features into USENET, the giant international
computer network running on UNIX-based computers. Several
brainstorming meetings were held in the San Francisco area in May, led
by Eric Raymond (eric@snark.uucp or uunet!snark!eric).

Meanwhile Kirk Kelley at Sun Microsystems is working to ensure that
the various hypertext systems will be able to exchange information.
This is a critical effort--having conflicting standards in hypertext
publishing would be like having conflicting standards for phone lines
or fax machines: isolated systems would offer inferior service,
hampering communication and the growth of knowledge (but eventually
linking up or disappearing).

There is enough known activity that there are likely to be other
hypertext publishing efforts still under wraps. We'll try to keep you
up to date in these pages, since so many FI participants have an
intense interest in this field. Another publication to watch is the
new magazine HyperAge, six issues a year for $20 in the US. To
subscribe using a credit card, call 800-682-2000.

The Road to Nanomachine Design
by Thomas Donaldson

One of the contributions by K. Eric Drexler to nanotechnology was his
success with estimating the behavior of nanomachines by using simple
mechanical calculations. Ultimately, however, these exploratory
engineering calculations remain approximations only. Serious
nanomachine design will require much more. Almost certainly it needs
very powerful computers able to carry out dynamic calculations on
large molecules. These calculations need lots of computer power.
Specialized chemical workstations with prices in the range of $200,000
already exist.

To speed nanotechnology along what we really want is lower price
computers, ideally costing no more than a Mac II. There is a wide open
road to just such a computer. Technology for chemical design
workstations costing about $40,000 exists right now, for the trouble
of assembling a system from standard boards (unfortunately not done
yet). The same parts will cost far less in a few years (so Popular
NanoMechanics may start publication soon).

The technology depends on the Transputer, a chip specially designed
for parallel processing. Computer System Architects sells IBM PC
boards with 16 Transputer chips and 16 megabytes of memory for
$28,000.

Chemical Design Ltd, a British company, already sells a chemical
design workstation, the MITIE 1000, which can contain as many as 36
independent Transputers. The smallest MITIE 1000 sells for $170,000.
The MITIE calculates as much as 72 times faster than a VAX 8600,
analyzing the conformation and dynamics of large molecules at
supercomputer speeds.

The MITIE contains a microVAX as a host machine; the remaining modules
run on the VAX. Chemical Design has about 250 customers around the
world, including Glaxo, Rhone-Poulenc, Fisons, Dupont, American
Cyanamid, Merck, and Hoffmann-LaRoche. The program ChemX contains
specific modules for building and displaying the molecule (ChemCore),
modelling molecules (fitting, analyzing the conformation: ChemModel),
designing proteins (ChemProtein), and carrying out calculations to
find minimum energy states (ChemQM). There are also library modules to
maintain a large database (ChemLib: the recommended size of hard disk
for a single-user system is 70 Megabytes).

Any molecular machine we design must be chemically stable in the
environment for which we design it. We must therefore make sure not
just that the molecule would be stable if isolated from all other
chemicals but also that the system will withstand likely chemical
attack. Molecules will try to attain minimum energy states, and their
excited states are also of interest. To resolve all of these issues
will require very fast chemical design software. Ultimately software
for nanomachine design will do much more, but even existing chemical
design software running on an affordable workstation puts us far
ahead.

What about the software? Unfortunately, porting software to a parallel
computer usually requires a total rewrite of any modules in which you
expect to use parallelism. Porting should be a cooperative effort
between someone versed in parallel computing and someone versed in
chemical design software. When someone will get a chemical design show
on the road, porting software to a MAC II-transputer system, isn't
clear to me. My own expertise lies in parallel computing. Anyone
interested can reach me through Foresight Update.

Dr. Thomas Donaldson currently writes software for a transputer
machine for the FEM market.  He pioneered the idea of artificial
enzyme systems as an approach to cell repair.

Biostasis Research

Biostasis research at the Alcor Life Extension Foundation has been
disrupted by a grandstanding local official who, presumably in an
effort to generate media coverage to help his re-election, has been
harassing Alcor. This intimidation has ranged from confiscation of
equipment and records to threats of criminal charges. To our knowledge
Alcor has done nothing to merit this treatment, but this fact doesn't
lessen the legal bills they are accumulating as a result of defending
against these bogus charges.

Those who are interested in ensuring the continuation of
state-of-the-art biostasis research and services should send their
contributions to Alcor Life Extension Foundation, 12327 Doherty St.,
Riverside, CA 92503, phone 714-736-1703.

Thanks

As usual there are too many people who deserve thanks for all to be
listed here, but the following is a representative group: Michael
Whitelock, John Alden, and Ray Alden for budget help; Gayle Pergamit,
Brian Quig, and Dennis Gentry for help in our Executive Director
search; Jeff Soreff for technical work; Marvin Minsky for
encouragement; Gerald Feinberg for talking to the press; Stewart Brand
and Nils Nilsson for spreading the word; Blair Newman for making
valuable contacts; Ed Niehaus for marketing advice; James Dinkelacker
for strategic advice; Steve Hyde for setting up the Colorado Springs
talk; Jerry Fass (and many others) for sending information; Pat Wagner
and Leif Smith for book recommendations.

Books are listed in order of increasing specialization and level of
reading challenge. Your suggestions are welcome.  --Editor

Prisons We Choose to Live Inside, by Doris Lessing, Harper & Row,
1987, paper, $6.95. A small book with high impact, as asserted on the
cover.  An eloquent plea for integrating what little we know of the
social sciences into education, to help us primates stop repeating
Milgram experiment-type horrors.

Moving Mountains, by Henry M. Boettinger, Collier Macmillan, 1975,
paper, $5.95. A practical treatise on convincing others to share your
ideas. "The first truly modern and truly searching essay on
rhetoric--in the classical meaning of the term--in the last three or
four hundred years."--Peter Drucker

The Social Brain, by Michael S. Gazzaniga, Basic Books, 1987, paper,
$8.95. A neuroscientist argues that the brain is more a social entity,
a vast confederacy of relatively independent modules, each of which
processes information and activates its own thoughts and actions. This
view has some similarity to Minsky's Society of Mind theory. The
writing is anecdotal and enjoyable.

The Visual Display of Quantitative Information, by Edward R. Tufte,
Graphics Press, 1983, hardcover, $34. A beautiful book explaining the
right ways (and ridiculing the wrong ways) to present numerical
information. Amusing and visually enjoyable, it inspires the reader to
support Tufte's high standards. Fun to browse; makes a great gift.

How Superstition Won and Science Lost, by John C. Burnham, Rutgers,
1987, paper, $16. Tracks the decline in the quality of science
popularization by the media over the past century and shows how this
has undermined the impact of science and strengthened the forces of
irrationalism.

What Sort of People Should There Be?, by Jonathan Glover, Pelican,
1984, paper, $5.95. An Oxford philosopher looks at the emotional and
ethical issues raised by (hypothetical) advanced technologies able to
alter the human form, control the brain, and create artificial
intelligences. Covers such topics as possible abuse of the
technologies, and what people will do once there is no need to "work."

Neurophilosophy, by Patricia Smith Churchland, MIT Press, 1986,
hardcover, $29.95. Begins with the neurosciences, then proceeds
through AI, connectionist research, and philosophy to give a picture
of how the brain works. Skillfully written and very readable.

Evolutionary Epistemology, Theory of Rationality, and the Sociology of
Knowledge, ed. by Gerard Radnitzky and W.W. Bartley, III, Open Court,
1987, paper. A collection of essays on a powerful theory of how
knowledge grows: by evolution through variation and selective
retention. Treats knowledge as an objective evolutionary product, and
offers insights into evolutionary processes in general. Authors
include Sir Karl Popper.

Neural Darwinism, by Gerald Edelman, Basic Books, 1987, hardcover,
$29.95. Having won a Nobel Prize for his work in immunology, the
author now examines how the brain works, presenting his theory of
neuronal group selection. A difficult book with significant ideas.

Nanotechnology BBS

For those with access to computers on the USENET, there is now a
Netnews group, sci.nanotech, for the discussion of nanotechnology.
(The USENET newsgroups form a large, distributed, hierarchical
electronic bulletin board.) If your site receives some Netnews groups
but not sci.nanotech, tell your system administrator that it is a
"moderated, technical, low-volume" group.  The moderator is J. Storrs
Hall (rutgers!klaatu.rutgers.edu!josh or josh@klaatu.rutgers.edu), who
can answer specific questions about the group by electronic mail.

There is also a newly formed biostasis mailing list, run by Kevin Q.
Brown. Send queries to ho4cad!kqb@att.att.com.



Copyright Policy

FI's standard arrangement with our writers is as follows: we copyright
the material and may use it in the future, including in other forms
such as recordings, videos, and electronic publications. The writer
also is welcome to use the material; we ask that the credits indicate
where it was first published. Writers desiring different arrangements
can be accommodated; please consult the editor. We urge those who
write for commercial publications to retain electronic publishing
rights for your own use on future hypertext publishing systems.



Help

Many libraries do not have Engines of Creation indexed under the
subject "nanotechnology." Readers are asked to check their favorite
libraries, especially those at universities, and if necessary ask the
librarian to correct this omission.

Kantrowitz Joins Board

Arthur Kantrowitz has joined Gerald Feinberg, Stewart Brand, and
Marvin Minsky on FI's Board of Advisors. Now a professor of
engineering at Dartmouth, Dr. Kantrowitz is the founder and former CEO
of the Avco Everett Research Laboratory. His technical interests have
ranged from space transportation to power generation to artificial
hearts, but FI readers may know him better as the originator of the
Science Adversary Procedure, popularly known as the Science Court. Dr.
Kantrowitz is also active in the space development movement and served
for years as Chairman of the L5 Society.

We plan in future issues to give profiles of all four Advisors.

The Materials of Nanotechnology
by Russell Mills

The road to nanotechnology consists of several converging paths, each
leading independently to the Assembler Breakthrough -- the building of
the first general molecular fabricators. Biotechnology is one of these
paths, but not necessarily the shortest one.

Biotechnology seeks to understand and manipulate the molecules we have
inherited through traditional evolutionary processes, focusing
particularly on two chainlike molecules: proteins (chains of amino
acids) and nucleic acids (chains of sugar and phosphate molecules with
pyrimidine and purine bases attachments).

Nanotechnology, by contrast, deals with any material, chainlike or
not, that can be designed and assembled atom-by-atom. In this sense
nanotechnology is broader than biotechnology.

What materials will form the basis of the Assembler Breakthrough? One
could argue that proteins and nucleic acids have the best (in fact,
the only ) "track record" as substrates for nanomachinery, and that
these are therefore the materials of choice for building
nanomachinery. But the qualities that made nucleic acids and proteins
good choices as biological materials on the Earth several billion
years ago are less relevant to nanotechnology today. Evolution
selected them because of their chainlike structure and the ready
availability of their component parts on prebiotic Earth. Molecular
chains are favored over other structures because they can be copied
and repaired by relatively simple molecular machines; Earth's
evolutionary process places a premium on simplicity by emphasizing
individual self-reliance -- each individual organism is forced to
contain most of the machinery needed for its own maintenance and
replication.

Nanotechnology presents a very different situation: we do not want
self-reliant assemblers. We will build assemblers that rely on us for
support, and cannot function without externally supplied information,
energy, or assistance in replication. This freedom from traditional
evolutionary constraints opens up design possibilities that have never
been exploited biologically. Even if, for historical reasons, the
easiest route to nanotechnology turns out to lead through
protein-based assemblers programmed with information conveyed by
nucleic acid molecules, we should expect a rapid transition to better
materials.

Let's look at where we stand in understanding and using traditional
nanomachinery, then look at some developments in less traditional
areas.

Protein structure & applications

The ability to redesign existing proteins (e.g., enzymes, regulatory
proteins, receptor proteins), or to design new ones, depends on
understanding the detailed relationship between function and
configuration.

The amino acid sequences making up proteins are determined by direct
analysis or from translation of the DNA or RNA sequences that encode
them. These methods generate data rapidly.

On the other hand, 3D maps of proteins in their functional
configurations are obtained by X-ray crystallography, sometimes with
the aid of nuclear magnetic resonance (NMR). These are time-consuming
methods.

The different rates at which these techniques can be used has given
rise to a growing gap between the availability of sequence data and
its interpretation and application:

-- Sequence data is available for more than 8000 proteins and is
accumulating at an exponential rate (doubling time about 2 years) [1].

-- Only about 400 proteins have been spatially mapped. The number of
these maps increases linearly (about 40 proteins per year). [1, 4]

Sequence data alone gives little indication of function. Progress in
understanding protein function requires spatial maps, but proteins are
difficult to crystallize in forms suitable for X-ray crystallography.
This obstacle is now being surmounted by growing protein crystals on a
mineral substrate, such as magnetite. The atomic spacing in the
mineral surface seems to affect the pattern of deposition of protein
molecules; the result has been the ability to grow some protein
crystals with unprecedented ease, and in forms never before seen [5].

THE FOLDING PROBLEM.

Proteins fold up into their functional conformations with little or no
outside help; this implies that the amino acid chain itself contains
the information needed to specify the folding pattern. A fast way to
acquire useful data on protein function might therefore be to compute
the most stable spatial configuration of protein chains from energy
considerations and sequence data alone. This approach, known as "the
folding problem", has slowly been yielding to efforts to solve it [7].
The general case has proved too difficult to carry out with
present-day computers, but the problem size can be reduced in several
ways [1, 4]:

-- For proteins with sequences similar to proteins of known structure,
take parts of the known structure as givens.

-- Statistical properties of a sequence can identify segments that lie
inside or outside the folded protein, or segments that make contact
with a lipid matrix (suggesting a protein destined for a cell
membrane).

-- NMR data can put constraints on distances between specific amino
acids residues in the folded protein.

-- Exon shuffling (the swapping of DNA segments within the genome that
is known to occur in genes associated with the immune system, and may
turn out to be a much more general phenomenon) suggests that proteins
are actually composed of a relatively small number of modular units. A
number of such modules have already been identified, but it is not
known to what extent all proteins are modular in this sense. To the
extent that they are, the folding problem would reduce to a
calculation of the packing configuration of a given set of prefolded
modules.

THE ACTIVITY PROBLEM

Some investigators, ignoring spatial conformation, are trying to
determine the functions of proteins from statistical properties of
their sequences. They have determined, for example, that antigenic
activity correlates with certain periodic variation of hydrophobic
residues along a sequence. [4]

THE DESIGN PROBLEM

Despite difficulties with the folding problem and the activity
problem, progress has been made (as predicted in 1981 [17]) in solving
the design problem: to design a protein sequence that will give rise
to a given activity. Several approaches are being pursued:

-- Limit the design to include only those aspects of protein folding
which are already understood. For example, W. DeGrado at duPont has
designed and built a protein that self-folds into a 4-helix bundle. It
might be modified to incorporate biological functions [7].

-- Design a protein from native components. T. A. Jones of Univ. of
Uppsala has used this approach to build retinol-binding protein by
fitting together 22 fragments from other proteins. The resulting
protein has a different amino-acid sequence than the protein it
mimics, but has the same shape [1].  Similar modeling of triose
phosphate isomerase and lactate dehydrogenase has been done by S.
Wodak of l'Universite Libre de Bruxelles.

-- Modify existing proteins.

One recent effort at protein modification involves a redesign of the
antimicrobial drug trimethoprim (TMP) to make it less toxic. Toxicity
results from TMP attacking human dihydrofolate reductase (dHFR) in
addition to bacterial dHFR, its intended target. The strategy being
taken is to reduce the floppi-ness of the TMP molecule, so that it
fits only its target and not human dHFR. [1]

Another example is a redesign of glucose isomerase (commercially
important in corn syrup production) to improve its efficiency, by
taking cues from the structure of triose phosphate isomerase, an
enzyme that catalyzes a different reaction but does so 10,000 times
faster. [1]

Genex has developed a technique for redesigning antibody molecules.
The result is a much smaller antibody that consists of a single chain
instead of four chains, is much easier to produce in quantity, elicits
fewer side effects when used in patients, is more stable, and binds
better to the target molecules. The trick is to use computer-designed
sequences of amino-acids to link together binding sites which formerly
were located on separate protein chains. The technique may lend itself
to the redesign of many other useful protein molecules besides
antibodies. [15]

Nucleic acid structure

Nucleic acids are sequenced either by chopping them into pieces of all
possible lengths, or by causing them to grow into such a set of pieces
in the first place, and then separating the pieces by electrophoresis.
The sequencing procedure is even easier than that of proteins, and
some of the steps have been automated.

-- About 20 million nucleotides from hundreds of organisms have been
sequenced and the number is increasing exponentially. The doubling
time, currently 2 to 3 years, is expected to decrease sharply soon. A
sequencing rate of one million bases per day is anticipated by 1996.
[4]

As with proteins, to know the sequence is not to know the function.
Some of the most interesting and useful biological information resides
in the local geometry of nucleic acids: information about gene
boundaries, regulatory binding sites, polymerase binding sites,
ribosomal sites, posttranslational modification sites, etc. While the
average spatial architecture of nucleic acids is known in detail,
local variations in this architecture are hard to study and data is
sparse [4].

-- The number of nucleic acid structures known from crystallographic
studies is less than 40.

-- Statistical analysis of nucleic acid sequences can identify some of
these structures in DNA, and can be done by computer. Reliability
varies greatly, but is as high as 90% in some cases.

Nontraditional materials

SELF-ASSEMBLING MEMBRANES

A traditional cell membrane is like a sea of inert material with, here
and there, a floating island of protein machinery. The sea is a
mixture of fatty molecules (phospholipids, like lecithin) and
cholesterol molecules, the relative proportions of which determine how
wavy and flexible the surface is. Typically the protein machines
extend all the way through the cell membrane, providing specialized
communications links (or in some cases pores) between the inside and
outside of the cell.

By determining what goes in and comes out of a cell, the cell membrane
defines the relations a cell has with the external world. It is
therefore intriguing to think of what might be possible if such
membranes could be deliberately altered, or if entirely different
kinds of active membranes could be designed and synthesized.

At the Weizmann Institute of Science a group led by Israel Rubinstein
is making membranes from molecules chosen for their ability to mimic
one function of biological membranes: the ability to recognize ions in
the solution surrounding the cell. These investigators have found that
a mixture of 2,2'-thiobisethyl acetoacetate (TBEA) and n-octadecyl
mercaptan (OM) will spontaneously assemble into a layer one molecule
thick on a gold electrode. TBEA is the active element; OM plugs gaps
between TBEA molecules preventing direct access to the gold substrate.
When the coated electrode is put in a solution with copper and iron
ions, it is found that copper ions are reduced to elemental copper,
whereas iron ions are unaffected. The mechanism depends on the fact
that TBEA molecules have two arms that open just wide enough for a
copper ion to slip in and bind to four oxygens projecting from the
arms. This brings the copper ion to within 7 angstroms (.7 nm) of the
gold substrate -- close enough for electrons to pass by
quantum-mechanical tunneling from the substrate to the copper. Because
of their geometry, iron atoms are not accepted into the arms of TBEA.
[13, 14]

SYNTHETIC NANO-EFFECTORS

A group at UCLA led by Donald J. Cram has launched a full-scale attack
on the problem of nano-effector design [16]. Working entirely away
from the protein/nucleic acid path blazed by terrestrial evolution
over the past several billion years, this group has designed hundreds
of molecules of varying shapes, hoping to learn how to make molecules
with desired catalytic properties. Cram's co-workers synthesized more
than 75 of these designed molecules and subjected them to X-ray
crystallography to check the correspondence between design and actual
structure. A series of compounds of gradually increasing complexity
was then tested for the intended activity: in one case the ability to
selectively bind certain ions (lithium, sodium, potassium, and
others). The compounds performed extremely well.

In another set of experiments, the aim was to build molecules able to
discriminate between D- and L- amino acids and ester salts -- a task
that seemed intractable earlier in this century. So successful were
their efforts that the investigators were able to build a machine
based on the designed molecules; when a 50-50 D-L mixture was poured
into the machine, the machine delivered two solutions with 86 to 90%
separation of the two substances.

In yet another branch of their work, Cram's group is designing
molecules that mimic the actions of enzymes. Free of the requirement
to build everything out of amino acids, they have been able to come up
with molecules far smaller (though not easier to make) than the
enzymes being imitated. Their mimic for the enzyme chymotrypsin has
been synthesized and tested; it proved to have some, but not all, of
the functionality of chymotrypsin itself.

DIAMOND

Diamond is in the news, and this is good news for nanotechnology.
Diamond is a prime candidate material for building nanomachines for
several reasons: the tetrahedral geometry of its bonds lets it be
shaped in three dimensions without becoming floppy; it is made of
carbon, the chemistry of which is well understood; and carbon atoms
make a variety of useful bonds with other types of atoms. Diamond
research may therefore advance nanotechnology even when it is pursued
for its short-term commercial potential. Progress in understanding and
making diamonds has been driven mainly by work done in the Soviet
Union [8, 9]:

-- In the 1930s Soviet scientists calculated a phase diagram for
diamond and began looking for easy ways to synthesize diamond.

-- In the 1950s, while American industry started manufacturing
diamonds at 2,000 C and 55,000 atmospheres pressure, Soviet
scientists developed a vapor deposition method for growing diamond
fibers at 1,000 C and low pressures.

-- During the 1960s and 1970s, the Soviet group improved on this
process, aiming to produce diamond films.

The technological implications of diamond films have recently been
realized in Japan and the U.S., and so a race has begun to develop
this technology. Dramatic discoveries are being made:

-- At the University of Texas 10-nanosecond laser pulses are being
used to vaporize graphite, which then deposits as a film 20 nm thick
over areas as large as 10 square centimeters. The film is
diamond-like, but may turn out to be something new. [3]

-- Soviet researchers report the discovery of a new form of carbon
much harder than diamond, called C8. They use an ion beam of low
energy to produce thin films of the substance. Carbon atoms in C8
appear to have tetrahedral bonds, but the lattice is somehow different
than in diamond--it may simply be somewhat random, resembling a glass
rather than a crystal. [8]

Much of the new interest in diamond is motivated by near-term
commercial applications like diamond-coated razor blades,
scratch-resistant windows and radiation-resistant semiconductors for
nuclear missiles. The C8 results, however, are of special relevance to
nanotechnology, showing us that diamond is just the default form of
more general tetrahedral bonding patterns for carbon. Choosing from
among the many possible departures from crystalline regularity may
turn out to be an important of nanomachine design.

Speaking of crystallinity ... a "new state of matter" has been
announced, called the nanocrystal [6]. The nanocrystalline state is
one in which roughly half the atoms occupy sites in crystal grains,
while the other half are free to move between and around the grains.
Both populations of atoms have the same chemical composition (titanium
oxide, for example), and atoms are easily exchanged between the grains
and the matrix. The response of such a material to strain is plastic
rather than brittle, because grains can change shape quickly instead
of hammering against each other or being forced apart (cracking). This
flow of atoms and restructuring of grains does not turn the material
into a liquid or a putty; at macro scales, nanocrystalline materials
are as solid as their ordinary counterparts.

Nanocrystallinity is a function of grain size.  In nanocrystals the
grains are about 10 nanometers across -- 1000 times smaller than in
ordinary materials. Small grain size implies large surface-to-volume
ratio and short diffusion "circuits" around the grains -- hence, rapid
response to strain. In the case of nanocrystalline copper,
self-diffusion at 20-120 C is increased by 19 orders of
magnitude over ordinary copper!

J. Israelachvili and collaborators are studying the properties of bulk
materials as one or more dimensions of a system is reduced to the size
of a few molecules or less [10, 11]. Previous work has shown that some
properties remain similar to bulk properties: e.g., refractive index,
dielectric constant, and surface energy. Now they have undertaken to
measure viscosity in thin films trapped between two solid surfaces.
They report that as the liquid layer thins to less than 10 molecular
diameters the liquid stops acting like a continuum and comes to
resemble a series of layers; the principles of viscosity no longer
describe the relationship between shear forces and sliding motion.

The amount of force required to initiate sliding (the critical shear
stress) is much greater in such systems than that predicted by
extrapolating from bulk properties. Taken at face value this suggests
that nanomachines with moving parts would get stuck unless the parts
remained in continuous motion, even when lubricants are present. But a
better interpretation is that the concept of liquid lubrication
becomes meaningless at the nanometer scale.

Liquids, the atoms of which are not tied down, evade part of the
design process. This is acceptable in a bulk machine, but not in a
nanomachine, the design of which must specify the behavior of every
atom. "Lubrication" in a nanomachine would consist of an optimization
of the chemical type, location, and orientation of each atom in the
machine; it would inhere in the design of the solid parts themselves
rather than in a separate liquid substance [18].

Dr. Mills has a degree in biophysics and runs a business in Palo Alto.
He also assists with the production of Foresight Update.



REFERENCES

1. Barbara Jasny, Science 240:722-723 (6May88)
2. Sci News 134:116 (6Aug88)
3. Sci News 134:94 (6Aug88)
4. Charles DeLisi, Science 240:47-52 (1Apr88)
5. Ivars Peterson, Sci News 133:154-155:5Mar88
6. Robert W. Cahn, Nature 332:112-113: 10March88
7. Thomas E. Creighton, Science 240:267 (15April88)
8. Mike Simpson, New Scientist p50-53 (10Mar88)
9. The Economist:92 (23Apr88)
10. Jacob N. Israelachvili, et al., Science 240:189-191 (8Apr88)
11. Ivars Peterson, Sci News 133:283:30Apr88
12. Nature 332:374-376:24Mar88
13. R.J.P. Williams, Nature 332:393 (31Mar88)
14. Israel Rubenstein, et al., Nature 332:426-429 (31Mar88)
15. The Economist p75 (27Feb88)
16. Donald J. Cram, Science 240:760-767 (6May88)
17. PNAS: K. Eric Drexler, Proc. Nat. Acad. Sci., 78: 5275-5278
(1981)
18. Gears and Bearings: K. Eric Drexler, in Proceedings of the IEEE
Micro Robots and Teleoperators Workshop, IEEE87TH0204-8 (1987)

"Human Frontiers" Advances

Although its technical scope has been refined and its budget cut,
Japan's proposed international Human Frontiers Science Program is
still relevant to development of both nanotechnology and artificial
intelligence. The Economist calls the planned effort "the world's
first truly international government research program."

Originally budgeted at $6 billion to be spent over twenty years, with
Japan contributing about half of the funds, Frontiers' initial goals
were very broad and, some said, overambitious: from neural-style
computing to "the elucidation of biological functions." Even Japan's
former Prime Minister Yasuhiro Nakasone, a strong promoter of the
program, criticized its vagueness. A one-year $1.4 million study to
clarify these goals was completed in spring 1988 and reviewed by
scientists from the Western summit nations and the European Community,
who advocated an immediate start on the program.

The new refined goals are (1) to study the higher-order functions of
the brain, especially its ways of visualizing objects and
understanding words, and (2) molecular recognition and response
functions. The new proposed budget is $60-100 million, to be spent on
30-50 three-year research grants, 100-200 post-doc fellowships, and
10-20 workshops.

Frontiers got a boost from the June 1988 economic summit of Western
nations, when it was endorsed in the final communique: "We note the
successful conclusion of the Japanese feasibility study on the Human
Frontiers Science Program and are grateful for the opportunity our
scientists were given to contribute to the study. We look forward to
the Japanese government's proposals for implementation of the program
in the near future."

Japanese officials had originally hoped to get the other six summit
nations--Canada, Britain, France, Italy, West Germany, and the
U.S.--to commit funds to the project at the summit, but not
surprisingly these nations are waiting for Japan to make a commitment
first. The communique's statement of support will strengthen the
position of the program's advocates, the Science and Technology Agency
and the Ministry for International Trade and Industry, when they
approach the Ministry of Finance for funds later in 1988.

In a move unprecedented in Japan, these agencies propose that the
program be run by an international foundation to be established in
Switzerland, and to be funded entirely by Japan in the initial phase
(at least $20 million in fiscal year 1989). The U.S. National Science
Foundation and the European Community will provide experienced
personnel for the administrative secretariat, and scientists from all
summit nations will participate in the governing council and peer
review committees.

Precollege Training

Many of today's researchers were first confirmed in their vocation
when they participated in an NSF-sponsored summer program for high
school students. Now a guide to these programs is available: the 1988
Directory of Student Science Training Programs for High Ability
Precollege Students. The Directory lists institutions in the U.S. that
will be conducting student science training programs in the academic
year 1988-89.

The 507 programs listed are of three general types: courses, research,
and combinations of courses and research.  Residential and commuter
programs are offered; some charge for participaion, some do not.
Scholarships are often available. Programs are provided in science,
engineering, and mathematics. For each copy send a $1 check made out
to "Science Service Directory" to 1988 SSTP Directory, 1719 N St., NW,
Washington, DC 20036. Domestic orders only accepted; those outside the
US should send $4 to the Foresight Institute and we will order a copy
and send it to you by airmail.

+---------------------------------------------------------------------+
|  Copyright (c) 1988 The Foresight Institute.  All rights reserved.  |
|  The Foresight Institute is a non-profit organization:  Donations   |
|  are tax-deductible in the United States as permitted by law.       |
|  To receive the Update and Background publications in paper form,   |
|  send a donation of twenty-five dollars or more to:                 |
|    The Foresight Institute, Department U                            |
|    P.O. Box 61058                                                   |
|    Palo Alto, CA 94306 USA                                          |
+---------------------------------------------------------------------+