rick@cs.arizona.edu (Rick Schlichting) (04/09/91)
[Dr. David Kahaner is a numerical analyst visiting Japan for two-years
under the auspices of the Office of Naval Research-Asia (ONR/Asia).
The following is the professional opinion of David Kahaner and in no
way has the blessing of the US Government or any agency of it. All
information is dated and of limited life time. This disclaimer should
be noted on ANY attribution.]
[Copies of previous reports written by Kahaner can be obtained from
host cs.arizona.edu using anonymous FTP.]
To: Distribution
From: David Kahaner ONR Asia [kahaner@xroads.cc.u-tokyo.ac.jp]
Re: New Information Processing Technology Symposium (6th Generation Proj)
7 April 1991
ABSTRACT. An International Symposium on New Information Processing
Technologies (NIPT) '91 13-14 March 1991, Tokyo Japan is described. NIPT
is to be the successor to the 5th Generation Project.
INTRODUCTION.
On 13-14 March 1991, an open International Symposium on New Information
Processing Technologies was held in Tokyo, Japan, attended by almost 400
people, including about 30 from 12 overseas countries. The Symposium
program is attached at the end of this report. Three months ago, a much
smaller workshop was held in Hakone on the same topic, and was reported
on in "nipt.90, 28 Dec 1990". Except when required for clarity, I will
omit material that duplicates information given there. Consequently the
earlier report should be thought of as an important supplement to this
one.
WHAT IS NIPT?
There are several parts to NIPT, but the main thread is that some things
are easy for humans and difficult (presently) for computers, such as
reading and interpreting comics. The reverse is also true, computers
seem much better at floating point arithmetic than we are. NIPT seeks to
concentrate on those areas where computers are currently weak. These
include friendliness, flexibility, processing diverse and parallel
information such as speech and image, adaptation, and integration of
ambiguous information. Humans can perform logical symbol manipulation,
intuitive thinking based on pattern dynamics, integration of multimodal
incomplete information, learning from examples, fault tolerance, self
organization, etc. The Japanese believe that computers should be
made more human-like. Until now computer programming has emphasized
logical sequential processing, corresponding to activities in the human
left-brain, but they feel it will be important to build computers that
mimic right brain capabilities. In this report I am deeply greatful to
many helpful comments and corrections from Mr. Satoshi Ishihara (ETL),
Dr. Paul Refenes (DTI and UCL), Prof Bruce MacLennan (UTenn), and others.
In addition, several Japanese participants had an opportunity to review a
draft of this report and voiced no specific objections to it.
The most succinct and complete description of NIPT was provided by
Professor Takemochi Ishii (U-Tokyo, Department of Mechanical
Engineering), who chairs the research committee studying the project. I
could not do better than quoting his remarks, which follow. My comments
[bracketed] are inserted when appropriate.
"Since the 1989 fiscal year, the Research Committee on NIPT has been
conducting a two-year preliminary study for MITI to explore the
possibility of a new international R&D program to realize advanced
information processing technology which can break through the limitations
of conventional computers and must be a basis of the Information Network
Society of the 21st century. [A one-year, feasibility study begins now.
The program per-se, if approved, will begin in 1992. An official budget
has not been announced, but MITI claims this will around $30million US.
There are also likely to be industrial contributions that may be
difficult to measure.]
The Committee has three sub-committees, Fundamental Theory, Computer
Science, and Social Impact, consisting of more than 100 researchers
participating from various fields such as neuro-physiology, cognitive
science, economics, philosophy as well as computer science and
engineering. Here is a perspective of the outcome of the preliminary
study as an introduction to the detailed reports by chairs of the sub-
committees. [Each of the three sub-committee chairs presented detailed
papers at this Symposium. The computer science sub-committee was the
largest of these, and was broken into two working groups, Integrated
Computing, and Optical Computer/Devices. These were in turn broken into
several smaller sub-working groups.]
1. Objective
The goals of the NIPT program should be:
* Establish humanized flexible advanced information processing
technologies which will be the basis of the information network
society of the 21st century; [Since the earlier meeting there is now
a more definite statement that the focus and goals of the program are
diverse, multi-directional and pluralistic. It was admitted that some
research groups will fail, or get lost. I felt that the program has
been enunciated much more clearly than previously, although parts are
still vague. On the other hand, some Western attendees who were not at
the Hakone workshop were confused by its imprecision. One called it
"alice in wonderland", but I think that progress is definitely being
made.]
* Encourage the international cooperative researches to make some
contribution for the development of the fundamental generic
technologies in high-tech fields and construct the international co-
prosperous relationship. [The lofty nature of the goals and limited
resources available require cooperation and coordination. Overseas
participation was welcomed, but admittedly difficult to manage. The
5th Generation Project was thought of entirely Japanese, whereas NIPT
has international cooperation as a major stated goal.]
2. Strategy
These goals will be accomplished through promoting an international
cooperative research program to:
* Realize humanized advanced flexible computers through integration of
logical information processing and intuitive information processing by
** Development of massively parallel and distributed hardware with
new device technologies such as optical devices and
[At the earlier meeting, it was apparent that building a
massively parallel computer was to be a part of this project,
but this is the first time I have seen it explicitly stated. In
later sessions this was spelled out in more detail, but also
specifically proposed as something more challenging than that
selected by industry, such as a one million processor parallel
computer, a fully optical computer, or a neural computer.
It was left undecided how a neural computer will be
integrated into this program, and if the focus will be on a general
purpose computer using a neural engine as one part, or on a
more specialized neural-optical computer.]
** Controlling the hardware in the flexible and adaptive way based
on the theory of flexible information processing. [Later,
Amari made clear than bold new theories are needed, and what is
envisioned is not an extension of current ideas.]
3. Background and Needs for the NIPT
In the 21st century, because of the development of sensor technologies
and telecommunication service, trend for multi-media information
processing, popularization of HDTV, and weight shift from material
production to information production, quantity and variety of information
which flow through the society will be expected to increase explosively.
Computational ability of conventional computers seem to be insufficient
to process such massive and multi-modal information. [It is extremely
important to note the role that networking plays in Japan's view
of the future. In virtually every plan, application, and product, the
existence of high performance networks connecting substantial segments
of the society is a given, something to be folded in and used, rather
than something for which a case needs to be made.]
To cope with the crisis of information explosion of the 21st century,
new information processing technologies enabling us to realize
revolutionary advanced information processing devices which will have
some features as the following:
* Learning and self-organizing ability for software reduction;
* Adaptability for individual users or situations;
* Information integration ability which enables not only analyzing
information but also synthetic decision and use of many kinds of
knowledge cooperatively for problem solving;
* Sufficient complexity of systems compared with the complexity of the
phenomena which the system treat; and
* Affinity for optical communication technologies for enabling the
integration of communication and information processing will be
needed.
4. Fundamental Policy
In carrying out this program, to keep the following fundamental policy
is important;
* Construction of international cooperative and co-prosperous
relationship in high-tech fields by keeping the door open to foreign
companies and universities as well as domestic ones, and guaranteeing
impartial distribution of the results; [Several Western governments
are already involved in discussions as are a few private companies.
Ishii admitted that handling the results was crucial, could
eventually become a model for international R&D in generic high-tech
fields, and that the Japanese people felt the need to become more
global and share world intellectual assets. In particular he noted
that advanced research needed to be used by the developing countries,
and that cooperation with their researchers was important. Although
several private meetings occurred there still is no public statement
of how cooperation and intellectual property are to be handled.
However, at about the time of the Symposium some proposed changes
were announced in the way MITI treats patent rights arising out of
R&D projects that are funded either by MITI or by NEDO (New Energy
and Industrial Technology Development Organization). The proposals
have been submitted to the Diet as a revision of the "NEDO Law". The
claim is that the measure is intended to promote international
research cooperation among the private sectors in order to develop
new technology, and is a first step toward realizing the government's
"technoglobalism" policy. Until now, all patent rights arising from
projects funded by MITI have been held by the government.
Participating companies have had to pay license fees to use these
patent rights. (In the case of projects funded by NEDO, companies can
hold up to 50% of the patent rights, but they still have to pay
license fees.) Under the proposed new measure, companies can hold up
to 50% of patent rights, even if those patent rights arise from
government-funded R&D projects. In addition, companies can use patent
rights arising from R&D projects funded by MITI or NEDO free of
charge or for a small fee. The law is expected to be passed and go
into effect this summer. I assume that the proposed legislation will
also apply to research collaborations with universities.
Because of MITI's support for Japanese industry, as contrasted to
MOMBUSHO which supports Japanese universities, some Western
participants were wary about assuring themselves that cooperation
will mean that benefits flow in both directions. Perhaps it would
make sense to offer up parts of the program directly through
MOMBUSHO. I hope as the next study begins we will get even more
details of who is going to do what, and with what resources. If this
information is open and complete it will go a long way to allieviate
a number of uneasy comments that I heard. See also remarks from
Refenes below.]
* Challenge for creating fundamental and revolutionary generic
technologies by encouraging the participation of the universities and
national laboratories; and
* Shift from R&D for technology itself to R&D for the human beings by
encouraging the participation of users of the technologies and
researchers from related basic research fields such as brain science
and cognitive science. " [End of Ishii's remarks. Western
participants in the Symposium who also attended the workshop were
impressed with the increased role assigned to biology, a topic whose
absence was criticized at the earlier workshop. See also the summary of
Suzuki's lecture below.]
REMARKS.
Both formal and informal discussions suggested that at the moment four
major ideas are active. These were also mentioned by Amari in his lecture
describing the computer science research subcommittee.
(1) Develop a theory of flexible information processing, parallel and
distributed computation, learning and self organization, neural and
optical computation, etc.
(2) Develop a theory and implement an integrated information processing
model of the brain that can be useful for recognition,
understanding, and control in a real world of information, such as
control by complex speech input.
(3) Build a (probably) general purpose, object-oriented, data-driven massively
parallel computer system with 10^6 to 10^9 processors, using optical
or neural chips.
(4) Continue research and development on optical computing, including
optical interconnects, devices, optical neural computers, and fully
optical computers.
With respect to a massively parallel system, Amari described what could
be built using technology that would be working within the next ten
years. A 32bit processing element (PE) would contain 2K words of memory,
and require 600K transistors. Eight processors would be on one chip, 1K
processors would be on a one board module, and 1000 boards would compose
a one million processor module, or system. He emphasized that the NIPT
program would only go so far as to develop proof of concept or do the
research necessary to overcome bottlenecks. Further work would be left
for industry.
Professor Shun-ichi Amari
Faculty of Engineering
University of Tokyo
Bunkyo-ku, Hongo 7-3-1
Tokyo 113 Japan
Phone: 03 812-2111, ext 6910, Fax: 03 816-7805
AMARI@SAT.T.U-TOKYO.AC.JP
PANEL DISCUSSION.
One of the most interesting parts of the Symposium was a panel composed
of eight well known researchers.
M.A. Arbib Univ Southern Cal, USA
R. Eckmiller Univ of Dusseldorf, Germany
A. Hartmann MCC USA
B. MacLennan Univ of Tenn, USA
P.N. Refenes Dept Trade and Industry and Univ College London, UK
J. Shimada ETL, Japan
K. Toyama Kyoto Pref Univ of Medicine, Japan
T. Yuba ETL, Japan
The topic was "Towards Computing in the 21st Century", and each panelist
was given an opportunity to make some formal remarks and then there was
to be a dialog. One or two questions from the floor were allowed to each
speaker. Although all of the presentations were exceptionally well
thought out and extremely valuable, most of the non-Japanese panelists
seemed unable to keep to the schedule and an important opportunity for a
real discussion was lost, save for Arbib, who spoke last and was able to
make a few comments about earlier presentations. As readers will
discover scanning the summaries below, panelists dealt with very different
topics and an interchange among them would have been most useful. My own
feeling is that the Japanese should market an overhead projector which
displays a large clock on its face showing the time remaining (to be set
via remote control by the chair), blinks to alert the speaker when time
is almost up, and after an appropriate grace interval turns the projector
light off. This would also eliminate the embarrassing need for the
secretary to try and catch the speaker's attention with a sign reading
"no more time".
A brief summary of the significant remarks are given below. Eckmiller,
Hartmann, Refenes, Shimada, and Toyama submitted papers which are
included in the proceedings. Arbib and Yuba gave other lectures which are
also in the proceedings, and in those two cases I have used their papers
as reference material.
Toyama: Neurobiologist, discussed aspects of cortical machinery that
might help in future design of parallel computers. (Another biology paper
was presented by Hideo Sakata of Nippon University. I am not qualified to
comment on either of these.)
Eckmiller: A very general appeal to use the international cooperation
aspect of this project to work on solutions to the Earth's major
problems, population, environmental protection, etc. Also pointed out
that the European Esprit model was probably not the right one for MITI to
follow for cooperation. MITI's Nishikawa responded to a question that as
yet there was no contract with any other country, or any real model for
how cooperation should be done. On the other hand, I have learned that
are some negotiations already underway with Western universities for the
exchange of scholars, setting up of research institutes, etc.
Prof Rolf Eckmiller
Dept of Biophysics
Heinrich-Heine-Universitat Dusseldorf
D-4000 Dusseldorf, FR Germany
Tel: (211) 311-4540
MacLennan: Current "hot" computer science areas, AI, expert systems,
fuzzy logic, etc., cannot cope well with flexibility (too brittle--nice
term). Urged the NIPT planners to consider studying how discrete objects
come from continuous ones (neural processes and subsymbolic cognition).
His paper was not included in the proceedings, but copies can obtained by
contacting the author.
Professor Bruce J. MacLennan
University of Tennessee at Knoxville
Computer Science Department
107 Ayres Hall
Knoxville, TN 37996-1301
Tel: (615) 974-5067
Email: MACLENNAN@CS.UTK.EDU
After the meeting, MacLennan forwarded to me some remarks that are
quoted below. I believe that they agree with my own observations as well
as those other participants.
"One of the most important characteristics of the Japanese initiative,
as I understand it, is that it is not a narrowly technological project.
First, it is based on a comprehensive vision of "the information
network society of the 21st century."* This is seen to imply a need
for: (1) flexible computing ("intuitive information processing");
(2) adaptive computing; and (3) massively parallel computing, including
optical computers. Second, the Japanese are aware that much basic
research remains to be done before this vision can be fully realized,
and so they are including researchers in "neuro-physiology, cognitive
science, economics, philosophy as well as computer science and
engineering." Furthermore, they apparently realize that the same
fundamental understanding of the cooperative/competitive dynamics
of complex systems that informs our understanding of neural networks
will also inform our understanding of the multinational projects,
societies and economies that produce them. Thus they are abandoning
the top-down, unidirectional organization of the Fifth Generation
project in favor of a bottom-up, pluralistic, cooperative/competitive
strategy. All of this shows, I think, a breadth of vision that will
carry the Japanese project much further than would the view that
the goal is no more than a new computer technology."
Refenes: The most focused of the panel presentations in my opinion.
Claimed that the NIPT program was ambitious. Pointed out that soft
information processing is not yet a theory, and in fact that there are
several competing theories, no accepted model nor evidence of the
emergence of any unifying theory. Developing the technologies necessary
is a considerable task requiring large resources and long lead times.
Claimed it was unclear if a massively parallel computer should be MIMD,
SIMD, etc., and that problems of overhead explosion are not solved.
Finally, that basic system software development is significant task.
Suggests that as much as possible various problems should be treated
independently, and that objectives should be narrowed. In particular to
decouple the architecture from neurocomputing. Also claimed that high
speed was not really necessary for neurocomputing except during training,
and that neural networks will typically be trained once, off-site, using
general purpose computers. (Arbib disagreed with this; I think I do too.)
Pointed out that there was an important need to provide network
development tools, and network compilation tools. Finally noted that
technology transfer was an essential element in international
cooperation.
Subsequent to the Symposium, Refenes sent me some additonal comments
specifically related to cooperation. He agrees with most other observers
that NIPT is more a Program than a Project, and that a central research
facility, such as ICOT (5th Generation) is inappropriate. He notes that
"From the Japanese point of view, there are two main reasons for
seeking International collaboration in NIPT. Firstly, to establish
a strong presence in the world research community and thus
"legitimize" their Industrial exploitation of world R&D results. The aim
is to counteract long standing criticism of the Japanese
contribution to global R&D efforts, and is in line with the policy
of establishing research Laboratories in the west (e.g. NEC and Sharp in
the USA and Europe respectively). Secondly, NIPT requires significant
numbers of high caliber researchers which, for many but not all areas
of NIPT, are not readily available in Japan."
He also points out that in the past there was not much Western interest
in collaboration because it was felt that Western R&D was significantly
ahead of Japan's and that collaboration would have too many one sided
benefits. In many areas, especially those related to device technology,
this is no longer true. Collaboration on this current project could
provide Western access to Japanese markets (as much by exposure as for
any other reason), and also because of the possibility of new markets
developing from NIPT technology. This might be particularly true in
consumer electronics, as these are likely candidates for intelligence.
(I agree emphatically, and have repeatedly emphasized the role this part
of the industry plays here in Japan.) Finally, Refenes notes that
horizontal collaboration, between Japanese companies and Western
universities is likely to happen anyway, especially if NIPT becomes a
major funding source. He notes that intellectual property rights is the
key to be worked out, that the Eureka model is not a bad one to copy,
and that there need not be any central funding source, except to cover
pure collaboration costs, allowing participating governments to apply
their normal rules.
Dr. Paul N. Refenes
Information Technology Division
Department of Trade and Industry (DTI)
Kingsgate House
66-74 Victoria Street
London SW1E 6SW
United Kingdom
Tel: +44 (0)71-215-5000, Fax: 071-215-8318
Email: P.REFENES@CS.UCL.AC.UK
Yuba: Explained that NIPT is thinking about a "super parallel" machine,
which he defined as one with more than one million processors. A starting
point for this is the ETL EM-4 parallel data-flow machine. Also explained
that they have proposed an EM-5, to be built by 1995, with 16K
processors, 1.3T-Ops, 655GIPS, using an object-oriented model, a
universal network, as well as a robust architecture that uses both
hardware and software to adapt to load scheduling.
Dr. Toshitsugu Yuba
Director of Intelligent Systems Division
Electrotechnical Laboratory
Tsukuba Science City
305 Japan
Fax: 0298-58-5176, Tel: 0298-54-5412
Email: YUBA@ETL.GO.JP
Hartmann: Showed an interesting slide giving the potential payoff of each
of three technologies in processing, communication, and storage.
(Read this table across.)
Processing Communication Storage
Photonics Third First Second
Electronics/
Semicond Second Third First
Electronics/
Supercond First Second Third
Thought that 21st century computing will be characterized by an
additional dimension, processing planar data through volumetric
processors. "Dense data planes can be communicated photonically, while a
compatible ultrafast ultradense processing capability could be achieved
in superconductor electronics, using a storage hierarchy of semiconductor
and photonic storage devices.
Dr. Alfred Hartmann
MCC Computer Physics Lab
3500 W. Balcones Center Drive
Austin, Texas 78759-6509
Shimada: Discussed the pros and cons of optical computers. Clear
description of technical problems, but conclusion "Optical
interconnection is strongly advocated as a basis of optical computers"
was conservative.
Arbib: Began by saying he didn't like the term "soft" and suggested the
use of "flexible" instead. Felt that the de-emphasis on programming was
wrong, and that all computers will need to be programmed. Rather than
reducing the need for programmers we will be making programming easier,
and also easier to describe more complex issues. Also (similar to
Refenes) suggested that the program set more modest goals, establish
benchmarks, and develop specific applications. He felt that 10 years was
a very short time, and wondered if time lines of 100 or 1000 years would
be necessary to computerize "wisdom". His vision of 6th generation
computer is one of "cooperative computation; the computer will be a
problem solving network, rather than a single serial machine. The average
user will use an expert system to configure a network of standard
components with established network protocols, whereas the advanced user
will "program" new networks for new applications, using environments for
distributed programming, including design of new components (silicon
compilers' mechatronics) and network protocols. Each 6th generation
computer will thus be a network of subsystems, including general-purpose
engines and special-purpose machines some of which (such as the front
ends for perceptual processors, and devices for matrix manipulation) will
be highly parallel machines. Some subsystems will use optical computing;
more speculatively, some may employ biomaterials. Another key aspect ...
is the use of learning in artificial neural networks, which can adapt
automatically to new tasks in a manner based on the learning principles
of the brain. We will also see devices and computers more tightly
integrated so the perceptual robotics will be an integral part of the 6th
generation design, with computers including robotic actuators and multi-
modal intelligent interfaces among their subsystems.
Professor Michael A. Arbib
Director, Center for Neural Engineering
University of Southern California
Room 03, Nedco Neurosciences Building
Los Angeles, CA 90089-2520
Tel: (213) 740-9220, Fax: (213) 746-2863
Email: ARBIB@POLLUX.USC.EDU
Subsequently, Arbib read a draft of this report and agreed that it
correctly summarized the content and spirit of the meeting.
OPTICAL COMPUTING.
Last year I wrote a summary of optical computing activities in Japan, see
"optical, 17 August 1990, but other remarks are also given in the
"nipt.90" report cited earlier. An excellent survey of Japanese research
in optical computing is given in "Optical Computing in Japan", S.
Ishihara (ed) 1990, Nova Science Publishers Inc., 283 Commack Road, Suite
300, Commack NY 11725. For additional information contact the editor
Mr. Satoshi Ishihara
Senior Researcher, Optical Information Section
Electrotechnical Laboratory
Tsukuba Science City, 305 Japan
Tel: (0298) 58-5625, Fax: (0298) 58-5627
Email: ISHIHARA@ETL.GO.JP
At this meeting, two lectures were presented on this topic, by
Prof. Takanori Okoshi
Research Center for Advanced Science and Technology (RCAST)
University of Tokyo
4-6-1 Komaba, Meguro-ku, Tokyo 153 Japan
Tel: (03) 3481-4436
and
Dr. Alan Huang
Head, Optical Computing Research Dept
AT&T Bell Labs
Room 4G514
Crawfords Corner Road
Holmdel, NJ 07733
Tel: (201) 949-7631
Okoshi explained that about 35 university professors have just launched a
Grant-in-Aid Special Research Project, to end March 1994, entitled
"Ultrafast Ultra-Parallel Optoelectronics", and his talk centered on
three examples of the work associated with that project. He displayed a
figure showing operating time versus operating power on which various
devices (silicon transistors, GaAs, Josephson Junction, etc) were
plotted, along with boundaries associated with cooling, numbers of
photons, and uncertainty, showing what kinds of devices will make sense
in different regions. For example, with 1 pico watt power, the
uncertainty limit requires no less than 20 pico second operations,
whereas with 1 milliwatt this can be reduced to 0.001 pico second.
However, the photon limit, that is point below which not enough photons
are being delivered to reliably make decisions about "0" or "1" is much
more restrictive, forcing operation times of more than 1 micro second
with one pico watt of power. Details of this work is cited as T. Kamiya,
"private communication" in Okoshi's paper, but it was (firstly in
English) published in the Nova book mentioned above (T. Kamiya;
pp.407-417). [Thanks to Mr. Ishihara for pointing this out to me.]
Finally Okoshi concluded that "if an ultrafast optical computer is to
realized in future, we will be obliged to take advantage of the parallel
computation capability of the optical approach, because its advantage
cannot be emphasized too much on the power-speed trade-off graph."
Okoshi then went on to describe a parallel logic system OPALS, two
dimensional surface-emitting laser arrays, and an experiment in
fabricating both AND and EOR units using a semi-insulating GaAs-wafer-
based high-mobility epitaxial GaAs layer as active medium. As these have
already been published in English their descriptions are omitted here.
He ended with the general remarks that optical transmission is
still ahead of optical computing. With respect to the latter, premature
research is still working toward prototypes, and that many years will be
required with lots of room for new ideas. However some technology is
already practical, such as optical memory (CD-ROM), although
opportunities are open for innovation. In the near future, optical
interconnections will become more important via fiber optics, free space
optics, and wave guides. In the former (optical interconnections) we
already (1990) can transmit at 10Gbits/second. Okoshi also showed a table
of the rate of improvement in communication capability, which is
summarized below.
Year M-bits/second #Telephone Lines #TV Channels
1981 32 460 1
1987 1600 23000 48
1990 10000 150000 200
Finally, he pointed out the improvements in long distance transmission
capability in 1990, about 10km using coaxial cable, vs (experimentally
realized) 364km using optical lines, between repeaters.
My impression of this lecture was that it was very conservative, with a
great deal of hesitancy to commit as to whether optical computing will
really work, and if so how long it will take. The tone was entirely
different from that of Huang's lecture. Huang gave the last, and one of
the most up-beat presentations of the work that his group is engaged in.
He began his lecture by noting that today's supercomputer has a clock in
the range of a few nanoseconds, while the transistor runs at a few pico
seconds, a factor of one thousand difference, which he feels can be made
up by use of optical connections. Again, much of this has been published
so I only summarize his conclusions, i.e., he expects that using optical
output pads and various architecture modifications will allow 100GBit/sec
output. Using more speculative weak nonlinear optical materials he also
believes that femtosecond reaction times might be achievable. He also
stated that "optics can easily achieve over 50 times more connectivity"
(parallelism).
SPECIAL LECTURE.
A dazzlingly elegant lecture on symmetry was presented by Professor
T.D.Lee, Columbia University Physics Dept, and winner of the 1957 Nobel
Prize in Physics. Nevertheless, as far as I could tell, the only
connection with this Symposium were his remarks about quantum
chromodynamics (QCD) calculations requiring very fast parallel computers.
He showed a graph on which various special purpose QCD computers' speed
were plotted against time. Early machines (mid 1980s) were capable of
about 100MFlops, current machines are in the range of 10GFlops, including
one (GF11) at the speaker's institution (Columbia), and another (QCDPAX)
at Tsukuba University. These are still orders of magnitude below the
performance that is required. At exactly the same time as this meeting, a
collection of high energy physicists were also conferring at Tsukuba to
discuss the same problem of Computing in High Energy Physics, and one of
the speakers there showed essentially the identical slide as Lee did.
OTHER LECTURES.
A few other lectures are worth noting briefly.
H. Tanaka (U-Tokyo) described the Expectations and Problems in the World
of New Info Processing in a blitz talk loaded with facts and figures
going far too fast for me to take in or get much out of. A few details.
He pointed out that ICOT's PIM/P machine will be generating 8GIPS next
year. He also mentioned (***check this**) development of Micro 2000,
using 0.1um(micro meter) CMOS, 25x25mm chips (with FPU, 5*10^7 ? transistors), 64bit
words, 4 PE per chip, capable of 2000 MIPS. Neural chips 125x125
(connections?) or more, from Hitachi and Mitsubishi.
W. Goloi (GMD-Berlin) discussed two research topics that his group has
been working on related to methods of programming massively parallel
systems. These are virtual shared memory, in which a distributed memory
machine can be programmed as if it had shared memory, and virtual
processor model in which the user can pretend that there are as many
synchronized processors as are appropriate for the application. He
claimed that such ideas are very well suited for real applications such
as lattice guage (QCD), and finite difference computations. As Giloi
publishes in English it is not necessary to detail this further, except
to say that he made a very persuasive case (to me) about his activities,
and seemed deeply involved in system building and testing of these ideas.
Dr. W.K. Giloi
GMD Research Center for Innovative Computer
Systems and Technology
Technical University of Berlin
Hardenbergplatz 2, Berlin 12, Germany
Email: GILOI@GMDTUB.UUCP
Ryoji Suzuki (U-Tokyo) discussed general principles that biology can
teach us about computing. Suzuki is the chair of the fundamental theory
subcommittee. These principles are
(1) Highly parallel distributed processing, including the architecture of
the brain, the role of efferent signals, how information is represented
in the brain (including the possibility that chaotic behavior of a
network could be a candidate for information coding), and understanding
the neuron as a processing unit.
(2) Learning and self organization, including the multilevel organization
of the brain (molecular, network, behavioral).
(3) Integrated processing of patterned and symbolic information (this
includes unconscious parallel processing, and later conscious serial
processing in the recognition system, integrated processing in the motor
control system, and mutual interaction between these).
International Symposium on New Information Processing Technologies '91
13-14 March 1991, Tokyo Japan
Program:
Greetings.
Eiji Kageyama, President JIPDEC (Japan Information Processing
Development Center)
Kohsuke Yamamoto Director-General of Machinery and Information
Industries Bureau, MITI
Keynote Speech
Toward New Information Processing Technologies
Takemochi Ishii, Prof Univ of Tokyo & Chairperson, Research Committee
on The New Information Processing Technology
Special Lecture
Symmetry Principles in Physics
T.D. Lee, Prof Columbia Univ, Winner 1957 Nobel Prize in Physics
Research Reports
Information Processing Age in the 21st Century--an Impact of the New
Information Processing Technologies in Society
Tadashi Sasaki, Senior Advisor, Sharp Corp & Chairperson Social
Impact Subcommittee
What the Brain Tells Us Towards a New Computational Principle
Ryoji Suzuki, Prof Univ of Tokyo & Chairperson Fundamental Theory
Subcommittee
Perspectives of New Information Processing Technologies
Shun-ichi Amari, Prof Univ of Tokyo & Chairperson Computer Science
Subcommittee
Panel Discussion
Towards Computing in the 21st Century
Coordinator: Shun-ishi Amari
Panelists:
M.A. Arbib Univ Southern Cal, USA
R. Eckmiller Univ of Dusseldorf, Germany
A. Hartmann MCC USA
B. MacLennan Univ of Tenn, USA
P.N. Refenes Univ College London, UK
J. Shimada ETL, Japan
K. Toyama Kyoto Pref Univ of Medicine, Japan
T. Yuba ETL, Japan
Fundamental Theory Session
Implementation of Learning Computational Principles in the Cerebellar
Neuronal Circuity
Masao Ito, Inst of Physical and Chemical Research, Japan
(because of illness, this was replaced at short notice by)
Neural Mechanisms in Association Cortex
Hideo Sakata, Nippon Univ
Schemas and Neural Networks: From Brain Models to Cooperative Computation
M.A. Arbib, Univ of Southern Cal, USA
Integrated Computing Session
The World of New Information Processing-The Expectations and Problems
H. Tanaka, Univ of Tokyo, Japan
Programming Models for Massively Parallel Systems
W.K. Giloi, Prof GMD, Germany
Optical Computer and Devices
Ultrafast Ultra-Parallel Optical Information Processing and Transmission
T. Okoshi, Univ of Tokyo Japan
The Evolving Relationship Between Optics and Electronics
A. Huang, AT&T Bell Labs, USA
Closing Remarks
Masao Teruyama, Executive Director JIPDEC
-----------------END OF REPORT----------------------------------------