[comp.research.japan] Kahaner Report: NIPT Workshop

rick@cs.arizona.edu (Rick Schlichting) (12/29/90)

  [Dr. David Kahaner is a numerical analyst visiting Japan for two-years
   under the auspices of the Office of Naval Research-Asia (ONR/Asia).  
   The following is the professional opinion of David Kahaner and in no 
   way has the blessing of the US Government or any agency of it.  All 
   information is dated and of limited life time.  This disclaimer should 
   be noted on ANY attribution.]

  [Copies of previous reports written by Kahaner can be obtained from
   host cs.arizona.edu using anonymous FTP.]

To: Distribution
From: David Kahaner ONR Asia [kahaner@xroads.cc.u-tokyo.ac.jp]
Re: New Information Processing Technology Workshop (6th Generation Proj)
26 Dec 1990

ABSTRACT. A two day workshop 1-2 Dec 1990, was held in Hakone Japan to 
discuss aspects of a possible Japanese new ten year program to follow the
5-th Generation (ICOT) program that is to end in 1992. A summary of the 
discussions is presented and some opinions about the directions that this 
program might take.

INTRODUCTION AND SUMMARY. 
Over the past year I have written several reports on long term Japanese
research programs in computing. The most famous of these programs is the
"5-th Generation" project, also known as the ICOT.  ICOT is scheduled to
end in 1992 and the Japanese Government is studying possible follow on
projects. The most exciting of these, and the largest, is NIPT, or New
Information Processing Technology. The name "6th Generation Project" is
Western, not Japanese.  I reported on early plans for this in [nipt, 25
June 1990] listing the main goals of the project and names/affiliations
of various committee members.  It also gave some comparative information
about projects in the U.S. and E.C.  that overlap. See also
[japgovt.upd, 30 July 1990]. In March 1990 The Ministry of International
Trade and Industry (MITI), issued "Report of The Research Committee on
The New Information Processing Technology", in English, describing the
status and goals of the program. Both my June 25 report and MITI's
should be considered as significant appendices to this one. Please write
to me for copies of either of these.  (The MITI report is not available
in electronic form.)

The overall goals of the program may have evolved slightly since those 
reports, or perhaps different people have been talking about them.  
Briefly, NIPT is to perform research and development of new paradigm 
information processing technologies based on "soft (flexible) information 
processing" and "integrated computing." The actual meanings of these 
terms are vague enough that a great deal can be subsumed under them. The 
program is proposed to have three major components.  

1.    Theoretical.  To establish new theoretical foundations for soft
information processing on integrated computing systems.

2.    Technological.  To develop integrated computing systems with new 
architectures, including artificial neural networks and optical computing 
systems, which are well suited to soft information processing.  

3.    Application. To create and expand application domains of soft 
information processing functions on integrated computing systems.  

More detailed goals of the program have not been articulated and its 
targets are explicitly to be left flexible.  

If this program goes forward as its proponents hope, it will be funded 
over 10 years at the level of $30-40 Million (U.S.) per year, beginning 
in 1992.  

The general Research Committee on New Information Processing 
Technologies, draws upon three subcommittees, in Fundamental Theory, 
Computer Science, and Social Impact. The CS subcommittee is in turn 
broken into two collections of working groups. The overall organization 
is shown below.

 NIPT Committee
  Fundamental Theory Subcommittee
  Computer Science Subcommittee
    Integrated Computing Working Groups
      Theory and New Functions Sub-Working Group
      Neural Systems Sub-Working Group
      Massively Parallel Systems Sub-Working Group
    Optical Computer and Devices Working Groups
      Needs for Optical Computing Sub-Working Group
      Parallel Optical Digital Computer Sub-Working Group
      Optical Neural Computer Sub-Working Group
      Optical Interconnection Sub-Working Group
  Social Impact Subcommittee


Because of the intense international interest, and also to solicit new
ideas as part of the planning process, the 
    Advanced Information Technology Research Office
    Japan Information Processing Development Center (JIPDC)
    3-5-8 Shiba-Koen, Minato-ku, Tokyo 105 Japan
    Tel: +81 3 432-5405, Fax: +81- 3 431-4324
organized a Workshop in Hakone, about one hour's train ride south of
Tokyo, 1-2 December 1990, to bring together the working groups and to 
discuss the new NIPT proposed program.  

Invitations to the Workshop were limited to about 50 persons, from 
the Japanese NIPT working groups, MITI, JIPDC, researchers 
from the U.S., France, and Germany. A few other foreign researchers were 
invited but unable to attend. In addition 6 U.S. Government officials 
attended as observers, as did one person from the German GMD Liaison
Office in Tokyo.  The U.S. observers, other than myself, were

        Dr. Eugene Wong
        Associate Director
        Office of Science and Technology Policy
        The White House
        Washington, DC
         (202) 395-3902

        Dr. Lance Glasser
        Program Manager
        Defense Advanced Research Projects Agency (DARPA)
        Information Science and Technology Office (ISTO)
        1400 Wilson Blvd
        Arlington VA 22209-2308
         (703) 614-5800
         GLASSER@DARPA.MIL

        Mr. John E. McPhee
        Director
        Office of Computers and Business Equipment
        U.S. Dept of Commerce
        International Trade Administration
        Room HCH-1104
        Washington, DC 20230
         (202) 377-0572

        Dr. Robert A. Kamper
        Director, Boulder Laboratory
        National Institute of Standards and Technology
        Electromagnetic Technology Division
        Boulder Colorado
         NICOLETI@CENTRAL.BLDRDOC.GOV

        Dr. Edward Malloy
        Counselor for Scientific and Technological Affairs
        Embassy of the United States of America
        10-5 Akasaka, 1-chome
        Minato-ku, Tokyo 107, JAPAN

The Workshop was organized as follows. 
 * A half morning general session (Japanese speakers)
 * Two parallel sessions lasting into the night of the first day and 
     half the morning of the second. 
       "Integrated Computing Track"
            Approach to Integrated Computing session
              (reports from the three Japanese sub-working groups)
            Overseas Activities in Integrated Computing session (US & EC)
            Informal Discussion on Future Information Technologies (all)
            Comments on NIPT session (US & EC)
       "Optical Computer and Device Track." 
            Approach to Optical Computing session 
              (reports from the four Japanese sub-working groups)
            Overseas Activities in Optical Computing session (US & EC)
            Informal Discussion on Future Information Technologies (all)
            Comments on NIPT session (US & EC)
 * A late morning general session titled "Toward International 
     Cooperation" with formal presentations by the Japanese, and informal 
     presentations by U.S., EC, & GMD.  

Shortly after the conference ended Dr. Kamper sent me a copy of his 
excellent trip report. Kamper attended the Optical Computing track, while 
I attended the Integrated Computing track.  Because our views on the 
combined sessions seemed similar I have decided to merge his report into 
mine in the following way. In the section labelled Optical Computing, all 
the comments are Kamper's. Comments about Integrated Computing are mine.  
Remarks about the combined sessions are a mixture, sometimes quoted, 
although I take responsibility for the content. For a complete copy of 
Kamper's report please write to him directly.  I also had an opportunity 
for discussions with the other attendees including Dr. Wong, Dr. Glasser, 
and Mr. McPhee. Nevertheless all the comments below are my own and, as 
usual, do not represent any official policy.  If other attendees send 
summaries to me, I will revise this report to reflect them.  

It was the impression of most of the foreign attendees that major
aspects of this program are still very vaguely defined. In fact, with
the exception of some industrial research projects, the majority of the
factual information that was contributed seemed to be from outside
Japan.  Partially this was because the researchers came prepared to talk
about specific research activities, and the Japanese seemed to be more
interested in discussing the general directions of the program. This was
reflected in comments as to whether this was a "project", "program", or
"initiative". Ignoring the semantics, I have used "program" consistently
here without attempting to differentiate it from the other terms.

MITI's description of possible international cooperation was also very 
vague. The hand drawn overhead transparency on this was greeted by good 
natured howls because of its complexity as well as warm support for the 
speaker's willingness to present it.  The official U.S. response by Dr. 
Wong was also vague, as there was little to respond to. Nevertheless, 
several U.S. and EC researchers admitted that they were quite interested in 
accepting money from any source that was handing it out, independent of 
official government-to-government agreements.  

RECOMMENDATIONS.
It is not possible yet to know what form this program will take, or even 
if it will definitely be funded. At the moment it seems a long way from a 
coordinated project, and looks more like a general umbrella under which a 
large number of research topics will be covered (see Yuba's comments 
below for a list).  The optical computing portion will certainly go 
forward. The integrated computing portion, where most of the software 
research is centered, will probably be supported but I feel that it needs 
to be more clearly defined first. If the massive parallelism portion 
results in an effort to design and build a very large system, that 
activity will attract world class researchers like bears to honey.  
Aspects of the NIPT program probably overlap significantly with the U.S. 
High Performance Computing Initiative.  

The preceding program (5-th Generation Project) did not emphasize 
international cooperation to the extent that this one appears to do, and 
so it is very important that all relevant scientific organizations be 
involved, firstly in the shaping of that cooperation, and then in its 
implementation. The technical aspects of the program also need to be 
followed and reported outside Japan. Another meeting is to be held March 
13-14, in Tokyo. Current plans are for a total attendance of about 400, 
with 100 from MITI. Talks will be in Japanese and English with 
simultaneous translation, and it is essential that international 
organizations participate.  

As the program firms up it would be very healthy to invite several of the 
key scientists to technical meetings in the U.S. and elsewhere so they can 
articulate their ideas personally and discuss them with other (normally
skeptical) scientists. 

DETAILED SUMMARIES, GENERAL.
>From what I heard in the combined sessions about optical computing, it
seemed that this was the better focused of the two tracks.  Research has
gone from fiber optics, to optical interconnections, and is now
beginning to move to optical computing at the device level.  This seemed
to me to be entirely related to hardware research. Another colleague who
attended this track also mentioned to me that there was almost a
complete absence of discussion of software issues there.  See also my
report "optical", 17 August 1990. However, Kamper felt that track this
was not yet focused enough.

The integrated computing track was very poorly focused. It 
was not clear if any concrete ideas have crystallized yet, particularly in 
the software area. In fact, the only software/computer science talk was
by Agha (U Illinois). Oyanagi (Toshiba) did touch on the question of
software, but mostly to comment on the difficulties in bridging the
hardware/software gap, and his talk focused more directly on
technologies for implementation. Opening remarks by Amari (U Tokyo)
discussing the difference between logical and intuitive computing, and
the need for new information principles and more basic mathematical
theory, did not seem to be followed up by the Japanese in any general
way as to how to go about doing this. However, Amari himself, has made
significant and deep theoretical contributions in neural networks and
related learning theory. He was also the organizing chair of this
Workshop and his ideas are held in very high regard. I was disappointed
that he was unable to participate beyond the opening sessions, as his
perspective would have been very helpful.

Massive parallelism was mentioned frequently, but I found 
myself confused by what the speakers actually meant. At first I thought 
that it referred only to neural computing, especially when terms like 
"adaptable", "self organization", "learning", "advanced human interface", 
and "brain machine" were repeatedly used. Many of the examples, especially 
those related to optical computing, emphasized neural networks such as 
for English to Japanese translation and intelligent feature extraction 
from noisy images.  This might perhaps be generalized in some way, such 
as to genetic computing, which was enthusiastically described by 
Muehlenbein, (GMD-Germany).  The Japanese, informally, agreed that genetic 
algorithms were an important element that was not emphasized strongly
enough.  Later there were some discussions of data-flow computing by
attendees from the ETL lab in Tsukuba where much of this work has been
going on.  See my report [etl, 2 July 1990]. Eventually I asked
specifically if massive parallelism was meant to be more than neural
networks, and here the Japanese seemed honestly to be undecided. T. Yuba
(ETL) who will play an important role in the management of this program,
had a "who knows" expression,  and Shinada (ETL) a data-flow researcher
said "I hope so".  

The committee is clearly not interested in the type of massive
parallelism that has resulted in the Connection Machine.  As far as I
could tell, there was no discussion about engineering applications such
as those that drive current supercomputing activities.  There was no
discussion of biological computing, and this was criticized by several
attendees. I asked if "soft" computing was related to "fuzzy logic" and
after some hesitation one Japanese scientist admitted that he felt the
latter was a "quick and dirty" approach and that NIPT was hoping to look
at much more fundamental ideas. None of the other Japanese disputed that
statement.

Kamper made the following observations.
"Nobody made any commitments and nobody revealed any technical 
information or even technical opinions that were not already common 
knowledge.  Nishikawa (MITI) stated that the purpose of the meeting was 
simply to collect ideas as a basis for developing more concrete 
proposals, and I suppose there was some merit in airing the field of 
common knowledge, all in one place, with the participants alert to the 
context.  Certainly we all came away with a mutually understood view of 
the status and prospects of the field.  Otherwise there was very little 
progress to report.  

No one person can speak for a democratic nation, so the process of 
arriving at a consensus must be slow and iterative.  It would help to 
have "straw man" proposals to study ahead of time, so that delegates 
could go to a meeting knowing in advance what their nations regard as the 
limits of negotiation.  This requires a clear vision of goals of the 
program, and I sense that we are a long way from that.  We need some 
good, strong ideas supported by arguments that can survive discussion by 
skeptical people.  The origin of the optical computing program at UCSD is 
a good example to study.  Singh Lee's original vision was tempered by the 
different views of colleagues and funding agencies to become a very 
productive program.  Perhaps we should start with a similar vision for an 
international program." 

At the outset, it was emphasized to the attendees that the talks should
be very frank and open; the Japanese had been encouraged beforehand not
to wear ties!  Nevertheless, many of the public statements were
cautious, especially those from Government representatives, both
Japanese and U.S..  A number of Japanese were clearly uncomfortable
speaking to officials who they did not know.  This was amplified by the
relatively large U.S.  official presence, which was definitely a topic
of conversation.  My own feeling was that the Japanese scientists were
very frank and open with me. I believe that they sincerely want Western
opinions, and hope that the program will have a strong international
cooperative component. Of course, the challenge is to arrange that
cooperation so that all parties benefit. Perhaps being here more than a
year has helped a little to make communication easier. I have had almost
no interaction with MITI or other Japanese Government officials, and
have no comments about their views.

I would like to suggest that the non-Japanese Workshop participants were 
fortunate to participate in what would normally be considered internal 
discussions about a program that is still taking shape, and given an 
opportunity to see aspects of the Japanese decision making process at 
work. It is difficult to inject a systematic Western view into the 
Japanese consensus oriented one without both sides being ill at ease.  
Perhaps a few foreign attendees came expecting a more definitely 
structured plan.  There were a great many "this is my personal opinion" 
from the Japanese, eventually generating good natured laughter. A Western 
perspective might be that none of these people can make a commitment for 
the Government.  Perhaps a more generous statement would be that the 
decision process goes around and around until at last everyone's personal 
opinion agrees, and then this naturally becomes the official opinion.  

On the other hand, a final report is due March 1991, and there does not 
appear to be much time left to define the program well enough that it can 
be funded coherently. One of the attendees who often distributes research 
funds remarked that "I'm glad that I don't have to defend this program." 
The current phase of the program is viewed as preliminary, ending this 
month. A feasibility study will begin now and continue through most of
1991.  The actual national program, if it is funded, will begin in late
1991. In the U.S. it is unlikely that a program this vague would be
funded without significant changes. The Japanese system may be
different; it is almost certain that something will be funded.


DETAILED SUMMARIES, COMBINED SESSIONS.

(Combined Session)  OVERVIEW OF THE NEW INFORMATION PROCESSING INITIATIVE

S. Amari (U. of Tokyo), who was chairman of this session, opened the 
meeting with a somewhat abstract overview of the new technologies for 
information processing.  Amari is one of the leading figures in Japanese
neural net research, and his papers have international readership. He
compared present-day computers and "hard" logic to the human brain and
"soft" logic, and proposed a 10-year program with flexible targets to
develop theories and models, system architecture, and device technology.
He emphasized the flexibility of the program's targets. Finally, he
remarked that it is just as difficult to cultivate cooperation among
institutions in Japan as it is to cultivate international cooperation.

T. Yuba (ETL) gave a summary of the goals that MITI hopes to achieve
with the NIPT program. He went on to describe difficulties with
conventional information processing, such as
      - to represent and process ambiguous or incomplete information,
      - to describe and solve problems in which a multitude of
          inter-related information factors are involved,
      - to adapt or generalize itself to environmental information
          which is changing dynamically.

He also gave some examples of the usage of the term "Soft"

               Soft Information:
                  l) Ambiguous
                  2) Incomplete
                  3) Multi-modal
                  4) Mutually Dependent
                  5) Massive

               Soft Control:
                  l) Learning
                  2) Self-organizing
                  3) Optimal
                  4) Adaptive
                  5) Massively parallel

               Soft Processing:
                  l) Processing of soft information
                     - Robust, reliable, high-speed, etc.
                  2) Processing based on soft control
                     - Learning, self-organizing, optimizing, etc.

               Soft Evaluation:
                  l) Evaluation based on soft information
                     - Robust, reliable, high-speed, etc.
                  2) Evaluation with allowance of uncertainty 
                     - Ambiguous/incomplete criteria, adaptive criteria, etc.

Finally, in the three subdomains of the initiative, theory, technology,
and applications, he described the research subjects that the initiative
would support.
  1. Fundamental Theory
      Theory of soft information processing, which is theoretical
      foundation for such functions as:
         - Processing of ambiguous/incomplete information
         - Solving of approximately correct problems
         - Integration of massive information
         - Learning and self-organization

  2. Fundamental Technology
         - Integrated computing systems with advanced architectures,
             utilizing neural computing and optical computing
             technologies.
         - Massively parallel and distributed operating systems and
             high-level languages.

  3. Application Domains
         - Recognition via constraint satisfaction and active
             perception (Image/speech processing)
         - Problem solving by using soft knowledge and soft
             inference (Expert systems)
         - Self-organizing information base and ambiguous query
             systems (Database)
         - Open system simulation (Simulation)
         - Autonomous and cooperative control of multiple robots (Robotics)

N. Otsu (ETL) talked about fundamental theory and soft logic.  His 
lecture was essentially a summary of his own research on nonlinear 
feature extraction in pattern recognition using Bayesian methods to 
define a best estimate. Many people were puzzled that he chose to present 
this extremely technical talk at the combined session of the Workshop.  
In the final few minutes Otsu described some impressive applications of 
his work to an adaptively trainable vision system.  Unfortunately, by 
this time most of the audience had lost the main point.  Except for a few 
in the audience who were well versed in probabilistic logic, not much 
information was communicated.  

T. Kamiya (U. of Tokyo) discussed the prospects for optical technology.  
He started with a brief history of the topic and referred to three 
reports that summarize the present status: "Optical computing in Japan," 
edited by S. Ishihara; proceedings of the Kobe meeting, 1990; and the 
JIPDEC Report.  Kamper notes that "Having read two of these documents, I 
can confirm that they cover nearly all of the technical material 
discussed at this Workshop." The Kobe meeting is also discussed in my 
report [optical, 17 August 1990].  

Kamiya reviewed the present status of optical devices.  Among those 
already demonstrated are:  switches with picosecond switching time; 
optical interconnects for wafer scale integration; an optical system 
using spatial light modulators (2 dimensional) for parallel digital 
processing; and an opto-electronic neural chip.  

For a research strategy, he proposed a focus on optical interconnects and 
various types of optical computers:  dedicated, special-purpose 
computers; optical digital signal processors; image processors; work 
stations; intelligent robots; supercomputers and main frames. To myself 
and many others in the audience, these directions appear perfectly 
reasonable.  

(Combined Session) TOWARD INTERNATIONAL COOPERATION.

This session brought in some higher officials from MITI including
        Mr. Hidetoshi Nishimura
        Director
        Information, Computer and Communications Policy Office
        Machinery and Information Industries Bureau
        MITI
        1-3-1 Kasumigaseki, Chiyoda-ku, Tokyo 100
         Tel: (03) 501-1511, ext 3321-5, (03) 501-2964 (direct)
and
        Mr. Taiso Nishikawa
        Deputy Director
        Industrial Electronics Division
        Machinery and Information Industries Bureau
        MITI
        1-3-1 Kasumigaseki, Chiyoda-ku, Tokyo 100
         Tel: (03) 501-1511, ext 3341-6, (03) 501-1074 (direct),
              (03) 501-2788 (direct).

Edward Malloy (U.S. Science Councellor, from the Embassy in Tokyo) also
attended.  An interpreter was present to help avoid misunderstanding,
but she was occasionally corrected by bilingual participants.  Prepared
talks were interspersed with informal discussion, moderated by T. Kamiya
(U Tokyo).

T. Nishikawa (MITI) repeated that this Workshop was part of the process 
to establish a program to follow the 5th Generation Project when it ends
in March 1992.  The preliminary study started in 1989 and a final report
is due in March 1991, when it will be followed by a feasibility study.

He then carefully defined four distinct types of international 
cooperation, divided by source(s) of funds, location of research 
laboratories, and the option of short term exchange of scientists.  All 
four had the common feature that scientists from both countries 
participate.  

With respect to the NIPT program, he expressed MITI's commitment to 
international cooperation but stated directly that there is as yet no 
concrete proposal, and that time and discussion as well as ideas are 
needed.  He then showed a diagram of his personal view of the form the 
organization of the cooperation should take.  It was a very complicated 
combination of the simpler diagrams he had used to describe his four 
types.  He proposed that a central office should control a pool of funds 
supplied by Japan, USA, and EC.  MITI and counterpart government 
organizations in the other participating countries should be involved.  
Laboratories should be located in all participating countries, with 
exchange of scientists.  The central pool should also fund international 
consortia.  The whole organization should have complete symmetry with 
respect to national borders.  It should operate under the U.S./Japan 
Agreement on Cooperation in Science and Technology and a counterpart 
EC/Japan agreement.  Intellectual property rights should be allocated 
according to degree of participation and in accordance with the various 
S&T agreements.  

Discussion.--------
G. Agha (U. of Illinois) asked what mechanism will be used by the 
Japanese government to choose among the options.  Nishikawa replied that 
a survey team had been sent out before the Workshop, but did not comment 
on its findings.  He stated that the Japanese government has no intention 
to press foreign participants to follow concepts it had developed 
unilaterally.  He expects the 1991 feasibility study will be funded in 
Japan, and encourages the U.S. and EC to organize parallel studies.  He 
declined to suggest the form of these studies, but called for a 
commitment to cooperation.  

M.E. Prise (Bell Labs) commented that as an individual he is eager for
international cooperation, but as an employee of a U.S. corporation he
cannot make any commitments without concrete proposals and the
establishment of company and government policy.  Nishikawa replied that
he understands the point.  He said that the purpose of this Workshop is
to collect ideas and comments from individual researchers as a basis for
further development.-----------

E. Wong (OSTP) presented the U.S. government view.  He explained that the 
U.S. government officials at this meeting came as observers, not 
participants, and were attracted by the prominence of international 
cooperation on the agenda.  He recognized many theoretical advantages of 
cooperation, such as economy in the use of R&D funds that could be used 
for other means of promoting economic growth in a time of world-wide 
capital shortage.  But he pointed out that excellence is driven by 
competition, that Japan has learned better than the U.S. that cooperation 
and competition can coexist, and praised MITI for fostering both 
successfully.  He pointed out that several members of the U.S.  
delegation represented science and technology agencies that are eager to 
participate in the early stages of planning, and that in a well designed 
program, individual institutions and national interests are accommodated.  
Finally, he suggested that this new initiative should be organized under 
the U.S./Japan S&T Agreement.  

Discussion ------------
G. Agha (U. Illinois) asked what role the federal government would have, 
and pointed out that the U.S. has a decentralized system in which 
individual organizations respond mainly to sources of funding, and that 
he personally was anxious to begin discussions with NIPT staff.  Wong 
recognized our tradition of decentralization and the generally 
independent nature of scientists.  He saw the government's role as 
catalyzing and coordinating the effort, with everyone's goodwill.  T.  
Kamiya (U. Tokyo) asked what is the motivation to cooperate.  Wong 
replied that all parties should gain more from a cooperation than they 
would as individuals.  S. Lee (UCSD) remarked that he is attracted to 
cooperation because he prefers fighting nature to fighting people.  He 
emphasized the need for fairness, considering both past and future 
investments.-----------

H.W. Muehlenbein (GMD, Germany) presented the European perspective, and 
commented that funding in the EC's ESPRIT program is much larger than 
NIPT's is likely to be, but that the optical computing component looked 
about the same.  [The major European Community research funding agency is 
the European Strategic Program for Information Technology (ESPRIT).  This 
has a billion dollar budget, spent mostly on short-term research in 
electronic computing.  About 5% is spent on optical computing projects.  
Another, but somewhat less prominent, agency funding optical computer 
research is the Basic Research in Industrial Technology Agency (BRITE).  
There is a conference series (ISOC), but the major forum for coordination 
of research is the series of ESPRIT project meetings.] Muehlenbein 
remarked that the members of EC are experienced in international 
cooperation, but it works well for them because ESPRIT has plenty of 
funds.  Without that there would be more conflict than cooperation.  He 
emphasized a point that was taken up by other university researchers, 
that without the promise of new funds they will not interested in 
participation.  He advised the governments involved in this initiative to 
concentrate on that aspect of the organization.  

T. Hagemann (GMD-Tokyo) defined the issues as:  what to do; what to do 
with the results (intellectual property rights); and how to do it 
(organization and funding). The property rights issue should be clarified 
at the very beginning to avoid headaches later on. He suggested that 
distributing property rights to the contributing researchers was the best 
mechanism.  Using ESPRIT as a model, he suggested that funds should stay 
with the contributing organizations rather than being collectively 
managed or divided up according to the shareholding ratio of the 
partners.  He did not favor the establishment of a central research 
laboratory, nor the exchange of scientists among participating 
laboratories, but considered that communication and coordination are 
enough.  He believes that consortia should have balanced partners (e.g., 
company with company, or university with university) and that their 
funding should also be balanced and come from local sources.  He asserted 
that the most significant effect of ESPRIT has been to get the European 
scientists to know one another and to cooperate.  Finally, he recommended 
that Japan approach the EC Commission rather than individual national 
governments.  

Discussion -----------
W.T. Cathey (U. Colorado) commented that most industrial participation in 
ESPRIT is on short-term projects only.  Industry/university partnerships 
should not be excluded for long term work.  Also, central management of 
funds would bring coordination that would not otherwise occur.  Hagemann 
said he was not convinced.  P. Chavel (CNRS-France) agreed with 
Hagemann, particularly on the virtue of local organization of research 
with coordination.  He felt that funding and participation need be 
balanced only on average.  S. Lee agreed strongly with Muhlenbein's 
comments about funding.  He stated that he is not interested in a zero 
sum game.  Several people in the audience declared that they had been the 
victims of zero sum games in the past.  G. Agha said that university 
researchers welcome international cooperation, and neither know nor care 
what attitude the U.S. government has towards it. ------

H. Nishimura (MITI) closed the meeting with a few brief remarks and much 
good humor.  He asserted that a mood of trust had been established (Not 
everyone would agree.  People were very careful what they said to those 
they didn't know and free-form discussion tended to dry up).  He also 
stated that MITI is drafting a policy to support basic research in Japan.  


OPTICAL COMPUTING TRACK.--------------------------------------
 BACKGROUND.

Most people define an optical computing system as one in which some 
functions are performed by optical devices.  These will be accepted to 
replace the corresponding electronic devices only when they demonstrate a 
clear advantage in system performance.  It is unlikely that we will see 
an all-optical computer except perhaps for some very specialized purpose.  

The supreme advantage of optics lies in parallel processing.  Closely 
packed and intersecting channels do not crosstalk except at detectors or 
other non-linear devices.  Interconnection can be made without the energy 
penalty of mismatched transmission lines, although at present there is 
another energy penalty from the inefficiency of electrical/optical 
conversion.  One of the goals of optical computer development is 
therefore a massively parallel digital processor.  Another goal derived 
from this is the exploration of neural networks and "soft" logic.  In 
this respect one of the speakers showed a plot comparing speed and 
complexity that put the potential performance of an optical computer 
comfortably ahead of that of the brain of a bee but a long way short of 
the human brain.  

There is research in progress in many parts of the world that has already 
demonstrated some very respectable devices, such as the Self Electro-
Optic Effect Device (SEED) developed at AT&T Bell Laboratories, and an 
optical neural chip developed at Mitsubishi.  Optical interconnects among 
the chips and boards of an electronic computer are developed almost to 
the stage of becoming commercial products.  Between these and a neural 
network dreaming away in soft logic lies a very wide field in which to 
develop new practical systems and devices, and part of this field could 
be very appropriate for an international collaborative program.  The 
problem is to define which part, and this Workshop attempted to do that 
with very limited success.  

(Session) APPROACH TO OPTICAL COMPUTING.
Planning for the Optical Computing part of the NIPT initiative has been 
in the hands of four Sub-Working Groups, who reported their progress in 
this session.  

Y. Ichioka (Osaka U.) talked about optical digital computing.  The goals 
are to develop:  optical parallel computing systems; parallel and 
distributed optical functional circuits; electronic systems with optical 
components (e.g., interconnects); parallel inputs and outputs, concurrently 
addressable; and parallel memory systems. From these he derived a list 
of the components that are needed: LED and laser diode arrays; functional 
array devices (e.g., threshold devices); parallel shutter and memory 
arrays; spatial light modulators; opto-electronic integrated circuits; 
holographic elements; microlens arrays and high performance lenses; and 
diffractive optics.  

He offered no selection from this list, and talked of a 10-year 
development program leading to an optical mainframe computer.  

K. Kyuma (Mitsubishi) talked about optical neural computing.  After a 
discussion of the advantages of neurocomputing and optical implementation 
thereof, and a summary of the current status of research, he listed the 
research targets that his working group had identified: neural models for 
optical implementation; neural models and architectures for direct image 
processing; modular and expandable models and their optical 
architectures; and optical architectures for multi-parallel systems.  For 
device development, he emphasized the need for computer aided design 
systems specialized for optics.  

O. Wada (Fujitsu) talked about optical interconnects.  He reviewed the 
requirements and defined the ranges of computing speed and clock 
frequency in which optical interconnects will most likely find their 
place.  Then he categorized the various devices that already exist in 
some form according to function on a three dimensional plot that was in 
itself a wonderful example of "soft logic," with axes that change 
character from one end to the other, like something out of "Alice through 
the looking glass."  The result was a surprisingly expressive and easy to 
follow global categorization of relationships.  He was careful to 
distinguish the characteristics required for telecommunications from 
those required for interconnects in computers, and laid out a logical 
progression of development that could lead from one to the other.  Apart 
from that, and a comprehensive list of problems that could become 
research topics, he did not venture a specific course that a formal 
program should take.  

S. Ishihara (ETL) reported the deliberations of the working group on 
needs for optical computing.  He spent a lot of time describing the 
strategy and mechanics of the committee itself, but never came to the 
point of offering any conclusions.  He promised they will be reported in 
the final report of the working group, due in March 1991.  He offered his 
personal view that it is difficult to get international cooperation on 
the development of concrete, practical applications.  An international 
program should be on a long-term, fundamental level.  

Comment: Kamper notes that "It was clear to me from these presentations
that none of the four working groups has been able to develop a specific
program plan or even to define priorities among the major fields.  I
doubt if the discussions at this Workshop were of much help in this
respect.  We are a long way from defining a program that could form the
basis for negotiations or terms of partnership."

(Session) OVERSEAS (IE, NON-JAPANESE) ACTIVITIES IN OPTICAL COMPUTING.
S.H. Lee (UCSD) described the considerations and arguments that were used 
to plan the rather well coordinated optical computer program that he 
directs at the University of California at San Diego (UCSD).  He 
presented a clear view of the strengths of optical computing and the 
components of the supporting technology on which the development effort 
should be focused.  He described a conceptual Programmable Opto-
Electronic Multiprocessor (POEM) System that had provided a framework for 
planning.  Applying quite general principles showed that an optimized 
realization of the POEM system would be faster than an electronic system 
for processing anything larger than a 100x100 array.  His group is 
working systematically to develop all the components needed to realize 
the concept, including architecture, processors, memory, interconnects, 
and packaging.  His talk was a fine demonstration that it is possible to 
plan and conduct a systematic program to develop a practical optical 
computer, even though the job will not be finished for one or two decades 
and the final form the system will take is unknown as yet.  

W.T. Cathey (Colorado U.) described the program at the Optoelectronic 
Computing Systems Center, an NSF-funded Center of Excellence at the 
University of Colorado.  This appeared to be less highly coordinated than 
that at UCSD, but has projects in several areas that will obviously 
contribute to the development of one or another of the concepts for 
optical computing that appear promising at present.  In fact, proof-of-
principle projects are emphasized at the Center.  One original device he 
described manipulates a sequence of bits circulating in a optical fiber 
loop, using an electrically driven crossbar switch.  It is capable of 
time- division multiplexing and generates pretty results.  

P. Chavel (CNRS) described research on optical computing in the European 
Community.  He did not cover work in East Europe, which excludes a fair 
body of Russian work, nor did he include the "Outer Six" countries of the 
European Free Trade Agreement (EFTA).  

The three major research groups are at Erlangen, Germany; Edinburgh, 
Scotland; and Paris, France.  Some of them have good fabrication 
facilities, but most focus their attention on device physics and devices 
that can be made with modest resources.  Professor S.D. Smith, a leading 
figure in ESPRIT, believes that the highest priority should be given to 
developing the enabling technologies.  Several pretty devices have been 
invented and demonstrated.  These include a non-linear Fabry-Perot (NLFP) 
device (Edinburgh) that is optically bistable.  This was adapted to GaAs 
technology in Paris, where extensive work with multiple quantum wells has 
been reported and an 8x8 electrically addressed Spatial Light Modulator 
(SLM) using GaAs technology has been demonstrated.  There is much work on 
interconnects both in Paris and in Erlangen, and some work on optical 
analog computing.  The overall impression of optical computer research in 
Europe is of plenty of flourishing, productive device development 
projects with minimal coordination.  

M.E. Prise (AT&T) summarized some highlights of optical computer research 
at AT&T.  After the usual review of the benefits of optical technology in 
computing, he described some impressive device development.  This 
included extensions of the Self Electro-Optic Effect Device (SEED) 
principle (invented at AT&T), especially to arrays of devices, and much 
work on interconnects, all with characteristic Bell Laboratories quality.  
Good fabrication facilities can certainly be recognized in the products 
of a device research program.  

(Evening Session)  DISCUSSION ON FUTURE OPTICAL INFORMATION PROCESSING.
This was planned as a session for spontaneous discussion, following a 
formal Japanese dinner.  It was not very successful.  Several people gave 
short, unprepared, unfocused talks on what seemed to be a random 
selection of small topics and general truths.  Analog computing and 
photorefractive devices were discussed, but no conclusion was reached.  
The audience was very coy about participating in a discussion.  Finally 
the session lapsed into silence.  Someone got up with a set of viewgraphs 
to give a short, prepared description of his own project (a device called 
VSTEP, similar to SEED), and when he finished the session ended without 
regret.  

(Session) COMMENTS ON THE NIPT INITIATIVE WITH REGARD TO OPTICAL COMPUTING.
The speakers in this session had been asked in advance to talk and were 
prepared.  Their comments were more in the nature of general advice than 
critique of the rather formless Initiative.  

S. Lee shared the experience of planning the optical computing program at 
UCSD.  His view is that an optimum computer would combine electronic and 
optical functions where each excels, so it is important to look for 
opportunities to combine technologies.  The main focus he chose was on 
optoelectronic packaging and interconnects, looking towards large arrays 
with wide bandwidth.  As a criterion for comparing technologies he used 
the operation of an NxN perfect shuffle as a benchmark.  He asserted that 
there is no point in developing a new technology unless it can be 
predicted to offer at least two orders of magnitude improvement over the 
technology it is to replace.  He said it is necessary to choose among the 
technical options at an early stage and to have a clear vision of what 
type of computing one is trying to develop:  digital optical computing, 
neuro (or fuzzy logic) computing, or a data base machine?  He did not 
venture to suggest choices for the NIPT Initiative.  

W.T. Cathey tried a little harder to come to grips with the problem of 
program definition facing MITI.  He pointed to cross fertilization among 
technologies, and pointed to optical communications and display 
technology as promising contributing fields.  He discussed joint research 
projects, and emphasized the importance of answering the basic questions 
of mutual benefit, complementarity and funding.  He was the first speaker 
to raise the important topic of funding.  As possible topics for 
collaborations he listed:  pattern classification; architecture for 
optical computing; system specification of devices; impact of massive 
parallel or very fast interconnects on architecture; and potential neural 
network applications.  

He was also the first speaker to acknowledge that the Workshop did not 
appear to be converging on any definite conclusions.  He suggested there 
should be another meeting with different structure.  First, the 
integrated computing and optical groups should not be separated.  Then 
expert system architects and computer scientists should be brought in to 
remind the audience what the rival technologies can do, evaluate 
suggested systems, define needs, and design a suitable architecture.  A 
final report of the meeting should be required before it closes.  

In the discussion, a question was raised about whether a university would 
be capable of mounting a "critical mass" effort.  Since both speakers had 
done just that, the question was not received with much sympathy.  

P. Chavel started with S.D. Smith's list of important devices to develop:  
logic; memory; sources; detectors; spatial light modulators; and micro-
optics.  He then reminded the audience of a few simple truths: the 
development of the present day computers cost a lot of money; most of the 
optical computing devices we have today are many orders of magnitude 
below the projected performance that makes them attractive; but a few 
very specialized systems of very high performance do exist already.  

Then he presented a little lecture on general principles, emphasizing the 
ability of optics to handle many parallel channels in a small volume but 
discussing the limits set by diffraction and the aberrations of lenses.  
He suggested that arrays of about a million pixels would be practical and 
useful for operations such as:  matrix/vector multiplication; Fourier 
transformation; and correlation.  These are important operations for:  
fixed-shape pattern recognition; symbolic substitution; matched 
filtering; and "understanding."  Apart from these basic principles he 
drew no general conclusions.  

M.E. Prise started with a careful discussion of the distinction between 
data links and interconnects.  He pointed out that future technology will 
erase the distinction, but at present it marks the line between 
commercial products and advanced research.  The former are the subjects 
of competition, the latter is a possible subject of international 
cooperation.  He asserted that there is an organizational problem because 
people who understand the technical end of the spectrum do not understand 
the commercial end.  Not very helpful to the purpose of the Workshop.  


INTEGRATED COMPUTING TRACK.-------------------------------------------

(Session) APPROACH TO INTEGRATED COMPUTING.
Planning for the Integrated Computing part of the NIPT initiative has been 
in the hands of three Sub-Working Groups, who reported their progress in this 
session.  

Theory and New Functions Sub-Working Group (Kawahara NTT). The thrust of 
this presentation was the issue of how to put "intimate" machines into 
everyday life. That is, what kind of new functions are necessary for 
machines to cope with the realw world, what kind of theories are necessary 
to formulate these new functions, and how can they be integrated? Again, 
the stress was on autonomous systems, heterogeneous information, 
intuitive information processing, illogical situations, flexible and 
natural interaction with humans and the environment, etc.  Kawahara did 
not present any new theories but only emphasized that new theories will be 
needed to deal with information representation, integration, evaluation, 
learning and self-organization. He did list several theoretical 
frameworks in which some of these new theories may arise, including the 
following.  

  - Probability
     Pattern recognition, Multivariate data analysis, Probabilistic Inference
  - Constraint satisfaction and regularization
     Neural computation, Approximation and Optimization theory
  - Modularization
  - Formal treatment of interacting autonomous systems
  - Physical and developmental algorithms
     Simulated annealing, Genetic algorithms, Immune system
  - Multiple paradigmatic processing, heuristics

He used visual and auditory processing, as well as robotics examples to 
illustrate some of the techniques that are now being used and their 
difficulties, such as inability to work under noisy conditions, inability 
to satisfy multiple constraints, fault intolerance, etc. He also 
described the transformation of ill-posed problems into minimization 
problems by introducing "subjective" constraints, in much the same way 
that regularization adds smoothness constraints. Without going into any 
detail he also listed a number of wide ranging applications including 

  - Music transcription system
  - Prosthesis of sensory-motor function
  - Autonomous cleaner and maintenance system
  - Alert system for social security
  - Mesoscopic scale simulator
  - Self-organizing database
  - Quantum mechanical computer.

There was no concrete plan presented, nor specific details, and for me, 
the speech was far too vague to bite into. I think that most of the other 
foreign attendees had the same impression and as a result there was 
almost no discussion. One point I did note though was that an important 
aspect will be research in the area of very advanced human interfaces, 
including audio, visual, touch, smell, etc.  

Neural Systems Sub-Working Group (Okabe UTokyo). Neural networks (NN) are 
sufficiently well studied that it is possible to imagine the directions 
that research might take over the next ten years, and Okabe articulated 
several, perfectly reasonable ones here. He pointed out that 
modularization of NN are already taking place, with networks built in 
serial, and in parallel, and to a lesser extent, hierarchically. However, 
he felt that not nearly enough has been done on recurrent networks with 
multiple layers, or on learning algorithms for training collections of 
differently organized NNs. Similarly, learning algorithms, primarily with 
teacher signals are common, but self organizing structures can be much 
more powerful. He suggested three specific research topics
  - Inclusion of structural development process into conventional algorithms
  - Evolutional algorithms such as genetic or chaotic algorithms
  - Self-organization of structured neural networks.
He also gave one view of the system image of a NN front and back end to a 
massively parallel processor (MPP), which would be rule based, and 
focused on symbolic processing. The MPP might be a heterogeneous 
combination of neural structures, including layered, circuit (randomly 
interconnected), completely connected, tree, and dynamically connected. 
The physical organization of such an MPP would be hierarchical, chips 5cm 
square (neurons) organized 64 to a board, boards connected via a grid 3x3 
at the subsystem level, and stacked on planes into a system, and systems 
connected together somehow. He called this a "Recursively Modular 
Architecture", and felt it would be a CMOS MPP Supercomputer. 

He claimed that a one million neuron system with 2 Tera updates 
(multiplication and addition) per second (2 TCUPS) is definitely within 
range. For example, Hitachi has already built via wafer scale integration 
(WSI) a board containing 8 wafers with 100 neurons per wafer. Okabe feels 
that the technology of a 1,000,000 neuron system can be built using 2E07 
transistors/chip in 1995, with 6 inch wafers and 60 chips per wafer. The 
neuron circuit would be completely digital, with a learning algorithm 
using back propagation and fully 8 bit input, output, and weights.  
(It was pointed out during several sessions, that for significant 
numerical computation it will be necessary to have at least 32 bit 
capability.) This assumes about 1.2E5 transistors per neuron. A one 
million neuron system would consist of 1000 subsystems each composed of 
1000 fully connected neurons. He even showed a slide of this entire one 
million neuron system on a single board 50x70cm, composed two rows of 50 
WSI cards, each card 20x20cm containing a 6 inch wafer with 10,000 
neurons.  This part of the program is so much more detailed than the 
first that it will need no help to get going. Quite the opposite, one
gets the impression that it will occur independently of any massive
government push.

Massively Parallel Systems Sub-Working Group (Oyanagi, Toshiba).
This talk was divided into four subtopics, Framework, Research Themes, 
Software, and Hardware. Framework meant robust, reliable, failsafe 
hardware and adaptable, self-organizing, optimizing, learning software. 
These are the same words we have heard before, and again at this level no 
specifics. In the context of research themes we do get somewhat more 
detail. He listed the following.
 Research Themes:
  Soft Model: Multi paradigm model integrating object oriented and data 
      flow models. 
  Soft Architecture: Reconfigurable, and integrated with a neural network.
  Soft Software: Resource management, load balancing, and a super 
      parallel language are necessary. 
  Soft human interface: Multi paradigm interface, and interactive 
      environment. 
  Devices: Wafer scale integration, optics, high density, cooling
  Processor: High speed, low power consumption
  Interconnection Network: High connectivity and reliability (may lead to  
      optical interconnection)
  Systems: Maintenance, debugging, integration with neural network

There were no further details given about software except that work needs 
to be done on parallel languages, and that the computational model is 
probably going to be a combination of object oriented programming and 
concurrent programming. 

Oyanagi felt that by the year 2000 we should expect 20,000,000 
transistors on an 8 inch wafer. There would be 1E5 transistors per cell, 
200 cells per chip, 30 chips per wafer, 1000 wafers per stack and 16 
stacks per system. He estimated that four stacks could be built on a 
100x100cm board, and that a 1E8 cell system would generate about 160kw, 
thus heat dissipation would be a significant problem. Nevertheless he 
felt that building a BILLION cell system would not be impossible. He then 
went on to describe a three dimensional implementation using printed 
circuits and VLSI (not WSI) that is being built by Matsushita, 

     500K transistors/cell
      32 cells/chip
     128 chips/board
      32 boards/module
       8 modules/system
This will give about 1,000,000 cells in one cubic meter. Power 
dissipation is around 320kw, cooled by heat pipes. He concluded with a 
table describing two target systems, one for 1995 and another for 2000.

                     1995 (silicon)     2000
   Design Rule    0.3-0.5 mu-m           0.13-0.2 mu-m
   Integration     WSI or VLSI            WSI
   Cell            1E5 - 1E6              1E6 - 1E9
   Purpose         Testbed, and           Integrated system
                    Software development   
   Environment     Optical network        Optical network

Once again, the hardware issues seem very much clearer than the software
and Oyanagi acknowledged this afterwards. Also, there was no suggestion
of the software and design issues related to reliability and fault
tolerance of such huge systems. 

(Session) OVERSEAS ACTIVITIES IN INTEGRATED COMPUTING.
My description of this session is deliberately brief. 

      H. Muehlenbein (Parallel Genetic Algorithms)
      GMD 
      Schloss Birlinghoven
      D-5205, Sankt Augustin1

      G. Cottrell (Grounding Meaning in Perception)
      Computer Science and Engineering Dept
      University of California, San Diego
 
      G. Agha (Foundations for Building Massively Parallel Computers)
      Computer Science Department
      University of Illinois
      Urbana-Champaign

These speakers like the rest of us, really did not know what integrated
computing was, and hence, appropriately, spoke about their individual
research activities. My own interest was most captured by Muehlenbein
who talked about early random search methods using evolutionary
principles in the 1960's. These were not influential, but new extensions
are based on "genetics", that is the addition of some clever randomness
not to the search but to the capabilities of the searchers. This
approach appears to be much more powerful and also very suitable for
parallelization.  Muehlenbein claims that his algorithm beat three other
neural network algorithms at large Travelling Salesman Problems, and is
much faster than any other published algorithm on a benchmark ("beam")
Graph Partitioning Problem. He also claims that it is the fastest
general purpose unconstrained minimizer. I was not familiar with this
but it certainly caught the attention of the audience. A well written
survey of his work, along with a good bibliography is in his paper
"Parallel Genetic Algorithms and Combinatorial Optimization", to be
published in the SIAM J of Optimization. Finally, it is worth noting
that research in genetic algorithms is going on in the U.S too. In fact,
at the Naval Research Lab, John Grefenstette [gref@aic.nrl.navy.mil] has
also written a survey paper on a similar topic. 


(Evening Session) DISCUSSION OF FUTURE INFORMATION TECHNOLOGIES.
While this session was informal in the sense that numerous bottles of 
beer were available, the discussion consisted of descriptions by Hitachi, 
Fujitsu, and Toshiba of neural net research projects. Since these were 
essentially all hardware (although exceptionally interesting) the 
software people in the audience were hardly in a position to make any 
serious comments. 

Hitachi described a 2.3 GCUPS neuro system built with 8 (5 inch) wafers, 144 
neurons per wafer (30,000 transistors per neuron), using a 0.8 mu-m CMOS 
gate array. The system is 30x21x23cm and dissipates about 50 watts. The
interesting this about this system in addition to its speed (about 4
times faster than previous) is that the weights and connections are
dynamically changeable and that the weight values can be full 16 bits.
The speed comes from a clever use of two separate busses, one each in
the input and output direction. Learning is implemented via back
propagation.  Several applications have already been programmed
including signature verification and stock prediction. This device was
announced formally three or four days before the Workshop. 

Fujitsu gave an overview of their own neuro-computing research
activities. They have built or are working on three different systems
including a PC board, but the most interesting is Sandy, a collection of
256 Texas Instrument floating point digital signal processors (DSP) on a
ring network. Each DSP is presently functioning as a single neuron, but
Fujitsu claims that the software can allow each to be four (or more)
neurons. Because of the DSP honest 32 bit floating point operations are
possible. Currently an 8 processor prototype is running. The 256
processor version will be capable of 6 GCUPS for back propagation.
Several applications of this were cited, including mobile robot control,
stock forecasting, convertible bond rating (this was demonstrated to me
during my last visit to the Fujitsu lab) and more practically, a process
failure prediction system to be used during steel continuous casting, to
determine "breakout time", the time at which the the cooling process
fails.  Fujitsu is also experimenting with combining a neural net with a
fuzzy reasoning interface.

Toshiba described a 512 processor system organized as an 8x8x8 set of
cross bar switches representing the faces of a 3-cube. In other words,
any processor can communicate with any other in at most three hops.
There were no other details given and the current status of the project
was not made clear. It was also mentioned that work is continuing on a
Japanese word processor that uses a neural network to select kanji from
input kana.

(Session) COMMENTS ON INTEGRATED COMPUTING PORTION OF NIPT.
This was the opportunity for the foreign speakers at the integrated
computing track to give some opinions about what they had heard. They
had been asked to do this in advance. 

Muehlenbein began by reminding the audience that many of the goals of
the 5th Generation Project are unfulfilled, perhaps because the
announced goals were "artificial definite". He felt that many EC
projects seemed better thought out, but that to be honest, in 10 years
Japanese industry had come much further than European. With respect to
the current NIPT proposal, he was happy to see the emphasis on the
theory component, which he felt was lacking in the 5th Generation
project, but also noted that the numerically intensive engineering
applications (such as fluid dynamics, etc.) were missing. This was a
theme that several of us commented on. He was happy to see a
pluralistic, multi-paradigm approach, and was certainly enthusiastic
about the possibility of international cooperation. He wondered, though,
why Japanese industry was interested, as similar interest is difficult to
generate in European industry. (It seems to me that government money is
a very good way to generate interest.) He emphasized what we all had
been saying, that the software component was almost totally missing, and
that this was not a project in the usual sense of targets or schedules.
Finally he remarked that hardware speed is not the issue, organization
(computational model) and software are more important.

Cottrell listed a number of neural networks that were not discussed,
such as multiplicative connections, oscillating networks, and shared
weights. (I had heard some discussion about studying oscillation earlier.)
He mentioned the relationship between neural networks and statistics,
and various other issues such as neuroscience and self organization. He
felt the Human Frontier Science Project was a good model for NIPT and
that cooperation should include support for students, post docs, and
exchange of researchers. He also mentioned that if the goals of the program
were to help all of mankind, why hadn't he heard anything about
medicine, environment, or social applications. Personally, I thought that
the waters of this Workshop were muddy enough with the scientists who
participated, and to have included some of these others would have been
a disaster. 

Agha felt that NIPT should focus on massively parallel processing and
work on building basic principles and conceptual development. 

Shinoda (ETL) ventured a comment that he wanted to develop a massively
parallel processor, not a neural network. It is not clear how this will
evolve.

Bryant (Carnegie Mellon Univ and Fujitsu) gave a thoughtful description.
He felt that the theory portion (computational model and mathematical
understanding) was the most difficult to plan, but that substantial
progress should be made before any implementation begins. This progress
cannot be set by MITI, would take years, and that many models would
never make it. He felt that programming (languages and compilers) as
well as hardware and applications should come after and that perhaps a
good view was to develop implementations towards the end of the decade.
In other words, if neural network theory began in earnest in 1980 its
implementations are only getting serious now.  So we might imagine "soft
logic" theory developed during NIPT might be more appropriate for
implementation during the next 10 year program.


FINAL COMMENT.
Hakone, the site of this Workshop, is a beautiful area with a large lake
and many mountains. The Workshop was held in a plush hotel with
wonderful facilities including hot Japanese baths. But Hakone is
difficult to reach from Tokyo. The Workshop sessions lasted from
breakfast until late at night.  Unfortunately, as far as I could tell
few of the foreign attendees had much spare time for sightseeing and
some of them felt that it might have been more efficient to have the
Workshop in a less attractive but more accessible business hotel in
Tokyo.

----------------------- END OF REPORT --------------------------------