[comp.ai.neural-nets] Neuron Digest V7 #37

neuron-request@HPLMS2.HPL.HP.COM ("Neuron-Digest Moderator Peter Marvit") (06/26/91)

Neuron Digest   Tuesday, 25 Jun 1991
                Volume 7 : Issue 37

Today's Topics:
        Re: ANN and GA application to chaotic dynamical systems?
                         3D object recognition?
           Neural Network in handwritten character recognition
                     Neuroprose Turbulence Expected
                          hertz.refs.bib patch
                         Job Announcement - GTE
         POST-DOCTORAL VACANCY : Connectionism and Oral Dialogue
     IJCNN'91 Presidents' Forum (new announcement from Prof. Marks)
            On-line dialog data wanted (for ANN interpreter)
                  genetic algorithms + neural networks
                         ANNOUNCEMENT: NEW BOOK
        a new book; special issue on emergence; preprint availab


Send submissions, questions, address maintenance and requests for old issues to
"neuron-request@hplabs.hp.com" or "{any backbone,uunet}!hplabs!neuron-request"
Use "ftp" to get old issues from hplpm.hpl.hp.com (15.255.176.205).

----------------------------------------------------------------------

Subject: Re: ANN and GA application to chaotic dynamical systems?
From:    erwin%trwacs@uunet.UU.NET (Harry Erwin)
Date:    Wed, 12 Jun 91 08:13:07 -0400

MSANDRI%IVRUNIV.BITNET@ICINECA.CINECA.IT asks:
>Do you know applications of neural network, genetic algorithms and so on,
>to chaotic dynamical systems? I am very interested in such areas.

This is something of a vague question. You can get chaotic dynamics in
neural networks. I'll refer you to the NIPS proceedings that stuff, since
I don't have too much interest in that area. Neural networks are good at
tracking chaotic systems. See Lapedes, Alan S., and Farber, Robert M.,
1987. Nonlinear signal processing using neural networks: prediction and
system modelling. Technical Report LA-UR-87-2662, Los Alamos National
Laboratory. The Dynamics of Computation Group has done related work. See
Weigend, A. S., Huberman, B. A., and Rumelhart, D. E., 1990. Predicting
the future: a connectionist approach. Submitted to the International
Journal of Neural Systems. An open issue that I am investigating is how
the non-stationary statistics seen in dissipative chaos affect the
definition of training sets. Since any finite training set represents a
transient, it's not clear whether the resulting net will be able to track
the real process when it's evolving in some other region of its state
space.  I've also seen some stuff in the most recent NIPS proceedings
(III).

On genetic algorithms--I know less about work in this area.  My son
(Jeremy A. Erwin) used a genetic algorithm to investigate the evolution
of the population strategy for a highly chaotic game related to the one
discussed in Glance, N. S., and Huberman, B. A., 1991. Dynamics of
expectations. Submitted to the Journal of Economic Behavior and
Organization. Jeremy did this work in 1987-88. He discovered that natural
selection eliminated temporarily inferior strategies so rapidly that the
chaos was damped out, and the population strategy fixated on the best
_available_ strategy in the initial set (and mild mutants).

I hope this helps.

 Harry Erwin    erwin@trwacs.fp.trw.com


------------------------------

Subject: 3D object recognition?
From:    ytan@caip.rutgers.edu (Yi Tan)
Date:    Mon, 17 Jun 91 00:50:26 -0400

I am seeking references to recent literature on model based 3-d object
recognition by using neural network approaches. Hope any related
response.  Thanks in advance.

Yi Tan
ytan@caip.rutgers.edu


------------------------------

Subject: Neural Network in handwritten character recognition
From:    <EKANG%NTUVAX.BITNET@CUNYVM.CUNY.EDU>
Date:    Tue, 18 Jun 91 11:44:00 +0800


I am currently working on the application of neural network to
handwritten alphanumeric character recognition. What is the current
status of research in this area? Has an effective NN model been
determined for this application?  I have read articles from IJCNN
proceedings that analysed and compared different NN architectures for
classifying handwritten digits. Has more been established besides these?
By 'effective' I have in mind optimal training time and recognition time
and accuracy.

Any feedback on models that have been conclusively researched or
implemented would be appreciated.

Thanks.

ekang@ntuvax.BITNET

------------------------------

Subject: Neuroprose Turbulence Expected
From:    Jordan B Pollack <pollack@cis.ohio-state.edu>
Date:    Tue, 18 Jun 91 11:28:35 -0400

Cheops, the pyramid machine upon which NEUROPROSE resides, will be
decommissioned. The Neuroprose archive will move, with luck, to a new
Sparcserver at the same IP address also called Cheops.  But between today
and July 1, all cis.ohio-state.edu systems (including email) will be
pretty wobbly, so expect delays.

Jordan Pollack                            Assistant Professor
CIS Dept/OSU                              Laboratory for AI Research
2036 Neil Ave                             Email: pollack@cis.ohio-state.edu
Columbus, OH 43210                        Phone: (614)292-4890 (then * to fax)



------------------------------

Subject: hertz.refs.bib patch
From:    Nici Schraudolph <schraudo@cs.UCSD.EDU>
Date:    Wed, 19 Jun 91 18:39:56 -0700

In adding the "HKP:" prefix to the citation keys in the BibTeX version of
the Hertz/Krogh/Palmer bibliography I forgot to modify the internal
cross-citations accordingly.  I've appended the necessary patch below; it
only involves three lines, but those who don't feel up to the task can
ftp the patched file (still called hertz.refs.bib.Z) from neuroprose.

My apologies for the invonvenience,

 - Nici Schraudolph.


Here's the patch:

*** hertz.refs.bib      Wed Jun 19 18:23:36 1991
***************
*** 73,80 ****
  @string{snowbird = "Neural Networks for Computing"}
  
  % -------------------------------- Books ---------------------------------
! @string{inAR = "Reprinted in \cite{Anderson88}"}
! @string{partinAR = "Partially reprinted in \cite{Anderson88}"}
  @string{pdp = "Parallel Distributed Processing"}
  
  % ------------------------------- Journals ---------------------------------
=--- 73,80 ----
  @string{snowbird = "Neural Networks for Computing"}
  
  % -------------------------------- Books ---------------------------------
! @string{inAR = "Reprinted in \cite{HKP:Anderson88}"}
! @string{partinAR = "Partially reprinted in \cite{HKP:Anderson88}"}
  @string{pdp = "Parallel Distributed Processing"}
  
  % ------------------------------- Journals ---------------------------------
***************
*** 3500,3506 ****
          pages = "75--112",
        journal =  cogsci,
         volume =  9,
!          note = "Reprinted in \cite[chapter 5]{Rumelhart86a}",
           year =  1985
    }
  
=--- 3500,3506 ----
          pages = "75--112",
        journal =  cogsci,
         volume =  9,
!          note = "Reprinted in \cite[chapter 5]{HKP:Rumelhart86a}",
           year =  1985
    }
  


------------------------------

Subject: Job Announcement - GTE
From:    Rich Sutton <rich@gte.com>
Date:    Thu, 20 Jun 91 10:24:53 -0400


The connectionist machine learning project at GTE Laboratories is looking
for a researcher in computational models of learning and adaptive
control.  Applications from highly-qualified candidates are solicited.  A
demonstrated ability to perform and publish world-class research is
required.  The ideal candidate would also be interested in pursuing
applications of their research within GTE businesses.  GTE is a large
company with major businesses in local telphone operations, mobile
communications, lighting, precision materials, and government systems.
GTE Labs has had one of the largest machine learning research groups in
industry for about seven years.

A doctorate in Computer Science, Computer Engineering or Mathematics is
required.  A demonstrated ability to communicate effectively in writing
and in technical and business presentations is also required.

Please send resumes and correspondence to:

June Pierce
GTE Labs MS-44
40 Sylvan Road
Waltham, MA 02254
USA


------------------------------

Subject: POST-DOCTORAL VACANCY : Connectionism and Oral Dialogue
From:    Haffner Patrick <haffner@lannion.cnet.fr>
Date:    21 Jun 91 17:36:19 +0200

Applications are invited for research assistantship(s) for post-doctoral
or sabbatical candidates. Funding at the French National
Telecommunications Research Centre (Centre National d'Etudes des
Telecommunications, CNET) will commence in September '91 for a two-year
period ; the work location will be Lannion, Brittany, France. Experience
is required in Natural Language Processing, especially Oral Dialogue
Processing, by Connectionist methods. Applicants should specify the
period between Sept '91 and Sept '93 which interests them.  Applications,
including CV/Resume, should be sent to :

Mme Christel Sorin

CNET LAA/TSS/RCP
BP 40
 22301 LANNION CEDEX
FRANCE

TEL : +33 96-05-31-40
FAX : +33 96-05-35-30
E-MAIL : sorin@lannion.cnet.fr




------------------------------

Subject: IJCNN'91 Presidents' Forum (new announcement from Prof. Marks)
From:    " J. N. Hwang" <hwang@pierce.ee.washington.edu>
Date:    Fri, 21 Jun 91 11:54:47 -0700


News release
IEEE NEURAL NETWORKS COUNCIL IS SPONSORING A PRESIDENTS' FORUM AT
IJCNN `91 IN SEATTLE, WASHINGTON

Robert J. Marks II, Professor at the University of Washington and
President of the IEEE Neural Networks Council (NNC), has announced that
for the first time the IEEE/NNC will be sponsoring a Presidents' Forum
during IJCNN `91 in Seattle, Washington, July 8-12, 1991.

The participants of the Presidents' Forum will be the Presidents of the
major artificial neural network societies of the world, including the
China Neural Networks Committee, the Joint European Neural Network
Initiative, the Japanese Neural Networks Society and the Russian Neural
Networks Society.  The Forum will be open to conference attendees and the
press on Wednesday evening, 6:30-8:30 pm, July 10, 1991, at the
Washington State Convention Center in Seattle.  Each President will give
a short (15-20 minute) presentation of the activities of their society,
followed by a short question/answer period.  Robert J. Marks II will be
this year's moderator.



------------------------------

Subject: On-line dialog data wanted (for ANN interpreter)
From:    Kingsley Morse <kingsley@hpwrce.mayfield.hp.com>
Date:    Fri, 21 Jun 91 16:54:04 -0700

        I've developed an algorithm which learns by example, and I'd like
to train it to translate a conversation between two people. So, I'm
looking for a text file that contains a dialog between two people, with
everything each person says in two languages. See Appendix A for an
example. If you don't have such a file, can you direct me to someone who
does?

        Also, I'm looking for a file that contains a dialog between two
people, but with the entire conversation in English. Again, If you don't
have such a file, can you direct me to someone who does?

        Failing the above, does anyone know how to subscribe to the
mailing list called "Linguist"? (I would like to ask for help there
also.)

        Or, I've seen similar work by Eiichiro Sumita at Kyoto University
in Japan, and Hitoshi Iida at the ATR Interpreting Telephony Research
Labs.  Does anyone have their email addresses? (I would like to ask them
for a copy of ATR's linguistic database of spoken Japanese with English
translations.)

        Finally, does anyone have email addresses for anyone at:

                     ATR Interpreting Telephony Research Labs

                                       or

            Center for Machine Translation at Carnagie Mellon University

Thanks,

Kingsley Morse Jr.
kingsley@hpwrce.hp.com

=------------------------------------------------------------------------------
                        Appendix A.

        This is a translation between English and Filipino-Tagalog. Any pair of
languages will do, but I would prefer it if one were English. Ideally the
file will be from 10,000 to 100,000 bytes long. 

Speaker # 1 in English: "Hello".

Speaker # 1 in Filipino : "Kumusta."


Speaker # 2 in Filipino: "Anong oras na?"

Speaker # 2 in English: "What time is it?"


Speaker # 1 in English: "Noon".

Speaker # 1 in Filipino: "Tanghali"




------------------------------

Subject: genetic algorithms + neural networks
From:    Bernd Rosauer <rosauer@ira.uka.de>
Date:    Tue, 25 Jun 91 14:29:18 +0700


I am interested in any kind of combination of genetic algorithms and
neural network training. I am aware of the papers presented at

        * Connectionist Models Summer School, 1990
        * First International Workshop on Parallel Problem Solving from
          Nature, 1990
        * Third International Conference on Genetic Algorithms, 1989
        * Advances in Neural Information Processing Systems 2, 1989.

Please, let me know if there is any further work on that topic.  Post to
<rosauer@ira.uka.de>, so I will summarize here.

Thanks a lot
                        
               Bernd


------------------------------

Subject: ANNOUNCEMENT: NEW BOOK
From:    CORTEX@buenga.bu.edu
Date:    Thu, 30 May 91 13:01:00 -0400

           ____________________________________________
           COMPUTATIONAL-NEUROSCIENCE BOOK ANNOUNCEMENT
           --------------------------------------------

                 FROM THE RETINA TO THE NEOCORTEX
                  SELECTED PAPERS OF DAVID MARR
                   Edited by Lucia Vaina (1991)

                  Distributer: Birkhauser Boston
________________________________________________________________________
ISBN 0-8176-3472-X
ISBN 3-7643-3472-X
Cost: $49
For more information: DORAN@SPINT.Compuserve.COM 
To order the book: call George Adelman at Birkhauser: (617) 876-2333.
________________________________________________________________________

MAIN CONTENTS: the book contains papers by David Marr, which are placed
in the framework of current computational neuroscience by leaders in each
of the subfields represented by these papers.

(1) Early Papers 

           1. A Theory of Cerebellar Cortex [1969]
              with commentary by Thomas Thach

           2. How the Cerebellum May be Used (with S. Blomfield) [1970]
              with commentary by Jack D. Cowan

           3. Simple Memory: A Theory of Archicortex [1971]
              with commentaries by David Willshaw & Bruce McNaughton

           4. A Theory of Cerebral Neocortex [1970]
              with commentary by Jack D. Cowan

           5. A Computation of Lightness by the Primate Retina [1974]
              with commentary by Norberto M. Grzywacz


(2) Binocular Depth Perception

           6. A Note on the Computation of Binocular Disparity in a 
              Symbolic, Low-Level Visual Processor [1974]     

           7. Cooperative Computation of Stereo Disparity 
              (with T.Poggio) [1976]

           8. Analysis of a Cooperative Stereo Algorithm 
              (with G.Palm, T.Poggio) [1978]

           9. A Computational Theory of Human Stereo Vision 
              (with T. Poggio) [1979]
              with commentary on Binocular Depth Perception by
              Ellen C. Hildreth and W. Eric L. Grimson   


(3) David Marr: A Pioneer in Computational Neuroscience
                by Terrence J. Sejnowski

(4) Epilogue: Remembering David Marr
              by former students and  colleagues: Peter Rado, Tony Pay,
              G.S. Brindley, Benjamin Kaminer, Francis H. Crick,
              Whitman Richards, Tommy Poggio, Shimon Ullman,
              Ellen Hildreth.


------------------------------

Subject: a new book; special issue on emergence; preprint availab
From:    Kampis Gyorgy <h1201kam@ella.hu>
Date:    Sun, 16 Jun 91 13:05:00



                     ANNOUNCEMENTS

****************************************************************

1. a new book
2. a Special Issue on emergence
3. preprint available

****************************************************************
1. the book

                                George Kampis   

        SELF-MODIFYING SYSTEMS IN BIOLOGY AND COGNITIVE SCIENCE:
        a New Framework for Dynamics, Information and Complexity

      Pergamon, Oxford-New York, March 1991, 546pp with 96 Figures


About the book:

The main theme of the book is the possibility of generating information
by a recursive self-modification and self- redefinition in systems.

The book offers technical discussions of a variety of systems (Turing
machines, input-output systems, synergetic systems, connectionist
networks, nonlinear dynamic systems, etc.) to contrast them with the
systems capable of self-modification.

What in the book are characterized as 'simple systems' involve a fixed
definition of their internal modes of operations, with variables, parts,
categories, etc. invariant. Such systems can be represented by single
schemes, like computational models of the above kind. A relevant
observation concerning model schemes is that any scheme grasps but one
facet of material structure, and hence to every model there belongs a
complexity excluded by it.  In other words, to every simple system there
belongs a complex one that is implicit.

Self-modifying systems are 'complex' in the sense that they are
characterized by the author as ones capable to access an implicate
material complexity and turn it into the information carrying variables
of a process. An example for such a system would be a tape recorder which
spontaneously accesses new modes of information processing (e.g. bits
represented as knots on the tape). A thesis discussed in the book is that
unlike current technical systems, many natural systems know how to do
that trick, and make it their principle of functioning.

The book develops the mathematics, philosophy and methodology for dealing
with such systems, and explains how they work. A constructive theory of
models is offered, with which the modeling of systems can be examined in
terms of algorithmic information theory. This makes possible a novel
treatment of various old issues like causation and determinism, symbolic
and nonsymbolic systems, the origin of system complexity, and, finally,
the notion of information. The book introduces technical concepts such as
information sets, encoding languages, material implications, supports,
and reading frames, to develop these topics, and a class of systems
called 'component-systems', to give examples for self-modifying systems.
As an application, it is discussed how the latter can be applied to
understand aspects of evolution and cognition.


>From the Foreword of John Casti:

        In this thought-provoking volume, George Kampis argues
(among other things) that the Turing-Church Thesis is false, at least for
the kinds of physical systems that concern developmental biologists,
cognitive scientists, economists, and other of that ilk. [...]

        This book represents an exciting point of departure from ho-
hum traditional works on the philosophy of modeling, especially
noteworthy being the fact that instead of offering mere complaints
against the status quo, Kampis also provides a paradigm holding out the
promise of including both the classical systems of the physicist and
engineer and the neoclassical processes of the biologist and psychologist
under a single umbrella. As such, the ideas in this pioneering book merit
the attention of all philosophers and scientists concerned with the way
we create reality in our mathematical representations of the world and
the connection those representation have with the way things "truly are".


How to order if interested:

Order from              Pergamon Press plc, Headington Hill Hall, 
                    Oxford OX3 0BW, England or a local Pergamon office

        ISBN 0-08-0369790     100 USD/ 50 pound sterlings

Hotline Service:        USA (800) 257 5755
                             elsewhere (+44) 865 743685 
                                   FAX (+44) 865 743946
****************************************************************

2. Forthcoming: A SPECIAL ISSUE ON EMERGENCE AND CREATIVITY


It's a Special Issue of 
**************************************************************
*    World Futures: The Journal of General Evolution         *
*          (Gordon & Breach), to appear August 1991          *
*                                                            *
*    Guest Editor:  G. Kampis                                *
*                                                            *
* Title:     Creative Evolution in Nature, Mind and Society  *
**************************************************************

Individual copies will be available (hopefully), at a special
rate (under negotiation).

List of contents:

Kampis, G.              Foreword
Rustler, E.O.           "On Bodyism" (Report 8-80 hta 372)
Salthe, S.              Varieties of Emergence
Csanyi, V.              Societal Creativity
Kampis, G.              Emergent Computations, Life and Cognition
Cariani, P.             Adaptivity and Emergence in Organisms
                            and Devices
Fernandez,J., Moreno,A. 
and Etxeberria, A.      Life as Emergence
Heylighen, F.           Modelling Emergence
Tsuda, I.               Chaotic Itinerancy as a Dynamical Basis  
                            for Hermeneutics in Brain and Mind
Requardt, M.            Godel, Turing, Chaitin and the Question 
                            of Emergence as a Meta-Principle of
                            Modern Physics. Some Arguments       
                            Against Reductionism

****************************************************************

3. preprint from the Special Issue on Emergence


            EMERGENT COMPUTATIONS, LIFE AND COGNITION

                      by George Kampis

          Evolutionary Systems Group, Dept. of Ethology
             L. Eotvos University of Budapest, Hungary

                                          and

             Department of Theoretical Chemistry
        University of Tubingen, D-7400 Tubingen, FRG


ABSTRACT        This is a non-technical text discussing general ideas of
information generation. A model for emergent processes is given.
Emergence is described in accordance with A.N. Whitehead's theory of
'process'. The role of distinctions and variable/observable definitions
is discussed. As applications, parallel computations, evolution, and
'component-systems' are discussed with respect to their ability to
realize emergence.

KEYWORDS: emergence, distinctions, modeling, information set, evolution,
cognitive science, theory of computations.

AVAILABLE from the author at

                        h1201kam@ella.hu       or
                        h1201kam@ella.uucp

or by mail at H-1122 Budapest Maros u. 27.



------------------------------

End of Neuron Digest [Volume 7 Issue 37]
****************************************