[comp.ai.neural-nets] NEURON Digest - V2 / #27

NEURON-Request@ti-csl.csc.ti.COM (NEURON-Digest moderator Michael Gately) (11/05/87)

NEURON Digest	Thu Nov  5 08:19:28 CST 1987 - Volume 2 / Issue 27
Today's Topics:

 need info
 Books WANTED.
 Six Generation Computing
 Product Review
 Time-averaging of neural events/firings
 Shift-Invariant Neural Nets for Speech Recognition
 Names for Neuromorphic Systems
 Contest to name etc... (several)
 Workshop on Neural Computers
 Neural Networks Applied Optics Issue

----------------------------------------------------------------------

Date: 28 Oct 87 14:50:39 GMT
From: Duc Tran <dgis!duc@lll-tis.arpa>
Subject: need info
 
HELP !! HELP !!!
I need to get a hold of the following conference proceedings:
 
	Conference on Neural Networking for Computing,
	Snowbird, Utah (1986)
 
If anybody can provide me pointers to where I can get it, I would 
be appreciated very much !!
 
 
Duc Tran
duc@dgis
uunet!dgis!duc
tel: 703-998-4647

------------------------------

Date: 28 Oct 87 20:38:28 GMT
From: John Eckrich <astroatc!johne@speedy.wisc.edu>
Subject: Books WANTED.
 
I am relatively new to neural networks/architectures and am interested in
learning more.  I would greatly appreciate any assistance you could provide
in helping me in this endeavor.  If you know of some good books, articles, or
journals, etc. please send me some E-mail.
 
10Q in advance.
 
-------------------------------------------------------------------------
Jonathan Eckrich                 |   (rutgers, ames)!uwvax!astroatc!johne 
Astronautics Technology Center   |   ihnp4!nicmad!astroatc!johne
Madison, WI                      |   (608) 221-9001

------------------------------

Date: 3 Nov 87 15:13:25 PST (Tuesday)
Subject: Six Generation Computing
From: Michael_R._Emran.OsbuSouth@xerox.com
 
Can anybody out there direct me to find update information about Six-
Generation Computing and the Japanese progress after the 1984's
Proposals?
I have a presentation  next week for my Expert System course. 
All comments, info(s), or leads to any article is appreciated in
advance.
Mike  

------------------------------

Date: Wed, 28 Oct 87 12:53:26 CST
From: mnorton@rca.com
Subject: product review
 
 
I recently saw a presentation by Nestor on their Neural Network-based
recognition systems.  Fellow reader might be interested to known that
they claim to have fielded 12 of their Handwriting Recognition Systems
and that to the best of their knowledge (and mine), this is the first
commercial application of a neural network.
 
The presentation include a demonstration of handwriting recognition on
a Toshiba Labtop.  Future products will include object recognition from
photographs (target recognition from aerial photography) and 3-D solid
recognition.  I suspect they are far ahead of any competition in terms
of producting net-based products.
 
The network model they use is a feedforward, three-layer, perceptron-like
network which they call RCE (Reduced Coulomb Energy).  It was mentioned
that a paper might be included in the March 1988 issue of IEEE Computer
(Special Issue on Neural Networks) which describes their model formally.
 
Mark J. Norton
RCA Advanced Technology Laboratories, AI Lab
mnorton%henry@RCA.COM

------------------------------

Date: Thu, 29 Oct 87 15:53:23 est
From: Michael Cohen <mike@bucasb.bu.edu>
Subject: Time-averaging of neural events/firings
 
We never did here at the Center for Adaptive Systems.
Our architectures are far more general.  You should
look at a general bibliography.
Michael Cohen ---- Center for Adaptive Systems
Boston University (617-353-7857)
Email: mike@bucasb.bu.edu
Smail: Michael Cohen
       Center for Adaptive System
       Department of Mathematics, Boston University	
       111 Cummington Street
       Boston, Mass 02215

------------------------------

Date: Fri, 30 Oct 87 20:31:32+0900
From: kddlab!atr-la.atr.junet!waibel@uunet.UU.NET (Alex Waibel)
Subject: Shift-Invariant Neural Nets for Speech Recognition
 
A few weeks ago, there was a discussion on AI-list,  about connectionist
(neural) networks being afflicted by an inability to handle shifted patterns.
Indeed, shift-invariance is of critical importance to applications such as
speech recognition.  Without it a speech recognition system has to rely
on precise segmentation and in practice reliable errorfree segmentation
cannot be achieved.  For this reason, methods such as dynamic time warping
and now Hidden Markov Models have been very successful and achieved high
recognition performace.  Standard neural nets have done well in speech
so far, but due to this lack of shift-invariance (as discussed on AI-list
a number of these nets have been limping along in comparison to these other
techniques.
 
Recently, we have implemented a time-delay neural network (TDNN) here at
ATR, Japan, and demonstrate that it is shift invariant.  We have applied
it to speech and compared it to the best of our Hidden Markov Models.  The
results show, that its error rate is four times better than the best of our
Hidden Markov Models.
The abstract of our report follows:
 
	      Phoneme Recognition  Using Time-Delay Neural Networks
 
	      A. Waibel, T. Hanazawa, G. Hinton^, K. Shikano, K.Lang*
		 ATR Interpreting Telephony Research Laboratories
 
				Abstract
 
	In this paper we present a Time Delay Neural Network (TDNN) approach
	to phoneme recognition which is characterized by two important
	properties: 1.) Using a 3 layer arrangement of simple computing
	units, a hierarchy can be constructed that allows for the formation
	of arbitrary nonlinear decision surfaces.  The TDNN learns these
	decision surfaces automatically using error backpropagation.
	2.) The time-delay arrangement enables the network to discover
	acoustic-phonetic features and the temporal relationships between
	them independent of position in time and hence not blurred by
	temporal shifts in the input.
 
	As a recognition task, the speaker-dependent recognition of the
	phonemes "B", "D", and "G" in varying phonetic contexts was chosen.
	For comparison, several discrete Hidden Markov Models (HMM) were
	trained to perform the same task.  Performance evaluation over 1946
	testing tokens from three speakers showed that the TDNN achieves a
	recognition rate of 98.5 % correct while the rate obtained by the
	best of our HMMs was only 93.7 %.  Closer inspection reveals that
	the network "invented" well-known acoustic-phonetic features (e.g.,
	F2-rise, F2-fall, vowel-onset) as useful abstractions.  It also
	developed alternate internal representations to link different
	acoustic realizations to the same concept.
 
^ University of Toronto
* Carnegie-Mellon University
 
For copies please write or contact:
Dr. Alex Waibel
ATR Interpreting Telephony Research Laboratories
Twin 21 MID Tower, 2-1-61 Shiromi, Higashi-ku
Osaka, 540, Japan
phone: +81-6-949-1830
Please send Email to my net-address at Carnegie-Mellon University:
						  ahw@CAD.CS.CMU.EDU

------------------------------

Date: Fri 30 Oct 87 09:17:25-PST
From: Ken Laws <LAWS@iu.ai.sri.com>
Subject: Names for Neuromorphic Systems
 
I'll vote for adaptive networks.  I'm not sure that fits constraint
relaxation via Hopfield networks or hill climbing with stochastic
annealing, but it fits better than any of the other suggested terms.
(I'm in the camp that sees no relation to simulation of neurons, other
than the coincidence that biological neural networks have some
capabilities that we would like to understand and then surpass.)
 
					-- Ken

------------------------------

Date: Fri, 30 Oct 87 11:10:09 CST
From: im4u!rutgers!m.cs.uiuc.edu!matheus (Chris J. Matheus)
Subject: Re:  NEURON Digest - V2 / #26
 
This summer I picked up the following name for computer simulated
neural networks:
 
                   "Artificial Neural Systems"
 
Unfortunately, I cannot identify the originator of the term.  I simply
recall hearing it used in a few presentations and reading it occasionally
in papers.  Other than being a bit long to say (it can be shortened to
ANS's: "anzes"), the name seems appropriate in the way it captures the
general flavor of this field of research.  But this matter is not going
to be decided by a simple vote.  Rather, it will depend upon what
name(s) end up being adopted in the literature and accepted by the
scientific community at large.
 
------------------------------------------------------------------------------
Christopher J. Matheus        usenet: {ihnp4, convex, philabs}!uiucdcs!matheus
Inductive Learning Group      arpa:   matheus@a.cs.uiuc.edu
University of Illinois        csnet:  matheus@uiuc.csnet
Urbana, IL   61801            phone:  (217) 333-3965
------------------------------------------------------------------------------

------------------------------

Date: Fri, 30 Oct 87 11:33:39 PST
From: Dr Josef Skrzypek <skrzypek@cs.ucla.edu>
Subject: Contest
 
 
How about
 
	NEURONIA -- field of euphoric neuro-builders

------------------------------

Date: Fri, 30 Oct 87 21:41:59 PST
From: Dr Jacques J Vidal <vidal@cs.ucla.edu>
Subject: Contest to name etc...
 
I have used  - Neuromimetic Systems
             -              Networks
to designate artificial neural nets,
plus "Neuromimetics" and, (in french), "Neuroinformatique"
(Neuroinfomatics??) to refer to the whole field.
However "Artificial Neural Networks" (ANNs) saeems OK and should
appease the neuron modeling purists.
 
PDP should be avoided. It apply just as well to models of
computation that have almost no neuronal flavor. 

------------------------------

Date: 4 November 87, 11:05 CET
From: ECKMILLE%DD0RUD81.BITNET@wiscvm.wisc.edu
Subject: Workshop on Neural Computers
 
 
            Dear Fellow Scientist,
 
            The proceedings of the NATO-Workshop (ARW) on NEURAL COMPUTERS
            in Neuss/Duesseldorf - 28.September - 2. October 1987 -
            will be published as:
                                 NEURAL COMPUTERS
 
                                 R. Eckmiller and C. v.d. Malsburg (eds.)
                                 at Springer-Verlag, Heidelberg
 
            The Book will be distributed in January 1988.
            During the Pre-Publication Sale you have the opportunity to order
            one or more copies for only $ 25 (25 US Dollars) if you send me
            the exact mailing address and a check before 10 December 1987.
            The official price as of January 1988 will be about $100.
            Please note that this book includes "Author Index", 
            "Subject Index",
            and "Collection of References from all Contributions".
            The List of Contributers and the Table of Contents are enclosed
            for your information.
                         Sincerely yours,
                                         Rolf Eckmiller, Ph.D.
                                         Department of Biophysics
                                         Universitaetsstr.1
                                         D-4000 Duesseldorf, FRG
                                         Tel.(211)311-4540
 
 
 
cut here-------------------and mail this slip--------------------------------
 
 
            I order _____ copies of the book NEURAL COMPUTERS for $ 25 each
 
            during the pre-publication sale (payment before 10 December 87).
 
            Please send the copies upon delivery (Jan.1988) to the following
 
 
            address:________________________________________________________
 
 
            ________________________________________________________________
 
            I enclose a check for ___US $ or transfer ___DM (corresponding to
 
            ___US $) to your account: No. 626 171 at SPARDA Bank Wuppertal,
 
            (Account Holder: Rolf Eckmiller, Ph.D.), Bankleitzahl 330 605 92.
 
 
            Signature_______________________        Date_______________1987
 
  
            NEURAL COMPUTERS
            R. Eckmiller and C. v.d. Malsburg, eds.
            Springer-Verlag, Heidelberg (January 1988)
 
 
            LIST OF CONTRIBUTORS
 
             Akers, Lex A. (USA)
            Aleksander, Igor (UK)
            de Almeida, Luis B. (PORTUGAL)
            Anderson, Dana Z. (USA)
            Anninos, Photios (GREECE)
            Arbib, Michael A. (USA)
            Atlan, H. (ISRAEL)
            Barhen, J. (USA)
            Beroule, Dominique (FRANCE)
            Berthoz, Alain (FRANCE)
            Bienenstock, Elie (FRANCE)
            Bilbro, G.L. (USA)
            Buhmann, J (W.GERMANY)
            Caianiello, Eduardo R. (ITALY)
            Carnevali, P. (ITALY)
            Cotterill, Rodney M. J. (DENMARK)
            Daunicht, Wolfgang (W.GERMANY)
            Dress, William (USA)
            Dreyfus, Gerard (FRANCE)
            Droulez, J. (FRANCE)
            Eckmiller, Rolf (W.GERMANY)
            Feldman, Jerome A. (USA)
            Ferry, D.K. (USA)
            Fukushima, Kunihiko (JAPAN)
            Gardner, E. (UK)
            Garth, Simon (UK)
            Ginosar, R. (ISRAEL)
            Graf, H.P. (USA)
            Grondin, R.O. (USA)
            Gulyas, B. (BELGIUM)
            Guyon, I. (FRANCE)
            Hancock, P.J.B. (UK)
            Hartmann, Georg (W.GERMANY)
            Hecht-Nielsen, Robert (USA)
            Hertz, John (DENMARK)
            Hoffmann, Klaus-Peter (W.GERMANY)
            Huberman, Bernardo A. (USA)
            Iverson, L. (CANADA)
            Jorgensen, C.C. (USA)
            Koch, Christof (USA)
            Koenderink, Jan J. (NETHERLAND)
            Kohonen, Teuvo (FINLAND)
            Korn, Axel (W.GERMANY)
            Mackie, Stuart (USA)
            Mallot, Hanspeter (W.GERMANY)
            v. d. Malsburg, Christoph (W.GERMANY)
            Marinaro, M. (ITALY)
            May, David (UK)
            Moller P. (DENMARK)
            Moore, Will R. (UK)
            Negrini, R. (ITALY)
            Nylen, M. (DENMARK)
            Orban, Guy (BELGIUM)
            Palm, Guenther (W.GERMANY)
            Patarnello, Stefano (ITALY)
            Pellionisz, Andras J. (USA)
            Personnaz, L. (FRANCE)
            Phillips, William A. (UK)
            Reece, M. (UK)
            Ritter, Helge (W.GERMANY)
            Sami, M.G. (ITALY)
            Scarabottolo, N. (ITALY)
            Schulten, Klaus (W.GERMANY)
            Schwartz, D.B. (USA)
            v. Seelen, Werner (W.GERMANY)
            Sejnowski, Terrence J. (USA)
            Shepherd, Roger (UK)
            Singer, Wolf (W.GERMANY)
            Smith, L.S. (UK)
            Snyder, Wesley (USA)
            Stefanelli Renato (ITALY)
            Stroud, N. (UK)
            Tagliaferri, R. (ITALY)
            Torras, Carme (SPAIN)
            Treleaven, Philip (UK)
            Walker, M.R. (USA)
            Wallace, David J. (UK)
            Weisbuch, Gerard (FRANCE)
            White, Mark (USA)
            Willson, N.J. (UK)
            Zeevi, Joshua Y. (ISRAEL)
            Zucker, Steven (CANADA)
            Zuse, Konrad (W.GERMANY)


------------------------------

Date:     Wed, 4 Nov 87 18:10 EDT
From: MIKE%BUCASA.BITNET@wiscvm.wisc.edu
Subject:  Neural Networks Applied Optics Issue
 
             NEURAL NETWORKS:  A special issue of Applied Optics
                     December 1, 1987 (vol. 26, no. 23)
            Guest editors: Gail A. Carpenter and Stephen Grossberg
 
 
     The Applied Optics special issue on neural networks brings together a
selection of research articles concerning both biological models of brain and
behavior and technological models for implementation in government and
industrial applications.  Many of the articles analyze problems about pattern
recognition and image processing, notably those classes of problems for which
adaptive, massively parallel, fault-tolerant solutions are needed, and for
which neural networks provide solutions in the form of architectures that will
run in real-time when realized in hardware.
 
     The articles are grouped into several topics: adaptive pattern recognition
models, image processing models, robotics models, optical implementations,
electronic implementations, and opto-electronic implementations. Each type of
neural network model is typically specialized to solve a variety of problems.
Models of back propagation, simulated annealing, competitive learning, adaptive
resonance, and associative map formation are found in a number of articles.
Each of the articles may thus be appreciated on several levels, from the
development of general modeling ideas, through the mathematical and
computational analysis of specialized model types, to the detailed explanation
of biological data or the fabrication of hardware. The table of contents
follows.
 
     Single copies of this special issue are available from the Optical Society
of America, at $18/copy. Orders may be placed by returning the form below, or
by calling (202) 223-8130 (ask for Jeana Macleod).
-------------------------------------------------------------------------------
 
 Please send ____ copies of the Applied Optics special issue on neural networks
 
 (vol. 26, no. 23) to:
NAME: __________________________________________________
 
ADDRESS:  _______________________________________________
 
_______________________________________________
 
_______________________________________________
 
 
TELEPHONE(S):___________________________________________
 
 TOTAL COST: $ ____________ $18/copy, including domestic or foreign surface
                                postage (+ $10/copy for air mail outside U.S.)
 
 PAYMENT: _____ Check enclosed (payable to Optical Society of America, or OSA)
 
        or _____ Credit card: American Express ____ VISA ____ MasterCard ____
                             Account number  __________________________________
                             Expiration date _________________________________
 
                             Signature (required)
                                        ____________________________
 
 SEND TO: Optical Society of America
           Publications Department
           1816 Jefferson Place NW Or call: (202) 223-8130 (Jeana Macleod)
           Washington, DC 20036 USA (credit cards)
_______________________________________________________________________________
 
 
             NEURAL NETWORKS: A special issue of Applied Optics
                     December 1, 1987 (vol. 26, no. 23)
            Guest editors: Gail A. Carpenter and Stephen Grossberg
 
 
 
                            TABLE OF CONTENTS
 
 
 
ADAPTIVE PATTERN RECOGNITION MODELS
 
   Teuvo Kohonen.  Adaptive, associative, and self-organizing functions in
   neural computing
 
   Gail A. Carpenter and Stephen Grossberg.  ART 2: Self-organization of
   stable category recognition codes for analog input patterns
 
   Jean-Paul Banquet and Stephen Grossberg.  Probing cognitive processes
   through the structure of event-related potentials during learning: An
   experimental and theoretical analysis
 
   Bart Kosko.  Adaptive bidirectional associative memories
 
   T.W. Ryan, C.L. Winter, and C.J. Turner.  Dynamic control of an artificial
   neural system: The Property Inheritance Network
 
   C. Lee Giles and Tom Maxwell.  Learning and generalization in high order
   neural networks: An overview
 
   Robert Hecht-Nielsen.  Counterpropagation networks
 
   Kunihiko Fukushima.  A neural network model for selective attention in
   visual pattern recognition and associative recall
 
 
 
IMAGE PROCESSING MODELS
 
   Michael H. Brill, Doreen W. Bergeron, and William W. Stoner.  Retinal
   model with adaptive contrast sensitivity and resolution
 
   Daniel Kersten, Alice J. O'Toole, Margaret E. Sereno, David C. Knill, and
   James A. Anderson.  Associative learning of scene parameters from images
 
 
 
ROBOTICS MODELS
 
   Jacob Barhen, N. Toomarian, and V. Protopopescu.  Optimization of the
   computational load of a hypercube supercomputer onboard a mobile robot
 
   Stephen Grossberg and Daniel S. Levine.  Neural dynamics of attentionally
   modulated Pavlovian conditioning: Blocking, inter-stimulus interval, and
   secondary reinforcement
 
 
 
OPTICAL IMPLEMENTATIONS
 
   Dana Z. Anderson and Diana M. Lininger.  Dynamic optical interconnects:
   Volume holograms and optical two-port operators
 
   Arthur D. Fisher, W.L. Lippincott, and John N. Lee.  Optical implementations
   of associative networks with versatile adaptive learning capabilities
 
   Clark C. Guest and Robert Te Kolste.  Designs and devices for optical
   bidirectional associative memories
 
   Kelvin Wagner and Demetri Psaltis.  Multilayer optical learning networks
 
 
 
ELECTRONIC IMPLEMENTATIONS
 
   Larry D. Jackel, Hans P. Graf, and R.E. Howard.  Electronic neural-network
   chips
 
   Larry D. Jackel, R.E. Howard, John S. Denker, W. Hubbard, and S.A. Solla.
   Building a hierarchy with neural networks: An example - image vector
   quantization
 
   A.P. Thakoor, A. Moopenn, John Lambe, and Satish K. Khanna.  Electronic
   hardware implementations of neural networks
 
 
 
OPTO-ELECTRONIC IMPLEMENTATIONS
 
   Nabil H. Farhat.  Opto-electronic analogs of self-programming neural nets:
   Architectures and methodologies for implementing fast stochastic learning
   by simulated annealing
 
   Yuri Owechko.  Opto-electronic resonator neural networks
(Please Post this to Your Mailing List)
 

------------------------------

End of NEURON-Digest
********************