[comp.ai.neural-nets] Neuron Digest V7 #5

neuron-request@HPLMS2.HPL.HP.COM ("Neuron-Digest Moderator Peter Marvit") (01/26/91)

Neuron Digest   Friday, 25 Jan 1991
                Volume 7 : Issue 5

Today's Topics:
                   PDP/nnets work in intuitive physics
                       Re: Kohonen's Network again
                              Backprop s/w
                               GA software
   Re: Neuron Digest, 1-12-91. Vol.7, Issue 4, "Brain Size and Sulci"
                         p.s. on cerebral sulci
                 Job Opportunity at Stanford University
                            IJCNN-91-SEATTLE
                      Neural Network Council Awards
                      intelligent tutoring systems
                 Introductory Texts (Cleaned-Up Version)


Send submissions, questions, address maintenance and requests for old issues to
"neuron-request@hplabs.hp.com" or "{any backbone,uunet}!hplabs!neuron-request"
Use "ftp" to get old issues from hplpm.hpl.hp.com (15.255.176.205).

------------------------------------------------------------

Subject: PDP/nnets work in intuitive physics
From:    hplabs!ames!gatech!ames!scenic.wa.com!pauld (Paul Barton-Davis)
Date:    Sun, 13 Jan 91 11:56:19 -0800


Anyone know of any work being done trying to use a PDP/nnets approach to
the problems of "intuitive physics" ? The latter term refers to what I
believe is a long recognised problem concerning the human ability to
perform simple tasks like catching a ball. There seemed to be some
consensus that neural computational power isn't enough to solve Newton's
laws of motion for such cases, and that instead some "intuitive" method
was being used, that although not totally accurate was sufficiently
precise to work most of the time. Seems like this is an ideal area for
PDP work - rapidly building models of motion, using back propagation (or
even just plain old feedback) to get the model into shape.

Also seems like something anyone doing nnet-inspired robotics might be
looking at, so does anyone have any knowledge of such work ?

Paul Barton-Davis                       <pauld@scenic.wa.com>
ScenicSoft, Inc.      
(206) 776-7760

[[ Editor's note: This query makes me wonder at the assumptions behind
Paul's question.  I certainly understand the difference between "naive
physics" and Newtonion.  For example, non-physicists don't really think
that the Earth is pushing back on you as you walk, even though the laws
physics state otherwise.  However, I'm not sure what Paul has in mind
here.  Inverse kinematics? Simple trajectory determination?  Eye-hand
coordination?  I'm not aware of any artifical neural net systems which
use Newton's laws directly to achieve their goal.  Readers, can you help
both Paul and me? -PM ]]

------------------------------

Subject: Re: Kohonen's Network again
From:    Giorgos Bebis <bebis@csi.forth.gr>
Date:    Mon, 14 Jan 91 22:20:35 +0200

In comp.ai.neural-nets you write:

>        I am working on Kohonen network, and I have met lots of trouble
>when I have tried to find the correct parameters k1 and k2 on the
>learning algorithm.  Does anyone know how to find them?

>        Besides, the convergence of the learning procedure is guaranteed
>because of the decreasing nature of the alpha gain factor. But is it
>guaranteed that it will converge to the right vectors. In other
>clustering algorithms, it does not end until convergence in clustering
>mean vectors is reached ( v.g. k-means), and I think this is more
>correct.

>        By the way, is anybody working on Kohonen's network? I have seen
>it quoted thousands of times, but the quotes are always from the same
>papers from Kohonen himself. I know not about anybody who has got Kohonen
>net working ( maybe Aleksander, as he says in his book, but this is the
>only one ). I think it *must* work, but I have got mixed results.
>Besides, it is boring to keep on trying new parameters.

>        I hope to get some help,

>JJ Merelo
>Dpto. de Electronica y Sistemas Informaticos
>Facultad de Ciencias
>Campus Fuentenueva, s/n
>18071 Granada ( Spain )
>e-mail JMERELO@UGR.ES

I have used the Kohonen algorithm some time ago for a character
recognition experiment. I have found very helpful the following paper :

T. Kohonen, K. Makisara and T. Saramaki "Phonotopic Maps - Insightful
Representation of Phonological Features for Speech Recognition",
Proceedings of IEEE, 7th Inter. Conf. on Pattern Recognition, Montreal,
Canada, 1984.

A way to choose the k1 and k2 as well as the gain parameter is indicated
in this paper. I can send you this paper if you don't have it. In
addition, I can tell you if you want, my experiences during the training
of the Kohonen's algoritm.


Bye, George.


++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
                      George Bebis (Graduate student),
                         Dept. of Computer Science,
                           University of Crete,
                                 Greece.
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

------------------------------

From:    Shawn Lockery <shawn@helmholtz.sdsc.edu>
Date:    Mon, 14 Jan 91 14:59:54 -0800
Subject: Backprop s/w

Several months ago I asked about canned backprop simulators.  At long
last, here is the result of my query:

=======================================-

Barak Pearlmutter has written a dynamical backprop simulator.  A version
of his program that solves a toy problem and that is readily modifiable
is available by anonymous ftp from helmholtz.sdsc.edu.  The directory is
pub/ and the filename is pearlmutter.tar

=======================================-

Yoshiro Miyata (miyata@dendrite.colorado.edu) has written an excellent
public domain connectionist simulator with a nice X windows or Sun View
interface.  It is called SunNet. He provides a pretty easy to learn
"general" definition language so a user can experiment with quite varied
back-prop and non-conventional architectures.  Examples are provided of
backpropagation, boltzmann learning, and others.  Source code is
available by anonymous ftp from boulder.  Look for SunNet5.5.tar.Z at
boulder.colorado.edu.

=======================================-

Yan Le Cun (Department of Computer Science, University of Toronto,
Toronto, Ontario, M5S 1A4, Canado) has written a commercial simulator
called SN /2 that is powerful and well documented.

=======================================

The Rochester Connectionist Simulator (RCS) is obtainable by anonymous
ftp from cs.rochester.edu.  You will find the code in the directory
pub/simulator.

========================================

The speech group at Oregon Graduate Institute has written a
conjugate-gradient optimization program called OPT to train fully
connected feed-forward networks.  It is available by anonymous ftp from
cse.ogi.edu.  The code is in the directory pub/speech.  Copy the file
opt.tar.  You will need to use the unix "tar" command to process the file
once you have it on your computer.

========================================-

For the Macintosh, there is the commercial program called MacBrain
(Neuronics, Inc., ! Kendall Square #2200, Cambridge, MA 02139).  It has
the usual Macintosh bells and whitsles and costs $400.

========================================-

For the Macintosh, there is a public domain program called Mactivation.
Mactivation version 3.3 is available via anonymous ftp on
alumni.Colorado.EDU (internet address 128.138.240.32) The file is in /pub
and is called mactivation.3.3.sit.hqx
 Mactivation is an introductory neural network simulator which runs on
all Apple Macintosh computers. A graphical interface provides direct
access to units, connections, and patterns. Basic concepts of network
operations can be explored, with many low level parameters available for
modification. Back-propagation is not supported (coming in 4.0) A user's
manual containing an introduction to connectionist networks and program
documentation is included. The ftp version includes a plain text file and
an MS Word version with nice graphics and footnotes.  The program may be
freely copied, including for classroom distribution.  for version 4.0.
You can also get a copy by mail.  Send $5 to Mike Kranzdorf, Box 1379,
Nederland, C0 80466-1379.

========================================-

For 386 based PC's, you may purchase ExploreNet from HNC, 5501 Oberlin
Drive, San Diego, CA 92121.  You don't get source code for your $750, but
it's powerful and flexible.

========================================-

For IBM PC's, there is a disk that comes along with the third volume of
the PDP books (Parallel Distributed Processing, Rumelhart, McClelland and
the PDP Research Group, MIT Press, 1986 .  You get lots of source code,
and the third volume itself is a nice manual.


------------------------------

Subject: GA software
From:    sct60a.sunyct.edu!sct60a.sunyct.edu!stu@sct60a.sunyct.edu (Stu Card)
Date:    Wed, 16 Jan 91 10:57:31 -0500


Anybody know of any public domain, freeware, shareware or other low-cost
software for genetic algorithms (a la Holland) and/or evolutionary
programming (a la Fogel)?  I will, of course, post a summary of what I
receive; please respond by direct e-mail rather than a posting.

stu

------------------------------

Subject: Re: Neuron Digest, 1-12-91. Vol.7, Issue 4, "Brain Size and Sulci"
From:    "Harry J. Jerison" <IJC1HJJ@MVS.OAC.UCLA.EDU>
Date:    Wed, 16 Jan 91 16:56:00 -0800

     Although I had not read the earlier discussion to which Cavonius
(below) contributed, his topic is one that I began looking into a decade
ago.  It is a pleasure to read about WET neural networks for a change,
and I would like to comment and add references:

     1.  First, Let me quote Cavonius's commentary dated 19 Dec 90:
"On the matter of brain size and number of sulci: exceptions -e.g.,
Gauss- have been reported; but there's no clear pattern. A century ago
measuring brains of famous savants was popular, but went out of fashion
when it turned out that geniuses tended to fall into the normal range.
Incidentally, although we don't know the surface area of their cortex,
the Neanderthal probably had slightly larger brains than ours.

*   C.R. Cavonius                       BITNET:uap001@ddohrz11     *
*   Inst. f. Arbeitsphysiologie        (Note: uap-zero-zero-one,   *
*   an der Universitaet Dortmund          not uap-oh-oh-one)       *
*   Ardeystr. 67                        Tel: +49 231 1084 261      *
*   D-4600 Dortmund 1, F.R. Germany     Fax: +49 231 1084 308      *"

     2.  I agree with this commentary, add some references to the
literature, a minor correction, and more comments.  The 19th century work
on sulcal complexity is best summarized by Broca (1871), and (in English
but with prejudice) by Gould (1981).  "Neanderthal" is misspelled;
modernized spelling after 1913 is "Neandertal," a forgiveable oddity
(even for Cavonius, at a German institute), since the "scientific"
spelling remains "Homo sapiens neanderthalensis."  For more on this and
on the evidence for the difference in brain size see Kennedy et al. (in
press).

     3.  There are additional and better reasons why the work was
abandoned.  First, it had racist and sexist overtones unrecognized at the
time as "unscientific" until exposed by Boas (1938) and other early 20th
Century critics; they were correlated with horrible social consequences
by Adolph Hitler and Nazi Germany.  Second, nobody knew how to measure
the "complexity" of a sulcal pattern or quantify the "genius" of a person
whose brain was measured, so even a bivariate analysis (of "intelligence"
as a function of "convolutedness") could be no more than a program for
research and could not be carried out.  That it remains no more than a
program today, when we know enough to attempt the necessary analysis, is
mainly a social rather than scientific decision.  The decision might be
based on balancing the benefit of the scientific knowledge against the
cost of gaining it.  In dollars and cents, the cost would be about
$100,000 at UCLA if done right.  The social cost could be much greater,
because the knowledge (if the discovered function was not trivial) could
be used to justify discrimination against certain human populations or
individuals, especially on the basis of sex, race, or ethnicity.  The
misuse of knowledge is always a problem, and in this case misuse must be
anticipated, because demogogues would exploit our unavoidable xenophobia
and our typical ignorance of the mathematics and statistics of the
analyses.  (Avoiding the misuse is possible, and its cost can be added to
the other social costs.)  There is no purely scientific reason for the
issue to remain unresolved, although the resolution may turn out to be
scientifically trivial.

     4.  The scientific problem as recognized today is primarily of
the relationship between surface area of the cerebral cortex (including
tissue buried in the sulci) and the size of the brain.  I have discussed
this and the relationship between it and convolutedness in several
publications (Jerison, 1982a,b; 1983, 1987, and in press).  The
relationship is extremely strong "between-species" but there is a partial
de-coupling of between-species from within-species (individual
differences) variance.  I plan to publish on the decoupling shortly.  On
the why of convolutions, see Rakic (1988) and Welker (1990).

      References (diacritical marks and italics not transmitted)

Boas, F. 1938. The mind of primitive man (2nd ed.). New York, Macmillan.
Broca, P. 1871. Sur le volume et la forme du cerveau suivant les
   individus et suivant les races. Bulletins de la Societe
   d'anthropologie. 1861, 2(ser I):139-204.
Gould, S.J. 1981. The mismeasure of man. New York, Norton.
Jerison, H.J. 1982a. The evolution of biological intelligence. In
   Sternberg, R. J. (ed.). Handbook of Human Intelligence. pp. 723-791.
   New York & London Cambridge Univ. Press.
Jerison, H.J. 1982b. Allometry, brain size, cortical surface, and
   convolutedness. In Armstrong, E. & Falk, D. (eds.). Primate Brain
   Evolution: Methods and Concepts. pp. 77-84. New York, Plenum.
Jerison, H.J. 1983. The evolution of the mammalian brain as an
   information processing system. In Eisenberg, J. F. & Kleiman, D. G.
   (eds.) Advances in the Study of Mammalian Behavior pp. 113-146.
   Special Publication No. 7, American Society of Mammalogists.
Jerison, H.J. 1987. Brain size.  In Adelman, G. Encyclopedia of
   Neuroscience. Vol. 1:168-170. Boston, Basel, Stuttgart, Birkhauser.
Jerison, H.J. in press. Brain size and the evolution of mind: 59th James
   Arthur Lecture on the Evolution of the Human Brain.  New York,
   American Museum of Natural History.
Kennedy, G. in press. On the autapomorphic traits of Homo erectus.
   Journal of Human Evolution.
Rakic, P. 1988. Specification of cerebral cortical areas. Science
   241:170-176.
Welker, W.I. 1990. Why does cerebral cortex fissure and fold?  A review
   of determinants of gyri and sulci. In Jones, E.G. & Peters, A.
   Cerebral Cortex Vol. 8B. pp. 1-132. New York, Plenum Press.


------------------------------

Subject: p.s. on cerebral sulci
From:    "Harry J. Jerison" <IJC1HJJ@MVS.OAC.UCLA.EDU>
Date:    Fri, 18 Jan 91 13:28:00 -0800

Please add the following postscript, if you send out my message of last
night: p.s. The NN community may also be interested in the discussion of
this issue by Benoit Mandelbroit in his "The Fractal Geometry of Nature,"
(San Francisco, Freeman, 1982) on pp. 112 and 162.

------------------------------

Subject: Job Opportunity at Stanford University
From:    Dave Rumelhart <der%psych@Forsythe.Stanford.EDU>
Date:    Sat, 19 Jan 91 12:46:22 -0800

[[ Editor's Note: PLEASE NOTE the 18 February deadline! -PM ]]

        The Psychology Department at Stanford University currently has
two job openings at least one of which may be appropriate for a
connectionist.  I enclose a copy of the advertisement which appeared in
several publications.  If you feel you may be appropriate or know of
someone who may be appropriate please apply for the position.  Note from
the ad that we are open to people at any level and with a variety of
interests.  This means, in short, we are interested in the best person we
can attract within reasonably broad guidelines.  I personally hope that
this person has connectionist interests.

        David Rumelhart
        Chair of the Search Committee




                     Stanford University Psychology Department.

          The Department of Psychology plans two tenure-track  appointments
          in  the  Sensory/Perceptual  and/or Cognitive Sciences (including
          the biological basis of cognition) beginning in the academic year
          1991.   Appointments  may be either at the tenured or non-tenured
          (assistant  professor) level.  Outstanding  scientists  who  have
          strong   research   records   in   sensory/perceptual  processes,
          cognitive neuroscience and/or  computational/mathematical  models
          of  cognitive  processes  are  encouraged  to  apply.  Applicants
          should send a current curriculum  vitae,  copies  of  their  most
          important scholarly papers, and letters of recommendation to: The
          Cognitive Sciences Search  Committee,  c/o  Ms.  Frances  Martin,
          Department   of   Psychology,  Bldg.  420,  Stanford  University,
          Stanford, California, 94305.  The  deadline  for  application  is
          February  18, 1991, but applicants are encouraged to submit their
          materials as soon as possible.  Stanford University is  an  Equal
          Opportunity Employer.

------------------------------

Subject: IJCNN-91-SEATTLE
From:    Dave Rumelhart <der%psych@Forsythe.Stanford.EDU>
Date:    Mon, 21 Jan 91 12:22:00 -0800


        In my role as conference chairman of the International Joint
Conference on Neural Networks to be held this summer (July 8-12) in
Seattle, Washington, I would like to remind readers of this mailing list
that the deadline for paper submissions is February 1, 1991.  I would
encourage submissions.  The quality of a conference is largely determined
by the quality of the submitted papers.

        As a further reminder, or in case you haven't seen a formal call
for papers, I provide some of the details below.

        Papers may be submitted in the areas of neurobiology, optical and
electronic implementations, image processing, vision, speech, network
dynamics, optimization, robotics and control, learning and
generalization, neural network architectures, applications and other
areas in neural networks.

        Papers must be submitted in English (1 original and seven copies)
maximum six pages, camera-ready on 8 1/2" x 11" white paper with 1"
margins on all sides and un-numbered.  Centered at the top of the first
page should be the complete title, author name(s), affiliation(s) and
mailing address(es).  This is followed by a blank space and then the
abstract up to 15 lines, followed by the text.  A cover letter including
the corresponding author's name, mailing address, telephone and fax
number, technical area, oral or poster presentation preference.

        Send papers to IJCNN-91-SEATTLE, University of Washington,
Conference Management, Attn: Sarah Eck, MS/GH-22, 5001 25th Ave. N.E.,
Seattle WA 98195.

        The program planning for this meeting is outstanding.  The site
of the meeting will, I think, be outstanding.  A major contribution to
the success of the meeting (and, I think, the success of the field) will
be made by each quality paper submitted.  I look forward to an exciting
meeting and hope to see a strong contribution from participants on the
connectionist mailing list.

        Thank you for your consideration.


        David E. Rumelhart,
        Conference Chair, IJCNN-91-SEATTLE

------------------------------

Subject: Neural Network Council Awards
From:    Bradley Dickinson <bradley@ivy.Princeton.EDU>
Date:    Wed, 23 Jan 91 13:14:09 -0500

Nominations Sought for IEEE Neural Networks Council Awards

The IEEE Neural Networks Council is soliciting nominations for two new
awards.  Pending final approval the IEEE, it is planned to present these
awards for the first time at the July 1991 International Joint Conference
on Neural Networks.  Nominations for these awards should be submitted in
writing according to the instructions given below.



IEEE Transactions on Neural Networks Outstanding Paper Award

This is an award of $500 for the outstanding paper published in the IEEE
Transactions on Neural Networks in the previous two-year period.  For
1991, all papers published in 1990 (Volume 1) in the IEEE Transactions on
Neural Networks are eligible.  For a paper with multiple authors, the
award will be shared by the coauthors.

Nominations must include a written statement describing the outstanding
characteristics of the paper.  The deadline for receipt of nominations is
March 31, 1991.  Nominations should be sent to Prof. Bradley W.
Dickinson, NNC Awards Chair, Dept. of Electrical Engineering, Princeton
University, Princeton, NJ 08544-5263.



IEEE Neural Networks Council Pioneer Award

This award has been established to recognize and honor the vision of
those people whose efforts resulted in significant contributions to the
early concepts and developments in the neural networks field.  Up to
three awards may be presented annually to outstanding individuals whose
main contribution has been made at least fifteen years earlier.  The
recognition is engraved on the Neural Networks Pioneer Medal specially
struck for the Council.

Selection of Pioneer Medalists will be based on nomination letters
received by the Pioneer Awards Committee.  All who meet the contribution
requirements are eligible, and anyone can nominate.  The award is not
approved posthumously.  Written nomination letters must include a
detailed description of the nominee's contributions and must be
accompanied by full supporting documentation.  For the 1991 Pioneer
Award, nominations must be received by March 1, 1991.  Nominations should
be sent to Prof. Bradley W. Dickinson, NNC Pioneer Award Chair,
Department of Electrical Engineering, Princeton University, Princeton, NJ
08544-5263.


Questions and preliminary inquiries about the above awards should be
directed to Prof. Bradley W. Dickinson, NNC Awards Chair; telephone:
(609)-258-4644, electronic mail: bradley@ivy.princeton.edu


------------------------------

Subject: intelligent tutoring systems 
From:    Jo Cove <hplms2!logcam!joc>
Date:    Thu, 24 Jan 91 14:59:10 +0000

Peter

I am keen to obtain information about how neural networks can be used in
the development of intelligent tutoring systems and in general in
learning technology and wondered if the following message could be mailed
to the readers of Neuron Digest.

==========================

Neural Networks in Intelligent Tutoring Systems

Logica Cambridge is carrying out a small (10 week) project to provide the
Employment Department with an in-depth briefing on neural networks, their
existing and future applications (withparticular emphasis on
training-related issues) and a consideration of their potential use in
learning technology.

Of interest to the department is the implications of neural networks for
learning technology and in particular of the development of more
sophisticated intelligent tutoring systems.

Further information on what role neural networks can play in the area of
training would be gladly received. Please contact joc@logcam.co.uk (Jo
Cove 104 Hills Rd Cambridge CB2 1LQ UK).

Thank you
Jo (Cove)


------------------------------

Subject: Introductory Texts (Cleaned-Up Version)
From:    bakker@batserver.cs.uq.oz.au (Paultje Bakker)
Organization: Computer Science Department, The University of Queensland,
        Brisbane, Australia
Date:    24 Jan 91 23:58:54 +0000

(* Sorry for wasting bandwidth. Here's a cleaner version of the list I
posted the other day. I'll post it every 2-3 months from now on with
updates. *)

A List of Introductory Texts for Neural Networks.

 -----------------------------------------------------------------
 - I haven't checked the accuracy of many of these titles. Beware!
 - Wasserman's book is by far the most popular.
 - Please send new additions or comments/corrections on existing 
 - items to bakker@batserver.cs.uq.oz.au. 
 -----------------------------------------------------------------

Aleksander, I. and Morton, H. (1990). An Introduction to Neural
Computing.  Chapman and Hall.

Anderson, J. A. and Rosenfeld, E. (1988). Neurocomputing: Foundations of
Research. The MIT Press, Cambridge, MA.

Beale, R. and Jackson, T. (1990). Neural Computing, an Introduction.
Adam Hilger, IOP Publishing Ltd. (ISBN 0-85274-262-2).
Comments: "It's clearly written.  Lots of hints as to how to get the
adaptive models covered to work (not always well explained in the
original sources).  Consistent mathematical terminology.  Covers
perceptrons, error-backpropagation, Kohonen self-org model, Hopfield type
models, ART, and associative memories."

Caudill, M. and Butler, C. (1990). Naturally Intelligent Systems.
MIT Press: Cambridge, Massachusetts. (ISBN 0-262-03156-6)
Comments: "I guess one of the best books I read."

Hinton, G. E. (1989). Connectionist learning procedures.
Artificial Intelligence, Vol. 40, pp. 185--234.
Comments: "One of the better neural networks overview papers, although
the distinction between network topology and learning algorithm is not
always very clear.  Could very well be used as an introduction to neural
networks."

Lippmann, R. P. (1987). An introduction to computing with neural nets.
IEEE Transactions on Acoustics, Speech, and Signal Processing. vol. 2,
no. 4, pp 4-22.
Comments: "Much acclaimed as an overview of neural networks, but rather
inaccurate on several points.  The categorization into binary and
continuous- valued input neural networks is rather arbitrary, and may
work confusing for the unexperienced reader.  Not all networks discussed
are of equal importance.

McClelland, J. L. and Rumelhart, D. E. (1988).
Explorations in Parallel Distributed Processing: Computational Models of 
Cognition and Perception (software manual). The MIT Press.
Comments: "Written in a tutorial style, and includes 2 diskettes of NN
simulation programs that can be compiled on MS-DOS or Unix (and they do
too !)"; "The programs are pretty reasonable as an introduction to some
of the things that nns can do."

Pao, Y. H. (1989). Adaptive Pattern Recognition and Neural Networks
Addison-Wesley Publishing Company, Inc.(ISBN 0-201-12584-6)

[A paper by Rumelhart et.al published in Nature at the same time (vol 323
October 1986) gives a very good potted explanation of backprop NN's.  It
gives sufficient detail to write your own NN simulation.]

Rumelhart, D. E. and McClelland, J. L. (1986). Parallel Distributed
Processing: Explorations in the Microstructure of Cognition (volumes 1 &
2).  The MIT Press.
Comments: "As a computer scientist I found the two Rumelhart and
McClelland books really heavy going and definitely not the sort of thing
to read if you are a beginner."; "It's quite readable, and affordable
(about $65 for both volumes).";

Stanley, J. (1988,1989). Introduction to Neural Networks. California
Scientific Software.
Comments: "This is provided with the Brainmaker nn package. It is however
just what it claims to be; an introductory text. Perhaps a bit simplistic
for some.."

Wasserman, P. D. (1989). Neural Computing: Theory & Practice Van Nostrand
Reinhold. (ISBN 0-442-20743-3) 
Comments: Generally considered to be the best introductory text so far.

Wunsch, D. (Ed.) (July, 1991). Neural Networks: An Introduction.
Pergamon Press.

Zeidenberg. M. (1990). Neural Networks in Artificial Intelligence.  Ellis
Horwood, Ltd., Chichester.

 --Paul Bakker         email: bakker@batserver.cs.uq.oz.au
 --Dept. of Scatology    "Love between the ugly
 --University of Qld        Is the most beautiful love of all" 
 --Gondwanaland                         - T. Rundgren

------------------------------

End of Neuron Digest [Volume 7 Issue 5]
***************************************