[comp.ai.neural-nets] SUMMARY OF RESPONSES: Introduction to NN

bakker@batserver.cs.uq.oz.au (Paultje Bakker) (10/01/90)

Thanks to everyone for the great response! So, here follows.....

SUMMARY OF RESPONSES to the request for: 
"A good, general, *readable* introduction to neural networks"
October 1990.

*******************************************

Rumelhart D.E and McClelland J.L 
Parallel Distributed Processing Vols 1 & 2
MIT, Cambridge 1986

It's quite readable, and affordable (about $65 for both volumes).

A companion volume 'Explorations in PDP' by McClelland is written in
a tutorial style, and includes 2 diskettes of NN simulation
programs that can be compiled on MS-DOS or Unix (and they do too !)

A paper by Rumelhart et.al published in Nature at the same time 
(vol 323 October 1986) gives a very good potted explanation of backprop NN's.
It gives sufficient detail to write your own NN simulation.

-----------------------------------

   I Aleksander, H Morton: An Introduction to Neural Computing
   Chapman and Hall, 1990.

-----------------------------------

Books:

>From CS point of view:
%A P. D. Wasserman
%T Neural Computing: Theory and Practice
%I Van Nostrand Reinhold
%C New York
%D 1989

>From AI point of view:
%A M. Zeidenberg
%C Chichester
%D 1990
%I Ellis Horwood, Ltd.
%T Neural Networks in Artificial Intelligence

>From Psych point of view (note the bulk):
%A D. E. Rumelhart
%A J. L. McClelland
%D 1986
%I The MIT Press
%K PDP-1
%T Parallel Distributed Processing: Explorations in the Microstructure of
Cognition
%o (volume 1)

%A J. L. McClelland
%A D. E. Rumelhart
%D 1986
%I The MIT Press
%K PDP-2
%T Parallel Distributed Processing: Explorations in the Microstructure of
Cognition
%o (volume 2)

%A J. L. McClelland
%A D. E. Rumelhart
%D 1988
%I The MIT Press
%T Explorations in Parallel Distributed Processing:
Computational Models of Cognition and Perception


Papers:
%A R. P. Lippmann
%D April 1987
%J IEEE Transactions on Acoustics, Speech, and Signal Processing
%V 2
%N 4
%P 4--22
%T An introduction to computing with neural nets
%X Much acclaimed as an overview of neural networks, but rather inaccurate
on several points.  The categorization into binary and continuous-valued
input neural networks is rather arbitrary, and may work confusing for
the unexperienced reader.  Not all networks discussed are of equal importance.

%A G. E. Hinton
%T Connectionist learning procedures
%J Artificial Intelligence
%V 40
%D 1989
%P 185--234
%X One of the better neural networks overview papers, although the
distinction between network topology and learning algorithm is not always
very clear.  Could very well be used as an introduction to neural networks.

------------------------

D. Wunsch (ed.) (1991) Neural Networks : An Introduction.

------------------------

"Naturally Intelligent Systems" by Caudill, Maureen and Charles Butler.
Cambridge, Massachusetts: MIT Press, (1990). ISBN 0-262-03156-6
(about 300 pages)

-------------------------

	Yoh-Han Pao, Adaptive Pattern Recognition and Neural Nets,
	c. 1989 by Addison-Wesley Publishing Company, Inc.

------------------------


Neural Computing an Introduction by R. Beale and T. Jackson.  
It's $30.00 and published by Adam Hilger (ISBN 0-85274-262-2).

It's clearly written.  Lots of hints as to how to get the
adaptive models covered to work (not always well explained in the
original sources).  Consistent mathematical terminology.  Covers
perceptrons, error-backpropagation, Kohonen self-org model, Hopfield
type models, ART, and associative memories.

************************************


Wasserman seemed to be the most popular choice.

Thanks to James Tizard, Patrick van der Smagt, Guszti Bartfai, Don Wunsch,
Andy, Lilly Spirkovska, Nathan Brown, and others.


Paul Bakker
bakker@batserver.cs.uq.oz.au
--
Paul Bakker             | Internet    bakker@batserver.cs.uq.oz.au
Dept of Computer Science| Bitnet:     bakker%batserver.cs.uq.oz.au@uunet.uu.net
Uni of Qld              | JANET:      bakker%batserver.cs.uq.oz.au@uk.ac.ukc
Australia               | EAN:        bakker@batserver.cs.uq.oz

reynolds@bucasd.bu.edu (John Reynolds) (10/03/90)

A clearly written introduction to the field is Patrick K. Simpson's
Artificial Neural Systems: Foundations, Paradigms, Applications, and
Implementations- 1st edition. Pergamon Press (1990)

It presents the following paradigms/models in a unified notation which
facilitates comparison:

UNSUPERVISED FEEDFORWARD LEARNING AND RECALL SYSTEMS: learning matrix,
drive reinforcement, sparse distributed memory, linear associative
memory, optimal linear associative memory, fuzzy associative memory,
learning vector quantizer, and counter propagation

UNSUPERVISED FEEDBACK LEARNING AND RECALL SYSTEMS: additive grossberg,
shunting grossberg, binary adaptive resonance theory (ART1), analog
adaptive resonance theory (ART2), discrete autocorrelator, continuous
hopfield, discrete bidirectional associative memory, adaptive
bidirectional associative memory, temporal associative memory

SUPERVISED FEEDBACK LEARNING AND RECALL SYSTEMS: brain-state-in-a-box,
fuzzy cognitive map

SUPERVISED FEEDFORWARD LEARNING AND RECALL SYSTEMS: perceptron,
adaline/madaline, backpropagation, boltzmann machine, cauchy machine,
adaptive heuristic critic, associative reward-penalty, avalanche
matched filter  

It also describes in precis form the past contributions of the
following researchers, and indicates their present research interests:

McCulloch and Pitts, Hebb, Minsky, Uttley, Rosenblatt, Widrow,
Steinbuch, Grossberg, Amari, Anderson, Longuet-Higgins, Willshaw, and
Bunemann, Fukushima, Klopf, Kohonen, Copper, Erlbaum, Sejnowski,
McClelland, Rumelhart, Sutton and Barto, Feldman, Ballard,
Hecht-Nielsen, Hopfield and Tank, Mead, and Kosko.