[comp.ai.neural-nets] Who's doing what in neural net?

mkkam@sun1.cs.uh.edu (Francis Kam) (07/05/88)

No matter what delayed my posting of the replies to my
request 6 months ago for "Who's doing what in neural
net?", I apologise to those who have replied, waited, and
frustrated.  

The first part is a bibliography of papers that I have,
either due to the readers' replies, or some indirect
references.  Most of them are sent directly from the
authors or from their departments to whom I am so
grateful.  I did not read through all of them.  It's up to
the readers to appreciate their values.  The list is in a
special format with a $ as the field delimiter.  The first
field is the title, second the author(s), and third the
journal wherein the paper is published.  The list is
sorted in ascending order now by the first field. Each
record is separated by a <newline>.  It is posted this way
to facilitate further processing and storage.

Readers who are interested in getting a copy of the paper
can contact the author directly.  I can't reproduce copies
because I don't know how much it will cost me to xerox and
mail. Please understand.

The second part is the readers' replies to whom I owe them
a lot of thanks.

I didn't see much postings on comp.ai.neural-nets
recently.  I would be grateful if someone could inform me
of any new mailing lists for neural net research.

Last, Hecht-Nielsen Neurocomputer Corporation is producing
some neural hardware.  Its address:
	5893 Oberlin Dr., San Diego, CA 92121.  619/546-8877
I would also love to know any hardware/software
manufacturers in neural computing.

Many thanks.

-------------
Francis Kam                           Computer Science Department
Internet: mkkam@sun1.cs.uh.edu        University of Houston
CSNET:    mkkam@houston.csnet         4800 Calhoun
Phone: (713)749-1748                  Houston, TX 77004.



----------------------<< Cut Here >>-----------------------------
A Connectivity Analysis of a Class of Simple Associative Neural Networks$Hammerstrom, Dan$Technical Report No. CS/E-86-009, Jan 1988, Oregon Graduate Center, 19600 N.W. von Neumann Dr., Beaverton, Oregon 97006.
A Question of Levels: Comment on McClelland and Rumelhart$Broadbent, Donald$Journal of Experimental Psychology: General, 1985, Vol. 114, No. 2, 189-192.
A Temporal-Difference Model of Classical Conditioning$Sutton, Richard S., Barto, Andrew G.$TR87-509.2, Mar 1987. GTE Laboratories Incorporated, 40 Sylvan Road, Waltham, MA 02254.  Also appear in: Proceedings of the Ninth Annual Conference of the Cognitive Science Society. July, 1987.
An Adaptive Network That Constructs and Uses an Internal Model of Its World$Sutton, Richard S., Barto, Andrew G.$Cognition and Brain Theory, 1981, 4(3)217-248.
An Introduction to Computing with Neural Nets$Lippmann, Richard P.$IEEE ASSP Magazine, Apr 1987, pp. 4-22.
Applications of the Connection Machine$Waltz, David L.$Computer, Jan 1987, pp. 85-97.
Associative Learning, adaptive pattern recognition, and cooperative-competitive decision making by neural networks$Carpenter, Gail A., Grossberg, S.$SPIE Vol. 634 Optical and Hybrid Computing (1986), pp.218-247.
Associative Search Network: A Reinforcement Learning Associative Memory$Barto, Andrew G., Sutton, Richard S.,and Brouwer, Peter S.$Biological Cybernetics, 40, 201-211(1981).
Cognitive and Psychological Computation with Neural Models$Anderson, James A.$IEEE Transactions on Systems, Man, and Cybernetics, Vol. SMC-13, No. 5, Sep/Oct 1983.
Collective Computation in Neuronlike Circuits$Tank, David W., Hopfield, John J.$Scientific American, Dec 1987, pp 104-114.
Competitive Learning: From Interactive Activation to Adaptive Resonance$Grossberg, S.$Cognitive Science 11, 23-63 (1987).
Computing with Structured Connectiionist Networks$Feldman, Jerome A., Fanty, Mark A., Goddard, Nigel H., and Lynne, Kenton J.$Communications of the ACM, Feb 1988, Vol. 31, No. 2, pp 170-187.
Connectionist Architectures for Artificial Intelligence$Fahlman, Scott E., Hinton, Geoffrey E.$Computer, Jan 1987, pp 100-109.
Connectionist Expert Systems$Gallant, Stephen I.$Communications of the ACM, Feb 1988, Vol. 31, No.2, pp 152-169.
Connectionist Models and Their Properties$:Feldman, J. A., Ballard, D. H.$Cognitive Science 6, 205-254 (1982).
Connectionist Models and Their Properties$Feldman, J.A., Ballard, D.H.$Cognitive Science 6, 205-254 (1982).
Data Parallel Algorithms$Hillis, W. D., Steele, Guy L., JR.$Communications of the ACM, Dec 1986, Vol 29, No. 12.
Distributed Memory and the Representation of General and Specific Information$McClelland, James L., Rumelhart, David E.$Journal of Experimental Psychology: General, 1985, Vol. 114, No. 2, 159-188.
Distributed Representations$Hinton, Geoffrey E.$Technical Report CMU-CS-84-157, Oct 1984. Computer Science Dept., Carnegie-Mellon University, Pittsburgh PA 15213.
Feature Discovery by Competitive Learning$Rumelhart, David E., Zipser, David$Cognitive Science 9, 75-112 (1985).
Landmark Learning: An Illustration of Associative Search$Barto, Andrew G., Sutton, Richard S.$Biological Cybernetics, 42, 1-8(1981).
Learning to Predict by the Methods of Temporal Differences$Sutton, Richard S.$TR87-509.1, revised Feb. 1988. GTE Laboratories Incorporated 40 Sylvan Road, Waltham, MA 02254.
Levels Indeed! A Response to Broadbent$Rumelhart, David E., McClelland, James L.$Journal of Experimental Psychology: General, 1985, vol. 114, No. 2, 193-197.
Neural Networks and Physical Systems with Emergent Collective Computational Abilities$Hopfield, J.J.$Proc. Nat. Acad. Sci. USA, Vol. 79, pp.2554-2558, Apr. 1982
Neural Networks, Part 1: What are they and why is everybody so interested in them now?$Eliot, Lance B.$IEEE Expert, Winter 1987, pp 10-14.
Neural Networks, Part 2: What are they and why is everybody so interested in them now?$Wassereman, Philip D.$IEEE Expert, Spring 1988, pp 10-15.
Neural Problem Solving$Barto, Andrew G., Sutton, Richard S.$pp. 123-152 in "Synaptic Modification, Neuron Selectivity, and Nervous System Organization" edited by William B. Levy, James, A. Anderson, and Stephen Lehmkuhle, 1985, Lawrence Erlbaum Associates, London.
Neural Processing Systems$Miceli, Bill$SPIE Vol. 634 Optical and Hybrid Computing (1986) p. 349.
Neural net models and optical computing: a brief overview$Farhat, Nabil H.$SPIE Vol. 634 Optical and Hybrid Computing (1986) pp. 307-311.
Neuronlike Adaptive Elements That Can Solve Difficult Learning Control Problems$Barto, Andrew G., Sutton, Richard S., Anderson, Charles W.$IEEE Transactions on Systems, Man, and Cyubernetics, Vol. SMC-13, No. 5, Sep/Oct 1983.
On the Storage Capacity of an Associative Memory with Randomly Distributed Storage Elements$Palm, G.$Biological Cybernetics, 39, 125-127(1981).
Optical Neural Computers$Abu-Mostafa, Yaser S., Psaltis, Demetri$Scientific American, Mar 1987, pp. 88-95.
Panel Discussion$Szu, Harold H.$SPIE Vol. 634 Optical and Hybrid Computing(1986) pp.331-348.
Pattern Formation and Chaos in Networks$Pickover, Clifford A.$Communications of the ACM, Feb 1988, Vol. 31, No. 2, pp 136-151.
Performance Limits of Optical, Electro-Optical, and Electronic Neurocomputers$Hecht-Nielsen, Robert$SPIE Vol. 634 Optical and Hybrid Computing (1986), pp 277-306.
Putting Knowledge in its Place: A Scheme for Programming Parallel Processing Structures on the Fly$McClelland, James L.$Cognitive Science 9, 113-146 (1985).
Recognition Cones: A Neuronal Architecture for Perception and Learning$Honavar, Vasant, and Uhr, Leonard$Computer Science Technical Report #717, Sep 1987. Computer Sciences Department, University of Wisconsin-Madison.
Representation of sensory information in self-organizing feature maps, and relation of these maps to distributed memory networks$Kohonen, Teuvo$SPIE Vol. 634 Optical and Hybrid Computing(1986), pp. 248-259.
Selected Bibliography on Connectionism$Selfridge, O.G., Sutton, R.S., Anderson C.W.$TR87-509.4, Dec. 1987. To appear in the review volume Evolution, Learning, and Cognition, edited by Lee, Y.C., World Scientific Publishing.
Strategic Learning with Multilayer Connectionist Representations$Anderson, Charles W.$TR87-509.3 Apr. 1987, GET Laboratories Incorporated, 40 Sylvan Road, Waltham, MA 02254.
The ART of Adaptive Pattern Recognition by a Self-Organizing Neural Network$Carpenter, Gail A., Grossberg S.$Computer, Mar 1988, pp 77-88.
The Fundamental Physical Limits of Computation$Bennett, Charles H., Landauer, Rolf$Scientific American, Jul 1985, pp. 48-56.
Three Layers of Vector Outer Product Neural Networks for Optical Pattern Recognition$Sze, Harold$SPIE Vol. 634, Optical and Hybrid Computing(1986), pp. 312-330.
Toward Memory-Based Reasoning$Stanfill, Craig and Waltz, David$Communications of the ACM, Dec 1986, Vol 29, No 12.
Two Problems with Backpropagation and Other Steepest-Descent Learning Procedures for Networks$Sutton, Richard S.$Proceedings of the 8th Annual Conference of the Cognitive Science Society, 1986, pp.823-831.
Unit Activation Rules for Cognitive Network Models$Williams, Ron$ICS Report No. 8303, Nov 1983, Institute for Cognitive Science, UCSD.
----------------------<< Cut here >>-----------------------------
From:	IN%"@RELAY.CS.NET,@speedy.cs.wisc.edu:honavar@cs.wisc.edu"  8-FEB-1988 19:38
To:	mkkam@houston.csnet
Subj:	Neuronal models



>I am working on the learning aspects of the neural net model in computing
>and would like to know what's happening in the rest of the neural net
>community in the following areas:
>  1) neural net models
>  2) neural net learning rules
>  3) experimental (analog, digital, optical) results of any kind with
>     figures;
>  4) neural net machines (commercial, experimental, any kind);
>  5) any technical reports in these areas;

>For information exchange and discussion purpose,
>please send mail to mkkam@houston.edu.
>Thank you.


	Greetings. I am also interested in learning in topologically
constrained (structured) neuronal architectures - especially in the
domain of vision. I am currently working on extending the "recognition
cones" model for perception (Uhr, 1972) to handle learning. The general
framework of this model (with its connectionist, neuronal interpretation
as it pertains to my work) is described in the paper "Recognition Cones:
A Neuronal Architecture for Perception and Learning" (Vasant Honavar and
Leonard Uhr, Computer Sciences Tech. report # 717, September 1987, 
Computer Sciences Department, University of Wisconsin-Madison; submitted 
to Cognitive Science journal). I have some preliminary results from
simulating the model for learning to distinguish between simple objects
which are rather encouraging. The model is able to evolve  arbitrarily 
complex feature detectors by combining simpler ones as needed in addition
to reweighting existing detectors (as do other connectionist models). 

	I would be interested in hearing from you about your work.

------------------------------------

Vasant Honavar
Computer Sciences Dept.
University of Wisconsin-Madison
Madison, WI 53706.

honavar@ai.cs.wisc.edu
------------------------------------


From:	IN%"GODDEN@gmr.com" 10-FEB-1988 03:12
To:	MKKAM@houston.csnet
Subj:	RE: Nanocomputer

It's published by Anchor Books (Doubleday) and I believe the copyright
is 1986 (?) (either that or '87).
-Kurt

From:	IN%"@RELAY.CS.NET,@ai.cs.wisc.edu:honavar@cs.wisc.edu" 10-FEB-1988 03:13
To:	MKKAM@houston.csnet
Subj:	RE: Neuronal models


	1. You can write to me or to our tech. reports librarian
	giving your postal address and asking for tech report # 717.
	Our tech reports librarian is Linda McConnell and she can be
	reached by e-mail at linda@speedy.cs.wisc.edu and a copy will
	be mailed to you.

>	Summary of his research:        mmk
	----	
	"Layered, converging-diverging neuronal 
	architectures for perception have been proposed 
	(e.g. "Recognition Cones" Uhr, 1972; Uhr, 1987) and programs 
	simulating such models have been studied
	(Uhr and Douglass, 1979; Li and Uhr, 1987).
	A variety of learning mechanisms in such 
	topologically constrained architectures
	are currently under investigation (Honavar and Uhr, 1987)."
	----

>	Readers please refer to the technical reports from the
>	Computer Science Dept. U. of Wisconsin-Madison.  mkk

	3. I would be interested in hearing about your work and I will 
	keep you posted when I have some interesting results.


	Good luck on your research.

	Vasant Honavar
	honavar@ai.cs.wisc.edu

 
From:	IN%"@RELAY.CS.NET:puswad@life.pawl.rpi.edu" 12-FEB-1988 04:01
To:	mkkam@houston.csnet
Subj:	Neural Nets

In-Reply-To: <727@uhnix1.UUCP>
Organization: Rensselaer Cellular Automataon Development Group

  I have embarked on a project to decide if newual nets will be useful in
the field of applied geometry.  Accordingly, could you please forward a copy
of any responses you get to me?  

  Thank you,
   C. Dominus


-------------------------------------------------------------------------------
  |   |   |   | 1 | 1 | 0 | 1 | * | 0 | $ | 1 | 0 | 0 | * | 1 | 1 | $ |   |   |
-------------------------------------------------------------------------------
"Turing Machine tape is strongly reminiscent of toilet paper."  

USERFG8M@rpitsmts.BITNET             puswad@pawl.rpi.EDU
USERFG8M%rpitsmts@itsgw.rpi.EDU      dominusm@b21.cs.rpi.EDU  <-- Preferred

"Crash Dominus is the world's most dangerous math major."  - A. Ihnatko
-------------------------------------------------------------------------------
 
From:	IN%"@RELAY.CS.NET:irani@umn-cs.cs.umn.edu" 12-FEB-1988 04:12
To:	mkkam@houston.edu.csnet
Subj:	Re: Who's doing What in neural net research?

I'm doing thesis on applying neural nets for statistical analysis of a
large clinical trial database.  This is for my PhD.  So far
back-propagation model is being experimented with.  Extended abstract
follows (accepted at AAAI workshops for medicine at stanford in spring
88).

****************************************

Experimenting with artificial neural nets for analyzing clinical trial
databases: The POSCH A.I. Project (** FOOTNOTE 1 **)
(** FOOTNOTE 1 **:  This material is based partly on work supported by
the National Science Foundation grant # DCR8512857, by NHLBI grant #
2R10HL15265, and by the Microelectronics and Information Sciences
Center of the University of Minnesota )

Erach A. Irani, John M. Long, James R. Slagle

University of Minnesota,
Minneapolis, MN 55455.
Contact: (612) 627 4850 

Clinical trials involve the collection, evaluation, and analysis of
data.  The POSCH (Program for Surgical Control of the Hyperlipidemias)
clinical trial has developed two expert systems, Exercise Test
Analyzer (ETA) and Evaluator of Serial Coronary Angiograms (ESCA) for
evaluation of trial data.  The size of the trial database is 838
patients * 1400 variables/patient/year * 10 years follow-up for each patient.
Efforts are underway to use A.I. techniques to help with the
statistical analysis of the trial database. 

	Systems such as RX/RADIX have been developed to do a
statistical analysis of medical databases using expert system
techniques[2].  Given the range of statistical techniques that can
be used, and the diversity of information recorded the knowledge
engineering effort is substantial.  Even then it is not clear that
such systems can capture all the statistical relationships in the
datbase.  We are looking at neural nets as a possible solution to both
these shortcomings.

	Neural net computation is organized as a collection of
deterministic or stochastic units, with feedback.  There can be
several layers of such units.  Several training
algorithms have been developed recently [3] that enable the net to
learn non-linear associations such as involved in XOR, or a parity
encoder.  A neural-net can be trained on a training set and then
used with a test set that it has not encountered before to see how
well it predicts the outputs in the test set.  This performance can
then be compared with that of statistical techniques such as
multiple linear regression.

	We are currently experimenting with the data used for testing
the expert system ESCA.  The expert system used the percentages of
stenoses read on two different occassions in 23 coronary arterial
segments to come up with an overall assessment of change in
atherosclerotic disease on an 8 point scale.  The expert system is 
accurate within one in 92% of 200 cases we tested it on.  Preliminary
results for a neural net trained with the back-propagation algorithm
are that it is within one in 69% of 75 test 
cases after training on 125 different cases.  The performance of the
net can be changed by changing the setup of the network and modifying
its convergence parameters.  To get a feel of how good this
performance is as compared to that of statistical techniques we are 
using multiple regression analysis on this data.

	There are several neural net models and it is possible that
other models such as Kohonen's linear associative networks, or a
Boltzmann machine model that has been trained on the test set without
the output value, will do better than the back-propagation model.  At
POSCH we have data relating to various tests and diseases of numeric
and non-numeric (yes/no, one of various classes) nature.  We plan to
test out different neural-net models and multiple-linear regression on
different sets of data, and compare their performance and with that of
multiple linear regression.  This way we shall get an empirical
understanding of the potential of neural nets as a statistical tool.

ACKNOWLEDGEMENTS

Discussions with John Matts clarified the statistical issues involved
in analyzing POSCH data.
	
REFERENCES

[1] J M Long, J R Slagle, M Wick, E Irani, J Matts and the POSCH group,
"Use of expert systems in medical research data analysis: The POSCH AI
project," National Computer Conference, Vol. 56, pp. 769 - 776, 1987.

[2] M G Walker, and R L Blum, "Towards Automated Discovery from Clinical
Databases: The RADIX Project," pp. 32-36, Proceedings of the Fifth
Conference on Medical Informatics, Vol. 5, 1986.

[3] D E Rumelhart, J L McClelland and the PDP Research Group, "Parallel
Distributed Processing, Explorations in the Microstructure of
Cognition", Vols 1 and 2,  MIT Press, 1986.
-- 
My opinions dont represent those of my employers but you can make them
your own for free.
Phone : (Work) (612) 782-7484                  (Home) (612) 378-2336 
ARPANET : irani@umn-cs.cs.umn.edu       UUCP: ..ihnp4!umn-cs!irani

 
From:	IN%"@RELAY.CS.NET:JimDay.Pasa@xerox.com" 12-FEB-1988 20:41
To:	mkkam@houston.csnet
Subj:	Re: Neural nets

I know very little about neural nets, but there is a book describing the
structure of the Connection Machine computer:

     The Connection Machine
     W. Daniel Hillis.
     MIT Press.  1986.
 
 
From:	IN%"wyle@solaris.ifi.ethz.ch" 15-FEB-1988 21:38
To:	mkkam%houston.edu@RELAY.CS.NET
Subj:	re: Who's doing What in neural net research?

Although I just dalley in the field, the leader of our local
group, Tony Bell (tony@solaris.uucp) is forming a group from
both this school and some others in this city.  Contact him
for details.

I posted to usenet a delta-rule (perceptron) system written
in Modula-2.  The response has been underwhelming.  If you
have not already done so, you should contact Dave Touretsky
at CMU and ask for his mailing list of connectionists.  I
was on it for a short time, but he pruned me off :-(.

You should contact Mike Gately (if you have not
already done so) about his list as well.

Finally, Rik Belew at UCSD has collected a bibliography of
nn research.

Ciao,

-Mitchell F. Wyle            wyle@ethz.uucp
Institut fuer Informatik     wyle%ifi.ethz.ch@relay.cs.net
ETH Zentrum                  
8092 Zuerich, Switzerland    +41 1 256-5237
 
From:	IN%"@RELAY.CS.NET:walters@cs.buffalo.edu" 17-FEB-1988 04:49
To:	mkkam@houston.csnet
Subj:	neural network research


I am doing research in neural networks, but concerning a topic you
didn't mention, but which is general enough perhaps to be of interest
to you.  One topic concerns a theoretical analysis of represenations
in neural networks.  This work has appeared in the First International
Neural network Conference, and will be presented at Snowbird this year.
The other aspect of my work concerns an application - the use of
neural networks in early vision.  I have been part of the DARPA
study group on neural networks and vision.

If you'd like any further information, feel free to contact me.

Deborah Walters
SUNY/Buffalo
 
 
From:	IN%"@RELAY.CS.NET:hgr001@pyr.gatech.edu" 26-FEB-1988 20:58
To:	mkkam@houston.csnet
Subj:	

Francis,

It's me again. I haven't seen my partner lately but I can refer you to
good references.

1. Anything by Hopcroft

2. 2 vol. books "Parallel Distributed Processing"

3. Join the INNS (International Neural Network Society). The premier
   issue (vol.1 no.1) is EXCELLENT. You can reach them at:

	Editor for INNS
	Stephen Grossberg
	Center for Adaptive Systems
	Mathematics Department
	Boston University
	111 Cummington Street
	Boston, MA  02215
	USA

Also, with reference to Andre deKorvin, I understand he is with the
"other" campus that is more involved with mathematics. Does this make
sense? If you see him, tell him I said hi.

-harvey
 
From:	IN%"@RELAY.CS.NET:mcvax!pi1!hjm@uunet.uu.net" 26-FEB-1988 21:00
To:	"cernvax! mkkam@houston.edu"@uunet.uu.net
Subj:	Re: Who's doing what in NNs. 

Francis,

   I read your request to find out who's doing what in NNs some time ago, but
didn't have time to reply until now.

   I'm a Dutch Computer Science student, doing my practical year.
In this time I have to work in two different places, both aprx. half a year.
Currently I'm working in a small Swiss software house. 

   However, the last half year I worked at the Centre for Speech Technology 
Research at the University of Edinburgh (Scottland). 

   There I've worked with neural-net simulaters, and written
some, under supervision of Dr. Richard Rohwer. The last thing we tried to
implement was a couple of ideas of merging and splitting nodes, given
some conditions and rules. What we want, is a system that, whilst training, 
creates and deletes hidden nodes, and eventually gives a network
with the *optimum* number of hidden nodes. We test our ideas on a 
strictly layered feed forward network, using the backprop training algo.

  As you, by now, will probably have an idea of 'who's doing what in NNs',
I'd like to know from you, if there are other people working on this idea, 
and if so, is they have results yet. Our results are resonable, but 
(even tho' I'm not working in Edinburgh anymore) this in on-going research, 
and we hope to get better results in the (near) future.


						--Erik Meinders.

						erik@paninfo
						erik@eusip
						hjm@pi1

						And if that all fails:

						Erik Meinders
						Haldenstrasse 6
						8600 Dubendorf
						Switzerland.


 
From:	IN%"@RELAY.CS.NET:lynne@mimsy.umd.edu"  3-MAR-1988 20:07
To:	mkkam@houston.csnet
Subj:	re: request for info on neural net research


A group of us at the University of Maryland have developed a general-purpose
simulator for developing and evaluating connectionist (neural) models.
Our system called MIRRORS/II is written in Franz LISP and runs on many
machines including Vaxen and SUNs.

I'm sending you a paper through the U.S. Mail which describes our system
in more detail.

Lynne D'Autrechy
University of Maryland
Computer Science Department

lynne@mimsy.umd.edu
 
From:	IN%"franke@irl1"  "Hubertus Franke" 29-JUN-1988 20:25
To:	mkkam@houston.CSNET
Subj:	Information Request

Received: from relay.cs.net by houston.csnet; Wed, 29 Jun 88 20:24 CST
Received: from relay.cs.net by RELAY.CS.NET id au14942; 29 Jun 88 11:41 EDT
Received: from uunet.uu.net by RELAY.CS.NET id aa07508; 29 Jun 88 11:17 EDT
Received: from [129.59.100.1] by uunet.UU.NET (5.59/1.14)	id AA01777; Wed, 29
 Jun 88 10:37:27 EDT
Received: from irl1.vuse.uucp by vuse.vanderbilt.edu (3.2/SMI-3.2)	id AA04087;
 Wed, 29 Jun 88 09:30:21 CDT
Received: by irl1.vuse.uucp (3.2/SMI-3.2)	id AA01816; Wed, 29 Jun 88 09:39:27
 CDT
Date: Wed, 29 Jun 88 09:39:27 CDT
From: Hubertus Franke <franke@irl1>
Subject: Information Request
To: mkkam@houston.CSNET
Message-Id: <8806291439.AA01816@irl1.vuse.uucp>

Dear Francis Kam !

I am currently studying general implementation issues for neural networks
in a real distributed and parallel environment. I saw yoru request from
february in the  Neuron Digest !. Did you get any response from some 
NN folks. If so could you share them with me. I am basically intersed in
information concerning your points:

 1)  Neural net model (or PDP model) as a general model of 
     parallel computation...

 2)  Neural net programming environment -- languages ,....


If you could send me some papers or paper references or addresses to
contact, I would be really glad, 


Thanks in advance



Hubertus Franke
Center for Intelligent Systems
Vanderbilt University
P.O. 1804, Station B
Nashville, TN 37235
--------------------------<< Cut here >>--------------------------