[comp.ai.neural-nets] Supervised ART Model

reynolds@park.bu.edu (John Reynolds) (05/15/91)

The following note appeared in Volume 7, Issue 15 of Neuron-Digest:

>I am looking for articles on the application of ART in supervised
>                                                       ==========
>learning situations. Can anyone help?

>Thanks.

>Kok Wee Gan
>Department of Information Systems and Computer Science
>National University of Singapore
>bitnet address: gankw@nusdiscs.bitnet

>[[Editor's Note: Perhaps someone from Boston U. could answer in a future
>Digest? I thought ART was, strictly speaking, unsupervised only. -PM]]

Gail Carpenter, Stephen Grossberg and I have recently introduced a
supervised ART system, called ARTMAP, that autonomously learns to
classify arbitrarily many, arbitrarily ordered vectors into
recognition categories based on predictive success.  Tested on a
benchmark machine learning database in both on-line and off-line
simulations, the ARTMAP system learns orders of magnitude more
quickly, efficiently, and accurately than alternative algorithms.  It
achieves these properties by using an internal controller that
conjointly maximizes predictive generalization and minimizes
predictive error by linking predictive success to category size on a
trial-by-trial basis, using only local operations.

It was presented last week at the Wang Institute Conference on Neural
Networks for Vision and Image Processing, and it will also appear at
the upcoming IJCNN meeting (Lecture, Friday, July 12, Session 2, 9:10
- 9:30AM). It will be discussed in an upcoming issue of Neural
Networks (Neural Networks, 4, in press), and it is now available as
Technical Report CAS/CNS-TR-91-001.

Write to the following address:

	Boston University Center for Adaptive Systems
	and Cognitive and Neural Systems Department
	111 Cummington Street, Rm. 244
	Boston, MA 02215

or contact Cindy Suchta (cindy@park.bu.edu) to request a copy of the
technical report.

-John Reynolds