manning@gap.caltech.edu (Evan Marshall Manning) (07/18/90)
jgk@osc.COM (Joe Keane) writes: >Don't get me wrong, i think neural nets are very interesting, and they have >produced good results in some areas. But i see them being used where more >mundane methods would work quite well, and probably much faster. >It seems like NN is the newest trick, so people want to use it everywhere. >But in the process they don't hear about the old things, which is too bad. Is >it just me, or are others bothered by this trend? It did perturb me when a former boss announced that we'd be trying to use NNs in every aspect of our work. This despite his ignorance of what NNs were. But the truth is that it really was worth a try. As long as one doesn't forget the older techniques (or takes the trouble to learn them) and honestly compares results, no ill can come of the exercise. I freely admit that I have not always done well by the above standards, by the way. But I'm feeling *much* better now. :-) No, what really tics me off are those who repackage statistical techniques as neural nets. Of course definitions are ill defined but... I don't want to name names but lets just say that Nestor's product might be ideal for those who feel pressured to use this new trick but don't really trust anything without a sound statistical basis. -- Evan *************************************************************************** Your eyes are weary from staring at the CRT for so | Evan M. Manning long. You feel sleepy. Notice how restful it is | is to watch the cursor blink. Close your eyes. The |manning@gap.cco.caltech.edu opinions stated above are yours. You cannot | manning@mars.jpl.nasa.gov imagine why you ever felt otherwise. | gleeper@tybalt.caltech.edu