rgg@munnari.oz.au (Rupert Graham Goldie) (09/17/90)
I have been trying to train a neural network to add two 4 bit numbers with little success and I was wondering if anyone else has tried to do this and succeeded. I first tried using binary representation for the numbers and with little success with that I then tried representing the numbers by the number of neurons activated (ie 2 is represented by 1.0 1.0 0.0 .... 0.0). This representation wasn't much more successful. This problem set has 256 example patterns and so I have tried training with 32 to 256 patterns in the training set. With up to about 60 patterns I was able to get the net to converge but the resulting network doesn't generalize very well to non-training set examples. I have tried training with larger sets for up to 15000 epochs but the total error just appears to oscillate and not to be approaching zero. I have been using between 10 and 80 hidden units with quickprop and have tried cascade-correlation with upto 160 new nodes. The latter has been spectacularly unsuccessful. With quickprop I mainly have one hidden layer but have also tried using more, also with little success. If there is any interest I could post the short C program I use to generate the training sets so that others can try this too. Thanks, Rupert ---- Rupert G. Goldie rgg@munmurra.cs.mu.OZ.AU Computer Science Honours Student, University of Melbourne "necessity is a mother"