lowe@cs.ubc.ca (David Lowe) (03/18/91)
I am sure that many people in this newsgroup would be interested in the announcement in comp.newprod of a new neural network computer by a company called Adaptive Solutions. It was posted on March 5. It sounds almost too good to be true. They claim that for $55,000 you will get a 256-processor SIMD machine designed to do back-prop and a range of other types of network training. They claim a pretty amazing speed of 1 billion connections per second for full back-prop learning (100 times the speed of a Cray 2), so that you could train NetTalk in 6 seconds instead of the 4 hours needed on a SparcStation. If this is true, then it will have a very large impact on most people's perception of the slowness of back-prop and lead to its use on much larger problems. The 3 orders of magnitude improvement in price/performance over what most people are using now would be the practical equivalent of discovering some amazing new convergence method. They claim that the machine will be available for beta testing this summer and for volume shipping by the end of this year. Since this is a marketing announcement, does anyone else have more information to post on possible limitations? Will the machine be able to handle very large networks and data sets (one suspicious omission was any reference to the amount of memory)? -- David Lowe (lowe@cs.ubc.ca) Computer Science Dept. University of British Columbia