[comp.ai.neural-nets] State of the art continued...

summer@trl.oz.au (Mark Summerfield) (06/20/91)

In article <1991Jun19.024201.2246@trl.oz.au> I wrote:
|> Hello all,
|> 
|> Sorry to ask such an inane question, but I have been delegated the task
|> of reporting to my supervisor on "the state of the art in neural networks".

Well, I have received a couple of replies (thanks to those who responded)
and now have a better idea of the sort of things I'd need to know.
For example:

	- Which way is the current implementation trend heading, analog
		or digital?
	- What is the state of the art in VLSI implementation. What speeds,
		how much complexity (synapses/chip, for example)?
	- What technology is being used to achieve this? CMOS? BiCMOS?
		What size? (0.8 micron?)
	- What sort of structures are used in implementation? Arrays of
		processors? What?
	- What kind of applications are such chips being used in?

If you are working in this area, and can provide me with concrete data
and examples on the above topics it would be very much appreciated.

Thanks again,

Mark
-- 
            +---------------------------------------------------+
            |  Mark Summerfield, Telecom Research Laboratories  |
            | ACSnet[AARN/Internet]:  m.summerfield@trl.oz[.au] |
            |      Snail: PO Box 249, Clayton, Vic., 3168       |
            +---------------------------------------------------+
  Everything you know is wrong -- but some of it is a useful approximation!