[comp.ai.neural-nets] Maximum Storage Capacity of a Net

mde@ecs.soton.ac.uk (Martin Emmerson) (11/24/89)

Can anybody supply me with references or other information
concerning the theoretical maximum storage capacity of
a neural-network.   I need to know how many different
inputs a network can distinguish based on the size
of its hidden layer (for a backprop net with a single
hidden layer).

Thankyou.  
Martin Emmerson.  (Researching Image recognition and fault-
tolerance at Southampton University).

bwk@mbunix.mitre.org (Kort) (12/02/89)

In article <1800@ecs.soton.ac.uk> mde@ecs.soton.ac.uk
(Martin Emmerson) writes:

 > Can anybody supply me with references or other information
 > concerning the theoretical maximum storage capacity of
 > a neural-network.   I need to know how many different
 > inputs a network can distinguish based on the size
 > of its hidden layer (for a backprop net with a single
 > hidden layer).

The connection weights of neural nets are typically low-precision
numbers--about 8 bits, so the information stored in the weight
matrix is 8 bits times the number of connections.

For a 3-stage net with M neurons in the hidden layer, there are
about 2^M possible "states" if you think of the neurons as 2-state
devices, and this would represent the maximum number of distinguishable
patterns that could be classified. 

If you think of neurons as producing analog signal levels with
8 bits of precision, you then have 256 states per neuron and
256^M theoretical states for the whole net.  (Continuous
classification makes sense if you are recognizing colors or
audio frequencies or other continuously graded stimuli.)

--Barry Kort