[comp.ai.neural-nets] continuous vs discrete values for weights

ssingh@watserv1.waterloo.edu (The Sanj-Machine aka Ice) (02/02/91)

Could someone tell me if there is any significant difference regarding the
properties of neural networks with a finite set of states for connection
strengths as opposed to continuous values. Which is more biologically
accurate?

I always thought that neurons assume one of a finite set of strengths. It
is just that it is a very large set, so from our vantage point it
appears continuous. I would like to explore the dynamical properties of
nonlinear neural networks, so this is important.

Thanks in advance for your time.


-- 
"No one had the guts... until now!"  
$anjay $ingh     Fire & "Ice"     ssingh@watserv1.[u]waterloo.{edu|cdn}/[ca]
ROBOTRON Hi-Score: 20 Million Points | A new level of (in)human throughput...
"The human race is inefficient and therefore must be destroyed."-Eugene Jarvis

rao@gabber.kodak.com (Arun Rao) (02/06/91)

In article <1991Feb2.001242.3473@watserv1.waterloo.edu>, ssingh@watserv1.waterloo.edu (The Sanj-Machine aka Ice) writes:
... [stuff deleted ]
|> 
|> I always thought that neurons assume one of a finite set of strengths. It
|> is just that it is a very large set, so from our vantage point it
|> appears continuous.
... [stuff deleted ]

	How large is very large ?  It appears unlikely to me that neuron
	activation could possess as much resolution as (say) even a typical
	binary float representation.  I don't remember having seen any numbers,
	but I would tend to think that if you need more than 8 bits of
	resolution to get a neural computational model to work, the biological
	plausibility of such a model is suspect.

	This is not to say, of course, that biological plausibility should be
	the acid test in evaluating models, especially application-oriented work.

	I'd be glad to hear about any experimental evidence that supports a
	considerably higher resolution in individual neuron activation.

	-Arun