[comp.ai.neural-nets] backpropagation

jgk@osc.COM (Joe Keane) (11/29/90)

In article <4916@trantor.harris-atd.com> mlaprade@x102a.ess.harris.com
(laprade maria 42641) writes:
>In my neural nets class we were assigned to build a BPP net to approximate
>the function z = sin(2PIx)sin(2PIy) for
>0<=x<=1 and 0<=y<=1.

This is a fairly difficult function, basically a sinusoidal version of XOR.
It can also be expressed as z = 1/2*(cos(2*pi*(x-y))-cos(2*pi*(x+y))).
Hopefully the nets will learn to use the sum and difference.

>Weights were to be initialized to random values between +-0.1.

I'd say	these weights are too small.  You need to give the net a large amount
of asymmetry to start with, or it will tend to converge to zero.