[comp.ai.neural-nets] Ask Kolmogorov's paper info

dli@uceng.UC.EDU (dapeng li) (07/15/90)

Hi, netters,

	Does anyone here can give me the reference information of Kolmogorov's
paper "On the representation of continous functions of many variables by
superposition of continous functions of one variable and addtions" in ENGLISH
VERSION ?

	Also, if you have any kind of information about the development of
Kolmogorov's work on back propagation network, I would like to know ( papers
on this topic).

	I will post the collected information on the net.

	Thanks for your help in advance.

						Dapeng Li
						University of Cincinnati

baindur@jhunix.HCF.JHU.EDU (Satyen G Baindur) (07/16/90)

In article <5458@uceng.UC.EDU> dli@uceng.UC.EDU (dapeng li) writes:

>	Does anyone here can give me the reference information of Kolmogorov's
>paper "On the representation of continous functions of many variables by
>superposition of continous functions of one variable and addtions" in ENGLISH
>VERSION ?
>
>	Also, if you have any kind of information about the development of
>Kolmogorov's work on back propagation network, I would like to know ( papers
>on this topic).
>
>						Dapeng Li


The reference is 

Kolmogorov, A.N., "On the representation of continuous functions of 
many variables by superposition of continuous functions of one variable 
and addition.",  American Mathematical Society Translations, Series 2,
Volume 28, pp.55-59. 

It should be read in conjunction with the Girosi and Poggio paper in 
Vol 1 no.4  pp.465-469 of Neural Computation. They point out that the 
theorem is irrelevant for network learning because it can require 
even differentiable functions to have a representation in terms of 
"wildly" behaving inner functions. 

Then again, you may have read this already, and be asking the 
question because they give only the reference to the Russian original, 
[which is 
Dokl. Akad. Nauk. SSSR. vol 114 pp 953-956, (1957).] 

Satyen Baindur