[sci.electronics] Sigmoid transfer function

krishna@aecom.YU.EDU (Krishna Ambati) (08/05/88)

I am looking for a "black box" circuit that has the product transfer
function:

Output voltage = 0.5 ( 1 + tanh ( Input voltage / "Gain" ) )

	       = 1 / ( 1 + exp ( -2 * Input voltage / "Gain" ) )

When plotted, this function looks like an elongated S

When IV = - infinity, OV = 0
When IV = + infinity, OV = 1
When IV = 0         , OV = 0.5

By the way, this question arose in connection with a neural network
problem.

Thanks.

Krishna Ambati
krishna@aecom.uucp

jbn@glacier.STANFORD.EDU (John B. Nagle) (08/06/88)

     Recognize that the transfer function in a neural network threshold unit
doesn't really have to be a sigmoid function.  It just has to look roughly
like one.  The behavior of the net is not all that sensitive to the 
exact form of that function.  It has to be continuous and monotonic, 
reasonably smooth, and rise rapidly in the middle of the working range.
The trigonometric form of the transfer function is really just a notational
convenience.

     It would be a worthwhile exercise to come up with some other forms
of transfer function with roughly the same graph, but better matched to
hardware implementation.  How do real neurons do it?

					John Nagle

ankleand@athena.mit.edu (Andy Karanicolas) (08/07/88)

In article <1945@aecom.YU.EDU> krishna@aecom.YU.EDU (Krishna Ambati) writes:
>
>I am looking for a "black box" circuit that has the product transfer
>function:
>
>Output voltage = 0.5 ( 1 + tanh ( Input voltage / "Gain" ) )
>
>	       = 1 / ( 1 + exp ( -2 * Input voltage / "Gain" ) )
>
>When plotted, this function looks like an elongated S
>
>When IV = - infinity, OV = 0
>When IV = + infinity, OV = 1
>When IV = 0         , OV = 0.5
>
>By the way, this question arose in connection with a neural network
>problem.
>
>Thanks.
>
>Krishna Ambati
>krishna@aecom.uucp

The function you are looking for is not too difficult to synthesize from
a basic analog circuit builing block; namely, a differential amplifier.
The accuracy of the circuit will depend on the matching of components, among
other things.  The differential amplifier is discussed in many textbooks
concerned with analog circuits (analog integrated circuits especially).

You can try:

Electronic Principles, Grey and Searle, Wiley 1969.
Bipolar and MOS Analog IC Design, Grebene, Wiley 1984.
Design and Analysis of Analog IC's, Gray and Meyer, Wiley 1984.

Unfortunately, drawing circuits on a text editor is a pain; I'll
attempt it if these or other books are not available or helpful.   


Andy Karanicolas
Microsystems Technology Laboratory

ankleand@caf.mit.edu





     
      

munro@icsia.berkeley.edu (Paul Munro) (08/08/88)

In article <17615@glacier.STANFORD.EDU> jbn@glacier.UUCP (John B. Nagle) writes:
[JN]Recognize that the transfer function in a neural network threshold unit
[JN]doesn't really have to be a sigmoid function.  It just has to look roughly
[JN]like one.  The behavior of the net is not all that sensitive to the 
[JN]exact form of that function.  It has to be continuous and monotonic, 
[JN]reasonably smooth, and rise rapidly in the middle of the working range.
[JN]The trigonometric form of the transfer function is really just a notational
[JN]convenience.
[JN]
[JN]   It would be a worthwhile exercise to come up with some other forms
[JN]of transfer function with roughly the same graph, but better matched to
[JN]hardware implementation.  How do real neurons do it?
[JN] 
[JN] 					John Nagle


Try this one :   f(x) = x / (1 + |x|)


It is continuous and differentiable:  

f'(x)  =  1 / (1 + |x|) ** 2   =   ( 1 - |f|) ** 2 .

- Paul Munro

jbn@glacier.STANFORD.EDU (John B. Nagle) (08/08/88)

In article <25516@ucbvax.BERKELEY.EDU> munro@icsia.UUCP (Paul Munro) writes:
>
>Try this one :   f(x) = x / (1 + |x|)
>

      The graph looks OK, although some scaling is needed to make it comparable
to the sigmoid.  Someone should try it in one of the popular neural net
simulators and see how the results change.

					John Nagle