DEV Community

loading...

Bipolar Sigmoid vs Tanh Activation Functions

saumitrajagdale profile image Saumitra Jagdale ・1 min read

The activation functions play an important role in determining the output of the neural network. The aggregated value of the network is fed into the activation function to find the output value.

Input ---> Neural Network (Computation and Aggregation) ---> Activation Function ---> Output

Bipolar sigmoid and tanh (tan hyperbolic) are the continuous activation functions which give us a gradual output value in the range [-1, 1]. The shape of the both graphs look similar, but is not exactly similar. The slope of tanh graph is more steeper than the bipolar sigmoid.

import numpy as np 

#Bipolar Sigmoid
y1 = (np.exp(x)-1)/(np.exp(x)+1)

#Tanh
y2 = (np.exp(2x)-1)/(np.exp(2x)+1)
Enter fullscreen mode Exit fullscreen mode

Hence, it could be observed that tanh has the factor of '2x' and bipolar sigmoid has the factor of 'x'. So after taking derivative of both functions the tanh has more value which explains its steeper slope than bipolar sigmoid.

Discussion (0)

pic
Editor guide