The activation functions play an important role in determining the output of the neural network. The aggregated value of the network is fed into the activation function to find the output value.
Input ---> Neural Network (Computation and Aggregation) ---> Activation Function ---> Output
Bipolar sigmoid and tanh (tan hyperbolic) are the continuous activation functions which give us a gradual output value in the range [-1, 1]. The shape of the both graphs look similar, but is not exactly similar. The slope of tanh graph is more steeper than the bipolar sigmoid.
import numpy as np #Bipolar Sigmoid y1 = (np.exp(x)-1)/(np.exp(x)+1) #Tanh y2 = (np.exp(2x)-1)/(np.exp(2x)+1)
Hence, it could be observed that tanh has the factor of '2x' and bipolar sigmoid has the factor of 'x'. So after taking derivative of both functions the tanh has more value which explains its steeper slope than bipolar sigmoid.