Test it on the Peltarion Platform
A platform to build and deploy deep learning projects.
Even if you’re not an AI superstar.
Tanh is a scaled sigmoid function. The gradient is stronger for tanh than sigmoid, that is, the derivatives are steeper.
Which one of sigmoid or tanh to use depends on your requirement of gradient strength. Tanh resembles a linear function more as long as the activations of the network can be kept small. This makes the tanh network easier to compute.