New to Peltarion? Discover our deep learning Platform
A single deep learning platform to build and deploy your projects, even if you’re not an AI superstar.FIND OUT MORE
Tanh is a scaled sigmoid function. The gradient is stronger for tanh than sigmoid, that is, the derivatives are steeper.
Which one of sigmoid or tanh to use depends on your requirement of gradient strength. Tanh resembles a linear function more as long as the activations of the network can be kept small. This makes the tanh network easier to compute.