New to Peltarion? Discover our deep learning Platform
A single deep learning platform to build and deploy your projects, even if you’re not an AI superstar.FIND OUT MORE
The Swish activation function intends to be a straightforward replacement for the ubiquitous ReLU function.
The Swish function has similar strengths as the ReLU function, to which it adds smoothness and non-monotonic properties. It has been benchmarked on a variety of tasks, and was shown to consistently give better results.
It is a non-linear function that is easily expressed as the product of the identity function with the sigmoid function: