Test it on the Peltarion Platform

A platform to build and deploy deep learning projects.

Even if you’re not an AI superstar.

TRY FOR FREE

ReLU

The ReLU (Rectified Linear Unit) is the go-to function for many neural networks since it is cheap to compute and still works well enough for many applications.

It is a non-linear function that gives the same output as input if the input is above 0, otherwise the output will be 0. that is,

  • Output = input, if input is above 0

  • Output = 0, if input is below 0

The ReLU function also helps with the problem of vanishing gradients in deep networks by not squashing in both ends.

\[f(x) = max(x, 0);\]
ReLU graph
Figure 1. ReLU graph