ReLU

The ReLU (Rectified Linear Unit) is the go-to function for many neural networks since it is cheap to compute and still works well enough for many applications.

It is a non-linear function that gives the same output as input if the input is above 0, otherwise the output will be 0. that is,

  • Output = input, if input is above 0

  • Output = 0, if input is below 0

The ReLU function also helps with the problem of vanishing gradients in deep networks by not squashing in both ends.

ReLU activation function
ReLU graph
Figure 1. ReLU graph
Try the platform