The ReLU (Rectified Linear Unit) is the go-to function for many neural networks since it is cheap to compute and still works well enough for many applications.
It is a non-linear function that gives the same output as input if the input is above 0, otherwise the output will be 0. that is,
Output = input, if input is above 0
Output = 0, if input is below 0
The ReLU function also helps with the problem of vanishing gradients in deep networks by not squashing in both ends.
Stay in the know by signing up for occasional emails with tips, tricks, deep learning insights, product updates, event news and webinar invitations.
We promise not to spam you or share your email with any third party. You can change your preferences at any time. See our privacy policies.
Please check your email inbox account to confirm, set, or update your communication preferences.