The sigmoid function generates a smooth non-linear curve that squashes the incoming values between 0 and 1. The sigmoid function works well for a classifier model but it has problems with vanishing gradients for high input values, that is, y change very slow for high values of x.
Example: If you have input values x of [1, 3, 10, 500, 10000, 10000000], y will change well enough for the lower values but not for the high values. The information in the high values will, therefore, be lost.
Stay in the know by signing up for occasional emails with tips, tricks, deep learning insights, product updates, event news and webinar invitations.
We promise not to spam you or share your email with any third party. You can change your preferences at any time. See our privacy policies.
Please check your email inbox account to confirm, set, or update your communication preferences.