# Hard sigmoid

Hard sigmoid is a generalization of the sigmoid activation function. Since it’s a generalization and almost linear it’s much faster and cheaper to compute than an ordinary sigmoid activation function.

Hard sigmoid is the default recurrent activation for LSTM models.