Activation

The activation block represents the same operation as the activation function part of a layer of nodes; every input is fed through the selected function and output. A Dense layer with Linear activation, connected into an activation layer set to ReLU, is equivalent to a dense layer with ReLU activation.

This block is useful when using a regularization block like batch normalization or dropout. In those cases, one usually places the regularization block, e.g., Dense or Convolution, between the weighting blocks and the Activation block.

Parameters

Activation: The function used to transform each input.

Try the platform