Activation
The Activation block represents the same operation as the activation function part of a layer of nodes; every input is fed through the selected function and output.
A Dense block with Linear activation, connected into an Activation block set to ReLU, is equivalent to a Dense block with ReLU activation.
This block is useful when using a regularization block like batch normalization or dropout. In those cases, one usually places the regularization block, e.g., Dense or Convolution, between the weighting blocks and the Activation block.
Parameters
Activation: The activation function used to transform each input.