Test it on the Peltarion Platform
A platform to build and deploy deep learning projects.
Even if you’re not an AI superstar.
The activation block represents the same operation as the activation function part of a layer of nodes; every input is fed through the selected function and output. A Dense layer with Linear activation, connected into an activation layer set to ReLU, is equivalent to a dense layer with ReLU activation.
This block is useful when using a regularization block like batch normalization or dropout. In those cases, one usually places the regularization block, e.g., Dense or Convolution, between the weighting blocks and the Activation block.
Activation: The function used to transform each input.