New to Peltarion? Discover our deep learning Platform

A single deep learning platform to build and deploy your projects, even if you’re not an AI superstar.

FIND OUT MORE

Activation

The activation block represents the same operation as the activation function part of a layer of nodes; every input is fed through the selected function and output. A Dense layer with Linear activation, connected into an activation layer set to ReLU, is equivalent to a dense layer with ReLU activation.

This block is useful when using a regularization block like batch normalization or dropout. In those cases, one usually places the regularization block, e.g., Dense or Convolution, between the weighting blocks and the Activation block.

Parameters

Activation: The function used to transform each input.