The dense block represents a fully connected layer of artificial nodes.
Each of the nodes (as many as the Nodes attribute) has one weight per input feature and an output that is a function of its inputs, according to the formula:
where the sum inside the function can be seen as the dot product of the weights vector with the features vector.
Dense blocks are the only blocks with nodes used in multilayer perceptrons, the simplest form of deep neural networks.
If the input has more than one dimension of features (for example an image has height, width, and channels), the data should be flattened before being fed into the nodes with a flatten block.
Given that a layer with N nodes and P input features has N*P weights, as all nodes are fully connected and each feature is treated separately, these layers grow quickly in memory usage with the input data size, and are not suitable for data like images (P = height x width x number of channels, which for a standard definition image is at least 1 million). Convolutional layers take better advantage of the structure within image data and need less memory to work.
Nodes: the number of nodes in the layer.
Initializer: the procedure used to set the initial values of the weights and the bias before starting training.
Default: Glorot uniform initialization
Activation: the function used to transform the output of the dot product inside the layer.
Trainable: whether we want the training algorithm to change the value of the weights during training. In some cases one will want to keep parts of the network static, for instance when using the encoder part of an autoencoder as preprocessing for another model.
Stay in the know by signing up for occasional emails with tips, tricks, deep learning insights, product updates, event news and webinar invitations.
We promise not to spam you or share your email with any third party. You can change your preferences at any time. See our privacy policies.
Please check your email inbox account to confirm, set, or update your communication preferences.