DenseNet snippet

We recommend DenseNet for image-based tasks like image classification, regression and feature extraction.

On the Peltarion Platform there are two types of DenseNet. DenseNet 169 have more layers and will therefore probably give better results but will be slower to train and get predictions from than DenseNet 121.

DenseNet architecture

DenseNet expands on the skip connections that are common in, e.g., ResNet, but instead of summing together the forwarded activation-maps, concatenates them all together.

DenseNet layers are very narrow, adding only a small set of activation-maps to the “collective knowledge” of the network while keeping the remaining activation-maps unchanged—and the final classifier makes a decision based on all activation-maps in the network.

DensNet architecture overview
Figure 1. DensNet architecture overview

DenseNet “dense blocks”

A DenseNet “dense block” (not to be confused with the Dense block in the Modeling view, which is a fully connected neural network layer) is built of repeated composite functions that via skip connections concatenate the forwarded activation-maps. A skip connection skips a layer and connects to the next available layer. By using a skip connection only a small set of activation maps is added to the “collective knowledge” of the network and the remaining activation maps are kept unchanged.

A “dense block” in the DenseNet architecture
Figure 2. A “dense block” in the DenseNet architecture

Transition layer

An essential part of convolutional networks are downsampling blocks that change the size of activation maps. The layers between the “dense blocks” are called transition layers. They compress the activation map size with a 1x1 2D convolution block and a 2D average pooling block.

How to use the DenseNet snippet

To add a DenseNet snippet open the Snippet section in the Inspector and click DenseNet.

The images in the dataset should be 32x32 pixels or larger.

For optimizer, we recommend using ADAM with learning rate 0.001 or SGD with momentum and learning rate 0.01.

Reference

Gao Huang, Zhuang Liu, Laurens van der Maaten, Kilian Q. Weinberger: “Densely Connected Convolutional Networks”, 2016.

Try the platform