Inception

Inception neural networks are used for image classification, regression and feature extraction.

How to use the Inception pretrained block

To add an Inception block, open the Inspector and click Inception v3.

The images in the dataset should be 229x229 pixels.

After the Input block, before the Inception, we recommend that you use a Random transformation block.

Inception architecture

A popular way to build convolutional networks is to stack layers on top of each other. The idea with the Inception network is to build a wider network rather than a deeper.

Inception modules

The wide part of the Inception network is built by Inception modules, here the blocks are executed in parallel in different sized branches with different sized filters. The intuition with these branches is that different sized filters can pick up features of different scale, i.e., a 7x7 filter picks up larger features than a 1x1 filter.

Factorized branches

Another intuition was that the convolutions in the branches could be factorized. This means that an nxn convolution is factorized into a combination of 1xn and nx1 convolutions. For example, a 7x7 convolution is replaced by two convolutions, first a 1x7 convolution, and then a 7x1 convolution.

Concatenate parallel branches

At the end of the Inception module, the outputs from the parallel branches are concatenated and sent forward.

Schematic of the Inception-B block showing factorized parallel branches with filters of different sizes.
Figure 1. Schematic of the Inception-B block showing factorized parallel branches with filters of different sizes.

If you want to know more about the Inception network architecture you should read this well-written blog post: A Simple Guide to the Versions of the Inception Network.

References

Christian Szegedy, Vincent Vanhoucke, Sergey Ioffe, Jonathon Shlens, Zbigniew Wojna: Rethinking the Inception Architecture for Computer Vision. 2015

Christian Szegedy, Sergey Ioffe, Vincent Vanhoucke, Alex Alemi: Inception-v4, Inception-ResNet and the Impact of Residual Connections on Learning. 2016

Parameters

Trainable: Whether we want the training algorithm to change the value of the weights during training. In some cases, one will want to keep parts of the network static.

Was this page helpful?
YesNo