A DenseNet is a type of convolutional neural network that utilizes dense connections between layers, through Dense Blocks, where we connect all layers (with matching feature-map sizes) directly with each other.
DenseNet (Dense Convolutional Network) is an architecture that focuses on making the deep learning networks go even deeper, but at the same time making them more efficient to train, by using shorter connections between the layers.
Following are some Advantages of the dense net.
- Parameter efficiency – Every layer adds only a limited number of parameters- for e.g. only about 12 kernels are learned per layer
- Implicit deep supervision – Improved flow of gradient through the network- Feature maps in all layers have direct access to the loss function and its gradient.
Some other terminologies in DenseNet Architecture:
- Growth rate – This determines the number of feature maps output into individual layers inside dense blocks.
- Dense connectivity – By dense connectivity, we mean that within a dense block each layer gets us input feature maps from the previous layer as seen in this figure.
- Composite functions – So the sequence of operations inside a layer goes as follows. So we have batch normalization, followed by an application of Relu, and then a convolution layer (that will be one convolution layer)
- Transition layers – The transition layers aggregate the feature maps from a dense block and reduce its dimensions. So Max Pooling is enabled here.