RELU Layer – After each convolution operation, the RELU operation is used. Moreover, RELU is a non-linear activation function. This operation is applied to each pixel and replaces all the negative pixel values in the feature map with zero.
Usually, the image is highly non-linear, which means varied pixel values. This is a scenario that is very difficult for an algorithm to make correct predictions. RELU activation function is applied in these cases to decrease the non-linearity and make the job easier.
Therefore this layer helps in the detection of features, decreasing the non-linearity of the image, converting negative pixels to zero which also allows detecting the variations of features.
Therefore non-linearity in convolution(a linear operation) is introduced by using a non-linear activation function like RELU.
![16: Rectified Linear Unit [94]. | Download Scientific Diagram | CNN questions](https://www.researchgate.net/profile/Saad-Albawi/publication/328048988/figure/fig17/AS:677675828011008@1538581916318/Figure-316-Rectified-Linear-Unit-94.jpg)
Image Source: Google Images