Explain the significance of the RELU Activation function in Convolution Neural Network

RELU Layer – After each convolution operation, the RELU operation is used. Moreover, RELU is a non-linear activation function. This operation is applied to each pixel and replaces all the negative pixel values in the feature map with zero.

Usually, the image is highly non-linear, which means varied pixel values. This is a scenario that is very difficult for an algorithm to make correct predictions. RELU activation function is applied in these cases to decrease the non-linearity and make the job easier.

Therefore this layer helps in the detection of features, decreasing the non-linearity of the image, converting negative pixels to zero which also allows detecting the variations of features.

Therefore non-linearity in convolution(a linear operation) is introduced by using a non-linear activation function like RELU.

16: Rectified Linear Unit [94]. | Download Scientific Diagram | CNN questions

Image Source: Google Images