The topic of activation functions deserves a separate article, but here I will present a general overview. If you remember, I mentioned how a natural neuron has a switch activation. In computer / math jargon, we call this function a **step function** .

Following the formula

*1 if x > 0; 0 if x ≤ 0*

the step function allows the neuron to return 1 if the input is greater than 0 or 0 if the input is less than or equal to 0. This behavior simulates the behavior of a natural neuron and follows the formula

*output = sum(inputs*weights) + bias*

The step function is a very simple function, and in the AI field there is a tendency to use more complex activation functions, such as the rectified linear unit (**ReLU**) and **SoftMax**.