Weights and Biases in Deep Learning

Weights and bias can be interpreted as a system of knobs that we can manipulate to optimize our model — like when we try to tune our radio by turning the knobs to find the desired frequency. The main difference is that in a neural network, we have hundreds if not thousands of knobs to turn to achieve the final result.

Since weights and bias are parameters of the network, these will be subject to the change generated by the rotation of the imaginary knobs. Since the weights are multiplied with the input, they affect the magnitude of the latter. The bias, on the other hand, since it is added to the whole expression, will move the function in the dimensional plane.