Bias Variance Trade off in ML

If the algorithm is too simple (hypothesis with linear eq.) then it may be on high bias and low variance condition and thus is error-prone. If algorithms fit too complex ( hypothesis with high degree eq.) then it may be on high variance and low bias. In the latter condition, the new entries will not perform well.

Well, there is something between both of these conditions, known as Trade-off or Bias Variance Trade-off.

This tradeoff in complexity is why there is a tradeoff between bias and variance. An algorithm can’t be more complex and less complex at the same time. For the graph, the perfect tradeoff will be like.


The best fit will be given by hypothesis on the tradeoff point.
The error to complexity graph to show trade-off is given as –

This is referred to as the best point chosen for the training of the algorithm which gives low error in training as well as testing data.