Bias - Variance trade-off

Bias variance

Bias is the simplifying assumptions made by the model to make the target function easier to approximate. Variance is the amount that the estimate of the target function will change given different training data. The trade-off is tension between the error introduced by the bias and the variance.

Bias - Bias means when we train the model on Train data or the data being used for training, error between predicted value and actual value is termed as Bias. If the Bias is high that means model is not able to predict the output values of train data correctly and if the Bias is low which means model is able to predict output value of train data correctly or the model is copying all the data.

Variance - Variance is opposite of Bias, Variance means the error between predicted value and actual value of test data or unseen data. If the Variance is high which means model is not able to predict the output values of unseen or test data correctly and vice versa. Bias and Variance have negative correlation, that means if the bias is high then variance is low and vice versa.

High Variance means model is over fitted which means model has remember all the train data and it is tightly bound on that, so it has low bias or able to predict the correct values on train data.