Underfitting and Overfitting
In simpler terms, overfitting refers to too much learning. Underfitting refers to less learning. The model should be generalized so that it is able to understand and make predictions for any data apart from test data.
Overfitting is a phenomenon which occurs when a model learns the detail and noise in the dataset to such an extent that it affects the performance of the model on new data. This implies that the random fluctuations in the training data are picked up and learned as concepts by the model, the concepts do not hold good to the new data set and therefore negatively impact the model’s performance.
Underfitting refers to a model that is neither capable of modelling the training data nor generalizing new data. An underfit machine learning model normally doesn’t perform well on the training data.