Explain the difference between Accuracy Precision and Recall

The difference between Accuracy Precision and Recall.

  • Accuracy: the ratio of correctly predicted observation of the total observations.

Accuracy = TP+TN/TP+FP+FN+TN

  • Precision: Precision is the ratio of correctly predicted positive observations of the total predicted positive observations.

Precision = TP/TP+FP

  • Recall: Recall is the ratio of correctly predicted positive observations to all observations in the actual class

Recall = TP/TP+FN

Accuracy, precision, and recall are evaluation metrics for machine learning/deep learning models. Accuracy indicates, among all the test datasets, for example, how many of them are captured correctly by the model comparing to their actual value.

However, consider a binary imbalanced dataset where 99% of data belongs to class 0 and remaining (1%) belongs to class 1. If your model classifies everything as class 0, your accuracy will be 99% which is very high, but it does not segregate between your classes. Therefore, this metric is not solely a good metric and we need other notions such as precision and recall.

Following the idea of binary classification, precision and recall are defined as follows:

Precision: measures how often an instance that is predicted as class 0 is actually class 0.

𝑃𝑟𝑒𝑐𝑖𝑠𝑖𝑜𝑛=𝑇𝑃/(𝑇𝑃+𝐹𝑃)

where TP is true positive and FP is false positive.

Recall measures how often a class 0 instance in the dataset is predicted as a class 0 instance by the classifier.

𝑅𝑒𝑐𝑎𝑙𝑙=𝑇𝑃/(𝑇𝑃+𝐹𝑁)

where TP is true positive and FN is false negative.