Explain the Confusion Matrix with respect to model evaluation?

The Confusion Matrix is a measurement table that is used to measure the performance of a machine learning algorithm. It is a table of different combinations of predicted and actual values. It is useful for measuring recall, precision, AUC-ROC curve, and accuracy. The diagonal of the matrix contains all the true or correct data. The size of the matrix depends upon the classes in the dependent variable. The matrix size is equal to N*N, where N is the number of classes in the output of the dependent variable.

  • True Positive: Actual Value = Predicted Value when o/p is 1
  • True Negative : Actual Value != Predicted Value when o/p is 0
  • False Positive: Type I Error
  • False Negative: Type II Error