Model Evaluation

  • Confusion Matrix

  • Precision and Recall

    • Precision referes to how many of retrieved items are relevant. I.e., out of items predicted as positive, how many of them are actually positive.

    • Recall referes to how many of relevant items are retrieved. The relavant items are the ones that are actually positive.

    $$\text{Precision} = \displaystyle \frac{\text{TP}}{\text{Predicted Positive}} = \frac{\text{TP}}{\text{TP} + \text{FP}}$$

    $$\text{Recall} = \displaystyle \frac{\text{TP}}{\text{Actual Positive}} = \frac{\text{TP}}{\text{TP} + \text{FN}}$$

Note: the numerator in both is the same $\text{TP}$, while the denumerator is different. In precision, the denumerator is the total number of items predicted as positive, and in recall its the number of items that actually belong to the positive class.


In [ ]: