Calculate precision, recall, and F1 score for your machine learning models effortlessly.
Correctly predicted positive cases
Incorrectly predicted as positive
Correctly predicted negative cases
Incorrectly predicted as negative
Accuracy
0%
Precision
0%
Recall
0%
F1 Score
0%
The ratio of correct predictions to total predictions. Formula: (TP + TN) / (TP + FP + TN + FN)
How many of the positive predictions were actually correct. Formula: TP / (TP + FP)
How many actual positive cases were correctly identified. Formula: TP / (TP + FN)
Harmonic mean of precision and recall, providing a single score. Formula: 2 × (Precision × Recall) / (Precision + Recall)

From your confusion matrix, identify the four values: True Positives, False Positives, True Negatives, and False Negatives.

Enter each value in the corresponding field. Make sure all values are non-negative numbers.

Click "Calculate Metrics" to see all four performance metrics instantly displayed.