←
trauma.ai
/
Confusion Matrix Calculator
Confusion
(Error)
Matrix
Calculator
Inputs
Counts · non-negative integers
Pred. Positive
Pred. Negative
Actual
Positive
True Positive
False Negative
Actual
Negative
False Positive
True Negative
Green = correct · Red = incorrect
F-β weight
β = 0.5 (precision ×2)
β = 1.0 (F1 — balanced)
β = 2.0 (recall ×2)
β = 3.0 (recall ×3)
Copy results
Reset
Quick examples
Balanced
Imbalanced (rare positives)
Perfect classifier
Worst case
Random
Results
Total samples: —
Accuracy
ACC
—
(TP + TN) / N
Matthews CC
MCC
—
(TP·TN − FP·FN) /
√((TP+FP)(TP+FN)(TN+FP)(TN+FN))
F-β score
F1
—
(1+β²)·P·R / (β²·P + R)
Predictive values
Precision
PPV
—
TP / (TP + FP)
Neg. Pred. Value
NPV
—
TN / (TN + FN)
False Disc. Rate
FDR
—
FP / (FP + TP) = 1 − PPV
False Omission
FOR
—
FN / (FN + TN) = 1 − NPV
True / false rates
Sensitivity
TPR · Recall
—
TP / (TP + FN)
Specificity
TNR
—
TN / (TN + FP)
False Pos. Rate
FPR
—
FP / (TN + FP) = 1 − TNR
False Neg. Rate
FNR
—
FN / (TP + FN) = 1 − TPR
Class distribution & error
Positive prevalence
—
(TP + FN) / N
Negative prevalence
—
(TN + FP) / N
Balanced Accuracy
—
(TPR + TNR) / 2
Error rate
—
(FP + FN) / N = 1 − ACC
←
Return to trauma.ai