Confusion (Error) Matrix
Calculator

Inputs

Counts · non-negative integers

Pred. Positive
Pred. Negative
Actual
Positive
Actual
Negative

Shaded cells show correct predictions (lighter green) vs errors (lighter red). This coding refers to prediction correctness only.

Quick examples

Results

Total samples: —

Accuracy ACC
(TP + TN) / N
Matthews CC MCC
(TP·TN − FP·FN) /
√((TP+FP)(TP+FN)(TN+FP)(TN+FN))
F-β score F1
(1+β²)·P·R / (β²·P + R)

Predictive values

+ marks metrics about the positive class (what you're testing for). marks metrics about the negative class. Neither is "good" or "bad" — positive often means the outcome of concern (e.g. injury present).

+Precision PPV
TP / (TP + FP)
Neg. Pred. Value NPV
TN / (TN + FN)
+False Disc. Rate FDR
FP / (FP + TP)  = 1 − PPV
False Omission FOR
FN / (FN + TN)  = 1 − NPV

True / false rates

+Sensitivity TPR · Recall
TP / (TP + FN)
Specificity TNR
TN / (TN + FP)
False Pos. Rate FPR
FP / (TN + FP)  = 1 − TNR
+False Neg. Rate FNR
FN / (TP + FN)  = 1 − TPR

Class distribution & error

+Positive prevalence
(TP + FN) / N
Negative prevalence
(TN + FP) / N
Balanced Accuracy
(TPR + TNR) / 2
Error rate
(FP + FN) / N  = 1 − ACC
Return to trauma.ai