Incorrect value of AUROC when plotting a PrecisionRecallCurve
metric with score=True
#2405
Labels
PrecisionRecallCurve
metric with score=True
#2405
🐛 Bug
test.csv.gz
Unfortunately, the documentation does not give details of how AUROC is computed for a
PrecisionRecallCurve
. I would expect it to match the value ofBinaryAveragePrecision
for each class. At least for non-overlapping curves I would expect AUROC values agree with the visual inspection of the curves - the one outside should be larger than the one inside. This is not what is shown on the image provided. I would expect the value for class 1 to be close to 1, and the value for class 0 to be close to 0. In fact, this is whatBinaryAveragePrecision
gives me.It would also be nice to label the axes on
MultiClassPrecisionRecallCurve
, similarly toBinaryPrecisionRecallCurve
, especially given how axes appears to have been flipped between versions (1.2.1 vs 1.3.1)In the documentation for the latest version (1.3.1), the plot shows a negative value for AUROC (-0.639).
To Reproduce
Compute a
MultiClassPrecisionRecallCurve
metric from the dataset attached and plot it:Code sample
Expected behavior
AUCs on the plot should match the printed values for
BinaryAveragePrecision
s (0.18 and 0.96). Axes should be labelled.The documentation should be more clear about how the AUC is computed, and should not contain obvious errors (like negative values).
Environment
conda
,pip
, build from source):Additional context
The text was updated successfully, but these errors were encountered: