Reference Metric in multiclass pecision recall unittests provides wrong answer when ignore_index
is specified with average = 'macro'
#2828
Labels
ignore_index
is specified with average = 'macro'
#2828
🐛 Bug
In unittests sklearn's
recall_score
andprecision_score
is being used as a reference . So even if in_reference_sklearn_precision_recall_multiclass()
functionremove_ignore_index
function is being used for removing those predictions whose real values are ignore_index class before passing it torecall_score
function, it does not matter. Because wheneveraverage='macro'
sklearn'srecall_score
andprecision_score
will always return mean cosidering the total no. of classes (as we are passing all the classes inrecall_score()
andprecision_score()
function'slabels
argument).To Reproduce
#2441 issue already talks about the wrong behaviour of MulticlassRecall macro average when ignore_index is specified. Although ignore_index is getting tested, but for it's wrong implementation testcase got passed.
same error for multiclass precision
Expected behavior
Environment
The text was updated successfully, but these errors were encountered: