From dde338bb85bf9ed3d20fc5ef2a7a087d972486fd Mon Sep 17 00:00:00 2001 From: DavidRosen Date: Tue, 7 Dec 2021 09:19:06 -0500 Subject: [PATCH] DOC improve the UG documentation of the classification report (#868) Co-authored-by: Guillaume Lemaitre --- doc/metrics.rst | 14 ++++++++++---- 1 file changed, 10 insertions(+), 4 deletions(-) diff --git a/doc/metrics.rst b/doc/metrics.rst index e83bf792b..f7e249c02 100644 --- a/doc/metrics.rst +++ b/doc/metrics.rst @@ -65,10 +65,16 @@ each class and averaged over classes, giving an equal weight to each class. Summary of important metrics ~~~~~~~~~~~~~~~~~~~~~~~~~~~~ -The :func:`classification_report_imbalanced` will compute a set of metrics -per class and summarize it in a table. The parameter `output_dict` allows -to get a string or a Python dictionary. This dictionary can be reused to create -a Pandas dataframe for instance. +The :func:`classification_report_imbalanced` will compute a set of metrics per +class and summarize it in a table. The parameter `output_dict` allows to get a +string or a Python dictionary. This dictionary can be reused to create a Pandas +dataframe for instance. + +The bottom row (i.e "avg/total") contains the weighted average by the support +(i.e column "sup") of each column. + +Note that the weighted average of the class recalls is also known as the +classification accuracy. .. _pairwise_metrics: