How to add f1_macro as a metric for multi-class classification? #3954
tiefenthaler
started this conversation in
General
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I want to use
f1_score(average='macro')
as a metric for multi-class classification.In the documentation we can add metrics like this:
from sklearn.metrics import log_loss
add_metric('logloss', 'Log Loss', log_loss, greater_is_better = False)
This does not work for
sklearn.metrics.f1_score
when I want to usef1_score(average='macro')
. Since settingf1_score
to'macro'
does call the method. So I can only add the plainf1_score
as a metric to the setup. But planf1_score
has a'weighted'
average as default.The
compare_models
method where we actually call the scoring does not provide any settings to configure our metrics. I wonder if there is any easy way to usef1_score(average='macro')
? Of course I can implement a customf1_score_macro
metric that uses'macro'
as default, but it would be nice to be able to use the sklearn implementation. Also to mention that sklearn metric are using parent classes which would make it some effort to translate them to a custom metric.Beta Was this translation helpful? Give feedback.
All reactions