-
Notifications
You must be signed in to change notification settings - Fork 74k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Metrics for multi-label classification for using with tf.keras #28074
Comments
@Abhijit-2592 I have opened a pull request already about adding multilabel classification, but i am not getting correct location to add those lines so that they can work properly |
Can you link the pull request? |
@Abhijit-2592 Have you tried the new 2.0 metrics? https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/keras/metrics.py#L1072 Please take a look and let me know if they would work for you. We have implementations of precision at K and recall at K. They are called |
Closing this issue for now, please feel free to re-open if this isn't resolved. |
@pavithrasv I'd like to reopen the issue if possible. My reasons are as follows:
with ops.control_dependencies([
check_ops.assert_greater_equal(
y_pred,
math_ops.cast(0.0, dtype=y_pred.dtype),
message='predictions must be >= 0'),
check_ops.assert_less_equal(
y_pred,
math_ops.cast(1.0, dtype=y_pred.dtype),
message='predictions must be <= 1')
]): why is this annoying? if using a sigmoid cross entropy for the loss (as multi-labels are independent),
Since this function applies a sigmoid transformation to the logits (output of the model), then the user shouldn't have in their final layer a sigmoid activation. Otherwise it is
# inside Precision update_state
return metrics_utils.update_confusion_matrix_variables(
{
metrics_utils.ConfusionMatrix.TRUE_POSITIVES: self.true_positives,
metrics_utils.ConfusionMatrix.FALSE_POSITIVES: self.false_positives
},
y_true,
y_pred,
thresholds=self.thresholds,
top_k=self.top_k,
class_id=self.class_id,
sample_weight=sample_weight) # <---- multi_label is not set to True, nor do we as users
# have the option to do so
# function signature
def update_confusion_matrix_variables(variables_to_update,
y_true,
y_pred,
thresholds,
top_k=None,
class_id=None,
sample_weight=None,
multi_label=False, # <---- defaults to false
label_weights=None): collectively, this means that While the Perhaps e.g. based on the subclassing Metric documentation: class MultiLabelMacroSpecificity(tf.keras.metrics.Metric):
def __init__(self, name='multi_label_macro_specificity', threshold=0.5, **kwargs):
super(MultiLabelMacroSpecificity, self).__init__(name=name, **kwargs)
self.specificity = self.add_weight(name='mlm_spec', initializer='zeros')
self.threshold = tf.constant(threshold)
# replace this with tf confusion_matrix utils
self.true_negatives = self.add_weight(name='fn', initializer='zeros')
self.false_positives = self.add_weight(name='fp', initializer='zeros')
def update_state(self, y_true, y_pred):
# Compare predictions and threshold.
pred_is_pos = tf.greater(tf.cast(y_pred, tf.float32), self.threshold)
# |-- in case of soft labeling
label_is_pos = tf.greater(tf.cast(y_true, tf.float32), self.threshold)
label_is_neg = tf.logical_not(tf.cast(label_is_pos, tf.bool))
self.true_negatives.assign_add(tf.reduce_sum(tf.cast(label_is_neg, tf.float32)))
self.false_positives.assign_add(
tf.reduce_sum(tf.cast(tf.logical_and(pred_is_pos, label_is_neg), tf.float32))
)
tn = self.true_negatives
fp = self.false_positives
specificity = tf.div_no_nan(tn, tf.add(tn, fp))
self.specificity.assign(specificity)
return specificity
def result(self):
return self.specificity |
Thank you, i see you have opened another issue for the pending features, will reply on that. |
Top-K Metrics are widely used in assessing the quality of Multi-Label classification.
tf.metrics.recall_at_k
andtf.metrics.precision_at_k
cannot be directly used withtf.keras
! Even if we wrap it accordingly fortf.keras
, In most cases it will raise NaNs because of numerical instability. Since we don't have out of the box metrics that can be used for monitoring multi-label classification training usingtf.keras
. I came up with the following plugin forTensorflow 1.X
version. This can also be easily ported toTensorflow 2.0
.Is this something which we can integrate to Tensorflow? If so I will be glad to open up a Pull Request
The text was updated successfully, but these errors were encountered: