-
Couldn't load subscription status.
- Fork 74.9k
Closed
Labels
comp:kerasKeras related issuesKeras related issuestype:docs-featureDoc issues for new feature, or clarifications about functionalityDoc issues for new feature, or clarifications about functionalitytype:featureFeature requestsFeature requests
Description
Click to expand!
Issue Type
Documentation Feature Request
Have you reproduced the bug with TF nightly?
Yes
Source
binary
Tensorflow Version
2.14.0-dev20230611
Custom Code
Yes
OS Platform and Distribution
Linux
Mobile device
NA
Python version
3.11.3
Bazel version
NA
GCC/Compiler version
NA
CUDA/cuDNN version
No response
GPU model and memory
No response
Current Behaviour?
tf.keras.metrics.Precision is returning precision assuming the label is binary? It looks so to me.
See:
In [4]: m = tf.keras.metrics.Precision()
...: m.update_state([0, 1, 2, 3], [0, 1, 2, 2])
...: m.result().numpy()
Out[4]: 1.0
In [5]: import tensorflow as tf
In [6]: m = tf.keras.metrics.Precision()
...: m.update_state([0, 1, 2, 3], [0, 1, 2, 2])
...: m.result().numpy()
Out[6]: 1.0
In [7]: tf.__version__
Out[7]: '2.14.0-dev20230611'
In [8]: m = tf.keras.metrics.Precision()
...: m.update_state([0, 5, 3, 3], [0, 1, 2, 2])
...: m.result().numpy()
Out[8]: 1.0
Above shouldn't be 1.0 if labels are treated as non binary. It appears to me that 0s are treated as 0 while non zeros are treated as 1.
But nowhere in the doc mentions this behavior. I can't find categorical precision or similar either. Please update doc to explain this behavior.
Standalone code to reproduce the issue
See above.Relevant log output
No response
Metadata
Metadata
Assignees
Labels
comp:kerasKeras related issuesKeras related issuestype:docs-featureDoc issues for new feature, or clarifications about functionalityDoc issues for new feature, or clarifications about functionalitytype:featureFeature requestsFeature requests