New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add precision and recall summaries #71
Comments
I print out the precision, recall and f1-score only at the end of the test evaluation (using metrics from sklearns): |
I did that also but this is not useful if you want to use TensorBoard |
You should initialize the local variables. Run
|
@ferasodh Hi, I was running into similar errors too. The session is initialized as follows for the Dev process.
and the new named scope for precision is can someone help with this? |
How can the code be modified to log precision and recall? I tried to add precision in text_cnn class as follows:
self.precision =tf.contrib.metrics.streaming_precision(logits1, self.input_y, name="precision")
and I added the summary to train file
precision_summary = tf.summary.scalar("precision", cnn.precision)
but this triggers the following error
FailedPreconditionError (see above for traceback): Attempting to use uninitialized value accuracy/precision/true_positives/count [[Node: accuracy/precision/true_positives/count/read = Identity[T=DT_FLOAT, _class=["loc:@accuracy/precision/true_positives/count"], _device="/job:localhost/replica:0/task:0/cpu:0"](accuracy/precision/true_positives/count)]]
Any ideas how to solve this? or any other way to add precision and recall?
Thanks,
The text was updated successfully, but these errors were encountered: