Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Remove a TF usage warning and rework the documentation #9756

Merged
merged 5 commits into from Jan 27, 2021

Conversation

jplu
Copy link
Contributor

@jplu jplu commented Jan 22, 2021

What does this PR do?

Recently we moved the warning for boolean argument misused in TF models from warnings.warning(...) to tf.Print to avoid the overhelm of messages in the output. Nevertheless, the usage of tf.Print prevent all our TF models to be compiled and executed with XLA and quantized with TFLite because the Print operator is not supported in XLA and TFLite.

As a solution for both issues (logging overhelm + XLA compilation/execution) I propose to simply remove the logs and state the use case directly inside the documentation for all the TF models.

Copy link
Collaborator

@sgugger sgugger left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not a big fan of removing the warning, but if there is no alternative, using the documentation to say the same thing is good enough.

@patrickvonplaten
Copy link
Contributor

patrickvonplaten commented Jan 25, 2021

Not in favor of this to be honest...Warnings / Print statements are super important IMO. It's weird that tf.Print is not supported in XLA -> how are logs / warnings / print statements then produced in XLA?

@jplu
Copy link
Contributor Author

jplu commented Jan 25, 2021

XLA/Graph/TFlite execution are not made to have things to be printed, this includes some of the assert usage as well.

@jplu
Copy link
Contributor Author

jplu commented Jan 25, 2021

@patrickvonplaten TF XLA has few other known issues https://www.tensorflow.org/xla/known_issues

@patrickvonplaten
Copy link
Contributor

@patrickvonplaten TF XLA has few other known issues https://www.tensorflow.org/xla/known_issues

Thanks for sharing! I'm still very surprised by your message:

Nevertheless, the usage of tf.Print prevent all our TF models to be compiled and executed with XLA and quantized with TFLite

To me, this means that every TF Repo that wants to be executable with XLA has no tf.Print(...) ops. That's a pretty hard requirement no?

@jplu
Copy link
Contributor Author

jplu commented Jan 26, 2021

To me, this means that every TF Repo that wants to be executable with XLA has no tf.Print(...) ops. That's a pretty hard requirement no?

I agree, it is, but I see very rarely tf.print(..) to be used. As far as I know I never seen it implemented in official TF models during runtime (you can easily check with a quick search on https://github.com/tensorflow/models), usually it is used when evaluating a model, which is a use case that is not XLA related.

@sgugger
Copy link
Collaborator

sgugger commented Jan 26, 2021

Can we use

import tensorflow as tf
tf_logger = tf.get_logger()
tf_logger.warn(xxx)

intead?

@jplu
Copy link
Contributor Author

jplu commented Jan 26, 2021

@sgugger yes, using the internal TF logger should work as expected, but I'm not sure if it will bring any conflict with the actual transformers logger in terms of configuration.

Do you want me to use it instead, and we will see later if there is indeed any conflict?

@sgugger
Copy link
Collaborator

sgugger commented Jan 26, 2021

Yeah I think it would be best to have a warning the user can't silence with our centralized logging that none.

@jplu
Copy link
Contributor Author

jplu commented Jan 26, 2021

Ok done! I restored the warning but with the internal TF logger.

Copy link
Member

@LysandreJik LysandreJik left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Works for me, thanks for taking care of it.

@jplu jplu merged commit a172069 into huggingface:master Jan 27, 2021
@jplu jplu deleted the remove-warning branch January 27, 2021 10:04
Qbiwan pushed a commit to Qbiwan/transformers that referenced this pull request Jan 31, 2021
)

* Rework documentation

* Update the template

* Trigger CI

* Restore the warning but with the TF logger

* Update convbert doc
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants