New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Fix doc example] Fix 2 PyTorch Vilt docstring examples #16076
Conversation
@@ -969,7 +972,8 @@ def forward( | |||
|
|||
>>> selected_token = "" | |||
>>> encoded = processor.tokenizer(inferred_token) | |||
>>> processor.decode(encoded.input_ids[0], skip_special_tokens=True) | |||
>>> output = processor.decode(encoded.input_ids[0], skip_special_tokens=True) | |||
>>> print(output) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Think it is better to print(output)
here.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We can add an expected output here and add ViLT to the doc tests
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There is already an expected output. That's why I think it is better to print the output.
Otherwise, the code example won't show any output, and therefore the expected output seems (logically) strange.
(I am not sure if this is a valid argument though)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, if we print something it should be shown below :-)
The documentation is not available anymore as the PR was closed or merged. |
Thanks for fixing, can we add ViLT to the doctests? Namely this file: https://github.com/huggingface/transformers/blob/master/utils/documentation_tests.txt |
Added (after discussed with Patrick) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for fixing those!
@@ -969,7 +972,8 @@ def forward( | |||
|
|||
>>> selected_token = "" | |||
>>> encoded = processor.tokenizer(inferred_token) | |||
>>> processor.decode(encoded.input_ids[0], skip_special_tokens=True) | |||
>>> output = processor.decode(encoded.input_ids[0], skip_special_tokens=True) | |||
>>> print(output) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, if we print something it should be shown below :-)
@NielsRogge I added |
Will merge later today (unless there is further comment 🙂 ) |
Can you confirm the doc tests are passing? |
Hi, Yes, I confirmed! Here is the doctest run https://github.com/huggingface/transformers/runs/5543735270?check_suite_focus=true |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM!
* fix 2 pytorch vilt docstring examples * add vilt to doctest list file * remove device Co-authored-by: ydshieh <ydshieh@users.noreply.github.com>
What does this PR do?
Fix 2 PyTorch Vilt docstring examples