Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Test XLA examples #5583

Merged
merged 5 commits into from
Jul 9, 2020
Merged

Test XLA examples #5583

merged 5 commits into from
Jul 9, 2020

Conversation

LysandreJik
Copy link
Member

Add a script to test examples on XLA.

@LysandreJik LysandreJik requested a review from julien-c July 7, 2020 18:55
Copy link
Contributor

@sshleifer sshleifer left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Love this!

@@ -73,7 +73,7 @@ jobs:
- checkout
- run: sudo pip install .[sklearn,torch,testing]
- run: sudo pip install -r examples/requirements.txt
- run: python -m pytest -n 8 --dist=loadfile -s ./examples/ | tee output.txt
- run: python -m pytest -n 8 --dist=loadfile -s ./examples/ --ignore=examples/test_xla_examples.py | tee output.txt
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

you could also make a require_xla decorator.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Very good idea!

# Assert that the model trains
self.assertGreaterEqual(value, 0.70)

# Assert that the script takes less than 100 seconds to make sure it doesn't hang.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

(nit) You can do this with pytimeout

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice, will do!

@codecov
Copy link

codecov bot commented Jul 9, 2020

Codecov Report

Merging #5583 into master will increase coverage by 1.52%.
The diff coverage is 40.00%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master    #5583      +/-   ##
==========================================
+ Coverage   76.31%   77.83%   +1.52%     
==========================================
  Files         145      145              
  Lines       25049    25053       +4     
==========================================
+ Hits        19116    19500     +384     
+ Misses       5933     5553     -380     
Impacted Files Coverage Δ
src/transformers/testing_utils.py 76.47% <40.00%> (-4.39%) ⬇️
src/transformers/modeling_tf_distilbert.py 66.25% <0.00%> (-32.52%) ⬇️
src/transformers/modeling_tf_utils.py 88.09% <0.00%> (-1.03%) ⬇️
src/transformers/file_utils.py 79.26% <0.00%> (-0.34%) ⬇️
src/transformers/data/processors/utils.py 27.63% <0.00%> (+1.31%) ⬆️
src/transformers/tokenization_xlnet.py 90.09% <0.00%> (+1.80%) ⬆️
src/transformers/generation_tf_utils.py 86.21% <0.00%> (+2.25%) ⬆️
src/transformers/tokenization_roberta.py 98.63% <0.00%> (+2.73%) ⬆️
src/transformers/training_args.py 77.55% <0.00%> (+11.22%) ⬆️
src/transformers/data/processors/glue.py 49.09% <0.00%> (+17.09%) ⬆️
... and 7 more

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 33e43ed...8f84df6. Read the comment docs.

@LysandreJik LysandreJik merged commit 0533cf4 into master Jul 9, 2020
@LysandreJik LysandreJik deleted the tpu-ci branch July 9, 2020 13:19
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants