Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

extend the test files #23043

Merged
merged 3 commits into from
Apr 28, 2023
Merged

extend the test files #23043

merged 3 commits into from
Apr 28, 2023

Conversation

ydshieh
Copy link
Collaborator

@ydshieh ydshieh commented Apr 28, 2023

What does this PR do?

Extend the test files in cross test CI job.

In #21023, flax bigbird modeling file is changed and the flax test file skips the pt/flax tests. However, the test fetcher is not designed to take into account the corresponding pytorch test file, and we have test failure in main on nightly run.

This PR extends the test files for torch_and_tf and torch_and_flax jobs to avoid such situation.

The effect could be seen in this run

https://app.circleci.com/pipelines/github/huggingface/transformers/63246/workflows/84722f3a-1259-4226-973d-267c74ca9aee/jobs/780372

@HuggingFaceDocBuilderDev
Copy link

HuggingFaceDocBuilderDev commented Apr 28, 2023

The documentation is not available anymore as the PR was closed or merged.

@ydshieh ydshieh requested a review from sgugger April 28, 2023 10:01
@ydshieh ydshieh marked this pull request as ready for review April 28, 2023 10:01
@ydshieh
Copy link
Collaborator Author

ydshieh commented Apr 28, 2023

I need to check if the new test files exist before adding them to the list. Will update later

Copy link
Collaborator

@sgugger sgugger left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I have already given my thoughts a thousand times on how such failures should be fixed, but since I'm being ignored again...

This is not my preferred solution, but if we go for this, I'd like the added tests to be logged somewhere, so we can inspect the results. Otherwise we can't debug if there is a failure of the test fetcher, as the generated config is not exactly readable.

@ydshieh
Copy link
Collaborator Author

ydshieh commented Apr 28, 2023

I have already given my thoughts a thousand times on how such failures should be fixed, but since I'm being ignored again...

@sgugger I don't mean to ignore your suggestion, but the only discussion I remembered is this slack discussion, where you mentioned (to my previous messages) with

Mmmm, maybe there is a post-processing function that could do that yes. (Note that the whole thing for the pipeline will disappear with the new test fecther and the new way pipeline tests are designed).

Therefore I assume the approach in this PR (current version) is fine.

I might forget anything you mentioned earlier somewhere else. In this case, please share the link, and I am happy to take a look your suggestion. Otherwise, feel free to drop what you think the best. Thank you.

This is not my preferred solution, but if we go for this, I'd like the added tests to be logged somewhere, so we can inspect the results. Otherwise we can't debug if there is a failure of the test fetcher, as the generated config is not exactly readable.

If we keep this approach, I am happy to save files of these modified version.

@ydshieh
Copy link
Collaborator Author

ydshieh commented Apr 28, 2023

Note that I am not really in favor of duplicating the tests. We have these tests both in PT/TF or PT/Flax test files. It's already kind of duplication. And if we copy each version to another framework again, that is duplication of duplication.

@sgugger
Copy link
Collaborator

sgugger commented Apr 28, 2023

Let's go with logging the modified test files somehow, so we can inspect the result of the code you add then.

@@ -45,7 +45,7 @@
)
from ...utils import ModelOutput, add_start_docstrings, add_start_docstrings_to_model_forward, logging
from .configuration_big_bird import BigBirdConfig

a = 1
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

will revert before merge: just to trigger test to see everything is expected

@sgugger
Copy link
Collaborator

sgugger commented Apr 28, 2023

Looks like it didn't touch any new file though.

@ydshieh
Copy link
Collaborator Author

ydshieh commented Apr 28, 2023

Looks like it didn't touch any new file though.

Yeah, I am bad at getting the correct path. Now it works (if I triggered the test via a change in a file), see

https://app.circleci.com/pipelines/github/huggingface/transformers/63288/workflows/721ab3d6-b1af-4f2d-b036-9bebbdaef2cc/jobs/780955

Copy link
Collaborator

@sgugger sgugger left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks! One last ask to make it easier to navigate and we should be good to go.

Comment on lines 66 to 77
- run: |
if [ -f filtered_test_list_torch_and_tf.txt ]; then
mv filtered_test_list_torch_and_tf.txt test_preparation/filtered_test_list_torch_and_tf.txt
else
touch test_preparation/filtered_test_list_torch_and_tf.txt
fi
- run: |
if [ -f filtered_test_list_torch_and_flax.txt ]; then
mv filtered_test_list_torch_and_flax.txt test_preparation/filtered_test_list_torch_and_flax.txt
else
touch test_preparation/filtered_test_list_torch_and_flax.txt
fi
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let's put everything in one report for cross tests (named filtered_crossed_test_list for instance)

Comment on lines 472 to 473
with open(f_path, "w") as fp:
fp.write(" ".join(job.tests_to_run))
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You can append to f_path in your for loop with a title being the job name so that everything ends up in the same file.

@ydshieh
Copy link
Collaborator Author

ydshieh commented Apr 28, 2023

@sgugger Could you check the last commit and see if it is OK? (I will run a test to make sure everything works as expected tomorrow before merge, already tired today with some personal meetings).

See the new run page and here

One thing to note is that, I put everything regarding cross tests into a single file as well as in the cross test jobs. So test_modeling_tf_xxx.py might be in the tests_torch_and_flax job and vice versa in some cases. It doesn't really matter as we correctly install the only necessary libraries in the job.

extended_tests_to_run = sorted(extended_tests_to_run)
for job in jobs:
if job.job_name in ["tests_torch_and_tf", "tests_torch_and_flax"]:
job.tests_to_run = extended_tests_to_run
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

one single list for all cross test jobs

Copy link
Collaborator

@sgugger sgugger left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks!

@ydshieh ydshieh merged commit dfeb5aa into main Apr 28, 2023
@ydshieh ydshieh deleted the extend_test_files branch April 28, 2023 20:25
gojiteji pushed a commit to gojiteji/transformers that referenced this pull request Jun 5, 2023
* fix

---------

Co-authored-by: ydshieh <ydshieh@users.noreply.github.com>
novice03 pushed a commit to novice03/transformers that referenced this pull request Jun 23, 2023
* fix

---------

Co-authored-by: ydshieh <ydshieh@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants