Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add TF<>PT and Flax<>PT everywhere #14047

Conversation

patrickvonplaten
Copy link
Contributor

@patrickvonplaten patrickvonplaten commented Oct 18, 2021

What does this PR do?

This PR adds the PT<>TF and PT<>Flax equivalence tests to the common PyTorch tests so that the equivalence tests are also fetched when just the PyTorch files are changed. The PR also uncovered a couple of smaller bugs in TFHuBERT and FlaxAlbert.

Before submitting

  • This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
  • Did you read the contributor guideline,
    Pull Request section?
  • Was this discussed/approved via a Github issue or the forum? Please add a link
    to it if that's the case.
  • Did you make sure to update the documentation with your changes? Here are the
    documentation guidelines, and
    here are tips on formatting docstrings.
  • Did you write any new necessary tests?

Who can review?

Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.

@patrickvonplaten patrickvonplaten mentioned this pull request Oct 18, 2021
5 tasks
@@ -314,35 +314,13 @@ def __call__(
return outputs


class FlaxAlbertLayers(nn.Module):
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There were some hidden bugs in the Flax implementation of Albert. The architecture didn't allow config.inner_group_num > 1. Issue is solved at full backwards compatibility.

Copy link
Collaborator

@sgugger sgugger left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for adding those. The only downside is that they will be run twice when we do the full suite, but I don't think it's a big issue.

@patrickvonplaten
Copy link
Contributor Author

Thanks for adding those. The only downside is that they will be run twice when we do the full suite, but I don't think it's a big issue.

Yes, but I think it's much more important to be sure that if one changes PyTorch's CLIP that both PT<>Flax and PT<>TF is run with the test fetcher.

Also, the tests are actually slighly different in a sense that the ones that I added:

  • Use the dummy model config of the PyTorch test suite
  • Use the model inputs created in the PyTorch tests

where as the previous tests used Flax's or TF's config and inputs. So I think having those tests also forces us to be consistent with the testing configs and inputs across frameworks

@patrickvonplaten patrickvonplaten merged commit 0c3174c into huggingface:master Oct 25, 2021
@patrickvonplaten patrickvonplaten deleted the add_conversion_tests_everywhere branch October 25, 2021 21:55
Albertobegue pushed a commit to Albertobegue/transformers that referenced this pull request Jan 27, 2022
* up

* up

* up

* up

* up

* up

* up

* add clip

* fix clip PyTorch

* fix clip PyTorch

* up

* up

* up

* up

* up

* up

* up
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants