Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adds missing module_specs for usages of _LazyModule #15230

Merged
merged 4 commits into from Jan 21, 2022
Merged

Adds missing module_specs for usages of _LazyModule #15230

merged 4 commits into from Jan 21, 2022

Conversation

jkuball
Copy link
Contributor

@jkuball jkuball commented Jan 19, 2022

What does this PR do?

This PR adds a missing __spec__ to the _LazyModule instances of all models.

When a module is missing a spec it could not be imported via importlib.import_module, which is what torchmetrics does, which is why I've found it in the first place.

Related to #13321 (pr)
Fixes #15212 (issue)

Before submitting

  • This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
  • Did you read the contributor guideline,
    Pull Request section?
  • Was this discussed/approved via a Github issue or the forum? Please add a link
    to it if that's the case.
  • Did you make sure to update the documentation with your changes? Here are the
    documentation guidelines, and
    here are tips on formatting docstrings.
  • Did you write any new necessary tests?

Who can review?

Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.

@HuggingFaceDocBuilder
Copy link

HuggingFaceDocBuilder commented Jan 19, 2022

The documentation is not available anymore as the PR was closed or merged.

@jkuball jkuball marked this pull request as ready for review January 19, 2022 16:12
@eroehrig
Copy link

😍 exactly what i need

Copy link
Collaborator

@sgugger sgugger left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for fixing this. Note that it should be added in every _LazyModule but I can take care of that in a separate PR.

Comment on lines 32 to 33
assert transformers.models.auto.__spec__ is not None
assert importlib.util.find_spec("transformers.models.auto") is not None
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please use self.assertIsNotNone here.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Since this is not part of a unittest.TestCase I can't

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It should be in the AutoConfigTest.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I can and will do that! I just oriented my changes on #13321 which also uses the pure pytest variant.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, I didn't catch the test was not up to our usual standards in that one :-) Thanks for fixing here!

@jkuball
Copy link
Contributor Author

jkuball commented Jan 20, 2022

I have just moved the test. I also have a local commit lying around that adds the module_spec to all occurences of the LazyModule, but I am not that sure on where to put the tests for that.

Looking at the test_modeling_{{cookiecutter.lowercase_modelname}}.py template I could add the test to both possibilities of the {{cookiecutter.camelcase_modelname}}ModelTester -- but I am not sure on how to import the model beforehand, since I am not that sure on how the Lazy Modules are working exactly.

Can I just add a import transformers.models.{{cookiecutter.lowercase_modelname}} to all model test files, even outside of is_torch_available or would this break things?

@sgugger
Copy link
Collaborator

sgugger commented Jan 20, 2022

Oh that's great! If you want to include that commit in this PR and rename it that would be great. Regarding tests, I don't think we need more than the test for the auto submodule and the test of transformers (since it's the same thing everywhere). To answer your question however, it does not need to be under a is_torch_available().

To have the PR be perfectly complete, could you also add the change to the modeling template (if you don't have it in your commit already) as well as fix the test for transformers spec to be in a unittest.TestCase?

@jkuball
Copy link
Contributor Author

jkuball commented Jan 21, 2022

Okay, I figured that with the tests, but I wasn't sure.
In theory it wouldn't hurt, but yeah, it's probably okay to test the LazyModule once.

From my side this would now be mergeable, unless I made an error somewhere!

Edit: We could also get rid of the Optional-Hint for the module_spec to prevent those issues from arising later again (or at least the None-default).

@jkuball jkuball changed the title Add missing __spec__ for transformers.models.auto Adds missing module_specs for usages of _LazyModule Jan 21, 2022
@sgugger
Copy link
Collaborator

sgugger commented Jan 21, 2022

Thank you so much for the wide fix! With the templates properly fixed, I don't think we will get new inits without the module_spec (if someone adds another new submodule, they will copy an existing init), so I think we're good.

@sgugger sgugger merged commit c962c2a into huggingface:master Jan 21, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

ValueError: transformers.models.auto.__spec__ is None. causing import errors
4 participants