Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix test_configuration_tie in FlaxEncoderDecoderModelTest #14076

Merged
merged 5 commits into from
Nov 2, 2021

Conversation

ydshieh
Copy link
Collaborator

@ydshieh ydshieh commented Oct 20, 2021

What does this PR do?

Fix the test_configuration_tie in FlaxEncoderDecoderModelTest.

The test_configuration_tie in FlaxEncoderDecoderModelTest would fail, as shown in run_tests_flax. See below.

FlaxEncoderDecoderModel didn't have encoder and decoder - they are inside FlaxEncoderDecoderModule, and need an indirect way to access them.

However, it might be too much to add these just for a test without other interesting use case.
If we don't want to add this to FlaxEncoderDecoderModel, we could probably move the changes to be inside test_configuration_tie.

BTW, this test is @slow, so won't be run by CircleCI. I saw it is mentioned CircleCI does not run the slow tests, but github actions does every night! in How to contribute to transformers?.

So test_configuration_tie failed on github actions, and I am wondering when a slow test fails, what are the actions? Is it only for Hugging Face internal?
(I finally found the report under GitHub actions tab)

CircleCI test results

    def _check_configuration_tie(self, model):
>       assert id(model.decoder.config) == id(model.config.decoder)
E       AttributeError: 'FlaxEncoderDecoderModel' object has no attribute 'decoder'

@ydshieh ydshieh changed the title [WIP ] Fix test_configuration_tie in FlaxEncoderDecoderModelTest Fix test_configuration_tie in FlaxEncoderDecoderModelTest Oct 20, 2021
@ydshieh ydshieh marked this pull request as ready for review October 20, 2021 08:54
@patrickvonplaten
Copy link
Contributor

Thanks a lot for fixing the test! The implemented approach is clean and makes total sense. However, I'm not really in favor of exposing the encoder module this way. The reason is the following. In PyTorch callling model.encoder (of model = EncoderDecoderModel(...)) gives one the encoder class with all the weights and PreTrainedModel functions attached. However this would not be the case in Flax. model.encoder.params would not work since model.encoder is a Flax module not a model. It's not really possible to retrieve the Flax model actually in the Flax encoder decoder design IMO, so I would prefer to just not add such properties and rather change the config test.

@patil-suraj - could you take a look here as well?

@ydshieh
Copy link
Collaborator Author

ydshieh commented Oct 29, 2021

I agree with @patrickvonplaten , especially the exposure here is only for the testing purpose.

By change the config test, do you mean we still want to keep the testing logic of test_configuration_tie, but getting the encoder/decoder modules inside it rather than exposing them in FlaxEncoderDecoderModel? (I am OK with this option.)

P.S. Currently, FlaxVisionEncoderDecoderModelTest doesn't have this test after our discussion in another thread.

#13359 (comment)

@patil-suraj
Copy link
Contributor

Agree with @patrickvonplaten .

@ydshieh I think we could still keep the test. For this use case, we could use the bind method, which makes the flax module stateful so we could directly get the encoder/decoder.

module = model.module.bind(model.params)
enc_config = module.encoder.config
dec_config = module.decoder.config

WDYT @patrickvonplaten?

@ydshieh
Copy link
Collaborator Author

ydshieh commented Nov 2, 2021

Agree with @patrickvonplaten .

@ydshieh I think we could still keep the test. For this use case, we could use the bind method, which makes the flax module stateful so we could directly get the encoder/decoder.

module = model.module.bind(model.params)
enc_config = module.encoder.config
dec_config = module.decoder.config

It works, thank you @patil-suraj .

https://app.circleci.com/pipelines/github/huggingface/transformers/29684/workflows/1392381e-9fbe-488d-9164-0bd4add6b393/jobs/299243

@patrickvonplaten
Copy link
Contributor

I let you merge if you're happy with the PR @patil-suraj :-)

Copy link
Contributor

@patil-suraj patil-suraj left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM now! Thanks a lot for fixing this @ydshieh !

@patil-suraj patil-suraj merged commit 4a394cf into huggingface:master Nov 2, 2021
Albertobegue pushed a commit to Albertobegue/transformers that referenced this pull request Jan 27, 2022
…e#14076)

* check test_configuration_tie

* Fix test_configuration_tie

* make test slow again

* Remove property and use model.module.bind

* revert to slow test

Co-authored-by: ydshieh <ydshieh@users.noreply.github.com>
@ydshieh ydshieh deleted the fix_flax_enc_dec_tie_test branch May 5, 2022 10:38
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants