Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Removed unused encoder_hidden_states and encoder_attention_mask #8972

Merged

Conversation

guillaume-be
Copy link
Contributor

What does this PR do?

This PR removes unused encoder_hidden_states and encoder_attention_mask from MobileBert forward methods. These are use for decoder models, but MobileBert does not include a cross-attention mechanism.

Fixes #8969

Who can review?

albert, bert, XLM: @LysandreJik

@guillaume-be
Copy link
Contributor Author

@LysandreJik I had to remove some tests that were testing the decoder mode for MobileBert.

One test still fails (flax), the error seems unrelated to this PR unless I am missing something?

@LysandreJik LysandreJik self-requested a review December 8, 2020 05:36
Copy link
Member

@LysandreJik LysandreJik left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This LGTM, it doesn't remove any supported behavior. @patrickvonplaten what do you think?

Copy link
Contributor

@patrickvonplaten patrickvonplaten left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks a lot for cleaning that @guillaume-be

@LysandreJik LysandreJik merged commit 7809eb8 into huggingface:master Dec 8, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

MobileBERT decoder capabilities
3 participants