Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.

Already on GitHub? Sign in to your account

Compatibility with latest Huggingface implementation of Longformers? #140

Open
MarkusSagen opened this issue Nov 24, 2020 · 2 comments
Open

Comments

@MarkusSagen
Copy link

Hi! I've been running the convert_model_to_long.ipynb and works great with 馃 transformers 3.0.2.
When upgrading to the latest version of Huggingface (3.5.1), the notebook doesn鈥檛 t work and they鈥檝e seem to changed their implementation of the Longformer and LongformerSelfAttention

Is there an easy way change the RobertaLongSelfAttention to the current Huggingface implementation or plans for updataing the notebook?
Following hitskyer, the dimensions becomes correct, but I don't know how to proceed and change the old forward pass into This.

@ibeltagy
Copy link
Collaborator

Unlikely that we will find time to update the notebook, but PRs are really appreciated.

@adamwawrzynski
Copy link

I've added PR with updated RoBERTa conversion script in transformers v4.2.0.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants