Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix XLNet and Transformer-XL Execution #2191

Merged
merged 6 commits into from Apr 7, 2021
Merged

Conversation

tiagokv
Copy link
Contributor

@tiagokv tiagokv commented Mar 29, 2021

If you execute the following snippet exception OverflowError: int too big to convert is thrown:

from flair.embeddings import TransformerWordEmbeddings
from flair.data import Sentence

model = TransformerWordEmbeddings("xlnet-large-cased", "all", "mean", layer_mean=False)
model.embed(Sentence("This is a test that will not work"))

This happens because both XLNet and Transformer-XL do not have a max sequence length and tokenizer.model_max_length comes with a very big integer.

@alanakbik
Copy link
Collaborator

@tiagokv thanks for fixing this!

@alanakbik alanakbik merged commit 498a674 into flairNLP:master Apr 7, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants