New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
OverflowError: int too big to convert #2210
Comments
I think I've seen something like this before. You could check if this property is set correctly on the tokenizer and/or the model objects that are used inside the TransformerWordEmbeddings. According to this it should be 128. |
@alanakbik can confirm it fixed one case for me where I was getting this issue! |
Hello!, It might be too late but just in case anyone has the same issue, you just need to add the model to the Here:
As you can see here, I've added the BERT model, as for me it used to cause the same error. Remember to import it from the Huggingface 🤗 It looks like the PR fix made by @tiagokv is needed as well for the other transformers models besides XLNet so |
@S-glitch I have a similar issue when using the |
Hi @seyyaw, Yes, you need to have a modifiable installation of the Flair framework and edit the source. I think the easiest way would be to just clone the repository and use it directly. In the case of RoBERTa for
The rest should be handled automatically thanks to the PR #2191. Now, in the case of |
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions. |
Hello,
I'm trying to train a named entity recognition model, with this embedding:
TransformerWordEmbeddings('emilyalsentzer/Bio_ClinicalBERT')
. However, it always failed withOverflowError: int too big to convert
. This is also happening in some other transformer word embedding such asXLNet
. However,BERT
andRoBERTa
works fine.Here is the full traceback of the error:
I have tried to change the
embedding_storage_mode
,hidden_size
, andmini_batch_size
. None of these gave me the fix to the issue.Does anyone have the same issue?
Is there any way to resolve this?
Thanks
The text was updated successfully, but these errors were encountered: