Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Positional Encoding RuntimeError #1157

Closed
VarunGumma opened this issue Apr 4, 2023 · 1 comment
Closed

Positional Encoding RuntimeError #1157

VarunGumma opened this issue Apr 4, 2023 · 1 comment

Comments

@VarunGumma
Copy link

VarunGumma commented Apr 4, 2023

I converted a fairseq transformer model checkpoint using ctranslate2 to int8 and used the converted checkpoint to evaluate flores101. I get the error RuntimeError: No position encodings are defined for positions >= 210, but got position 210 while using the translate_batch method. I checked my pre-processing stage and confirmed that my input's sequence length was < 210 tokens. What could be the possible issue for it and anyone else facing this?

@guillaumekln
Copy link
Collaborator

You should probably limit the maximum decoding length:

translator.translate_batch(..., max_decoding_length=210)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants