You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I converted a fairseq transformer model checkpoint using ctranslate2 to int8 and used the converted checkpoint to evaluate flores101. I get the error RuntimeError: No position encodings are defined for positions >= 210, but got position 210 while using the translate_batch method. I checked my pre-processing stage and confirmed that my input's sequence length was < 210 tokens. What could be the possible issue for it and anyone else facing this?
The text was updated successfully, but these errors were encountered:
I converted a
fairseq
transformer model checkpoint usingctranslate2
toint8
and used the converted checkpoint to evaluateflores101
. I get the errorRuntimeError: No position encodings are defined for positions >= 210, but got position 210
while using thetranslate_batch
method. I checked my pre-processing stage and confirmed that my input's sequence length was < 210 tokens. What could be the possible issue for it and anyone else facing this?The text was updated successfully, but these errors were encountered: