You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Oct 31, 2023. It is now read-only.
Currently the simplest way to do this would be to binarize the data you want to translate (being sure the data is also using the same BPE codes), and then reload the model with the --eval_only 1 parameter, and specifying your binarized dataset as the test set. Then, the model will generate hypothesis for your dataset in the directory where the experiment is dumped. I'm sorry if this is not very convenient, I'll try to add a script that only takes as input the source file and the MT model later.
Hello,
Can you help me how to infer with the saved checkpoint?
I am stuck with loading back the model to torch and call model.eval() function.
Thanks
The text was updated successfully, but these errors were encountered: