Skip to content
This repository has been archived by the owner on Oct 31, 2023. It is now read-only.

Inference #15

Closed
ptamas88 opened this issue Sep 11, 2018 · 1 comment
Closed

Inference #15

ptamas88 opened this issue Sep 11, 2018 · 1 comment

Comments

@ptamas88
Copy link

Hello,
Can you help me how to infer with the saved checkpoint?
I am stuck with loading back the model to torch and call model.eval() function.
Thanks

@glample
Copy link
Contributor

glample commented Sep 11, 2018

Hi,

Currently the simplest way to do this would be to binarize the data you want to translate (being sure the data is also using the same BPE codes), and then reload the model with the --eval_only 1 parameter, and specifying your binarized dataset as the test set. Then, the model will generate hypothesis for your dataset in the directory where the experiment is dumped. I'm sorry if this is not very convenient, I'll try to add a script that only takes as input the source file and the MT model later.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants