Skip to content
This repository has been archived by the owner on Aug 1, 2023. It is now read-only.

obtain alignment/attention information #694

Closed
lhk opened this issue Mar 22, 2020 · 1 comment
Closed

obtain alignment/attention information #694

lhk opened this issue Mar 22, 2020 · 1 comment

Comments

@lhk
Copy link

lhk commented Mar 22, 2020

I would like to access information on alignment or even the full attention matrix. This is possible with the fairseq CLI (with the print-alignment flag) and to some extent also in the python interface to fairsec (as logging messages).
Is there a way to obtain this information with pytorch-translate?

@jmp84
Copy link
Contributor

jmp84 commented Apr 12, 2020

Hi @lhk, please use https://github.com/pytorch/fairseq instead of Translate as Translate is deprecated (see announcement at the top of the README in https://github.com/pytorch/translate). We are developing some Translate features such as TorchScript directly in fairseq. If you have any questions or feature requests, please raise an issue directly in https://github.com/pytorch/fairseq

@jmp84 jmp84 closed this as completed Apr 12, 2020
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants