New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Shared Embeddings #2
Comments
@adrian-spataru yes! Would you be willing to submit a PR? My desktop machine broke, and we gave stay at home orders because of this pandemic 👍 |
Sorry to hear that @lucidrains! Hopefully, no data loss. |
thanks Adrian :) |
@adrian-spataru added it as a keyword |
Sharing the token_emb between Encoder & Decoder is not by default. Lot of transformers like BART/T5 use a shared encoder/decoder embedding.
Would be this enough?
Furthermore the example for Encoder/Decoder in ReadMe doesn't work out of the box, it needs also a value for
enc_num_memory_tokens
The text was updated successfully, but these errors were encountered: