Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Mask for TransformerDecoder in the end-to-end Transformer (chapter11_part04_sequence-to-sequence-learning.ipynb) #209

Open
balvisio opened this issue Jul 29, 2022 · 0 comments

Comments

@balvisio
Copy link

balvisio commented Jul 29, 2022

In the chapter11_part04_sequence-to-sequence-learning.ipynb, the TransformerDecoder receives the mask from the PositionalEmbedding layer of the target sequence:

x = PositionalEmbedding(sequence_length, vocab_size, embed_dim)(decoder_inputs)
x = TransformerDecoder(embed_dim, dense_dim, num_heads)(x, encoder_outputs)

Shouldn’t the mask be the one created from encoding the source sequence?

For example, I have seen that in this TF tutorial the mask from the source sequence is used instead.

Any clarification would be greatly appreciated.

@balvisio balvisio changed the title Mask for TranformerDecoder Mask for TranformerDecoder in the end-to-end Transformer (Chapter 11) Jul 29, 2022
@balvisio balvisio changed the title Mask for TranformerDecoder in the end-to-end Transformer (Chapter 11) Mask for TranformerDecoder in the end-to-end Transformer (2nd edition, Chapter 11) Jul 29, 2022
@balvisio balvisio changed the title Mask for TranformerDecoder in the end-to-end Transformer (2nd edition, Chapter 11) Mask for TransformerDecoder in the end-to-end Transformer (2nd edition, Chapter 11) Jul 30, 2022
@balvisio balvisio changed the title Mask for TransformerDecoder in the end-to-end Transformer (2nd edition, Chapter 11) Mask for TransformerDecoder in the end-to-end Transformer (chapter11_part04_sequence-to-sequence-learning.ipynb) Jul 30, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant