Skip to content
This repository has been archived by the owner on Aug 1, 2023. It is now read-only.

Remove pad tokens from attention during decoding #226

Closed
wants to merge 1 commit into from

Conversation

pmichel31415
Copy link
Contributor

Incidentally this fixes #225

Copy link
Contributor

@jmp84 jmp84 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sorry for the late review, can you rebase?

@facebook-github-bot
Copy link

Hi @pmichel31415!

Thank you for your pull request. We require contributors to sign our Contributor License Agreement, and yours needs attention.

You currently have a record in our system, but we do not have a signature on file.

In order for us to review and merge your code, please sign at https://code.facebook.com/cla. If you are contributing on behalf of someone else (eg your employer), the individual CLA may not be sufficient and your employer may need to sign the corporate CLA.

If you have received this in error or have any questions, please contact us at cla@fb.com. Thanks!

@jmp84 jmp84 closed this Nov 1, 2020
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Unk replacement doesn't work with the transformer
3 participants