Skip to content
This repository has been archived by the owner on Nov 22, 2022. It is now read-only.

Add option to skip 2 stage tokenizer and bpe decode sequences in the debug file #1257

Closed
wants to merge 1 commit into from

Conversation

anchit
Copy link
Contributor

@anchit anchit commented Feb 19, 2020

Summary:

  • When using the transformer decoupled model for normal text sequences we want to directly use the GPT2 tokenizer
  • The debug files had bpe tokens which were not human readable this diff decodes them before creating the file.

Reviewed By: einolghozati

Differential Revision: D19881420

…debug file

Summary:
- When using the transformer decoupled model for normal text sequences we want to directly use the GPT2 tokenizer
- The debug files had bpe tokens which were not human readable this diff decodes them before creating the file.

Reviewed By: einolghozati

Differential Revision: D19881420

fbshipit-source-id: d6ef4b1a3907e6939856e2e9b2d6b86c43896728
@facebook-github-bot facebook-github-bot added CLA Signed Do not delete this pull request or issue due to inactivity. fb-exported labels Feb 19, 2020
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D19881420

@facebook-github-bot
Copy link
Contributor

This pull request has been merged in 253e483.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
CLA Signed Do not delete this pull request or issue due to inactivity. fb-exported Merged
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants