Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[WIP?] Trying to fix Transformers tokenizer issues when models have extended vocabulary and/or special tokens #766

Merged
merged 3 commits into from
Apr 19, 2024

Commits on Apr 18, 2024

  1. Configuration menu
    Copy the full SHA
    058b289 View commit details
    Browse the repository at this point in the history
  2. Configuration menu
    Copy the full SHA
    3285dfa View commit details
    Browse the repository at this point in the history
  3. run black

    Harsha-Nori committed Apr 18, 2024
    Configuration menu
    Copy the full SHA
    b34a20b View commit details
    Browse the repository at this point in the history